| Thread Tools |
9th November 2023, 12:05 | #1 |
[M] Reviewer Join Date: May 2010 Location: Romania
Posts: 153,575
| NVIDIA's Eos supercomputer just broke its own AI training benchmark record Depending on the hardware you're using, training a large language model of any significant size can take weeks, months, even years to complete. That's no way to do business — nobody has the electricity and time to be waiting that long. On Wednesday, NVIDIA unveiled the newest iteration of its Eos supercomputer, one powered by more than 10,000 H100 Tensor Core GPUs and capable of training a 175 billion-parameter GPT-3 model on 1 billion tokens in under four minutes. That's three times faster than the previous benchmark on the MLPerf AI industry standard, which NVIDIA set just six months ago. https://www.engadget.com/nvidias-eos...6.html?src=rss |
Thread Tools | |
| |