Nvidia Unveils New Superchips for Complex AI Workloads

Nvidia Unveils New Superchips for Complex AI Workloads

image provided by pixabay

This post is also available in: heעברית (Hebrew)

Nvidia, the world’s leading supplier of chips for artificial intelligence applications, revealed in a press release that it is launching its next generation of superchips created to handle the “most complex generative AI workloads.”
This new superchip is called GH200 GraceHopper, and it features the world’s first HBM3e processor. According to Interesting Engineering, it was designed by combining Nvidia’s Hopper platform, which houses the graphic processing unit (GPU), with the Grace CPU platform, which handles the processing needs.
Despite traditionally being associated with high-end graphic processing in computers and gaming consoles, the GPUs’ superior computation abilities were repurposed for applications like cryptocurrency mining and training AI models.
Microsoft’s Azure has used Nvidia’s chips to create large computing systems to cater to the needs of OpenAI. The company built the necessary infrastructure to share the workload of large datasets for training the model, and OpenAI indeed used this to develop GPT models that powered ChatGPT.
Now Nvidia wants to build similar large data processing systems on its own and even launched a platform called Nvidia MGX with which businesses can train and run their own AI models in-house.
According to Interesting Engineering, the company created its superchips by using Nvidia’s proprietary NVLink technology that allows chip-to-chip (C2C) interconnections, which helps the GPU gain full access to the CPU’s memory and deliver 1.2 TB of fast memory.
The GH200 also features the world’s first HBM3e processor, which is 50 percent faster than the HBM3 used for computations today. With a combined bandwidth of 10TB/sec, the GH200 platform can process AI models that are 3.5 times larger and 3x faster than previous platforms from Nvidia.
GPUs are in very high demand since they are necessary both to train the AI models and to run them after. As AI becomes increasingly more and more mainstream, the demand for these chips is only expected to grow. It is therefore not surprising that even tech giants like Google and Amazon are developing their versions and offerings in the field.
Despite the competition, with the release of the GH200 Nvidia is proving itself to be the leading technology provider.