has the first phase of a new AI. Once the AI Research SuperCluster (RSC) is fully built later this year, the company believes it will be the fastest AI supercomputer on the planet, capable of “performing nearly 5 exaflops of mixed-precision computing.”
The company says the RSC will help researchers develop better AI models that can learn from trillions of examples. Among other things, the models will be able to build better augmented reality tools and “perfectly analyze text, images and video together,” according to Meta. Much of this work is in service of his vision for the metaverse, in which he says AI-powered apps and products will play a key role.
“We hope that RSC will help us build entirely new AI systems that can, for example, provide real-time voice translations to large groups of people, each speaking a different language, so that they can seamlessly collaborate on a research project. or play an augmented reality game. together,” wrote technical program manager Kevin Lee and software engineer Shubho Sengupta .
RSC currently has 760 Nvidia DGX A100 systems with a total of 6,080 GPUs. Meta believes the current iteration is already among the fastest AI supercomputers on the planet. Based on early benchmarks, he claims that the RSC can, compared to the company’s older configuration, run computer vision workflows up to 20 times faster and the NVIDIA Collective Communication Library over nine times faster.
Meta says that RSC can also train large-scale natural language processing models three times faster. As such, AI models that determine whether “an action, sound or image is harmful or benign” (e.g. to eradicate hate speech) can be trained more quickly. According to the company, this research will help protect people on current services like Facebook and Instagram, as well as in the metaverse.
In addition to creating the physical infrastructure and systems to run RSC, Meta said it needed to ensure it had security and privacy controls in place to protect the real-world training data it uses. He says that by using real-world data from your production systems rather than publicly available datasets, you can use your research more effectively, for example, identifying harmful content.
This year, Meta plans to increase the number of GPUs in the RSC to 16,000. It says it will increase AI training performance by more than 2.5 times. The company, which began work on the project in early 2020, wanted RSC to train AI models on datasets up to an exabyte (the equivalent of 36,000 years of high-quality video).
“We hope that this step-function shift in compute power will allow us to not only create more accurate AI models for our existing services, but also enable completely new user experiences, especially in the metaverse,” wrote Lee and Sengupta.
Other exascale systems are being built in the US. Delay at the Department of Energy’s Argonne National Laboratory is expected while the El Capitan supercomputer, which will be the country’s nuclear stockpile, is to the top 2 exaflops when it arrives next year.
All products recommended by Ploonge are selected by our editorial team, independent of our parent company. Some of our stories include affiliate links. If you buy something through one of these links, we may earn an affiliate commission.