· 3 min read
Keep up with the innovative tech transforming business
Tech Brew keeps business leaders up-to-date on the latest innovations, automation advances, policy shifts, and more, so they can make informed decisions about tech.
Meta kicked off this week with a big announcement: It’s building an AI supercomputer—poised to become the world’s fastest of its kind, Meta claims.
Right now…The system’s processing power is on par with the world’s fifth-fastest supercomputer, but by July, Meta plans to increase its GPUs from 6,080 to 16,000.
Dubbed the AI Research SuperCluster (“RSC,” for short), the new system is powered in part by thousands of Nvidia’s high-performance GPUs. Meta engineers started work in 2020, designing the system’s cooling, power, cabling, and networking systems from the ground up.
What it’s built for
Meta (Facebook at the time) began prioritizing AI years ago, starting with a dedicated AI research lab in 2013. The company’s metaverse focus goes hand in hand with its new project, since RSC has been designed for a slew of future-focused tasks.
- Think: possibly augmenting content moderation on all of Meta’s platforms, introducing new augmented reality experiences, improving speech recognition amid background noise, expanding the ability to process languages and accents, and more.
A note on data: Meta also designed RSC to be able to train on very, very large data sets—ones stacked with data the company says is equivalent to “36,000 years of high-quality video.” And unlike Meta’s past AI research projects, which “leveraged only open source and other publicly available data sets,” according to the company, RSC will make use of “user-generated data.”
- The company notes, though, that information will be encrypted until it’s time for the model to train, and then anonymized. The supercomputer is also isolated from the internet at large.
But, but, but: There’s an important distinction to be made here, between “AI supercomputers” and regular ones. A “world’s fastest” title for the former doesn’t mean quite the same thing as it would for the latter, since an AI supercomputer’s calculations don’t need to require the same level of accuracy as a “traditional” supercomputer.
- That means that by mid-2022, RSC will likely be able to carry out more calculations per second than some of the world’s fastest high-performance computers.
Meta isn’t the only company leaning into AI supercomputers. Nvidia and Microsoft have built their own, though Microsoft’s, in partnership with OpenAI, has 10,000 GPUs compared to Meta’s planned 16,000.
Looking ahead: As tech giants build more powerful supercomputers, we’ll likely see them build larger language models—and build them more quickly. Meta has started to train RSC to train large language models 3x faster, finishing training in three weeks instead of nine. Expect to see the hot-button debate over the ethics of large language models heat up in the AI space and beyond.