As per the latest updates and releases, Meta shared new details on projects it was pursuing to make its data centers better suitable for supporting artificial intelligence.
The owners of Facebook and Instagram said that they designed a first-generation chip in 2020 as a part of the Meta Training and Inference Accelerator (MTIA) program. This chip has an objective to improvise the efficiency for the recommendations models that is used to serve ads and other content in news feeds.
As per the sources, Meta was not planning to deploy its first in-house AI chip widely and was working on a successor of it. As per the blog posts, it is portrayed that the first MTIA chip has a great learning opportunity.
The company mentioned that from this initial program, the companies have learned invaluable lessons that they are incorporating into their future roadmap.
The very first MTIA chip was focused exclusively on an AI process which was termed as inference. Here the algorithms are trained on huge amounts of data.
A Meta spokesperson declined to comment on deployment timelines or elaborate on the company’s plans for the development of the chips that could give training to the models too.
Meta is currently working on a massive project to upgrade its AI infrastructure. After executives realized, it lacked the hardware and software that are needed for supporting demand from product teams building AI-powered features.
The company, Meta is also planning for a large-scale launch of an in-house inference chip and started work on an ambitious chip capable of performing both training and inference.
Meta acknowledged that its first MTIA chip stumbled with high-complexity AI models, although it said the chip handled low and medium complexity models more efficiently while comparing with competitor’s chips.
The MTIA chip utilizes only 25 watts of power. The MTIA chip used an open-source chip architecture called RISC-V.
In addition to detailing of the workability of the chip, Meta facilitates an update on its plans to redesign its data centers around more modern AI-oriented networking and cooling systems.
The new design is expected to be 31% cheaper and could be built twice as quickly as the company’s current data centers.