Meta has initiated testing of its inaugural in-house chipsets designed specifically for training artificial intelligence (AI) models. The company has rolled out a select number of these custom processors to assess their performance and efficiency. The outcome of these tests will influence the decision to move forward with large-scale production of the hardware, which belongs to Meta’s Meta Training and Inference Accelerator (MTIA) chipset family.
A recent report by Reuters detailed that these AI chipsets are being developed in partnership with Taiwan Semiconductor Manufacturing Company (TSMC). Meta has recently concluded the tape-out phase, which signifies the final step in the chip design process, and has commenced small-scale deployment of the chips.
This initiative marks Meta’s second foray into AI hardware. In the previous year, the company introduced Inference Accelerators tailored for AI inference. However, until now, Meta lacked dedicated in-house hardware accelerators for training its Llama family of large language models (LLMs).
According to sources familiar with the matter, the overarching objective of developing these in-house chipsets is to lower the infrastructure costs associated with implementing and maintaining advanced AI systems for a variety of applications, including internal use, consumer products, and developer tools.
In a notable development earlier this year, Meta CEO Mark Zuckerberg announced the completion of the expansion of the Mesa Data Center located in Arizona, which has begun operational activities. It is anticipated that the newly developed training chipsets will also be utilized at this facility.
The report indicated that the new chipsets will initially support Meta’s recommendation engine across its various social media platforms, with plans for broader applications in generative AI offerings, such as Meta AI, in the future.
In a January Facebook post, Zuckerberg mentioned that the company plans to invest up to $65 billion (approximately Rs. 5,61,908 crore) in AI-related projects by 2025. This investment covers not only the Mesa Data Center expansion but also the recruitment of additional personnel for its AI teams.