Meta allegedly test the first in-house chipset designed for AI training



Meta has allegedly started testing its first in-house chipset which will be used to train the Artificial Intelligence (AI) model. According to the report, the company has deployed a limited number of processors to test the performance and stability of the custom chipset, and on the basis of how well the tests run, it will start mass production of the said hardware on a large scale. These processors are said to be part of the family of the family of Menlo Park-based tech giant meta training and chipset.

According to a report by Reuters, Tech Giants developed these AI chipsets in collaboration with Chipmaker Taiwan Semiconductor Manufacturing Company (TSMC). Meta has reportedly recently completed the tape-out of the chip design process or final stage, and now started deploying small scale chips.

This is not the first AI-centric chipset for the company. Last year, it unveiled estimates accelerator or processor designed for AI. However, Meta did not have any in-house hardware accelerators to train its Lama family of Large Large Language Model (LLM).

Citing anonymous sources within the company, the publication claimed that the majority of the Meta behind developing in-house chipset is to bring down the cost of the infrastructure to deploy and run complex AI systems for consumer-centric products and developer tools.

Interestingly, in January, Meta CEO Mark Zuckerberg announced that the company’s expansion of the Messa Data Center in Erizona, United States, finally completed and the division began operations. It is likely that new training chipsets are also being deployed at this place.

The report stated that the new chipset will be used first with the recommended engine of the meta that strengthens its various social media platforms, and later the case of use will be extended to generative AI products such as meta AI products.

In January, Zuckerberg revealed in a Facebook post that the company plans to invest $ 65 billion (about Rs 5,61,908 crore) in 2025 projects in 2025. MESA is also responsible for the expansion of data centers in expenses. This includes hiring more employees for their AI teams.

Spread the love

Leave a Reply

Your email address will not be published. Required fields are marked *