Skip to main content

Breaking News: Meta, a major Big Tech company, is accelerating its AI development efforts. (Quite a surprise!) In this scenario, the key player is Meta, which, according to Reuters, is testing its first in-house AI training chip. The goal is to reduce its massive infrastructure costs and dependence on NVIDIA, a company that seems to elicit strong reactions from Mark Zuckerberg. If successful, Meta plans to utilize this chip for training purposes by 2026.

Meta has reportedly initiated a small-scale deployment of the dedicated accelerator chip, designed specifically for AI tasks, making it more power-efficient than general-purpose NVIDIA GPUs. This deployment follows the completion of its first “tape-out,” a critical phase in silicon development where a complete design is sent for a manufacturing test run.

The chip is part of the Meta Training and Inference Accelerator (MTIA) series, the company’s family of custom in-house silicon focused on generative AI, recommendation systems, and advanced research.

Last year, the company began using an MTIA chip for inference, a predictive process that occurs behind the scenes in AI models. Meta utilized the inference chip for its Facebook and Instagram news feed recommendation systems. Reuters reports that it plans to start using the training silicon for these purposes as well. The long-term plan for both chips is to begin with recommendations and eventually use them for generative products like the Meta AI chatbot.

As one of NVIDIA’s largest customers, after placing orders for billions of dollars’ worth of GPUs in 2022, Meta is now shifting its focus to in-house development. This move comes after the company previously abandoned an in-house inference silicon that failed a small-scale test deployment, much like the current test for the training chip.


Source Link