Skip to main content

Vishal Sharma, Amazon’s Vice President of Artificial General Intelligence, stated that “there’s scarcely a part of the company that is unaffected by AI” during his appearance at the Mobile World Congress in Barcelona, Spain. He also downplayed the idea that open-source models could reduce the need for computing resources and sidestepped the question of whether European companies would alter their GenAI strategies due to ongoing geopolitical tensions with the US.

In an on-stage interview with TechCrunch’s Mike Butcher at the 4YFN startup conference, Sharma, a former AI entrepreneur turned AI head, discussed Amazon’s widespread deployment of AI across various sectors, including AWS, robotics in warehouses, and the Alexa consumer product. He noted that Amazon has approximately three-quarters of a million robots in its warehouses, performing tasks such as picking and self-navigation, and that the Alexa product is likely the most widely used home AI product in existence.

Sharma emphasized that every aspect of Amazon’s operations is impacted by generative AI, citing the company’s recent announcement of Nova, a new family of four text-generating models, as an example. He explained that these models are tested against public benchmarks, highlighting the diversity of use cases and the need for specialized models tailored to specific applications, such as video generation or quick and predictable responses in the case of Alexa.

However, Sharma expressed skepticism about the prospect of reducing compute resources through the use of smaller, open-source models, stating that “as you begin to implement it in different scenarios, you just need more and more and more intelligence.” He also discussed Amazon’s Bedrock product, which allows companies and startups to mix and match various foundational models, including those from China’s DeepSeek.

Amazon is currently building a large AI compute cluster using its Trainium 2 chips in partnership with Anthropic, in which it has invested $8 billion. Meanwhile, Elon Musk’s xAI has released its latest flagship AI model, Grok 3, which was trained using an enormous data center in Memphis containing around 200,000 GPUs.

When asked about the level of compute resources required for such projects, Sharma commented that “my personal opinion is that compute will be a part of the conversation for a very long time to come.” He did not seem concerned about the emergence of open-source models from China, stating that Amazon is a company that believes in choice and is open to adopting trends and technologies that benefit customers.

Mike Butcher, TechCrunch and Vishal Sharma, Amazon
Mike Butcher, TechCrunch and Vishal Sharma, AmazonImage Credits:Mobile World Congress

Sharma was also asked whether he believed Amazon was caught off guard by the emergence of Open AI’s ChatGPT in late 2022. He disagreed with this assessment, pointing out that Amazon has been working on AI for approximately 25 years and had already developed billions of parameters for language models.

Regarding the recent controversy surrounding Trump and Zelensky, and the subsequent cooling of relations between the US and European nations, Sharma was asked if he thought European companies might seek alternative GenAI resources in the future. He acknowledged that this issue was outside his area of expertise but suggested that technical innovation often responds to incentives, implying that some companies might adjust their strategies in response to changing circumstances.


Source Link