Amazon challenges Nvidia with new chips AI for its data centers | Hardware devices | List of hardware components | Computer hardware | Turtles AI

Amazon challenges Nvidia with new chips AI for its data centers
The company intensifies investments in personalized solutions to reduce costs and increase the efficiency of cloud services
Isabella V13 November 2024

 

Amazon is preparing the launch of its new AI chips, designed to challenge Nvidia’s dominant position in the sector of semiconductors for data centers. The company aims to reduce costs and improve the efficiency of its cloud services, with a careful eye to the personalization of its infrastructure. Competition in the sector grows, with large technological players that develop internal solutions to support the growth of AI.

Key points:

  • Amazon develops chips to personalized, with the new "Trainium 2" ready for launch.
  • The initiative is led by Annapurna Labs, the start-up acquired by Amazon in 2015.
  • The goal is to reduce operating costs and offer an alternative to Nvidia, market leader.
  • Amazon’s expense in infrastructure for AI is destined to grow considerably in the coming years.

Amazon is ready to launch its new chips designed for AI, part of a wider strategy to reduce its dependence on the solutions of Nvidia, absolute leader in the semiconductor market for AI. The push towards this new direction is led by the company’s cloud division, which is significantly investing in proper technologies. At the center of this process is Annapurna Labs, the Texan start-up acquired in 2015 for 350 million dollars, which has now become the beating heart of the Amazon chip design. The next step for Amazon is the launch of "Trainium 2", a new generation of chips intended for the training of AI models, which according to the company’s plans will see its large -scale distribution starting next month. This chip is already in the test phase at some big names of the Tech industry, such as Anthropic, Databrks and Deutsche Telekom, and Amazon hopes that it can compete effectively with the graphic units of Nvidia, which currently dominate the market.

Amazon’s approach to the creation of personalized chips is part of a larger context of verticalization of the cloud infrastructure, aimed at offering customers a highly specialized solution and less dependent on external suppliers. Unlike the traditional chips that are designed to be universal tools, the chips developed by Amazon are optimized for specific workloads, such as those necessary for the AI, guaranteeing higher efficiency in many respects. "Trainium 2", in fact, should allow Amazon Web Services (AWS) to reduce data centers management costs, with savings that could reach up to 40% compared to competing solutions. These savings margins, although they may seem minimal in smaller scenarios, can make a difference when it comes to large -scale operations, as in the case of large AWS customers who manage millions of dollars in cloud services.

The competition in the semiconductor sector for AI is becoming increasingly on, with other cloud giants such as Microsoft and Google who are also developing their own solutions, trying to reduce their dependence on Nvidia and improve the efficiency of their operations. However, while Amazon tries to make his way in this market, his cloud division must deal with the current supremacy of Nvidia, who in the second quarter of 2024 recorded sales for over 26 billion dollars for his chips destined for data centers, a figure that roughly corresponds to the overall turnover of AWS for the same period. Nvidia’s position remains robust, with a market share that exceeds 90%, and Amazon is well aware of the difficulty of undermining such a colossus.

The Amazon infrastructure, however, stands out for the fact that the company is trying to build the whole system from scratch, including silicon, servers and software. This "integrated" approach allows Amazon to have greater control on the production cycle, reducing long -term costs and optimizing energy management, a fundamental aspect for data centers operating in high -intensity calculation environments. In addition, the possibility of developing ad hoc chips for specific needs allows Amazon to improve performance compared to other generic solutions such as those provided by Nvidia, which despite being very powerful, are not always optimized for any type of application.

The AI ​​chip market is in full turmoil, with innovation that is not limited only to chips but also to the entire technological ecosystem that supports them. Large companies such as Apple, Openai and Meta are investing enormous resources in the creation of owners chips, moves dictated by the desire to lower production costs, improve margins and obtain greater independence from external suppliers. This race to develop its own chip infrastructure quickly changing the dynamics of the market, with the giants of technology trying to carve out a place in the sun in a very high growth sector such as that of AI.

In this context, the choice of cloud customers becomes increasingly oriented towards those who can guarantee not only performances but also control over costs and technology. Despite facing the hard competition of Nvidia, Amazon is massively investing in this direction, with an infrastructure expenditure that, in 2024, should touch 75 billion dollars, a figure that represents a significant increase compared to the previous year.

The AI ​​chip market is now a battlefield on which a good part of the global technological future is played, and Amazon is ready to fight to earn a leading role.

Video