You are currently viewing AMD Introduces Next-Generation AI Chips; OpenAI One of Key Clients
Citation : Image is used for information purposes only. Picture Credit: https://lh3.googleusercontent.com/

AMD Introduces Next-Generation AI Chips; OpenAI One of Key Clients

Prime Highlights

  • AMD unveiled its MI400 and MI350 series chips to compete with Nvidia’s AI leadership.
  • OpenAI has confirmed it will incorporate AMD’s MI400 chips into its systems.

Key Facts

  • MI350 GPUs provide up to 4× the performance of the earlier MI300X series.
  • AMD’s Helios AI server racks will house up to 72 MI400 chips.
  • Significant partners are OpenAI, Meta, Microsoft, Oracle, and Elon Musk’s xAI.

Key Background

At its San Jose “Advancing AI” event, AMD launched a bold move into the hardware market for artificial intelligence, unveiling its future-generation AI accelerators—MI400 and MI350 series. The chips are aimed at enabling high-scale data centers and disrupting Nvidia’s de facto monopoly on AI chips. The MI400 series is due to hit the market in 2026, with AMD also announcing its Helios server rack platforms that can support up to 72 MI400 chips. These are going to be AMD’s most powerful AI offerings to date, designed to support compute-intensive operations such as big language models.

Highlight of the announcement was OpenAI CEO Sam Altman’s public support for the MI400 series. Altman assured that OpenAI, the developer of ChatGPT, will implement these chips in its future setup. Altman also hinted at more extensive collaboration with AMD, as he stated that the next-gen MI450 is already under development by them. This support makes AMD a reliable and rapidly evolving rival for Nvidia within the AI arena.

In addition to MI400, AMD also introduced the MI350 GPU family, which offers four times the performance of its predecessor, MI300X. These GPUs are set to deliver in late 2025 and will enable sophisticated workloads for partners like Meta, Microsoft, Oracle, and xAI. The MI350 is used for AI inference operations and will be packaged with AMD’s newest server-class EPYC processors.

AMD’s strategy contrasts with Nvidia’s by encouraging open standards and interoperability. The Helios system will not use proprietary interconnects such as Nvidia’s NVLink, providing customers with greater freedom. Openness is bound to entice partners interested in alternatives to Nvidia’s closed environment. The company also purchased ZT Systems, a development that adds more than 1,000 engineers and boosts its ability to offer full-rack, turnkey AI infrastructure solutions.

In spite of the aggressive launch, AMD’s stock fell 2%, indicating investor worries regarding execution and near-term competition with Nvidia. But AMD hopes to see its AI-slated revenues jump to tens of billions towards the end of the decade—up from perhaps $5 billion in 2024—implying a long-term vision with increasing industry support.

Read Also : OpenAI has its sights set on college campuses to create a generation of devoted AI users