🎤 Introducing "Advancing AI" in San Jose AMD held its historic "Advancing AI" developer conference at the McEnery Convention Centre in San Jose on June 12, 2025.
The two-hour keynote by CEO Dr. Lisa Su brought together industry
leaders, press, and developers to showcase AMD’s ambitious journey in the AI
hardware space.
The stage was set amid vivid scenes of bustling developers,
live demos, and pre-event commentary emphasising trust, openness, and the
collective progress of AI .
AMD Reinvents AI Hardware: MI350 & MI400 Series
At the heart of the keynote were AMD’s newly unveiled AI
accelerator chips:
- Instinct
MI350 series: These chips mark AMD’s refreshed entry into high-stakes
AI, bolstering earlier efforts in the data-center AI market.
- Instinct
MI400 series: The more powerful next-gen line will debut in 2026
within an entirely new server architecture dubbed “Helios”.
A standout, the MI355X, reportedly surpasses
Nvidia’s Blackwell chips in crucial benchmark metrics, delivering up to 40%
more tokens per dollar than Nvidia’s B200 GPUs.
Helios: AMD’s Open-Standard AI Rack
Lisa Su emphasized that AI is now about rack-scale
systems—not individual chips. To align with this, AMD previewed its Helios
server, featuring:
- 72
MI400 chips per rack (matching Nvidia’s NVL72),
- integrated
Zen 6-based Epyc CPUs,
- AMD’s
own Pensando NICs,
- and
fully open-networking standards, contrasting with Nvidia’s NVLink.
Su said, “The future of AI is not going to be built by
any one company or in a closed ecosystem; it’s going to be shaped by open
collaboration across the industry.”
Open Source, Open Ecosystem
Throughout her keynote, Su stressed AMD’s commitment to an open
ecosystem:
- AMD is
expanding its ROCm software platform to offer a viable, open-source
alternative to Nvidia’s CUDA.
- The
Helios rack’s networking architecture, including standards and
specifications, will be openly shared—even with competitors.
- Su
reiterated, “We have made outstanding progress building the
foundational product, technology and customer relationships needed…placing
AMD on a steep long‑term growth trajectory”.
Industry Support & Ecosystem Momentum
Lisa Su didn’t speak alone—industry heavy-hitters joined her
onstage, reflecting AMD’s growing hardware ecosystem:
- Sam
Altman, OpenAI’s CEO, revealed that OpenAI plans to adopt AMD’s latest
chips and collaborate on the MI450 design. He praised AMD, calling
their infrastructure ramp-up “a crazy, crazy thing to watch”.
- Executives
from Meta, Oracle, xAI, and Crusoe showcased real-world AI
use cases. Notably, Crusoe has committed $400 million in AMD chip
purchases.
Su added confidently that “seven of the top 10 AI companies
have deployed Instinct at scale,” including Microsoft, Meta, Oracle, OpenAI.
Market Outlook & Revenue Ambitions
Reflecting on market dynamics:
- Su
forecasted the AI accelerator market to surpass $500 billion by
2028, implying a CAGR of over 60% from 2023.
- AMD’s
AI revenue base rose from over $5 billion in 2024, and they’re
projecting it to scale into the tens of billions annually.
- Despite
this bullish vision, AMD’s stock dipped slightly (~2% in one day, and ~4%
YTD in 2025), amid broader investor caution.
Analysts responded by boosting their price targets: Evercore
ISI to $144 (from $126), Roth Capital to $150, with Citi maintaining a neutral
$120 target.
The Competitive Landscape with Nvidia
This conference was clearly positioned as a challenge to
Nvidia:
- MI355
and MI400 chips were compared favorably to Nvidia’s Blackwell/B200 line,
with performance and cost claims that placed them “in the dust” relative
to Nvidia, though Nvidia has not issued a direct response.
- AMD’s
rack-scale and open approach is an explicit counterpoint to Nvidia’s NVLink,
calling for a more collaborative approach.
- The
broader market (stock, forecasts, adoption) indicates AMD is making real
inroads, though Nvidia continues to dominate.
Leadership & Vision from Su
Beyond tech announcements, Su used the stage to reinforce
her leadership ethos:
- She
reiterated AMD tech's everyday impact, powering services like Microsoft
365, Facebook, Zoom, Netflix, Uber, Salesforce, and SAP.
- She
introduced “agentic AI”—a new wave of intelligent agents needing billions
of virtual users, thus requiring massive AI compute across CPUs and
GPUs in tandem.
- Again,
Su emphasized “AI on every device,” promising both cloud and edge
innovation as integral to AMD’s road map .
Throughout, Su’s tone was confident yet
measured—acknowledging AMD’s position as #2, while clearly plotting a path to
disrupt Nvidia’s dominance.
Final Takeaways
- Major
comprehensive launch: AMD introduced powerful new AI chips (MI350/400)
and its upcoming Helios AI rack.
- Performance
& cost challenge: MI355X promises 40% better token-per-dollar
efficiency vs. Nvidia’s B200.
- Open
strategy rally: Collaboration, open-source ROCm, and open rack
standards mark a philosophical edge.
- Strong
industry backing: Big-name support from OpenAI, Meta, Oracle,
Crusoe—cementing AMD’s ecosystem.
- Growth
projections: Vision of a $500 billion AI chip market by 2028, revenue
target in tens of billions.
- Cautious
market response: Short-term stock pullback, but analysts increased
targets, reflecting medium-term confidence.
- Competitive
spark: AMD is challenging Nvidia directly, positioning itself as both
complement and competitor.
In summary, Lisa Su’s “Advancing AI” keynote on June 12,
2025 delivered bold announcements and strategic positioning. AMD emerges
not simply as a chipmaker, but as a systems-level provider intent on open
innovation—offering not just silicon, but servers, networking, and software for
the AI era. As AI becomes central to both cloud and edge computing, Su has
charted a credible and comprehensive blueprint—one that demands attention from
partners, customers, and competitors alike.