AMD’s boilerplate descriptive text at the end of its press releases not only show us how AMD sees itself, but is actually a great statement of some very impressive achievement over half a century, with AMD only speeding up and strengthening its invention, innovation, integrity, independence, intelligence, and ever sharpening its intellectual heft in the pursuit of excellence.

So, while the boilerplate text is usually at the end of a media release, let’s celebrate it for a moment.

“For more than 50 years AMD has driven innovation in high-performance computing, graphics and visualisation technologies. Billions of people, leading Fortune 500 businesses and cutting-edge scientific research institutions around the world rely on AMD technology daily to improve how they live, work and play. AMD employees are focused on building leadership highperformance and adaptive products that push the boundaries of what is possible,” with the invitation to get “more information about how AMD is enabling today and inspiring tomorrow” by visiting AMD everywhere you can online.

And it’s certainly true. It’s not hyperbole of any kind – all modern Playstation and Xbox consoles have advanced custom designed and built AMD processors. AMD’s Ryzen processors for notebooks, PCs and workstations have been leading Intel in cores, performance and power usage for more than half a decade, and the latest versions thereof now include and inbuilt, on-chip AI engine, while the Epyc server processors are so good that this time last year, the 4th-gen Zen4 architecture of Epyc processors used half the power of Intel while offering FIVE TIMES the performance – and this saw even greater improvements when the Zen4c arrived six months ago.

Now, I have plenty more to say, and it’s all below, but I wanted you to see the AMD announcements visually if you wanted to. CNET has created an 8 minute supercut of the most important announcements. I have the full uncut keynote below, but if you want to see it all quickly, press play on the video below – and either way, please scroll down and read on!

AMD’s MI series of AI accelerators for use in the dozens or hundreds within data centres powering the generative AI systems that are so popular today have also been just as big a strength for AMD as the H100 has been for Nvidia, and with AMD’s newest MI300X AI accelerator having just launched earlier this week as well, with numbers and performance that seriously outclass Nvidia’s chip – and the successor Nvidia has already announced – AMD’s advancement of AI comes at just the right time to meet the supercharged demand for every AI service you can think of.

The sci-fi AI computers, minds, entities and more of my youth reading sci-fi novels is finally and actually coming true, and it is a tremendous thing to watch as a live participant, as we all are, in this time of change – we are living through it, not hearing stories about it from our grandparents. We are lucky to be alive now!

Given the magnitude of AMD’s advancing AI achievements as at December 2023, a year after the ChatGPT moment arrived to bring decades of work in AI into the bright and shining life and light of the zeitgeist, and which AMD’s hardware, software and users stand ready to help reshape reality with in humanity’s eternal quest to generatively live the greatest life the universe’s only known (thus far) intelligent beings living in a modern civilisation can attain, and as far beyond that as sci-fi writers and futurists have dreamt up.

Of course, up until now, that has all been done using the original – OI – or Organic Intelligence. We are all organic beings, and the absolute vast majority of us are not yet Cybermen or Cyberwomen in any sci-fi imagined way, with brains directly augmented by body implanted hardware. Most of our enhancements are still external – we use smartphones, tablets, computers, wearables, games consoles, workstations and more as external devices – even VR headsets are still external, there’s no Matrix-like plugs into the back of our brains yet, although Elon Musk’s Neuralink does show that future is coming, too!

After all, even spatial computing is external, and will one day give way to a direct brain and tech interface, but before we delve too far into a future, that could easily by dystopian if our the current talk about “responsible AI” isn’t followed up sincerely, the present is where we are at, and our choices today determine whether we move ever closer to a Terminator-like Skynet controlled dystopia, or the near Utopia that humans could create, if we ever actually wanted to (and had a damn good AI partner by our sides who worked with us and didn’t want to become our owners or masters).

Now, with all of that preamble before we get to the main stage, where AMD’s announcements are displayed and analysed, along with some exclusive video from the event that I was able to capture for everyone interested to watch, it’s worth taking a breath, expressing gratitude that this is the reality we live in (troubled though it is in many ways), and that, despite all the challenges the world faces, I still very strongly believe the best is yet to come, and these kinds of announcements are great examples this is true.

Wow, I said a lot. Let’s continue with the AMD announcements!

AMD’s Advancing AI event in San Jose in early December 2023, saw AMD announce the availability of its new AMD Instinct MI300 family of accelerators and the AMD ROCm 6 open software ecosystem, delivering new advancements in generative AI – right when the world needs them an is obsessed by the ever growing power and capability AI delivers to us as users.

Additionally, AMD announced the Ryzen 8040 Series processors, its newest mobile processors that have a second-generation AI engine buiit-in, compared to the first-ten AI engine built into the 7040 series notebook processors released earlier this year, seriously extending AMD leadership in mobile offerings, with more power, efficiency, and performance than the competition.

During the keynote, which is embedded in full below, AMD Chair and CEO, Dr. Lisa Su, showcased industry partners, including Dell, HPE, Lenovo, Microsoft, Oracle Cloud, and others, who are using AMD AI hardware to enable high-performance computing and generative AI applications across verticals, including:

  • Microsoft Azure ND MI300X v5 Virtual Machine (VM) series, optimized for AI workloads, which are powered by AMD Instinct MI300X accelerators.
    -mEl Capitan – the expected 2 exaflop supercomputer housed at Lawrence Livermore National Laboratory – is powered by AMD Instinct MI300A APUs.
    -mDell’s PowerEdge XE9680 servers, which feature AMD Instinct 300X accelerators and a Dell Validated Design with ROCm-powered AI frameworks.
  • HPE’s recently announced HPE Cray Supercomputing EX255a accelerator blade, also powered by AMD Instinct MI300A APUs.

Here is that keynote in full. It is just under two hours long, and I definitely recommend watching it in full. The article continues below, so please read on!

It’s important to note that Nvidia is not the only company out there with AI chips that powering the ChatGPT-style revolution across so many companies, as you would have seen in the videos above, AMD’s MI300X processor outclasses the H100 on virtually every metric, and in response to a question I asked during one of the briefing sessions, AMD firmly believes the MI300X outclasses the as-yet unreleased Nvidia H200 chip, too.

AMD Chair and CEO Dr. Lisa Su said: AI is the future of computing and AMD is uniquely positioned to power the end-to-end infrastructure that will define this AI era, from massive cloud installations to enterprise clusters and AI-enabled intelligent embedded devices and PCs.

“We are seeing very strong demand for our new Instinct MI300 GPUs, which are the highest performance accelerators in the world for generative AI. We are also building significant momentum for our data center AI solutions with the largest cloud companies, the industry’s top server providers, and the most innovative AI startups ꟷ who we are working closely with to rapidly bring Instinct MI300 solutions to market that will dramatically accelerate the pace of innovation across the entire AI ecosystem.

I also captured, on video, a range of presentations that took place after the main keynote, and will share them in a subsequent article, and will also add those this article over the course of the weekend – apologies they are not here yet, but they are coming.

There’s also a fantastic developer contest to encourage developers to create as many AI-enhanced and enabled apps as possible, full details are here.

So, here’s what else you’ll learn from the event:

Advancing Data Center AI from the Cloud to Enterprise Data Centers and Supercomputers

AMD was joined by multiple partners during the event to highlight the strong adoption and growing momentum for the AMD Instinct data center AI accelerators.

  • Microsoft detailed how it is deploying AMD Instinct MI300X accelerators to power the new Azure ND MI300x v5 Virtual Machine (VM) series optimised for AI workloads.
  • Meta shared that the company is adding AMD Instinct MI300X accelerators to its data centers in combination with ROCm 6 to power AI inferencing workloads and recognised the ROCm 6 optimisations AMD has done on the Llama 2 family of models.
  • Oracle unveiled plans to offer OCI bare metal compute solutions featuring AMD Instinct MI300X accelerators as well as plans to include AMD Instinct MI300X accelerators in their upcoming generative AI service.
  • The largest data center infrastructure providers announced plans to integrate AMD Instinct MI300 accelerators across their product portfolios. Dell announced the integration of AMD Instinct MI300X accelerators with their PowerEdge XE9680 server solution to deliver groundbreaking performance for generative AI workloads in a modular and scalable format for customers. HPE announced plans to bring AMD Instinct MI300 accelerators to its enterprise and HPC offerings. Lenovo shared plans to bring AMD Instinct MI300X accelerators to the Lenovo ThinkSystem platform to deliver AI solutions across industries including retail, manufacturing, financial services and healthcare. Supermicro announced plans to offer AMD Instinct MI300 GPUs across their AI solutions portfolio. Asus, Gigabyte, Ingrasys, Inventec, QCT, Wistron and Wiwynn also all plan to offer solutions powered by AMD Instinct MI300 accelerators.
  • Specialised AI cloud providers including Aligned, Arkon Energy, Cirrascale, Crusoe, Denvr Dataworks and Tensorwaves all plan to provide offerings that will expand access to AMD Instinct MI300X GPUs for developers and AI startups.

Bringing an Open, Proven and Ready AI Software Platform to Market

AMD highlighted significant progress expanding the software ecosystem supporting AMD Instinct data centre accelerators.

  • AMD unveiled the latest version of the open-source software stack for AMD Instinct GPUs, ROCm 6, which has been optimized for generative AI, particularly large language models. ROCm 6 boasts support for new data types, advanced graph and kernel optimisations, optimised libraries and state of the art attention algorithms, which together with MI300X deliver an ~8x performance increase for overall latency in text generation on Llama 2 compared to ROCm 5 running on the MI250.2
  • Databricks, Essential AI and Lamini, three AI startups building emerging models and AI solutions, joined AMD on stage to discuss how they’re leveraging AMD Instinct MI300X accelerators and the open ROCm 6 software stack to deliver differentiated AI solutions for enterprise customers.
  • OpenAI is adding support for AMD Instinct accelerators to Triton 3.0, providing out-ofthe-box support for AMD accelerators that will allow developers to work at a higher level of abstraction on AMD hardware.

Read here for more information about AMD Instinct MI300 Series accelerators, ROCm 6 and the growing ecosystem of AMD-powered AI solutions.

Continued Leadership in Advancing AI PCs

Windows notebook chips now get a second-generation AI brain!

With millions of AI PCs shipped to date, AMD announced new leadership mobile processors with the launch of the latest AMD Ryzen 8040 Series processors that deliver robust AI compute capability.

AMD also launched Ryzen AI 1.0 Software, a software stack that enables developers to easily deploy apps that use pretrained models to add AI capabilities for Windows applications.

Microsoft also joined to discuss how they are working closely with AMD on future AI experiences for Windows PCs.

The future? AMD gave us a sneak peek, too!

AMD also disclosed that the upcoming next-gen “Strix Point” CPUs, planned to launch in 2024, will include the AMD XDNA 2 architecture designed to deliver more than a 3x increase in AI compute performance compared to the prior generation3 that will enable new generative AI experiences.

Once again, I have some exclusive videos and more to publish, and these will come over the course of the weekend. Until then you can read more about the announcements made during Advancing AI here.