HawkInsight

  • Contact Us
  • App
  • English

AMD released a new AI chip MI300 series memory more than twice the Nvidia H100!

On Wednesday (December 7th) local time, AMD launched the highly anticipated MI300 series。There are two new AI data center chips in the MI300 series: one focused on generative AI applications and the other for supercomputers.。

On Wednesday (December 7) local time, AMD launched the highly anticipated MI300 series of chips designed to handle a large number of workloads in artificial intelligence applications.。

The main focus of the MI300 series released by AMD this time is a large memory and high bandwidth.。According to reports, the new AMD chip has more than 150 billion transistors, and its memory is 2 of the NVIDIA H100.4 times。It also has 1.6 times the memory bandwidth to further improve performance。

There are two new AI data center chips in the MI300 series: one focused on generative AI applications and the other for supercomputers.。Processor version for generating artificial intelligence MI300X includes advanced high-bandwidth memory for improved performance。

CEO Su Zifeng said the new chip is comparable to the H100 in terms of the ability to train artificial intelligence software and performs better in reasoning。

"I have to say this is really the most advanced product we've ever made.。"And it's the most advanced AI accelerator in the industry."。The company expects the new chips to bring the company more than $2 billion in sales in 2024.。

Su also updated AMD's forecast for the data center artificial intelligence chip market.。She believes that the data center AI chip market will surge by 70% per year and reach more than $400 billion by 2027。IDC predicts that by 2022, the entire chip industry will reach $597 billion in revenue.。

"It's clear that demand is growing much faster than expected," said Su.。"

The AMD event also invited executives from Microsoft, Meta and Oracle。At the event, they shared the process of cooperation between their respective companies and AMD。Microsoft says AMD's new chips are available for evaluation by its cloud business customers starting today。

The launch is arguably AMD's most important in five years and will take on Nvidia in the hot AI chip market。Such chips help develop AI models by processing massive amounts of data, and they are better at processing data than traditional computer processors。

AI

AMD is ambitious, but it's not easy to challenge the chip leader Nvidia。

Analysts estimate that NVIDIA currently accounts for at least nearly 80% of the AI chip market, including custom chips produced by companies such as Google and Microsoft.。Nvidia doesn't report its AI revenue, but the majority of the company's revenue comes from its data center division.。In the first three quarters of this year, Nvidia reported data center revenue of 291.$200 million。

Outstanding results have driven Nvidia's shares to record highs, with the company's shares up nearly 215% this year, topping the S & P 500。By contrast, AMD shares are up about 70 percent this year.。

Some time ago, Nvidia released its latest AI chip - H200。According to reports, H200 based on Nvidia Hopper architecture, is the first equipped with HBM3e GPU。With HBM3e, H200 at 4 per second.The 8TB speed provides 141GB of memory, compared with the A100, the capacity is almost doubled, and the bandwidth is increased by 2.4 times。Compared with the H100, the H200 has twice the capacity of the H100, and the memory bandwidth is increased by 1.4 times。Nvidia describes it as "the most powerful GPU in the world"。The chip is expected to start shipping in the second quarter of 2024.。

In addition, another AI chip market competitor, Intel, will host an event called "AI Everywhere" next week, where the company will highlight new chips for data centers.。

Goldman Sachs analyst Toshiya Hari said last month that the 2024 sales forecast "supports the view that AMD is well positioned to participate in the large and growing next-generation AI computing market."。

In addition, Raymond James analysts, led by SriniPajjuri, said after previewing Wednesday's event: "In the long term, we see no reason why AMD can't have a 10-20% share of the $100 billion-plus AI chip market, which could mean double-digit revenue growth and margin expansion over the next 2-3 years.。"

·Original

Disclaimer: The views in this article are from the original Creator and do not represent the views or position of Hawk Insight. The content of the article is for reference, communication and learning only, and does not constitute investment advice. If it involves copyright issues, please contact us for deletion.