Micron has reaffirmed plans to start shipments of its HBM3E memory in high volume in early 2024, while also revealing that NVIDIA is one of its primary customers for the new RAM. Meanwhile, the company stressed that its new product has been received with great interest by the industry at large, hinting that NVIDIA will likely not be the only customer to end up using Micron's HBM3E.

"The introduction of our HBM3E product offering has been met with strong customer interest and enthusiasm," said Sanjay Mehrotra, president and chief executive of Micron, at the company's earnings call.

Introducing HBM3E, which the company also calls HBM3 Gen2, ahead of its rivals Samsung and SK Hynix is a big deal for Micron, which is an underdog on the HBM market with a 10% market share. The company obviously pins a lot of hopes on its HBM3E since this will likely enable it to offer a premium product (to drive up its revenue and margins) ahead of its rivals (to win market share).

Typically, memory makers tend not to reveal names of their customers, but this time around Micron emphasized that its HBM3E is a part of its customer's roadmap, and specifically mentioned NVIDIA as its ally. Meanwhile, the only HBM3E-supporting product that NVIDIA has announced so far is its Grace Hopper GH200 compute platform, which features an H100 compute GPU and a Grace CPU.

"We have been working closely with our customers throughout the development process and are becoming a closely integrated partner in their AI roadmaps," said Mehrotra. "Micron HBM3E is currently in qualification for NVIDIA compute products, which will drive HBM3E-powered AI solutions."

Micron's 24 GB HBM3E modules are based on eight stacked 24Gbit memory dies made using the company's 1β (1-beta) fabrication process. These modules can hit data rates as high as 9.2 GT/second, enabling a peak bandwidth of 1.2 TB/s per stack, which is a 44% increase over the fastest HBM3 modules available. Meanwhile, the company is not going to stop with its 8-Hi 24 Gbit-based HBM3E assemblies. The company has announced plans to launch superior capacity 36 GB 12-Hi HBM3E stacks in 2024 after it initiates mass production of 8-Hi 24GB stacks.

"We expect to begin the production ramp of HBM3E in early calendar 2024 and to achieve meaningful revenues in fiscal 2024," added chief executive of Micron.

Source: Micron

Comments Locked

7 Comments

View All Comments

  • charlesg - Friday, September 29, 2023 - link

    Typo? Second to last paragraph:

    "These modules can hit date rates as high as 9.2 GT/second"

    Should be "data rates"?
  • DougMcC - Tuesday, October 10, 2023 - link

    No it's correct, this is for applications in speed dating.
  • Kevin G - Friday, September 29, 2023 - link

    This could be useful for a Hopper refresh as Blackwell looks to be at least a full year away from shipping. Much like Hopper, I would expect Blackwell to be announced months in advance.
  • A5 - Monday, October 2, 2023 - link

    Could do a ~140GB version of H100, ship it at GTC in March right before announcing Blackwell for end of the year.
  • skinnyelephant - Saturday, September 30, 2023 - link

    For rtx 5000
  • sonny73n - Sunday, October 1, 2023 - link

    For rtx 50000

    They could've had hbm in consumer graphics long ago but they chose not to. They're also reducing memory bandwidths in new gpus. So NO, you won't get hbm for your gaming cards.
  • Unashamed_unoriginal_username_x86 - Monday, October 2, 2023 - link

    It's not coming to consumer but the bandwidth statement's pessimistic. They've already gotten the low hanging fruit from giant caches and without SRAM density improvements memory will need to pick up the slack.
    There's speculation going on about a 512 bit bus flagship or GDDR7 next gen, one of these is probably true and I don't think it's the former.

Log in

Don't have an account? Sign up now