Tesla emerges as surprising rival to AMD and Nvidia in quest to grab next-gen HBM4 memory for AI and supercomputers


- HBM4 chips poised to power Tesla’s advanced AI ambitions
- Dojo supercomputer to integrate Tesla’s high-performance HBM4 chips
- Samsung and SK Hynix compete for Tesla’s AI memory chip orders
As the high-bandwidth memory (HBM) market continues to grow, projected to reach $33 billion by 2027, the competition between Samsung and SK Hynix intensifies.
Tesla is fanning the flames as it has reportedly reached out to both Samsung and SK Hynix, two of South Korea’s largest memory chipmakers, seeking samples of its next-generation HBM4 chips.
Now, a report from the Korean Economic Daily claims Tesla plans to evaluate these samples for potential integration into its custom-built Dojo supercomputer, a critical system designed to power the company’s AI ambitions, including its self-driving vehicle technology.
Tesla’s ambitious AI and HBM4 plans
The Dojo supercomputer, driven by Tesla’s proprietary D1 AI chip, helps train the neural networks required for its Full Self-Driving (FSD) feature. This latest request suggests that Tesla is gearing up to replace older HBM2e chips with the more advanced HBM4, which offers significant improvements in speed, power efficiency, and overall performance. The company is also expected to incorporate HBM4 chips into its AI data centers and future self-driving cars.
Samsung and SK Hynix, long-time rivals in the memory chip market, are both preparing prototypes of HBM4 chips for Tesla. These companies are also aggressively developing customized HBM4 solutions for major U.S. tech companies like Microsoft, Meta, and Google.
According to industry sources, SK Hynix remains the current leader in the high-bandwidth memory (HBM) market, supplying HBM3e chips to NVIDIA and holding a significant market share. However, Samsung is quickly closing the gap, forming partnerships with companies like Taiwan Semiconductor Manufacturing Company (TSMC) to produce key components for its HBM4 chips.
SK Hynix seems to have made progress with its HBM4 chip. The company claims that its solution delivers 1.4 times the bandwidth of HBM3e while consuming 30% less power. With a bandwidth expected to exceed 1.65 terabytes per second (TB/s) and reduced power consumption, the HBM4 chips offer the performance and efficiency needed to train massive AI models using Tesla’s Dojo supercomputer.
Sign up to the TechRadar Pro newsletter to get all the top news, opinion, features and guidance your business needs to succeed!
The new HBM4 chips are also expected to feature a logic die at the base of the chip stack, which functions as the control unit for memory dies. This logic die design allows for faster data processing and better energy efficiency, making HBM4 an ideal fit for Tesla’s AI-driven applications.
Both companies are expected to accelerate their HBM4 development timelines, with SK Hynix aiming to deliver the chips to customers in late 2025. Samsung, on the other hand, is pushing its production plans with its advanced 4-nanometer (nm) foundry process, which could help it secure a competitive edge in the global HBM market.
Via TrendForce
You may also like
HBM4 chips poised to power Tesla’s advanced AI ambitions Dojo supercomputer to integrate Tesla’s high-performance HBM4 chips Samsung and SK Hynix compete for Tesla’s AI memory chip orders As the high-bandwidth memory (HBM) market continues to grow, projected to reach $33 billion by 2027, the competition between Samsung and SK…
Recent Posts
- Grok blocked results saying Musk and Trump “spread misinformation”
- A GPU or a CPU with 4TB HBM-class memory? Nope, you’re not dreaming, Sandisk is working on such a monstrous product
- The Space Force shares a photo of Earth taken by the X-37B space plane
- Elon Musk claims federal employees have 48 hours to explain recent work or resign
- xAI could sign a $5 billion deal with Dell for thousands of servers with Nvidia’s GB200 Blackwell AI GPU accelerators
Archives
- February 2025
- January 2025
- December 2024
- November 2024
- October 2024
- September 2024
- August 2024
- July 2024
- June 2024
- May 2024
- April 2024
- March 2024
- February 2024
- January 2024
- December 2023
- November 2023
- October 2023
- September 2023
- August 2023
- July 2023
- June 2023
- May 2023
- April 2023
- March 2023
- February 2023
- January 2023
- December 2022
- November 2022
- October 2022
- September 2022
- August 2022
- July 2022
- June 2022
- May 2022
- April 2022
- March 2022
- February 2022
- January 2022
- December 2021
- November 2021
- October 2021
- September 2021
- August 2021
- July 2021
- June 2021
- May 2021
- April 2021
- March 2021
- February 2021
- January 2021
- December 2020
- November 2020
- October 2020
- September 2020
- August 2020
- July 2020
- June 2020
- May 2020
- April 2020
- March 2020
- February 2020
- January 2020
- December 2019
- November 2019
- September 2018
- October 2017
- December 2011
- August 2010