Samsung has a crazy idea for AI that might just work: add a processor inside RAM


Samsung is framing its latest foray into the realm of processing-in-memory (PIM) and processing-near-memory (PNM) as a means to boost performance and lower the costs of running AI workloads.
The company has dubbed its latest proof-of-concept technology, which it unveiled at Hot Chips 2023, as CXL-PNM. This is a 512GB card with up to 1.1TB/s of bandwidth, according to Serve the Home.
It would help to solve one of the biggest cost and energy sinks in AI computing, which is the movement of data between storage and memory locations on computing engines.
Samsung CXL-PNM
Samsung’s testing shows it’s 2.9 times more energy efficient than a single A-GPU, with a cluster of eight CXL-PNMs 4.4 times more energy efficient than eight A-GPUs. This is in addition to an appliance fitted with the card emitting 2.8 times less CO2, and boasting 4.3 times more operation efficiency and environmental efficiency.
It relies on compute express link (CXL) technology, which is an open standard for a high-speed processor-to-device and processor-to-memory interface that paves the way for more efficient use of memory and accelerators with processors.
The firm believes this card can offload workloads onto PIM or PNM modules, which is something it’s also explored in its LPDDR-PIM. It will save costs and power consumption, Samsung claims, as well as extend battery life in devices by preventing the over-provisioning of memory for bandwidth.
Samsung’s LPDDR-PIM boosts performance by 4.5 times versus in-DRAM processing and reduces energy usage by using the PIM module. Despite achieving an internal bandwidth of just 102.4GB/s, however, it keeps computing on the memory module and there’s no need to transmit data back to the CPU.
Samsung has been exploring technologies like this for some years, although the CXL-PNM is the closest it has come to date to incorporate it into what might soon become a viable product. This also follows its 2022 HBM-PIM prototype.
Made in collaboration with AMD, Samsung applied its HBM-PIM card to large-scale AI applications. The addition of HBM-PIM boosted performance by 2.6%, while increasing energy efficiency by 2.7%, against existing GPU accelerators.
The race to build the next generation of components fit to handle the most demanding AI workloads is well and truly underway. Companies from IBM to d-Matrix are drawing up technologies that aim to oust the best GPUs.
More from TechRadar Pro
Samsung is framing its latest foray into the realm of processing-in-memory (PIM) and processing-near-memory (PNM) as a means to boost performance and lower the costs of running AI workloads. The company has dubbed its latest proof-of-concept technology, which it unveiled at Hot Chips 2023, as CXL-PNM. This is a 512GB…
Recent Posts
- Fortnite’s new season has heists, pickles, and Cowboy Bebop
- The best microSD cards in 2025
- I tried this new online AI agent, and I can’t believe how good Convergence AI’s Proxy 1.0 is at completing multiple online tasks simultaneously
- I cannot describe how strange Elon Musk’s CPAC appearance was
- Over a million clinical records exposed in data breach
Archives
- February 2025
- January 2025
- December 2024
- November 2024
- October 2024
- September 2024
- August 2024
- July 2024
- June 2024
- May 2024
- April 2024
- March 2024
- February 2024
- January 2024
- December 2023
- November 2023
- October 2023
- September 2023
- August 2023
- July 2023
- June 2023
- May 2023
- April 2023
- March 2023
- February 2023
- January 2023
- December 2022
- November 2022
- October 2022
- September 2022
- August 2022
- July 2022
- June 2022
- May 2022
- April 2022
- March 2022
- February 2022
- January 2022
- December 2021
- November 2021
- October 2021
- September 2021
- August 2021
- July 2021
- June 2021
- May 2021
- April 2021
- March 2021
- February 2021
- January 2021
- December 2020
- November 2020
- October 2020
- September 2020
- August 2020
- July 2020
- June 2020
- May 2020
- April 2020
- March 2020
- February 2020
- January 2020
- December 2019
- November 2019
- September 2018
- October 2017
- December 2011
- August 2010