New AI Memory Standard Could Make Super-Fast Chips Cheaper to Build

SPHBM4 memory
SPHBM4 memory chip supports AI acceleration with faster data throughput. [TechGolly]

Key Points

  • A new memory standard, SPHBM4, aims to cut the cost of AI chips.
  • It uses only 512 connection pins, but each pin works four times faster.
  • This allows manufacturers to use cheaper materials instead of expensive silicon.
  • The technology is designed for large data centers, not consumer PCs.

The super-fast memory that powers the AI boom has a design problem: it is getting too complex and expensive to build. The next generation of High Bandwidth Memory (HBM4) is expected to have more than 2,000 tiny connection pins. That number pushes the limits of manufacturers’ capacity. To address this, a team of top engineers is developing a smarter, simpler alternative.

The new standard is called Standard Package High Bandwidth Memory 4 (SPHBM4). Instead of doubling down on the number of pins, it does the opposite. SPHBM4 will use only 512 pins, a quarter of HBM4’s.

The trick is that each of these pins will operate four times as hard, sending much more data at higher speeds. This keeps overall performance unchanged while simplifying the physical design.

This design has a significant advantage: it doesn’t require the extremely expensive silicon components currently used to connect memory to the main chip. Instead, it can use lower-cost, more common materials known as “organic substrates.” This change in materials could significantly lower the cost of building massive AI accelerators produced by companies like Nvidia and AMD.

However, don’t expect to see this new memory in your home computer anytime soon. HBM has always been a specialized product for large data centers operated by companies such as Google, Amazon, and Microsoft. For these “hyperscalers,” even a small reduction in cost per chip can save them billions of dollars across their entire operation.

The new SPHBM4 standard is designed specifically for these customers, not for the consumer market.

For this to become a reality, the big players in the memory world need to get on board. Fortunately, major suppliers such as Samsung, Micron, and SK Hynix are already members of the group developing the new standard. This signals that the industry is serious about finding a more cost-effective way to power the next wave of artificial intelligence.

EDITORIAL TEAM
EDITORIAL TEAM
Al Mahmud Al Mamun leads the TechGolly editorial team. He served as Editor-in-Chief of a world-leading professional research Magazine. Rasel Hossain is supporting as Managing Editor. Our team is intercorporate with technologists, researchers, and technology writers. We have substantial expertise in Information Technology (IT), Artificial Intelligence (AI), and Embedded Technology.
Read More