High Bandwidth Memory (HBM) is the commonly used type of DRAM for data center GPUs like NVIDIA's H200 and AMD's MI325X. High Bandwidth Flash (HBF) is a stack of flash chips with an HBM interface. What ...
There are lots of ways that we might build out the memory capacity and memory bandwidth of compute engines to drive AI and HPC workloads better than we have been able to do thus far. But, as we were ...
The next generation of high-bandwidth memory, HBM4, was widely expected to require hybrid bonding to unlock a 16-high memory ...
SK Hynix and Taiwan’s TSMC have established an ‘AI Semiconductor Alliance’. SK Hynix has emerged as a strong player in the high-bandwidth memory (HBM) market due to the generative artificial ...
Future AI memory chips could demand more power than entire industrial zones combined 6TB of memory in one GPU sounds amazing until you see the power draw HBM8 stacks are impressive in theory, but ...
TL;DR: SK hynix CEO Kwak Noh-Jung unveiled the "Full Stack AI Memory Creator" vision at the SK AI Summit 2025, emphasizing collaboration to overcome AI memory challenges. SK hynix aims to lead AI ...
Memory startup d-Matrix is claiming its 3D stacked memory will be up to 10x faster and run at up to 10x greater speeds than HBM. d-Matrix's 3D digital in-memory compute (3DIMC) technology is the ...