Seoul: Samsung Electronics said on Wednesday that it has started shipping its next generation high bandwidth memory chips known as HBM4 to customers, marking an important step in the fast growing artificial intelligence chip market.
The company said it has begun mass production and commercial shipments of HBM4, which is designed to handle large amounts of data at very high speeds. These chips are mainly used in advanced AI processors and data center systems that train and run complex AI models.
Samsung said demand for high performance memory is expected to remain strong through this year and into 2027, as global technology companies continue to invest heavily in AI infrastructure. The new HBM4 chips are built to improve speed and power efficiency compared to earlier versions, making them suitable for next generation AI accelerators.
The announcement comes at a time of intense competition in the memory chip industry. Rivals such as Micron Technology and SK Hynix are also working on similar advanced memory products to meet growing AI demand. Industry analysts say that having multiple suppliers could help large chip designers secure stable supplies and avoid shortages.
Major AI chipmakers, including Nvidia, rely on high bandwidth memory to power their graphics processing units used in data centers around the world. As AI applications expand in areas such as cloud computing, automation, and digital services, the need for faster and more efficient memory continues to rise.
Investors responded positively to the news, seeing it as a sign that Samsung is strengthening its position in the high end memory market. The company has been focusing on advanced chips as part of its broader strategy to compete more aggressively in the AI semiconductor space.
With global spending on AI technology increasing, Samsung’s move to ship HBM4 chips commercially is seen as an important milestone in the race to supply the core components behind the AI boom.