High-bandwidth memory (HBM), a key component in artificial intelligence (AI) computing, has been ordered and received some production and testing equipment from US and Japanese suppliers to assemble and produce HBM, Nikkei Asia reported, as Beijing seeks to limit the negative impact of Washington's export restrictions and reduce its dependence on foreign technology.
Currently, HBM is not on the US export control list, but Chinese companies themselves do not have enough capacity to produce this type of component on a “large scale”.
Based in Hefei, eastern China, CXMT is the country's leading maker of dynamic random-access memory chips. Since last year, the company has prioritized the development of technologies that stack DRAM chips vertically to replicate the architecture of HBM chips, sources said.
DRAM chips are a key component for everything from computers and smartphones to servers and connected cars, allowing processors to quickly access data during computations. Stacking them into HBM would expand communication channels, allowing for faster data transfers.
HBM is a promising area for computational acceleration and artificial intelligence applications. The Nvidia H100 chip, the computing power behind ChatGPT, combines a graphics processor with six HBMs to enable fast, human-like responses.
Founded in 2006, CXMT announced late last year that it had begun domestic production of LPDDR5 memory chips—a popular type of mobile DRAM suitable for high-end smartphones. According to the company, Chinese smartphone makers such as Xiaomi and Transsion have already completed the integration of CXMT's mobile DRAM chips.
This advancement puts CXMT behind only US memory chip leader Micron and South Korea’s SK Hynix in terms of technology, and ahead of Taiwan’s Nanya Technology. However, CXMT will account for less than 1% of the global DRAM market by 2023, while the three dominant companies – Samsung, SK Hynix and Micron – control more than 97%.
Meanwhile, HBM production is dominated by the world's two largest DRAM chipmakers, SK Hynix and Samsung, which together will control more than 92% of the global market by 2023, according to Trendforce. Micron, which has about 4% to 6% of the market, is also looking to expand its market share.
Producing HBM requires not only the ability to produce high-quality DRAM, but also specialized chip packaging techniques to link those chips together. China still does not have a local chipmaker that can produce HBM chips to accelerate AI computing.
(According to Nikkei Asia)
Source
Comment (0)