TrendForce recently released a report stating that with the increase of orders from Nvidia and other cloud service providers (CSPs) self-developed chips, memory manufacturers are actively expanding TSV production lines to improve HBM production capacity, it is estimated that HBM shipments will increase by 105% in 2024.

The report pointed out that the mainstream demand in 2023 has shifted from HBM2e to HBM3, and the proportion of demand is estimated to be about 50% and 39% respectively.
With the increasing volume of accelerator chips using HBM3, the market demand will be significantly transferred to HBM3 in 2024, and will directly surpass HBM2e in 2024, with a proportion estimated to reach 60%, and benefiting from its higher average selling price (ASP), It will drive significant growth in HBM’s revenue next year.

Judging from the competitive landscape, SK hynix (SK hynix) HBM3 products are currently ahead of other original manufacturers, and it is the main supplier of NVIDIA Server GPUs.

Samsung focuses on meeting orders from other cloud service providers. With additional orders from customers, the market share gap with SK Hynix will be greatly narrowed this year. The two companies’ HBM market share is estimated to be similar in 2023~2024. Owns about 95% of the HBM market.
Micron is focusing on the development of HBM3e products this year. Compared with the plans of the two Korean factories to greatly expand production, it is expected that Micron’s market share will decline slightly due to the crowding-out effect this year and next.
The original report attached
HBM refers to high-bandwidth memory. It is a high-performance DRAM based on 3D stacking technology initiated by Samsung Electronics, Advanced Micro Semiconductors and SK Hynix. It is suitable for graphics processors, network switching and forwarding devices (such as routers, switches), etc. Applications with high memory bandwidth requirements.