Website News Blog

Global AI Server Demand Surge Expected to Drive 2024 Market Value to US$187 Billion; Represents 65% of Server Market, Says TrendForce – Information Important Web



TrendForce’s stylish business inform on AI servers reveals that broad obligation for modern AI servers from field CSPs and sort clients is due to move in 2024. Meanwhile, TSMC, SK hynix, Samsung, and Micron’s sloping creation treatment has significantly mitigated shortages in 2Q24. Consequently, the advance instance for NVIDIA’s flagship H100 resolution has attenuated from the preceding 40–50 weeks to inferior than 16 weeks. 

TrendForce estimates that AI computer shipments in the ordinal lodge module process by nearly 20% QoQ, and has revised the period shipment prognosticate up to 1.67 meg units—marking a 41.5% YoY growth.

TrendForce notes that this year, field CSPs move to pore their budgets on procuring AI servers, which is crowding discover the ontogeny strength of generalized servers. Compared to the broad ontogeny evaluate of AI servers, the period ontogeny evaluate of generalized computer shipments is exclusive 1.9%. The deal of AI servers in amount computer shipments is due to accomplish 12.2% for an process of most 3.4 proportionality points from 2023.

In cost of mart value, AI servers are significantly tributary to income ontogeny more than generalized servers. The mart continuance of AI servers is sticking to top $187 1000000000 in 2024, with a ontogeny evaluate of 69%, business for 65% of the amount computer mart value.

North dweller CSPs (e.g., AWS, Meta) are continuously expanding their copyrighted ASICs, and Asiatic companies same Alibaba, Baidu, and Huawei are actively expanding their possess ASIC AI solutions. This is due to process the deal of ASIC servers in the amount AI computer mart to 26% in 2024, patch mainstream GPU-equipped AI servers module statement for most 71%.

In cost of AI defect suppliers for AI servers, NVIDIA holds the maximal mart share—approaching 90% for GPU-equipped AI servers—while AMD’s mart deal is exclusive most 8%. However, when including every AI chips utilised in AI servers (GPU, ASIC, FPGA), NVIDIA’s mart deal this assemblage is around 64%. 

TrendForce observes that obligation for modern AI servers is due to rest brawny finished 2025, especially with NVIDIA’s next-generation Blackwell (including GB200, B100/B200) ordered to change the Hopper papers as the mart mainstream. This module also intend obligation for CoWoS and HBM. For NVIDIA’s B100, the defect filler module be threefold that of the H100, intense more CoWoS. The creation power of field bourgeois TSMC’s CoWoS is estimated to accomplish 550–600K units by the modify of 2025, with a ontogeny evaluate forthcoming 80%.

Mainstream H100 in 2024 module be armored with 80 GB HMB3. By 2025, important chips same NVIDIA’s Blackwell Ultra or AMD’s MI350 are due to be armored with up to 288 GB of HBM3e, tripling the organisation usage. The coverall HBM cater is due to threefold by 2025 with the brawny current obligation in the AI computer market.

TrendForce analysts module be presenting speeches at the FMS24 – Future of Memory and Storage installation from August 6th to 8th. The presentations module counterbalance topics including HBM, Memory(DRAM/NAND Flash), Servers, AI servers, Storage, and developments in profession and capacity. We module also hit booth #956 acquirable for meetings with analysts. If these topics welfare you, we elicit you to intend in contact with us. Feel liberated to schedule an designation or meet our booth!


Next Article