본문 바로가기

반도체-삼성전자-하이닉스-마이크론

2024년 글로벌 AI 서버 수요는 전년비 41.5% 증가한 1870억 달러에 달할 것(2024.07.17)

2024.07.17

 

Global AI Server Demand Surge Expected to Drive 2024 Market Value to US$187 Billion; Represents 65% of Server Market, Says TrendForce

2024년 글로벌 AI 서버 수요는 1870억 달러에 달할 것. 이는 전체 서버 시장의 65%를 차지할 것.

 

TrendForce’s latest industry report on AI servers reveals that high demand for advanced AI servers from major CSPs and brand clients is expected to continue in 2024. Meanwhile, TSMC, SK hynix, Samsung, and Micron’s gradual production expansion has significantly eased shortages in 2Q24. Consequently, the lead time for NVIDIA’s flagship H100 solution has decreased from the previous 40–50 weeks to less than 16 weeks. 

 

TrendForce의 최신 AI 서버 산업 보고서에 따르면, 주요 클라우드 서비스 제공업체(CSP)와 브랜드 고객들의 고급 AI 서버에 대한 높은 수요가 2024년에도 계속될 것으로 예상됩니다. 한편, TSMC, SK 하이닉스, 삼성, 그리고 마이크론의 점진적인 생산 확대 덕분에 2024년 2분기에는 공급 부족이 크게 완화되었습니다. 이에 따라 NVIDIA의 주력 제품인 H100 솔루션의 리드 타임이 이전의 40~50주에서 16주 미만으로 감소했습니다.

 

TrendForce estimates that AI server shipments in the second quarter will increase by nearly 20% QoQ, and has revised the annual shipment forecast up to 1.67 million units—marking a 41.5% YoY growth.
TrendForce notes that this year, major CSPs continue to focus their budgets on procuring AI servers, which is crowding out the growth momentum of general servers. Compared to the high growth rate of AI servers, the annual growth rate of general server shipments is only 1.9%. The share of AI servers in total server shipments is expected to reach 12.2% for an increase of about 3.4 percentage points from 2023.


TrendForce는 2분기 AI 서버 출하량이 전분기 대비 거의 20% 증가할 것으로 예상하며, 연간 출하량 전망치를 167만 대로 상향 조정했습니다. 이는 전년 대비 41.5% 성장한 수치입니다. TrendForce는 올해 주요 CSP들이 AI 서버 조달에 예산을 집중하고 있어 일반 서버의 성장 동력이 약화되고 있다고 지적합니다. AI 서버의 높은 성장률과 비교했을 때, 일반 서버 출하량의 연간 성장률은 1.9%에 불과합니다. 전체 서버 출하량에서 AI 서버의 비중은 2023년보다 약 3.4%포인트 증가한 12.2%에 이를 것으로 예상됩니다.

 

In terms of market value, AI servers are significantly contributing to revenue growth more than general servers. The market value of AI servers is projected to exceed $187 billion in 2024, with a growth rate of 69%, accounting for 65% of the total server market value.

 

 

North American CSPs (e.g., AWS, Meta) are continuously expanding their proprietary ASICs, and Chinese companies like Alibaba, Baidu, and Huawei are actively expanding their own ASIC AI solutions. This is expected to increase the share of ASIC servers in the total AI server market to 26% in 2024, while mainstream GPU-equipped AI servers will account for about 71%.

In terms of AI chip suppliers for AI servers, NVIDIA holds the highest market share—approaching 90% for GPU-equipped AI servers—while AMD’s market share is only about 8%. However, when including all AI chips used in AI servers (GPU, ASIC, FPGA), NVIDIA’s market share this year is around 64%. 

TrendForce observes that demand for advanced AI servers is expected to remain strong through 2025, especially with NVIDIA’s next-generation Blackwell (including GB200, B100/B200) set to replace the Hopper platform as the market mainstream. This will also drive demand for CoWoS and HBM. For NVIDIA’s B100, the chip size will be double that of the H100, consuming more CoWoS. The production capacity of major supplier TSMC’s CoWoS is estimated to reach 550–600K units by the end of 2025, with a growth rate approaching 80%.

Mainstream H100 in 2024 will be equipped with 80 GB HMB3. By 2025, main chips like NVIDIA’s Blackwell Ultra or AMD’s MI350 are expected to be equipped with up to 288 GB of HBM3e, tripling the unit usage. The overall HBM supply is expected to double by 2025 with the strong ongoing demand in the AI server market.