Markets ▾

AI Data Center Compute, 2024–2026: Market Size, Concentration, and the Capex Engine

This article analyzes the rapid growth of the AI data center chip market, projected to reach $207B in 2025 and $286B by 2030, driven by hyperscaler investments and surging demand for accelerators. It details vendor concentration, regulatory headwinds, and the evolving landscape shaped by export controls and infrastructure constraints.

Market Definition and Scope

AI data center chips refer to GPUs and accelerators sold into data center AI workloads. Omdia estimates USD 123B in 2024 and projects USD 207B in 2025. Moreover, it sees the category reaching USD 286B by 2030. These figures scope silicon revenue, not full systems.

By contrast, TrendForce tracks AI server system value and units. It projects 2024 AI server value above USD 187B and shipments of about 1.67M units. As a result, chips represent a substantial, but not total, share of AI system value. On simple alignment, chips are on the order of two-thirds of 2024 system value.

Therefore, the two lenses converge: one on silicon, one on systems. TrendForce’s unit view adds mix and price context, while Omdia’s chip view highlights the silicon pool supporting those systems. Omdia notes “$123bn in GPUs and AI accelerators shipped in 2024, $207bn is expected in 2025.” TrendForce adds that 2024 AI servers represent roughly 65% of the total server market value.

Market Size and Growth Trajectory

In 2024, AI data center chip revenue reached about USD 123B. For 2025, Omdia forecasts USD 207B, implying roughly 67% YoY growth. Looking further out, Omdia projects USD 286B by 2030, while indicating growth could moderate as designs diversify.

Meanwhile, AI server market value is set to surpass USD 187B in 2024. That is up about 69% YoY, according to TrendForce. Unit shipments are projected at approximately 1.67M, up 41.5% YoY. Consequently, the implied 2024 average selling price is about USD 112k per AI server. Unit growth trails revenue growth, which suggests stronger content per box and higher system ASPs.

Moreover, the short-cycle demand tone remains firm. Dell’Oro reported accelerator revenue growth of 130% YoY in 3Q24. This aligns with rapid deployments for both training and inference. As a result, 2025 has begun with a high base and strong capacity adds underway.

Market Segmentation and Vendor Concentration

Vendor concentration remains elevated in 2024. TrendForce indicates NVIDIA approaches about 90% share of GPU-equipped AI servers, with AMD near 8%. Including all AI chips within AI servers (GPU/ASIC/FPGA), NVIDIA’s 2024 share is around 64% globally.

This share picture is reinforced by revenue scale. NVIDIA’s Data Center revenue reached USD 115.2B in fiscal 2025, up 142% YoY. Furthermore, NVIDIA reported fiscal Q2 2026 Data Center revenue of USD 41.1B, up 5% sequentially. These figures demonstrate continued momentum as new platforms ramp.

By contrast, AMD’s Data Center segment revenue was USD 4.3B in 3Q25, up 22% YoY. AMD also outlined long-term CAGRs at its Analyst Day, including a data center revenue CAGR above 60% and an AI data center revenue CAGR above 80% over 3–5 years. However, those figures are company projections rather than external forecasts.

Intel’s Data Center and AI (DCAI) segment shows mixed dynamics. In Q2 2025, server volume rose 13% YoY, while server ASPs fell 8% YoY. Year-to-date 2025, Intel recorded USD 361M of Gaudi accelerator inventory-related charges. Nevertheless, DCAI revenue was up USD 432M year-to-date versus the prior year. Overall, this indicates competitive intensity and pricing actions amid portfolio transitions.

Demand Drivers: Hyperscaler Investments

US hyperscalers continue to anchor demand. Dell’Oro estimates US hyperscalers are set to deploy more than five million training-capable accelerators in 2024. Additionally, accelerator revenue growth of 130% YoY in 3Q24 underscores the strength of the buildout.

Capex plans reinforce this trajectory into 2025. Amazon plans around USD 100B of capital expenditures in 2025, up from roughly USD 83B in 2024. The company cited AI as a primary driver of the increase. Alphabet now expects USD 91–93B of 2025 capex, also pointing to AI infrastructure needs. Moreover, Alphabet highlighted higher depreciation from technical infrastructure, with Q3 depreciation up 41% YoY. As a result, capex and depreciation lines both signal sustained investment in compute, networking, and facilities.

These budgets are likely to support both training clusters and expanding inference capacity. Consequently, they underpin demand for accelerators, CPUs, HBM, and high-speed interconnects. Near term, this investment mix supports elevated server ASPs and content per rack.

Revenue Performance: Major Vendors

NVIDIA’s Data Center segment delivered USD 115.2B in fiscal 2025, up 142% YoY. The company cited demand across training and inference. Fiscal Q4 2025 Data Center revenue reached USD 35.6B. Subsequently, fiscal Q2 2026 Data Center revenue came in at USD 41.1B, up 5% QoQ. Notably, management added that Blackwell Data Center revenue grew 17% sequentially in fiscal Q2 2026.

AMD reported Data Center revenue of USD 4.3B in 3Q25, up 22% YoY. At its 2025 Analyst Day, AMD articulated multi-year ambitions: a company-level revenue CAGR above 35%, a data center revenue CAGR above 60%, and an AI data center revenue CAGR above 80% over 3–5 years. However, these are management targets and carry execution risk.

Intel’s DCAI trends were more mixed. Server volume increased 13% YoY in Q2 2025, but server ASP declined 8% YoY. Year-to-date 2025 results included USD 361M of Gaudi-related inventory charges. Even so, DCAI revenue improved by USD 432M year-to-date versus the prior year. This combination suggests volume traction, pricing pressure, and inventory normalization in accelerators.

Regulatory Headwinds and Export Controls

Export controls are increasingly shaping shipment flows and product mix. On December 2, 2024, BIS announced additional controls on 24 types of semiconductor manufacturing equipment and certain software tools. The update also included high-bandwidth memory (HBM), expanding the scope beyond prior rules on advanced computing. The Bureau stated the measures enhance rules addressing advanced computing and supercomputing for PRC end use.

NVIDIA’s recent disclosures illustrate the impact. The company reported no H20 sales to China-based customers in fiscal Q2 2026. It also noted USD 650M in H20 sales to a customer outside China and a USD 180M release of previously reserved H20 inventory. Consequently, vendors are reallocating supply and adjusting product roadmaps to comply with evolving policy.

Such controls can constrain top-line potential in restricted markets and add volatility to quarterly mix. However, demand in other regions, especially among US hyperscalers, has partly offset the headwinds. The net effect varies by vendor exposure, product portfolio, and channel flexibility.

Outlook and Market Implications

Omdia projects the AI data center chip market could reach USD 286B by 2030. Near term, 2025 is modeled at USD 207B, up roughly 67% YoY from 2024. This trajectory coincides with strong hyperscaler capex and ongoing accelerator deployments.

Nevertheless, several cross-currents merit attention. First, regulatory actions continue to reshape China-bound shipments, including coverage of HBM. Second, vendor concentration remains high, with NVIDIA’s share commanding both mindshare and supply chains. Third, unit growth and rising ASPs suggest a premium mix, which may amplify cyclicality if budgets reset.

Moreover, capacity additions extend beyond chips. Power, cooling, networking, and data center real estate are gating factors. As a result, delivery timing and regional infrastructure constraints could influence quarterly revenue cadence. Different estimates suggest that mix shifts toward custom silicon and specialized accelerators may also evolve share over time.

In the base case, hyperscaler investment plans support robust demand into 2025. Amazon’s planned USD 100B and Alphabet’s USD 91–93B of 2025 capex highlight this posture. Meanwhile, US hyperscalers’ 2024 deployments exceeding five million training-capable accelerators set a high installed base. Consequently, inference scaling should follow, sustaining demand for accelerators, CPUs, and memory.

In summary, AI compute remains a top budget priority, yet policy risk and supply bottlenecks persist. Market growth into 2025 appears strong, with concentration favoring NVIDIA, while AMD and Intel seek share gains. Export controls and infrastructure constraints are the main swing factors for shipment mix and regional revenue distribution.

Sources

  1. NVIDIA Newsroom — NVIDIA Announces Financial Results for Fourth Quarter and Fiscal 2025
  2. NVIDIA Newsroom — NVIDIA Announces Financial Results for Second Quarter Fiscal 2026
  3. AMD Press Room — AMD Reports Third Quarter 2025 Financial Results
  4. Intel (SEC filing) — Form 10-Q for quarter ended June 28, 2025 (Data Center and AI segment detail)
  5. Omdia (Informa Tech) — New Omdia Forecast: AI Data Center Chip Market to Hit $286bn, Growth Likely Peaking as Custom ASICs Gain Ground
  6. TrendForce — Global AI Server Demand Surge Expected to Drive 2024 Market Value to US$187 Billion; Represents 65% of Server Market
  7. Dell’Oro Group (PR Newswire) — AI Accelerator Revenues Soar 130 Percent in 3Q 2024, According to Dell’Oro Group
  8. CNBC — Amazon plans to spend $100 billion this year to capture ‘once in a lifetime opportunity’ in AI
  9. Alphabet Investor Relations — 2025 Q3 Earnings Call – Alphabet Inc.
  10. U.S. Department of Commerce, BIS — Commerce Strengthens Export Controls to Restrict China’s Capability to Produce Advanced Semiconductors for Military Applications
Share the Post:

Related Posts

Stay in the loop