← Back to Library
# NVIDIA — GPU Compute Monopoly
## Thesis
NVIDIA dominates the AI training and inference accelerator market with >80% share. The transition from H100 to B100/B200 architecture creates a multi-year upgrade cycle.
## Key Metrics
- **Data Center Revenue:** $22.6B (Q4 FY25), +93% YoY
- **Gross Margin:** 73.5% (expanding with software/CUDA lock-in)
- **R&D Spend:** $3.2B/quarter — maintaining architectural lead
## Risks
- Custom silicon (Google TPU, Amazon Trainium) gaining share at hyperscale
- China export restrictions limiting TAM
- Potential margin compression as competition matures
## Catalysts
- B200 ramp in H2 2025
- Sovereign AI infrastructure spending (Middle East, India, Japan)
- Inference demand inflecting as AI applications scale
NVDADeep Divehigh
NVDA Deep Dive: GPU Compute Monopoly
February 15, 2026