The next major leap in foundation model capabilities is stalled by a scarcity of high-quality training data, a fundamental bottleneck for the industry.
The true value and defensibility in AI infrastructure lies in serving custom, post-trained models on dedicated capacity, as serving generic open-source models is a commodity race to the bottom on price.
NVIDIA's dominance in AI is secure for the near future due to its deeply integrated CUDA ecosystem and supply chain advantages, which competitors cannot easily replicate.
The demand for AI compute will vastly outstrip supply for the next five to ten years, creating a persistent scarcity that reinforces the value of existing infrastructure.
Enterprises that fail to integrate AI into their core products and workflows face an existential threat, as the technology represents a fundamental paradigm shift in business.
▶NVIDIA's Unshakeable MoatApr 2026
Srivastava posits that NVIDIA's market leadership is deeply entrenched due to its superior supply chain management and the indispensable CUDA developer ecosystem. He argues that the rapid 6-to-9 month cycle of new model releases makes it nearly impossible for hardware competitors to catch up, leading him to predict that no one will effectively challenge NVIDIA's position in the next couple of years.
For investors, this theme suggests that the most viable near-term investments in AI infrastructure may be in companies that are deeply integrated into the NVIDIA ecosystem, rather than those attempting to compete with it directly on hardware.
▶The Strategic Pivot to Custom InferenceApr 2026
In 2022, Base10 underwent a radical strategic shift, discontinuing three of its four products to focus exclusively on the AI inference market. This decision was catalyzed by the rise of powerful open-source models like Stable Diffusion, which created a need for specialized infrastructure to run custom, modified versions of these models at scale.
This demonstrates a leadership style that prioritizes decisive, high-conviction bets on emerging market trends, even at the cost of abandoning significant prior investments, such as a two-and-a-half-year-old application builder product.
▶The Coming Compute and Data ScarcityApr 2026
Srivastava holds two key macro views on the future of AI development: a belief that the industry is running out of high-quality training data, which will slow breakthroughs in foundation models, and a conviction that there will not be enough compute capacity to meet demand over the next five to ten years. This scarcity underpins the high value of existing GPU clusters and the difficulty of procuring new ones.
This dual-bottleneck thesis suggests that companies with proprietary data for post-training and secured access to long-term compute resources will have a significant competitive advantage in the coming years.
▶The 'Last Market' and Enterprise AdoptionApr–May 2026
Srivastava describes the AI inference market as the 'last market,' suggesting that even with AGI, the primary economic activity will be inference. He believes companies face an 'extinction moment' if they fail to integrate AI, yet estimates 99% of the enterprise market has not yet adopted the technology, framing a massive, winner-take-all opportunity.
This perspective frames AI not as a feature but as a fundamental, existential layer of business operations, justifying the high costs and long-term contracts customers are willing to sign for dedicated inference capacity.