▶Schreppfer consistently argues that the demand for AI compute is 'effectively infinite' and is fundamentally constrained by the cost and availability of energy.Apr 2026
▶He repeatedly emphasizes that the AI industry's focus is shifting from the initial training of large models to the ongoing, massive computational cost of inference and reasoning.Apr 2026
▶A core tenet of his strategy, demonstrated at Meta with PyTorch and Llama, is that open-sourcing foundational AI technology is crucial for ensuring broad access and accelerating industry-wide progress.Apr 2026
▶He consistently highlights the immense challenge of US energy infrastructure, stating the grid needs a 5x expansion by 2050 even before accounting for AI's explosive growth.Apr 2026
▶Schreppfer presents a strategic tension in hardware development: he champions custom silicon (ASICs) for its potential 10x performance advantage while simultaneously warning that this path carries the significant risk of the chip becoming 'worthless' if the underlying algorithm changes.Apr 2026
▶He expresses a nuanced view on AI's progress, acknowledging that Meta may have been 'too slow' to scale up models, while also asserting that the industry is now hitting 'diminishing returns' from that very scaling strategy.Apr 2026
▶He is highly optimistic about AI's potential for productivity (e.g., generating reports in minutes), yet pragmatic about its current limitations in deep tech, where he claims AI-driven discovery is often 'sub 10%' of the total effort compared to manufacturing scale-up.Apr 2026
▶While advocating for massive investment in AI compute, he also points out fundamental capability gaps in current models, specifically noting that LLMs lack the 'associative long-term memory' characteristic of human intelligence.Apr 2026
Not enough data for timeline
Sign up free to see the full intelligence report
Get started free