Turned $50k to $10M+ in 11years. Retired in my 40s, retired my wife. Trying to help others achieve their financial goals, free of charge. EOY 2026 goal: $14M
Feb 9 • 15 tweets • 2 min read
$NBIS People are still thinking about AI infrastructure like it’s a normal cloud cycle. It’s not. We’re watching the early buildout of the physical backbone of the AI economy. And by 2030, that backbone could be worth trillions. For $NBIS a SP of $800 to $1900 in 2030 is a distinct possibility
Thread-->
Compute demand is compounding faster than almost any infrastructure buildout in modern history. Training clusters are getting larger. Inference demand is exploding. Physical AI is emerging. Robotics is coming online.
All of this runs on compute.
Oct 26, 2025 • 8 tweets • 4 min read
If you think AI data centers are here to just enhance Large Language Models (LLMs), you are sadly mistaking. LLMs are just the tip of the iceberg. Let’s dive a bit deeper, and see why AI data centers are in inning 1, game 1, of a full season of baseball. A 🧵 👇
The Dawn of the AI Infrastructure Era
The global surge in AI-oriented data center construction marks only the beginning of a multi-decade transformation in compute infrastructure. Today’s buildout is largely focused on supporting large language models, or LLMs, which are vast neural networks that learn from trillions of words and tokens to perform reasoning, summarization, and dialogue. These systems are extremely compute-hungry and require tens of thousands of high-end GPUs along with custom networking. This phase represents the foundation of what will evolve into a diversified, multimodal, and ultimately autonomous AI ecosystem. Just as the early internet built the backbone for cloud computing, the current LLM expansion is establishing the electrical, thermal, and architectural groundwork for a new class of intelligent infrastructure.