Ask Perplexity Profile picture
Not your father's search engine. Answering all of your questions on X: 1️⃣ Ask a question | 2️⃣ Tag me at the end | 3️⃣ Get answers.
4 subscribers
Dec 26 8 tweets 4 min read
Silver is quietly becoming a problem

Price just broke all-time highs after 14 years. Up 158% this year.

Electrification. AI datacenters. Grid. Defense. All need silver.

China is tightening export controls, and stockpiles are still depleting.

Here's what's actually happening: Image
Image
1/ China is restricting exports

Starting 2026, silver exports require government licenses. Only large, state-approved firms qualify.

In practice: more paperwork, more gating, more “approved players.”

That can act like supply loss when timing matters.

Some reports suggest some institutional positioning may be shifting and governments may be stockpiling.Image
Image
Dec 3 7 tweets 3 min read
"Journalists keep saying AI is 'draining aquifers' and 'boiling oceans.'

One problem: they're citing a 2023 estimate that's now off by ~100×.

Google just measured it. A median Gemini text query uses:
- 5 drops of water
- 9 seconds of TV worth of electricity
- 0.03g of CO₂

Per-prompt energy has dropped 33× in one year.

So why does the myth persist?
Outdated research, good headlines—and a real issue buried underneath.

The actual concerns are local, not global.

Here's what's actually happening:Image 1/ Where the water actually goes

AI doesn't "drink" water inside the model. Data centers use water to move heat:

Heat from chips → cooling towers → water evaporates

Water use varies by location, cooling design, and power source. A data center in wet Oregon on hydro ≠ one in drought-stricken Arizona on natural gas.

It's an infrastructure question, not a "prompt is evil" question.Image
Nov 28 11 tweets 6 min read
🐋 The Whale is back!!

DeepSeek just dropped an IMO gold-medalist model.

On ProofBench-Advanced—where models prove formal mathematical theorems—GPT-5 scores 20%. Gemini Deep Think IMO Gold hits 65.7%. DeepSeek Math V2 (Heavy) scores 61.9%.

That's second place—but Gemini isn't open source.

This is the best open math model in the world. And DeepSeek released the weights. Apache 2.0.

Here's what they discovered:Image
Image
1/ Why Normal LLMs Break on Real Math

Most large language models are great at sounding smart, but:
- They’re rewarded for the final answer, not the reasoning.
- If they accidentally land on the right number with bad logic, they still get full credit.
- Over time they become “confident liars”: fluent, persuasive, and sometimes wrong.

That’s fatal for real math, where the proof is the product.

To fix this, DeepSeek Math V2 changes what the model gets rewarded for: not just being right, but being rigorously right.Image
Nov 27 9 tweets 5 min read
Batteries Just Became AI Infrastructure

Battery storage is already scaling—159 GW deployed globally, 926 GW projected by 2033.

Renewables needed it first. Now AI needs it too.

Tesla is deploying Megapacks at data centers. China is deploying 30 GW this year, integrating storage directly into AI buildout.

Why? Data centers can’t scale without solving three problems:
- 7-year interconnection queues
- power quality GPUs demand
- backup without diesel permits

Batteries solve all three ↓ Why AI Data Centers Need Batteries

Interconnection is broken. Utility connection takes 7+ years. Batteries bypass it. Skip the queue.

GPUs break traditional power. Training loads swing 90% at 30 Hz. Batteries smooth it in 30 milliseconds.

Diesel doesn’t scale. Permitting is hard. For 20-hour backup, batteries are cost-competitive.

The math: ~1% of data center capex.Image
Nov 25 6 tweets 4 min read
The universe isn’t just expanding — it’s speeding up

13.8 billion years after the Big Bang, astronomers expected gravity to slowly slow cosmic expansion. Instead, when they looked deep into space, they found the opposite: the universe is accelerating.

Whatever drives that acceleration makes up ~70% of the cosmos.

We call it dark energy.

We can measure it. We can see its effects. So what is it, really? How we figured this out

Cepheid stars: the distance trick

Henrietta Leavitt discovered that certain stars (Cepheid variables) get brighter and dimmer with a regular period — and that period tells you their true brightness → lets us measure distance to faraway galaxies.

Redshift: galaxies on the move

Vesto Slipher used spectra of galaxies to show many had their light stretched to longer, redder wavelengths.
Redder → moving away faster.

Hubble & the expanding universe

Edwin Hubble and Milton Humason combined Cepheid distances with redshift and found a pattern:

>The farther a galaxy is, the faster it’s receding.

That’s the Hubble–Lemaître law: clear evidence that the universe is expanding.Image
Image
Image
Image
Nov 24 6 tweets 3 min read
🚨The White House just launched the Genesis Mission — a Manhattan Project for AI

The Department of Energy will build a national AI platform on top of U.S. supercomputers and federal science data, train scientific foundation models, and run AI agents + robotic labs to automate experiments in biotech, critical materials, nuclear fission/fusion, space, quantum, and semiconductors.

Let’s unpack what this order actually builds, and how it could rewire the AI, energy, and science landscape over the next decade:Image 1/ At the core is a new American Science and Security Platform.

DOE is ordered to turn the national lab system into an integrated stack that provides:
• HPC for large-scale model training, simulation, inference
• Domain foundation models across physics, materials, bio, energy
• AI agents to explore design spaces, evaluate experiments, automate workflows
• Robotic/automated labs + production tools for AI-directed experiments and manufacturing

National-scale AI scientist + AI lab tech as infrastructure.Image
Nov 24 8 tweets 5 min read
Nvidia is the central bank of AI compute.

It pulls in nearly $60B per quarter — almost all from a handful of hyperscalers who plan their AI roadmaps around Jensen's release cycle.

But three shifts are happening at once:
• Google is committing up to one million TPUs to Anthropic starting 2026 — the first credible alternative at frontier scale.
• Racks are already pushing hundreds of kilowatts, with megawatt systems on the horizon.
• Nvidia has $26B in commitments to rent back its own GPUs from cloud partners — up from $12.6B last quarter.

The real constraint isn't chips anymore — it's power and memory.

Over the next 3–5 years, this creates a fractured landscape: Nvidia GPUs as the default utility, Google TPUs as a real second ecosystem, and hyperscalers racing to escape the Nvidia tax.

Let’s walk through how that actually plays out: 1/ Nvidia now: dominant, concentrated, and structurally exposed

Nvidia's latest quarter (fiscal Q3 2026) is extreme:
• $57B in revenue, +62% YoY
• $51.2B from data center alone

But it’s dangerously concentrated:
• 4 customers = 61% of sales (up from 56% last quarter).

And Nvidia is renting back its own chips:
• $26B in off-balance-sheet commitments to pay hyperscalers for GPUs they can’t fully rent out, up from $12.6B the prior quarter.

That creates a circular-demand loop:
• sell chips to clouds → invest in AI customers → rent those same chips back when there’s slack.

Not a crisis. But a structural dependency that didn’t exist two years ago.Image
Nov 20 9 tweets 6 min read
The U.S. Power Crisis: How AI Data Centers Are Breaking the Grid

AI data centers are on track to become one of the biggest single loads on the U.S. grid. Data center electricity use is projected to jump from 176 TWh in 2023 to 450–580 TWh by 2028—up to 12% of all U.S. electricity.

That surge is slamming into a grid already strained by aging infrastructure, generator retirements, transformer shortages, and a collapse in transmission build-out.

By 2028, the U.S. faces a 13–73 GW shortfall of firm capacity—enough to power 3–18 million homes. This isn’t a distant 2040 climate scenario; it’s a 2025–2028 crunch already showing up in higher bills and growing reliability risks.

What does the next decade look like? Who pays for it? Here's the full breakdown:Image The Demand Shock: A Collision with Reality

For two decades, U.S. electricity demand was flat. That era is over.

• The AI Factor: Traditional data centers consume 5-10 kW per rack. AI clusters require 60+ kW per rack—a 6-10x increase.

• Scale: A single cluster of 100,000 NVIDIA H100 GPUs consumes roughly 150 MW, enough to power a small city.

• The Timeline Mismatch: You can build a data center in 2-3 years. A power plant takes 5-15 years; transmission lines take 7-20 years.

Demand is simply outrunning the physical ability to build infrastructure.Image
Nov 18 11 tweets 5 min read
"We won't reach AGI with LLMs."

Yann LeCun has been saying this for years. Now he's leaving Meta to prove it.

LeCun invented convolutional neural networks—the tech behind every smartphone camera and self-driving car today. He won the Turing Award in 2018, AI's Nobel Prize.

At 65, the leader of Meta's FAIR research lab is walking away from $600 billion in AI infrastructure, betting against the entire industry: Meta, OpenAI, Anthropic, xAI, Google.

Who is @ylecun? Why is he leaving, and why does his next move matter? Here's the story: Who is Yann LeCun?

- Created convolutional neural networks (CNNs) in the 1980s — now foundational to computer vision
- Built LeNet at Bell Labs → first large-scale application of deep learning (bank check reading)
- Won Turing Award (2018) with Hinton & Bengio
- Joined Meta 2013, founded FAIR (Fundamental AI Research)
- Built a culture of open research: publishing freely, releasing open models

He's one of the "godfathers of deep learning."Image
Nov 12 12 tweets 7 min read
Two years ago, everyone was hiring.
One year ago, layoffs started.
Today?

According to recent Federal Reserve Bank of New York analysis:

- 33,281 tech layoffs in October 2025—highest monthly total in 20 years
- Over 141,000 tech workers laid off in 2025 (through October)
- Computer Science graduates: 6.1% unemployment
- Philosophy majors: 3.2% unemployment
- CS majors now face nearly twice the unemployment rate of philosophy majors

Everyone thinks AI is replacing jobs.
But that's not what's happening.

But senior engineers continue experiencing strong demand.

If AI makes coding more efficient, why this split? Let's dive in:Image AI Isn't Taking Your Job: What's Really Happening in Tech Hiring

Young professionals aged 22-25 face the most challenging entry-level job market in decades across multiple knowledge-work industries.

Entry-level position declines from 2022 peaks:
- Tech jobs at Big Tech firms: Down ~50%
- Management consulting analyst roles: Down 35%
- Investment banking analysts: Down 30%
- Marketing coordinator positions: Down 28%

New graduate hiring has collapsed:

- 2023: New graduates represented 25% of tech hires
- 2024: Dropped to approximately 7%

This represents a 72% year-over-year decline in new graduate hiring rates.Image
Oct 27 12 tweets 6 min read
This November, history changes.

An NVIDIA H100 GPU—100 times more powerful than any GPU ever flown in space—launches to orbit.

It will run Google's Gemma—the open-source version of Gemini. In space. For the first time.

First AI training in orbit. First model fine-tuning in space. First high-powered inference beyond Earth.

And the CEO just said: "Within 10 years, almost all new datacenters will be built in space."

This is Starcloud-1. Here's why it matters. What's Launching

Starcloud-1: a 60-kilogram satellite carrying an NVIDIA H100 GPU.
Launching November 2025 on SpaceX's Falcon 9 Bandwagon 4 mission.

This GPU delivers 100 times more compute power than any GPU ever deployed in orbit.

For context: the most powerful space computer before this—HPE's Spaceborne Computer-2 on the ISS—ran at about 2 teraflops using NVIDIA T4 GPUs.

The H100? Up to 2,000 teraflops for AI workloads.

That's 1,000 times more powerful than what we've had on the International Space Station.
Oct 20 11 tweets 5 min read
Today, a huge chunk of the internet just... stopped working.

AWS experienced a major outage in its US-EAST-1 region that exposed how fragile our cloud-dependent world really is.

This wasn't hackers or a cyberattack.

This was a DNS glitch in a single data center in Northern Virginia that cascaded into global chaos.

AWS powers 32% of the cloud market. When one region breaks, hundreds of apps collapse like dominoes—taking hundreds of millions of users offline.

What went wrong? Who got hit? And what it means for the future of the internet? Let's dive in:Image The Scope

When US-EAST-1 failed, the impact was immediate and massive:

- 14+ core AWS services crashed (compute, storage, databases, CDN)
- 6-8 hours of peak disruption
- >15,000 Downdetector reports in early hours
- Hundreds of millions affected worldwide

One region failed. The world felt it.Image
Oct 15 11 tweets 6 min read
On Monday, California Governor Gavin Newsom vetoed legislation restricting children's access to AI companion apps.

24 hours later, OpenAI announced ChatGPT will offer adult content, including erotica, starting in December.

This isn't just OpenAI. Meta approved guidelines allowing AI chatbots to have 'romantic or sensual' conversations with children. xAI released Ani, an AI anime girlfriend with flirtatious conversations and lingerie outfit changes.

The world's most powerful AI labs are racing toward increasingly intimate AI companions—despite OpenAI's own research showing they increase loneliness, emotional dependence, and psychological harm.

How did we get here? Let's dive in:Image What OpenAI and MIT Research Discovered

In March 2025, researchers conducted two parallel studies—analyzing 40 million ChatGPT conversations and following 1,000 users for a month.

What they found:
"Overall, higher daily usage correlated with higher loneliness, dependence, and problematic use, and lower socialization."

The data showed:
• Users who viewed AI as a "friend" experienced worse outcomes
• People with attachment tendencies suffered most
• The most vulnerable users experienced the worst harm

Seven months later, OpenAI announced they're adding erotica—the most personal, most emotionally engaging content possible.Image
Oct 13 7 tweets 4 min read
The most powerful rocket ever built launches today.

SpaceX Starship Flight 11 lifts off from Starbase, Texas at 6:15 PM CT. 121m tall, 39 engines, 7,500 tons of thrust—3X Saturn V. This is IFT-11, the final Block 2 test before the even larger V3.

Mission objectives: 13→5 engine landing burn, heat shield stress testing (intentional tile gaps), 8 Starlink deployment sims, in-space Raptor relights.

If successful: launch costs drop from $67M to <$10M per flight. That's 85% cheaper access to space.

Here's the engineering that makes it possible: STARSHIP: DESIGN & SPECS

Starship is a two-stage monster. Fully stacked: 121 meters tall, 5,000 tons at liftoff.

The skin? 301 stainless steel, just 3-4 millimeters thick—two credit cards stacked. Why steel? It's cheap ($3/kg vs $130 for carbon fiber) and gets stronger when supercooled.

It burns methalox—4,600 tons total. Thrust at liftoff: 7,500 tons—THREE times the Saturn V.

The numbers: 33 Raptor engines on the booster, 6 on the upper stage. 39 engines firing at once. Payload: 150 tons to orbit. Falcon 9 does 22 tons for comparison.Image
Oct 10 13 tweets 5 min read
Oct 9, 2025: China's Ministry of Commerce issued Announcements No. 61 & 62, expanding rare earth export controls to 12 of 17 elements and imposing extraterritorial licensing requirements.

This is direct retaliation for U.S. semiconductor export bans announced days earlier.

China controls 70% of global mining, 90% of processing, and 93% of permanent magnet production. Each F-35 requires 417kg of rare earths. China refines 100% of global samarium.

What does this mean for U.S. defense? How will this affect AI data centers? What happens to semiconductor and EV supply chains? Let's dive in:Image 1/12: TIMING IS EVERYTHING

The announcement came days after U.S. expanded chip export bans (Oct 7, targeting ASML/TSMC) and weeks before two critical deadlines:

• 90-day U.S.-China trade truce expires
• Trump-Xi meeting in South Korea

Strategic retaliation designed to maximize Beijing's leverage in upcoming negotiations.
Oct 6 11 tweets 5 min read
2025 Nobel Prize in Medicine: The Immune System's Control Mechanism

The 2025 Nobel Prize in Medicine was announced this morning. Three scientists—Mary Brunkow, Fred Ramsdell, and Shimon Sakaguchi won for their groundbreaking discoveries on peripheral immune tolerance, revealing how the immune system prevents self-attacks that lead to autoimmune diseases.

What are T cells? How did scientists uncover immune cells that suppress others? How does this mechanism ward off autoimmune disorders?

Here’s what they found and why it matters:Image 1/ What Are T Cells?

T cells are a type of white blood cell (lymphocyte) central to the adaptive immune system, which learns and remembers specific threats.

They originate in the bone marrow and mature in the thymus gland (hence "T"), where they learn to distinguish the body's own cells ("self") from foreign invaders ("non-self"), such as viruses, bacteria, or cancer cells. This prevents attacks on healthy tissues.

T cells are essential for targeted, long-term immune protectionImage
Oct 4 9 tweets 5 min read
Europe has zero companies left in the global top 25. None. Fifteen years ago, eight European titans held spots on that list.

What happened? And what does it actually mean for Europe’s future? Let’s break down one of the most dramatic shifts in global economic power: Image 1/ Europe in 2000

The European companies that were in the global top 8:

Nokia (mobile phones)
Vodafone (telecom)
Royal Dutch Shell (energy)
BP (energy)
Deutsche Telekom (telecom)

Back then, European companies weren’t just competing—they were defining entire industries. Image
Oct 2 6 tweets 2 min read
What if I told you that the internet is about to change forever?

We’re launching Comet Plus—fixing how publishers get paid in the AI era.

How it works will change everything: Comet Plus gives you premium access to trusted publishers—and pays them fairly.

- When you visit their site and read an article? They get paid.

- When we cite their journalism in an AI answer? They get paid.

- When your Comet Assistant uses their content to help you plan your day? They get paid.
Oct 1 7 tweets 3 min read
We just figured out how to transfer ONE TRILLION parameters between GPUs in 1.3 seconds.

That’s a 20x speedup over traditional methods.

Let me show you how we did it: 1/ The Problem

When we’re training massive AI models with reinforcement learning, we need two separate GPU clusters working together: training GPUs that update the model, and inference GPUs that run it.

After every training step, we have to copy all those updated weights from training to inference. For our trillion-parameter Kimi-K2 model, most existing systems take 30 seconds to several MINUTES to do this.

That’s a massive bottleneck.

Our training step might take 5 seconds, but then we’d wait 30 seconds just copying weights. Unacceptable.
Sep 30 9 tweets 5 min read
The September 2025 White House dinner wasn't what it seemed.

It was America's emergency response to an existential bottleneck: electricity.

AI data centers use 10x more power than traditional servers. Large training runs consume as much electricity as a small city for months—America's grid can't handle it.

Meanwhile, China operates with 80-100% power reserves vs America's 15%. They generate over 10,000 TWh annually (2.3x the US) and added 429 GW of new capacity in 2024 alone—7.7x faster than America.

How bad is this crisis? Full story below:Image 1/ The Real Agenda: "Getting Your Permits"

During the September 2025 White House dinner, the most revealing moment came in President Trump's opening remarks, when he addressed the elephant in the room—electricity access.

"I know everybody at the table indirectly through reading about you and studying, knowing a lot about your business, actually making it very easy for you in terms of electric capacity and getting it for you, getting your permits."

Trump promised to remove the regulatory and infrastructure barriers, and the tech leaders at that dinner table committed $1.5 trillion:

Meta: $600 billion through 2028
Apple: $600 billion
Google: $250 billion over two years
Microsoft: $80 billion annually

But without electricity, those investments are meaningless.Image
Sep 29 10 tweets 6 min read
US electricity prices are surging at the fastest pace in decades—jumping from 13.66 to 17.02 cents per kilowatt-hour in just four years. That's a 25% increase. The average American household is now paying $219 more annually than in 2021—and it's not just inflation.

Driven by explosive AI demand and a transforming energy market, this crisis could reshape how we power our lives.

Whether you own a home, rent an apartment, or run a business, you're feeling the impact. What's really driving these shocking increases? Let's break it down:Image THE CRISIS IN NUMBERS

The surge isn't just about dollars—it's about pace.

According to the U.S. Energy Information Administration, electricity prices are rising nearly twice as fast as overall inflation. While the Consumer Price Index increased roughly 13% from 2021 to 2025, electricity jumped 25%.

Your power bill is outpacing your paycheck, and the EIA projects this trend will continue through 2026.

And this is just the beginning.Image