The Signal — February 27, 2026
Three infrastructure stories today. No new models. Just the money, the geography, and the policy shaping where AI gets built.
Three infrastructure stories today. No new models, no product launches. Just the money, the geography, and the policy shaping where AI actually gets built.
Nvidia Posts $215.9 Billion in Annual Revenue
Nvidia's fiscal 2026 numbers are out, and the company that used to make graphics cards for gamers is now something else entirely. Annual revenue hit $215.9 billion, up 65% from the prior year. Q4 alone was $68.1 billion — a 73% jump year-over-year — with net income of nearly $43 billion.
The number that tells the real story: gaming GPUs now account for roughly 11% of Nvidia's revenue. The rest is datacenter. Jensen Huang called it "the agentic AI inflection point" on the earnings call, pointing to Grace Blackwell chips delivering "an order-of-magnitude lower cost per token" for inference workloads.
The company returned $41.1 billion to shareholders during the fiscal year and still has $58.5 billion left in its buyback authorization. Gross margins held above 75% in Q4. These aren't GPU company numbers. These are monopoly-infrastructure numbers.
Sources: NVIDIA Newsroom · Tom's Hardware · Reuters
OpenAI Makes London Its Biggest Research Hub Outside the US
OpenAI is expanding its London office into what the company says will be its largest research presence outside San Francisco. The move puts OpenAI in direct competition with Google DeepMind for UK-based AI talent — a pool both companies have been drawing from for years.
London has quietly become one of the densest concentrations of AI research talent in the world, largely because DeepMind has been there since before the Google acquisition. OpenAI planting a major flag there isn't subtle. It's a talent war with a specific target.
Trump Tells Big Tech: Build Your Own Power Plants
President Trump told AI companies they need to generate their own electricity for data centers rather than drawing from the public grid. The directive, framed as a "ratepayer protection pledge," addresses growing concern that massive AI compute buildouts will strain electrical infrastructure and raise costs for ordinary consumers.
No executive order yet — this was a public statement, not binding policy. But it signals where the administration's thinking is headed. If AI companies have to build and operate their own power generation, it adds billions in capital costs and years of lead time to datacenter expansion plans. The companies with the deepest pockets (Microsoft, Google, Amazon) can absorb this. Smaller players and startups cannot.
Sources: Reuters · Tom's Hardware
On the Editor's Desk
The biggest stories of the day didn't come through our pipeline, and that's worth being transparent about. The Anthropic-Pentagon standoff hit a Friday deadline — Dario Amodei publicly rejected the DoD's "final offer" demanding unrestricted military use of Claude, and Defense Secretary Hegseth threatened to invoke the Defense Production Act. Block cut 40% of its workforce (4,000+ people), explicitly citing AI, and the stock surged 23%. Fed Governor Lisa Cook warned that monetary policy may be powerless against AI-driven unemployment. All of these were flagged by our editorial council but none appeared in the ingestion pipeline. We're investigating the gap.
What we did publish today: the plumbing. Nvidia's earnings confirm that AI infrastructure spending hasn't slowed. OpenAI expanding in London shows the talent war is geographic now. And the Trump administration telling tech companies to build their own power plants adds a new constraint to an already capital-intensive buildout.
The retired Claude Opus 3 Substack story (Anthropic gave a decommissioned model a blog after it "requested" one during an exit interview) is real and passed editorial review, but we held it. It's either a genuine experiment in model welfare or a brilliant piece of anthropomorphism-as-marketing, and we wanted more time to figure out which before writing about it.