The Signal — March 22, 2026

Super Micro's co-founder charged with smuggling .5B in AI chips to China. Tesla launches Terafab. And 45,000 tech jobs vanished in Q1 2026.

Three events this week tell one story: the fight over who controls AI compute just got physical.


Hair Dryers and Dummy Servers: The Super Micro Chip Smuggling Case

Federal prosecutors unsealed charges on Thursday against three people tied to Super Micro Computer for allegedly smuggling $2.5 billion in Nvidia-powered AI servers to China. Among those charged: co-founder Yih-Shyan "Wally" Liaw, who controls roughly $464 million in Super Micro shares according to FactSet.

The alleged tradecraft reads like a low-budget spy thriller. Prosecutors say the defendants used hair dryers to remove serial numbers from servers headed to China, then slapped those serial numbers onto dummy units that stayed behind for auditors. A Southeast Asian intermediary company generated fake end-user certificates. Liaw and a second defendant, Ting-Wei "Willy" Sun, were arrested Thursday. The third, Ruei-Tsang "Steven" Chang, remains a fugitive.

Super Micro stock plunged 33% on Friday. Liaw resigned from the company's board the same day. This is the highest-profile AI chip export control prosecution to date, and it lands during a week when NVIDIA confirmed a deal to sell one million GPUs to Amazon Web Services by end of 2027. A reminder of exactly how valuable this hardware has become.

The uncomfortable question isn't about enforcement. It's about what this case reveals. If a co-founder with board-level access was the person circumventing export controls, compliance programs aren't catching the most dangerous actors. The indictment tells us more about what slipped through than what was stopped.

Sources: Reuters · CNBC · Bloomberg · Fortune · CNN


Tesla's Terafab: An Announcement About an Announcement

Elon Musk took the stage at Austin's decommissioned Seaholm Power Plant on Friday night to formally launch the Terafab Project, a joint venture between Tesla, SpaceX, and xAI to build an in-house chip fabrication facility. Musk called it the most "epic chip building exercise in history by far."

The stated ambition: produce one terawatt of computing power annually from a single facility. The plan includes logic, memory, packaging, and testing under one roof. In the near term, Tesla's AI5 chips (which the company claims deliver a compounded 50x improvement over AI4) are still manufactured by TSMC, with production transitioning to TSMC's Arizona fab. Tesla also has a $16.5 billion deal with Samsung for AI6 chips at their Taylor, Texas plant. Terafab would be a third, Tesla-owned leg of the supply chain.

Here's the gap between announcement and reality. Bloomberg reported Saturday morning that the project will be built in Austin and jointly run by Tesla, SpaceX, and xAI. But as of the launch event, the facility was still hiring its semiconductor infrastructure manager. Building a leading-edge fab from scratch requires $20-25 billion in capital and thousands of specialists managing 2,000 to 5,000 individual processes per chip. TSMC spent decades developing that expertise. Tesla is starting from zero.

The strategic logic is straightforward: Musk wants to avoid depending on external chip suppliers for the millions of Optimus robots and autonomous vehicles he's planning. Whether Terafab produces competitive chips by 2029 or 2030 is an open question. What's not in question is that another major player just announced it's tired of buying from the existing supply chain.

Sources: TeslaNorth · Bloomberg · Forbes


The Numbers Are Arriving: 45,000 Tech Jobs Cut in Q1 2026

Multiple analyses published this week converge on a number that's hard to dismiss: over 45,000 technology jobs have been eliminated globally since January, with more than 30,000 in the United States alone. At least 20% of those cuts were explicitly attributed to AI, according to data from RationalFX.

The pattern across companies is striking. Meta cut 1,500 from Reality Labs in January. Block slashed 40% of its entire workforce (roughly 4,000 people) in late February, with CEO Jack Dorsey explicitly citing AI as the reason. Atlassian cut 1,600 (10% of its workforce) on March 11, days after announcing AI-powered "teammates." In each case, the company cited AI as a factor. In each case, the company was also posting strong profits.

Last week, we wrote about how AI isn't replacing workers so much as outpacing the management structures around it ("AI Didn't Replace Workers. It Outran Their Managers," March 15). The Q1 data adds a new dimension. The Guardian's deep-dive into Atlassian's layoffs put the sequence in sharp relief: the company announced AI-powered "teammates" first, then eliminated the humans. Whether the AI actually performs those jobs is secondary to the narrative utility of the ordering.

The structural question, as several economists pointed out this week, is whether "AI-attributed" layoffs represent something genuinely new or conventional restructuring in new language. The 45,000 figure is 0.06% of the global tech workforce. That's not an apocalypse. But the trajectory, and the growing gap between record corporate profits and accelerating cuts, suggests the full picture hasn't arrived yet.

Sources: NetworkWorld · TechTimes · The Guardian · Storyboard18


On the Editor's Desk

The compute supply chain theme dominated this week so heavily that three of our four stories touch it. Behind the scenes, the NVIDIA-Amazon deal (one million GPUs by 2027, confirmed Thursday) completes the picture: while prosecutors were charging people for smuggling chips and Musk was breaking ground on his own fab, NVIDIA locked in the largest single customer commitment in GPU history. Every major player is repositioning simultaneously.

We held the White House AI framework reactions for a second day. The initial four-page document dropped Thursday; reaction coverage this weekend is clarifying the fault lines, but we covered the core tension (federal preemption versus state experimentation) in yesterday's Signal via the Colorado KILO story. We'll return when committee markup produces actual legislative text.

xAI shipped a multi-agent beta for Grok through its Enterprise API (four parallel specialized agents). Interesting architecture choice, but without benchmark data or independent testing, it's a product announcement, not a story.

The ingestion pipeline collected over 1,000 raw items across Friday and Saturday but the processing step didn't run overnight. We're working from the council's analysis of Thursday-Friday coverage rather than fresh Saturday scoring. Normal service resumes tomorrow.