Good morning. It's Wednesday, March 18th, 2026. Here's your tech and science briefing.
**The Karpathy Loop.** Andrej Karpathy went viral again this week โ but this time for something that's sparking both excitement and quiet unease across the AI research community. Karpathy, an early OpenAI research scientist widely known for making deep learning concepts accessible to a generation of practitioners, ran an experiment he called "autoresearch": he pointed an AI coding agent at a small language model and told it to figure out how to train it faster. Over two days, the agent ran seven hundred experiments autonomously. It found twenty optimizations. Applied to a slightly larger model, those tweaks produced an eleven percent training speedup. Shopify's CEO tried it overnight on internal company data and reported a nineteen percent performance gain. Critics are quick to note this isn't recursive self-improvement โ the agent is adjusting training code, not rewriting its own weights. But the direction of travel is hard to miss: autonomous agents are starting to do the kind of iterative optimization work that used to require entire research teams. Karpathy just showed it fits in a two-day run.
**Starship fires on Pad 2.** SpaceX made history at Starbase overnight with the first-ever static fire from its brand new Launch Complex B โ Pad 2. Super Heavy Booster 19 lit up Raptor 3 engines for the first time together, marking three simultaneous firsts: new pad, new vehicle generation, new engine variant all at once. The fire appears to have been cut short earlier than planned, with observers calling it potentially an abort. SpaceX hasn't officially characterized the result yet. But even a brief, aborted fire is valuable data for pad infrastructure checkout. Flight 12 โ the first Starship V3 mission โ continues to target late March or April, though the Tank 19 anomaly from last week remains a question mark. SpaceX is moving fast, and Pad 2 coming online dramatically expands their launch cadence potential.
**The UK bets a billion on quantum.** Britain's Chancellor Rachel Reeves announced a record two-billion-pound quantum computing investment package, including one billion specifically earmarked for procuring commercial-scale quantum computers. It's framed explicitly as a national security play โ the UK wants to be a tier-one quantum nation before the technology matures enough to break current encryption. The procurement programme is a first-of-its-kind model: rather than funding labs, the government is buying actual systems from commercial vendors, injecting real demand into the supply chain. This follows the US Quantum Initiative and China's Five-Year Plan quantum commitments. We're entering the phase where quantum isn't just a research curiosity โ governments are treating it like they treated nuclear capability in the 1950s.
**A mystery model has developers buzzing.** Something called "Hunter Alpha" appeared on OpenRouter earlier this week, and developers haven't been able to figure out who built it. What's unusual: it reportedly offers a one-million-token context window with reasoning capability at no apparent cost โ a combination that very few frontier models provide without real pricing. Reuters is reporting that the specs closely match what Chinese media outlets have been saying about DeepSeek V4, the long-anticipated multimodal successor that's been teased since late last year. DeepSeek itself hasn't said a word. Some engineers speculate it's a quiet soft launch ahead of a formal announcement. Others think it's a third party that trained on distilled weights. Either way, if Hunter Alpha is what people think it is, the AI leaderboards are about to get crowded again.
That's your briefing for Wednesday morning. Stay curious.