Good morning. It's Friday, April 10th, and the Pacific sky off San Diego is about to welcome some unlikely visitors home.
**Artemis II is coming back.** Today marks the splashdown of NASA's Artemis II mission โ the first crewed flight around the Moon in over fifty years. Astronauts Reid Wiseman, Victor Glover, Christina Koch, and CSA's Jeremy Hansen launched on April 1st aboard the Orion spacecraft and have spent the last nine days on a ten-day lunar flyby. They're currently prepping for reentry, with splashdown scheduled for roughly 5:07 PM Pacific off the coast of San Diego. After the heat shield erosion concerns from the uncrewed Artemis I flight, NASA has been watching every sensor reading. The crew has been experiencing some orthostatic intolerance โ a normal aftereffect of gravity deprivation โ but all systems are nominal. If this goes well, it's the final major validation step before Artemis III puts boots on the actual lunar surface. And here's a nice touch โ NASA is streaming splashdown live on Netflix, which is a far cry from the grainy CBS coverage of Apollo.
**RoboSense just posted numbers that should wake up everyone in the perception hardware space.** The Shenzhen-based LiDAR company shipped 330,300 units in Q1 2026 โ up 204% year over year. But the real headline is this: for the first time ever, RoboSense's *robotics* LiDAR sales exceeded its automotive ADAS segment. The robotics division sold over 185,000 units โ a staggering 1,458% year-over-year increase. These are going into robotic lawn mowers, autonomous delivery bots, humanoid robots, and commercial cleaning robots. RoboSense is now the top supplier across five robotics categories. What this tells us is that LiDAR is no longer just an automotive play. It's becoming the standard perception layer for anything that moves autonomously in the physical world, and the volume numbers are starting to look like a semiconductor company, not a boutique sensor maker.
**Micron is betting on Physical AI with a strategic investment in SiMa.ai.** The memory giant announced it's putting capital into SiMa.ai to tightly integrate Micron's LPDDR5X memory into SiMa's Modalix machine learning system-on-chip platform. The pitch is simple: run complex workloads โ large language models, vision language models โ at the edge, where robots and autonomous systems actually operate, without the power draw of a data center card. SiMa.ai's initial system-on-modules are already shipping and designed to drop into existing robotics and autonomous vehicle platforms. This is part of a broader trend: the compute architecture for Physical AI is diverging from the cloud GPU model. When your robot needs to see, reason, and act in real time, it can't round-trip to the cloud. Edge silicon with integrated high-bandwidth memory is going to be the bottleneck โ and the opportunity.
**And finally, NVIDIA is using National Robotics Week to flex its entire Physical AI stack.** The company's blog this week highlighted RoboLab โ a new high-fidelity simulation benchmark built on Isaac and Omniverse for training generalist robot policies across diverse environments โ and Cosmos Reason, demonstrated in a Doosan Robotics palletizing system that uses a single camera image to infer box contents, detect damage, and adapt grip and placement strategy in real-time. Meanwhile, Aigen is post-training Cosmos foundation models on agricultural data to build systems that generalize across millions of farming scenarios. The message is clear: the bottleneck for physical AI isn't hardware anymore. It's training data and simulation fidelity. And NVIDIA wants to own the entire pipeline from synthetic data generation, through policy training, to edge deployment on Jetson.
That's all for today. Have a great Friday.