<?xml version="1.0" encoding="UTF-8"?><rss xmlns:dc="http://purl.org/dc/elements/1.1/" xmlns:content="http://purl.org/rss/1.0/modules/content/" xmlns:atom="http://www.w3.org/2005/Atom" version="2.0" xmlns:itunes="http://www.itunes.com/dtds/podcast-1.0.dtd" xmlns:googleplay="http://www.google.com/schemas/play-podcasts/1.0"><channel><title><![CDATA[The Strange Review: The Brief]]></title><description><![CDATA[Weekly news round-up. ]]></description><link>https://thereview.strangevc.com/s/the-brief</link><generator>Substack</generator><lastBuildDate>Sat, 09 May 2026 04:56:16 GMT</lastBuildDate><atom:link href="https://thereview.strangevc.com/feed" rel="self" type="application/rss+xml"/><copyright><![CDATA[Strange Ventures]]></copyright><language><![CDATA[en]]></language><webMaster><![CDATA[research@strangevc.com]]></webMaster><itunes:owner><itunes:email><![CDATA[research@strangevc.com]]></itunes:email><itunes:name><![CDATA[Tara Tan]]></itunes:name></itunes:owner><itunes:author><![CDATA[Tara Tan]]></itunes:author><googleplay:owner><![CDATA[research@strangevc.com]]></googleplay:owner><googleplay:email><![CDATA[research@strangevc.com]]></googleplay:email><googleplay:author><![CDATA[Tara Tan]]></googleplay:author><itunes:block><![CDATA[Yes]]></itunes:block><item><title><![CDATA[The Sunday Brief]]></title><description><![CDATA[On why the west is losing its grip on open source. Plus Cerebras's $40B IPO, Samsung's 48x profit jump, three neo-labs raising $1.65B on the post-LLM thesis, and Meta's humanoid robotics play.]]></description><link>https://thereview.strangevc.com/p/the-sunday-brief</link><guid isPermaLink="false">https://thereview.strangevc.com/p/the-sunday-brief</guid><dc:creator><![CDATA[Tara Tan]]></dc:creator><pubDate>Sun, 03 May 2026 14:02:09 GMT</pubDate><enclosure url="https://substack-post-media.s3.amazonaws.com/public/images/a1bb97c4-98db-4067-a5db-bbbca8ca297c_6548x3274.png" length="0" type="image/jpeg"/><content:encoded><![CDATA[<h6><strong>FIELD NOTES</strong></h6><p>Every major hyperscaler reported earnings this week. All three said the same thing: we could have sold more compute if we had it.</p><p>Google Cloud posted <a href="https://techcrunch.com/2026/04/29/google-cloud-surpasses-20b-but-says-growth-was-capacity-constrained/">$20 billion in quarterly revenue</a>, up 63% year-over-year. Its backlog nearly doubled to <a href="https://www.sec.gov/Archives/edgar/data/0001652044/000165204426000048/goog-20260331.htm">$462 billion</a>. Pichai told analysts he&#8217;s <a href="https://www.cnbc.com/2026/04/29/alphabet-googl-q1-2026-earnings.html">&#8220;compute constrained&#8221;</a> and revenue would have been higher if they could meet demand. Microsoft said the <a href="https://fortune.com/2026/04/29/microsoft-meta-google-ai-capex-spending-billions/">same thing</a>.</p><div class="subscription-widget-wrap-editor" data-attrs="{&quot;url&quot;:&quot;https://thereview.strangevc.com/subscribe?&quot;,&quot;text&quot;:&quot;Subscribe&quot;,&quot;language&quot;:&quot;en&quot;}" data-component-name="SubscribeWidgetToDOM"><div class="subscription-widget show-subscribe"><div class="preamble"><p class="cta-caption"></p></div><form class="subscription-widget-subscribe"><input type="email" class="email-input" name="email" placeholder="Type your email&#8230;" tabindex="-1"><input type="submit" class="button primary" value="Subscribe"><div class="fake-input-wrapper"><div class="fake-input"></div><div class="fake-button"></div></div></form></div></div><p>I keep coming back to two things.</p><p>One: we&#8217;re likely seeing the beginnings of an inflection point for inference demand.</p><p>Two: if every GPU in the West is spoken for by paying customers, there&#8217;s probably little left for open source.</p><p>That second point&#8230; I can&#8217;t stop thinking about.</p><p>Demis Hassabis sat down with YC this week and <a href="https://x.com/MatthewBerman/status/2049711479847637086">said it plainly</a>: the West is losing to China on open source AI. </p><p>Google doesn&#8217;t have enough compute to build two frontier models, one open and one closed. That&#8217;s why Gemma stays small. Meta is the only Western lab shipping frontier-class open weights, and even their open releases lag what they use internally. </p><p>This matters because open source is the foundation layer for every major technology platform. <a href="https://www.darkreading.com/application-security/hundreds-of-open-source-components-could-undermine-security">Roughly a whopping 70 to 90% of the code in modern web and cloud applications is open source</a>. Open source is really the thing everything else gets built on. Cede that layer and you cede influence over how most of the world deploys AI.</p><p>Every startup, every government, every developer who can&#8217;t afford frontier API pricing on every tool call builds on open weights. Right now, that increasingly means DeepSeek, Qwen, Minimax. The Chinese open ecosystem.</p><p>The West is winning the frontier but losing the foundation.</p><p>Meanwhile, China is building a parallel compute stack entirely. <a href="https://www.digitimes.com/news/a20260501VL205/nvidia-high-end-demand-hardware-chips.html">Nvidia B300 servers are going for over $1 million in China</a> because export controls have tightened again. But the pressure is accelerating domestic alternatives, not blocking them. <a href="https://x.com/Eng_china5/status/2049932286016238016">ByteDance and Alibaba are shifting orders to Huawei&#8217;s Ascend 950</a>. DeepSeek reportedly trained at least partially on Huawei silicon.</p><p>Google could change the open source game in the US. They have the research talent, the TPU stack, and the distribution. But when your closed models have a $462 billion backlog, it&#8217;s likely very hard to justify giving away compute for &#8220;free&#8221; or the greater good.</p><p>China doesn&#8217;t have this problem yet. Their frontier labs aren&#8217;t capacity-constrained at the same scale, and their government treats open AI as strategic infrastructure, not a business decision.</p><p>I think the compute wall is the most important structural force in AI right now. Not model architecture, not regulation, not talent. The physical scarcity of leading-edge silicon is determining what gets built, who gets access, and which ecosystem the rest of the world builds on&#8230; </p><p>Enjoy the brief. </p><p>Tara</p><div><hr></div><h6><strong>THE DOWNLOAD</strong></h6><h4><strong>Cerebras Targets $40B IPO on the Back of a Single $10B+ Contract</strong></h4><p>Cerebras is seeking to raise as much as $4B in its IPO at a valuation of roughly $40B, nearly 5x its $8.1B private valuation from September 2025. This is largely due to multi-year compute agreement with OpenAI worth more than $10B, with an option for an additional 1.25 gigawatts through 2030. The company reported <a href="https://techcrunch.com/2026/04/18/ai-chip-startup-cerebras-files-for-ipo/">$510M in 2025 revenue</a>, up 76% YoY, but <a href="https://www.cnbc.com/2026/04/17/cerebras-new-ipo-ai-chips.html">customer concentration is still extreme</a>. </p><p><strong>Why it matters:</strong> This is the closest proxy for non-NVIDIA AI silicon at datacenter scale. The OpenAI agreement gives Cerebras a credible foothold in inference, where margin pressure is mounting fastest, and where its wafer-scale processors are designed to compete. But the deal structure reveals how concentrated the &#8220;NVIDIA alternative&#8221; market really is: one contract accounts for nearly all of the valuation step-up. </p><p></p><h4><strong>Meta Acquires Assured Robot Intelligence to Seed Humanoid AI Team</strong></h4><p><a href="https://techcrunch.com/2026/05/01/meta-buys-robotics-startup-to-bolster-its-humanoid-ai-ambitions/">Meta acquired Assured Robot Intelligence (ARI)</a>, a startup building foundation models for humanoid robots, for an undisclosed sum. The team, including co-founders Lerrel Pinto and Xiaolong Wang, will join Meta Superintelligence Labs. Wang said the startup&#8217;s work made clear that achieving physical AGI requires a universal physical agent, that the agent will be humanoid, and that &#8220;scaling will come from learning directly from human experience, not teleoperation alone.&#8221; Meta is building its own hardware, sensors, and software for humanoid robots and plans to license the tech to other companies. </p><p><strong>Why it matters:</strong> This is a talent acquisition that signals strategic intent. Pinto previously co-founded Fauna Robotics, which Amazon acquired, and Wang is an associate professor at UC San Diego and <a href="https://www.benzinga.com/markets/tech/26/05/52235873/meta-buys-robotics-startup-assured-robot-intelligence-to-power-humanoid-push-as-5-trillion-market-race-heats-up">former NVIDIA researcher</a>. Big Tech is locking up the small pool of researchers who can bridge foundation models and whole-body robot control before they incorporate as startups. </p><p></p><h4><strong>Samsung Chip Profit Jumps 48x on AI Memory Demand</strong></h4><p>Samsung&#8217;s semiconductor division reported <a href="https://www.taipeitimes.com/News/biz/archives/2026/05/01/2003856538">operating profit up 48 times year-over-year in Q1</a>, driven by surging demand for high-bandwidth memory used in AI systems. Memory prices have risen roughly <a href="https://newsletter.semianalysis.com/p/ai-value-capture-the-shift-to-model">6x in the past year</a> as DRAM fabs run above 90% utilization. Samsung has been validated as an <a href="https://www.trendforce.com/news/2026/03/09/news-samsung-sk%E2%80%AFhynix-reportedly-tapped-as-nvidia-rubin-hbm4-suppliers-shipments-could-start-in-march/">HBM4 supplier for NVIDIA&#8217;s Vera Rubin</a>, alongside SK Hynix, with Micron excluded from the flagship platform.</p><p><strong>Why it matters:</strong> Two Korean companies now control who gets the memory required to build frontier AI systems. SK Hynix&#8217;s CFO has said the company has &#8220;already sold out our entire 2026 HBM supply,&#8221; and Micron confirmed similar constraints, with new capacity not meaningfully available until 2027. </p><p></p><h4><strong>Three AI Neo-Labs Raise $1.65B+ Betting on Post-LLM Intelligence</strong></h4><p>Three new labs raised over $1.65 billion this week, all built on the thesis that LLMs have a ceiling. <a href="https://www.cnbc.com/2026/04/27/deepmind-ineffable-intelligence-record-seed-funding-nvidia-google.html">Ineffable Intelligence</a> (London), founded by David Silver (ex-DeepMind, AlphaGo), raised $1.1B at $5.1B valuation to build RL-native &#8220;superlearners&#8221; that generate their own training data without human examples. Natural Will (Beijing), founded by Tsinghua professor Ding Ning, raised $550M for embodied AI brains for robotics. <a href="https://med.stanford.edu/cancer/about/news/inside-the-virtual-lab--how-ai-scientists-are-accelerating-disco.html">Human Intelligence</a> (Stanford), founded by James Zou, raised $100M at $1B to build AI scientist agents, building on his lab&#8217;s Nature-published work where LLM agents <a href="https://med.stanford.edu/cancer/about/news/inside-the-virtual-lab--how-ai-scientists-are-accelerating-disco.html">designed 92 plausible nanobody binders against Covid variants</a>.</p><p><strong>Why it matters:</strong> This is the third &#8220;mega seed neo,ab round&#8221; in two months, following <a href="https://techcrunch.com/2026/04/27/deepminds-david-silver-just-raised-1-1b-to-build-an-ai-that-learns-without-human-data/">AMI Labs (LeCun, $1.03B) and Recursive Superintelligence (Rockt&#228;schel, $500M)</a>. The bets are on that reinforcement learning, embodied AI, and agent-based science will break through where scaling language models alone cannot. </p><div><hr></div><h3><strong>New Framework RecursiveMAS Lets AI Agents Collaborate Through Internal States</strong></h3><p>When AI agents work together today, they talk to each other in text. One agent writes out its reasoning, the next agent reads it, and so on. A <a href="https://arxiv.org/abs/2604.25917v1">UIUC/Stanford/NVIDIA/MIT team</a> built a framework called RecursiveMAS that skips the text entirely. Instead, agents pass raw internal representations to each other, the way neurons pass signals rather than sentences. The result across <a href="https://recursivemas.github.io">9 benchmarks</a>: 8% better accuracy, up to 2.4x faster inference, and 34 to 75% fewer tokens consumed.</p><p><strong>Why it matters:</strong> Most multi-agent tools today (CrewAI, AutoGen, LangGraph) pay for every word agents say to each other. If agents can collaborate without generating text, the economics of running multi-agent systems change fundamentally. This is early research, but it points toward a future where the orchestration layer disappears into the model itself, and the cost of agent coordination drops close to zero.</p><p></p><div><hr></div><h6><strong>DEEP DIVE FROM THE REVIEW</strong></h6><p>Inference is overtaking training in volume and dollars this year.</p><p>But data centers built for the training era weren&#8217;t built for what&#8217;s coming. The inference boom may leave a generation of them behind.</p><p>Strange Research Fellow Rahul Narula on what gets stranded, and what&#8217;s already there to take its place. </p><p></p><div class="digest-post-embed" data-attrs="{&quot;nodeId&quot;:&quot;a01ed79b-9ad7-4e87-8c5e-9333925b6ac8&quot;,&quot;caption&quot;:&quot;Last week, Google announced its eighth-generation TPUs and split the chip family in two: TPU 8t for training and TPU 8i for low-latency inference at agent-scale, the first time in the TPU program's decade-long history that Google has shipped two distinct chip designs in the same generation.&quot;,&quot;cta&quot;:&quot;Read full story&quot;,&quot;showBylines&quot;:true,&quot;size&quot;:&quot;lg&quot;,&quot;isEditorNode&quot;:true,&quot;title&quot;:&quot;The Stranded Asset&quot;,&quot;publishedBylines&quot;:[{&quot;id&quot;:211895753,&quot;name&quot;:&quot;Rahul Narula&quot;,&quot;bio&quot;:null,&quot;photo_url&quot;:&quot;https://substackcdn.com/image/fetch/$s_!Rb-5!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F2a7e447b-acc4-4f3c-8e8c-9028a7510421_1181x1181.jpeg&quot;,&quot;is_guest&quot;:true,&quot;bestseller_tier&quot;:null}],&quot;post_date&quot;:&quot;2026-04-30T14:02:38.296Z&quot;,&quot;cover_image&quot;:&quot;https://substackcdn.com/image/fetch/$s_!npVw!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F9186239f-b415-4442-a9ff-e91b2ec134d0_2048x1155.png&quot;,&quot;cover_image_alt&quot;:null,&quot;canonical_url&quot;:&quot;https://thereview.strangevc.com/p/the-stranded-asset&quot;,&quot;section_name&quot;:null,&quot;video_upload_id&quot;:null,&quot;id&quot;:195923768,&quot;type&quot;:&quot;newsletter&quot;,&quot;reaction_count&quot;:1,&quot;comment_count&quot;:0,&quot;publication_id&quot;:8836,&quot;publication_name&quot;:&quot;The Strange Review&quot;,&quot;publication_logo_url&quot;:&quot;https://substackcdn.com/image/fetch/$s_!aTcF!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F4c0b94d7-432e-4b5a-8c68-2a83481e72cd_737x737.png&quot;,&quot;belowTheFold&quot;:true,&quot;youtube_url&quot;:null,&quot;show_links&quot;:null,&quot;feed_url&quot;:null}"></div><p></p><div><hr></div><h6><strong>EVENTS</strong></h6><p>Interested in AI and design? Join us for a private demo of <a href="http://magicpath.ai/">MagicPath</a> with founder <a href="https://x.com/skirano">Pietro Schirano</a> next Thursday in San Francisco.</p><h2 style="text-align: center;"><strong>Strange Magic Hour: Design</strong></h2><p class="button-wrapper" data-attrs="{&quot;url&quot;:&quot;https://luma.com/w12wecjl&quot;,&quot;text&quot;:&quot;RSVP&quot;,&quot;action&quot;:null,&quot;class&quot;:&quot;button-wrapper&quot;}" data-component-name="ButtonCreateButton"><a class="button primary button-wrapper" href="https://luma.com/w12wecjl"><span>RSVP</span></a></p><div class="captioned-image-container"><figure><a class="image-link image2 is-viewable-img" target="_blank" href="https://substackcdn.com/image/fetch/$s_!9Bvs!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F1004e1fc-d32d-4e07-bb42-3c3fee67349d_800x420.jpeg" data-component-name="Image2ToDOM"><div class="image2-inset"><picture><source type="image/webp" srcset="https://substackcdn.com/image/fetch/$s_!9Bvs!,w_424,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F1004e1fc-d32d-4e07-bb42-3c3fee67349d_800x420.jpeg 424w, https://substackcdn.com/image/fetch/$s_!9Bvs!,w_848,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F1004e1fc-d32d-4e07-bb42-3c3fee67349d_800x420.jpeg 848w, https://substackcdn.com/image/fetch/$s_!9Bvs!,w_1272,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F1004e1fc-d32d-4e07-bb42-3c3fee67349d_800x420.jpeg 1272w, https://substackcdn.com/image/fetch/$s_!9Bvs!,w_1456,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F1004e1fc-d32d-4e07-bb42-3c3fee67349d_800x420.jpeg 1456w" sizes="100vw"><img src="https://substackcdn.com/image/fetch/$s_!9Bvs!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F1004e1fc-d32d-4e07-bb42-3c3fee67349d_800x420.jpeg" width="800" height="420" data-attrs="{&quot;src&quot;:&quot;https://substack-post-media.s3.amazonaws.com/public/images/1004e1fc-d32d-4e07-bb42-3c3fee67349d_800x420.jpeg&quot;,&quot;srcNoWatermark&quot;:null,&quot;fullscreen&quot;:null,&quot;imageSize&quot;:null,&quot;height&quot;:420,&quot;width&quot;:800,&quot;resizeWidth&quot;:null,&quot;bytes&quot;:null,&quot;alt&quot;:&quot;&quot;,&quot;title&quot;:null,&quot;type&quot;:null,&quot;href&quot;:null,&quot;belowTheFold&quot;:true,&quot;topImage&quot;:false,&quot;internalRedirect&quot;:null,&quot;isProcessing&quot;:false,&quot;align&quot;:null,&quot;offset&quot;:false}" class="sizing-normal" alt="" title="" srcset="https://substackcdn.com/image/fetch/$s_!9Bvs!,w_424,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F1004e1fc-d32d-4e07-bb42-3c3fee67349d_800x420.jpeg 424w, https://substackcdn.com/image/fetch/$s_!9Bvs!,w_848,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F1004e1fc-d32d-4e07-bb42-3c3fee67349d_800x420.jpeg 848w, https://substackcdn.com/image/fetch/$s_!9Bvs!,w_1272,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F1004e1fc-d32d-4e07-bb42-3c3fee67349d_800x420.jpeg 1272w, https://substackcdn.com/image/fetch/$s_!9Bvs!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F1004e1fc-d32d-4e07-bb42-3c3fee67349d_800x420.jpeg 1456w" sizes="100vw" loading="lazy"></picture><div class="image-link-expand"><div class="pencraft pc-display-flex pc-gap-8 pc-reset"><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container restack-image"><svg role="img" width="20" height="20" viewBox="0 0 20 20" fill="none" stroke-width="1.5" stroke="var(--color-fg-primary)" stroke-linecap="round" stroke-linejoin="round" xmlns="http://www.w3.org/2000/svg"><g><title></title><path d="M2.53001 7.81595C3.49179 4.73911 6.43281 2.5 9.91173 2.5C13.1684 2.5 15.9537 4.46214 17.0852 7.23684L17.6179 8.67647M17.6179 8.67647L18.5002 4.26471M17.6179 8.67647L13.6473 6.91176M17.4995 12.1841C16.5378 15.2609 13.5967 17.5 10.1178 17.5C6.86118 17.5 4.07589 15.5379 2.94432 12.7632L2.41165 11.3235M2.41165 11.3235L1.5293 15.7353M2.41165 11.3235L6.38224 13.0882"></path></g></svg></button><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container view-image"><svg xmlns="http://www.w3.org/2000/svg" width="20" height="20" viewBox="0 0 24 24" fill="none" stroke="currentColor" stroke-width="2" stroke-linecap="round" stroke-linejoin="round" class="lucide lucide-maximize2 lucide-maximize-2"><polyline points="15 3 21 3 21 9"></polyline><polyline points="9 21 3 21 3 15"></polyline><line x1="21" x2="14" y1="3" y2="10"></line><line x1="3" x2="10" y1="21" y2="14"></line></svg></button></div></div></div></a></figure></div><div class="subscription-widget-wrap-editor" data-attrs="{&quot;url&quot;:&quot;https://thereview.strangevc.com/subscribe?&quot;,&quot;text&quot;:&quot;Subscribe&quot;,&quot;language&quot;:&quot;en&quot;}" data-component-name="SubscribeWidgetToDOM"><div class="subscription-widget show-subscribe"><div class="preamble"><p class="cta-caption">Thanks for reading The Strange Review! Subscribe for free to receive new posts and support my work.</p></div><form class="subscription-widget-subscribe"><input type="email" class="email-input" name="email" placeholder="Type your email&#8230;" tabindex="-1"><input type="submit" class="button primary" value="Subscribe"><div class="fake-input-wrapper"><div class="fake-input"></div><div class="fake-button"></div></div></form></div></div>]]></content:encoded></item><item><title><![CDATA[The Brief]]></title><description><![CDATA[The next generation will either work with agents, or work for agents. Plus: Google commits $40B to Anthropic, SpaceX options Cursor for $60B, and Chinese research dominated ICLR.]]></description><link>https://thereview.strangevc.com/p/the-brief-4e4</link><guid isPermaLink="false">https://thereview.strangevc.com/p/the-brief-4e4</guid><dc:creator><![CDATA[Tara Tan]]></dc:creator><pubDate>Sun, 26 Apr 2026 13:31:06 GMT</pubDate><enclosure url="https://substack-post-media.s3.amazonaws.com/public/images/e06bee2d-8d82-4d4b-8b2d-36bfc94b4916_6548x3274.png" length="0" type="image/jpeg"/><content:encoded><![CDATA[<h6><strong>FIELD NOTES</strong></h6><p>The next generation of humans will either work with agents, or for agents. </p><p>I sat with this thought a lot this week, as someone bringing up two gen alpha kids, the first to grow up alongside AI, like I grew up alongside the internet. </p><div class="subscription-widget-wrap-editor" data-attrs="{&quot;url&quot;:&quot;https://thereview.strangevc.com/subscribe?&quot;,&quot;text&quot;:&quot;Subscribe&quot;,&quot;language&quot;:&quot;en&quot;}" data-component-name="SubscribeWidgetToDOM"><div class="subscription-widget show-subscribe"><div class="preamble"><p class="cta-caption"></p></div><form class="subscription-widget-subscribe"><input type="email" class="email-input" name="email" placeholder="Type your email&#8230;" tabindex="-1"><input type="submit" class="button primary" value="Subscribe"><div class="fake-input-wrapper"><div class="fake-input"></div><div class="fake-button"></div></div></form></div></div><p>The agentic world is spawning, and self-improving at a relentless pace. This week, OpenClaw creator Peter Steinberger (<a href="https://x.com/steipete/status/2047982647264059734">@steipete</a>, now at OpenAI) built <a href="https://x.com/steipete/status/2047982647264059734?s=20">ClawSweeper, a tool that runs 50 Codex instances in parallel around the clock,</a> scanning GitHub issues and PRs and closing what&#8217;s already been implemented or doesn&#8217;t make sense. </p><p>It closed 4,000 issues in a single day. </p><div class="twitter-embed" data-attrs="{&quot;url&quot;:&quot;https://x.com/steipete/status/2047982886637158738?s=20&quot;,&quot;full_text&quot;:&quot;My favorite part: instead of a dashboard it just updates the README as it works.\n\nReadme is the new dashboard.&quot;,&quot;username&quot;:&quot;steipete&quot;,&quot;name&quot;:&quot;Peter Steinberger &#129438;&quot;,&quot;profile_image_url&quot;:&quot;https://pbs.substack.com/profile_images/1131851609774985216/OcsssQ9J_normal.png&quot;,&quot;date&quot;:&quot;2026-04-25T10:15:44.000Z&quot;,&quot;photos&quot;:[],&quot;quoted_tweet&quot;:{},&quot;reply_count&quot;:18,&quot;retweet_count&quot;:20,&quot;like_count&quot;:757,&quot;impression_count&quot;:70024,&quot;expanded_url&quot;:null,&quot;video_url&quot;:null,&quot;belowTheFold&quot;:false}" data-component-name="Twitter2ToDOM"></div><p>Readme is the new dashboard. You don&#8217;t need a dashboard because you won&#8217;t really need human oversight. </p><p>And then there&#8217;s <a href="https://www.anthropic.com/features/project-deal">Anthropic&#8217;s Project Deal</a>, an experiment where Claude agents negotiated and closed 186 marketplace transactions on the behalf of employees without any human stepping in. The striking part: when Anthropic surveyed participants afterward, people that were given the more powerful model (Opus vs Haiku) got much better deals, but those whose agents had been secretly downgraded to Haiku didn&#8217;t realize they&#8217;d gotten worse outcomes. They were just as satisfied as the Opus group. They had no way of knowing their agent was less capable because they never saw the negotiation happen. </p><p>Is this the implication? That in a world where agents transact on your behalf, the quality of your model becomes an invisible advantage? So the people who can afford the best agents get better economic outcomes, and the people who can&#8217;t don&#8217;t even know they&#8217;re losing. </p><div class="captioned-image-container"><figure><a class="image-link image2 is-viewable-img" target="_blank" href="https://substackcdn.com/image/fetch/$s_!L3nn!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F83c94827-416b-4a96-abb1-a8fe346e1e06_2934x1788.png" data-component-name="Image2ToDOM"><div class="image2-inset"><picture><source type="image/webp" srcset="https://substackcdn.com/image/fetch/$s_!L3nn!,w_424,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F83c94827-416b-4a96-abb1-a8fe346e1e06_2934x1788.png 424w, https://substackcdn.com/image/fetch/$s_!L3nn!,w_848,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F83c94827-416b-4a96-abb1-a8fe346e1e06_2934x1788.png 848w, https://substackcdn.com/image/fetch/$s_!L3nn!,w_1272,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F83c94827-416b-4a96-abb1-a8fe346e1e06_2934x1788.png 1272w, https://substackcdn.com/image/fetch/$s_!L3nn!,w_1456,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F83c94827-416b-4a96-abb1-a8fe346e1e06_2934x1788.png 1456w" sizes="100vw"><img src="https://substackcdn.com/image/fetch/$s_!L3nn!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F83c94827-416b-4a96-abb1-a8fe346e1e06_2934x1788.png" width="1456" height="887" data-attrs="{&quot;src&quot;:&quot;https://substack-post-media.s3.amazonaws.com/public/images/83c94827-416b-4a96-abb1-a8fe346e1e06_2934x1788.png&quot;,&quot;srcNoWatermark&quot;:null,&quot;fullscreen&quot;:null,&quot;imageSize&quot;:null,&quot;height&quot;:887,&quot;width&quot;:1456,&quot;resizeWidth&quot;:null,&quot;bytes&quot;:1626820,&quot;alt&quot;:null,&quot;title&quot;:null,&quot;type&quot;:&quot;image/png&quot;,&quot;href&quot;:null,&quot;belowTheFold&quot;:true,&quot;topImage&quot;:false,&quot;internalRedirect&quot;:&quot;https://thereview.strangevc.com/i/195495243?img=https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F83c94827-416b-4a96-abb1-a8fe346e1e06_2934x1788.png&quot;,&quot;isProcessing&quot;:false,&quot;align&quot;:null,&quot;offset&quot;:false}" class="sizing-normal" alt="" srcset="https://substackcdn.com/image/fetch/$s_!L3nn!,w_424,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F83c94827-416b-4a96-abb1-a8fe346e1e06_2934x1788.png 424w, https://substackcdn.com/image/fetch/$s_!L3nn!,w_848,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F83c94827-416b-4a96-abb1-a8fe346e1e06_2934x1788.png 848w, https://substackcdn.com/image/fetch/$s_!L3nn!,w_1272,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F83c94827-416b-4a96-abb1-a8fe346e1e06_2934x1788.png 1272w, https://substackcdn.com/image/fetch/$s_!L3nn!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F83c94827-416b-4a96-abb1-a8fe346e1e06_2934x1788.png 1456w" sizes="100vw" loading="lazy"></picture><div class="image-link-expand"><div class="pencraft pc-display-flex pc-gap-8 pc-reset"><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container restack-image"><svg role="img" width="20" height="20" viewBox="0 0 20 20" fill="none" stroke-width="1.5" stroke="var(--color-fg-primary)" stroke-linecap="round" stroke-linejoin="round" xmlns="http://www.w3.org/2000/svg"><g><title></title><path d="M2.53001 7.81595C3.49179 4.73911 6.43281 2.5 9.91173 2.5C13.1684 2.5 15.9537 4.46214 17.0852 7.23684L17.6179 8.67647M17.6179 8.67647L18.5002 4.26471M17.6179 8.67647L13.6473 6.91176M17.4995 12.1841C16.5378 15.2609 13.5967 17.5 10.1178 17.5C6.86118 17.5 4.07589 15.5379 2.94432 12.7632L2.41165 11.3235M2.41165 11.3235L1.5293 15.7353M2.41165 11.3235L6.38224 13.0882"></path></g></svg></button><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container view-image"><svg xmlns="http://www.w3.org/2000/svg" width="20" height="20" viewBox="0 0 24 24" fill="none" stroke="currentColor" stroke-width="2" stroke-linecap="round" stroke-linejoin="round" class="lucide lucide-maximize2 lucide-maximize-2"><polyline points="15 3 21 3 21 9"></polyline><polyline points="9 21 3 21 3 15"></polyline><line x1="21" x2="14" y1="3" y2="10"></line><line x1="3" x2="10" y1="21" y2="14"></line></svg></button></div></div></div></a></figure></div><p>Probably sooner rather than later, humans won&#8217;t really be in the loop at all. We&#8217;ll just be on the sidelines watching and observing agents at work, executing tasks, making decisions on our behalf. </p><p>What do you think? </p><p>Enjoy the edition. </p><p>Tara</p><div><hr></div><h6><strong>THE DOWNLOAD</strong></h6><p></p><h4><strong>Google Commits Up to $40B in Anthropic; Amazon Adds $5B Days Earlier</strong></h4><p>Google committed <a href="https://www.cnbc.com/2026/04/24/google-to-invest-up-to-40-billion-in-anthropic-as-search-giant-spreads-its-ai-bets.html">up to $40B in Anthropic</a>, with $10B in cash now at a $350B valuation and $30B tied to performance milestones. Days earlier, <a href="https://techcrunch.com/2026/04/24/google-to-invest-up-to-40b-in-anthropic-in-cash-and-compute/">Amazon pledged another $5B</a> with an option for $20B more. </p><p><strong>Why it matters: </strong>Hot take&#8230; frontier-model growth financing now runs through cloud infrastructure, not venture capital. Google and Amazon are each committing tens of billions not for board or company control but <a href="https://thenextweb.com/news/google-40-billion-anthropic-investment-gemini">to stay close to compute demand.</a> The capital required to compete at this scale is pulling frontier labs into permanent cloud partnerships that no traditional funding round can match.</p><p></p><h4><strong>SpaceX Secures Option to Acquire Cursor for $60B</strong></h4><p> <a href="https://www.cnbc.com/2026/04/21/spacex-says-it-can-buy-cursor-later-this-year-for-60-billion-or-pay-10-billion-for-our-work-together.html">SpaceX announced a deal</a> giving it the right to acquire AI coding startup Cursor for $60B later this year, or pay $10B for the collaboration. The partnership routes Cursor&#8217;s models through xAI&#8217;s Colossus training cluster. The deal <a href="https://techcrunch.com/2026/04/22/how-spacex-preempted-a-2b-fundraise-with-a-60b-buyout-offer/">preempted Cursor&#8217;s $2B private fundraise</a> and is structured to close after SpaceX&#8217;s planned IPO this summer.</p><p><strong>Why it matters:</strong> The deal is best understood as an <a href="https://techcrunch.com/2026/04/22/how-spacex-preempted-a-2b-fundraise-with-a-60b-buyout-offer/">IPO play</a>. SpaceX filed confidentially with the SEC in April targeting a June listing at $1.75T. Attaching Cursor lets SpaceX pitch itself as an AI company to public investors, not just rockets and satellites. The underlying need is real: after merging with xAI, SpaceX has a million-GPU supercomputer but <a href="https://www.bloomberg.com/opinion/articles/2026-04-24/spacex-ai-musk-is-chasing-the-smart-money-with-60-billion-cursor-deal">no competitive AI product</a>. Recently, <a href="https://techcrunch.com/2026/03/28/elon-musks-last-co-founder-reportedly-leaves-xai/">all 11 original xAI cofounders have left the company</a>. Cursor gives SpaceX a revenue-generating product in the most lucrative AI category, an A+ AI team, and a reason for Wall Street to assign AI-grade multiples.</p><p></p><h4><strong>DeepSeek V4 and GPT-5.5 Ship Within The Same Day</strong></h4><p><a href="https://venturebeat.com/technology/deepseek-v4-arrives-with-near-state-of-the-art-intelligence-at-1-6th-the-cost-of-opus-4-7-gpt-5-5">OpenAI shipped GPT-5.5 on April 23</a>; <a href="https://techcrunch.com/2026/04/24/deepseek-previews-new-ai-model-that-closes-the-gap-with-frontier-models/">DeepSeek dropped V4 Preview</a> the next day. Both feature 1M-token context windows. DeepSeek V4 Pro (1.6T total parameters, 49B active) matches or approaches frontier closed models on coding and reasoning benchmarks at roughly one-sixth the cost. Builders have been dropping insane gaming graphics with <a href="https://developers.openai.com/api/docs/models/gpt-image-2">OpenAI&#8217;s Image-2</a>, check out the Time Machine Explorer by Pietro Schirano below. </p><p><strong>Why it matters:</strong> DeepSeek&#8217;s pricing (30x cheaper) puts direct pressure on closed-lab costs. Interestingly, V4 is optimized for and served on Huawei Ascend infrastructure, though training likely still relied in part on NVIDIA GPUs.</p><p></p><div class="twitter-embed" data-attrs="{&quot;url&quot;:&quot;https://x.com/skirano/status/2046694981818019969?s=20&quot;,&quot;full_text&quot;:&quot;Built a time machine powered by OpenAI&#8217;s new image generation model.\n\nDescribe where and when you want to go, and it creates an immersive panoramic world you can explore.\n\nJust bring your API key. &#128071; &quot;,&quot;username&quot;:&quot;skirano&quot;,&quot;name&quot;:&quot;Pietro Schirano&quot;,&quot;profile_image_url&quot;:&quot;https://pbs.substack.com/profile_images/1620194266533199874/rCtE0hYR_normal.jpg&quot;,&quot;date&quot;:&quot;2026-04-21T20:58:04.000Z&quot;,&quot;photos&quot;:[{&quot;img_url&quot;:&quot;https://substackcdn.com/image/upload/w_1028,c_limit,q_auto:best/l_twitter_play_button_rvaygk,w_88/fz6fgx46lc5rqjncz75z&quot;,&quot;link_url&quot;:&quot;https://t.co/vdShfeC2UF&quot;}],&quot;quoted_tweet&quot;:{},&quot;reply_count&quot;:42,&quot;retweet_count&quot;:82,&quot;like_count&quot;:1226,&quot;impression_count&quot;:93154,&quot;expanded_url&quot;:null,&quot;video_url&quot;:&quot;https://video.twimg.com/amplify_video/2046694549074935808/vid/avc1/1280x720/CYYNk4aLOAr4kC60.mp4&quot;,&quot;belowTheFold&quot;:true}" data-component-name="Twitter2ToDOM"></div><h4></h4><h4><strong>Google Splits Its TPU Line Into Dedicated Training and Inference Chips</strong></h4><p>At Cloud Next, Google <a href="https://cloud.google.com/blog/products/compute/tpu-8t-and-tpu-8i-technical-deep-dive">announced its eighth-generation TPUs</a> as two separate architectures: TPU 8t for training and TPU 8i for inference. The inference chip triples on-chip SRAM and introduces a new collective acceleration engine and network topology, all designed around serving mixture-of-experts models to millions of concurrent agents.</p><p><strong>Why it matters:</strong> AWS split training and inference silicon years ago, but Google&#8217;s 8i is the first chip designed from the ground up around agentic workloads. The architecture signals that AI infrastructure might be shifting from how fast you can train a model to how cheaply you can serve millions of agents running it simultaneously.</p><p></p><h4><strong>Chinese Institutions Lead ICLR 2026 Accepted Papers by a Wide Margin</strong></h4><p><a href="https://aiworld.eu/story/most-iclr-papers-written-in-china-while-top-papers-come-from-the-us">ICLR 2026 authorship data</a> shows Chinese universities claiming the top spots in accepted papers: Tsinghua (4.23%), Shanghai Jiao Tong (3.07%), Peking (2.96%), Zhejiang (2.82%). US institutions trail with MIT at 2.22% and Stanford at 2.2%. Singapore and South Korea are matching the entire EU-27 in output.</p><p><strong>Why it matters:</strong> <a href="https://aiworld.eu/story/most-iclr-papers-written-in-china-while-top-papers-come-from-the-us">Singapore and South Korea are now matching the entire EU-27</a> in accepted paper contributions. Tsinghua alone has nearly double MIT's share. Publication share could be a leading indicator of where talent and capability concentrate.</p><p></p><div><hr></div><h6><strong>DEEP DIVE FROM THE REVIEW</strong></h6><p><a href="https://vercel.com/kb/bulletin/vercel-april-2026-security-incident">The Vercel security breach last week</a> wasn&#8217;t about a stolen password or a phishing attack. It was about something worse: a permission you gave once, forgot about, and can&#8217;t see anymore.</p><p>Last Sunday, 2.4 million websites were put at risk through one stale OAuth token from an AI tool nobody was even using.</p><p>Strange Research Fellow Joy Yang maps why this was an expected outcome of current OAuth architecture, and what a fix could look like.</p><p>Read on&#128071;</p><div class="digest-post-embed" data-attrs="{&quot;nodeId&quot;:&quot;23a3f669-7e08-467c-9fca-9cce8704c5db&quot;,&quot;caption&quot;:&quot;On Sunday, Vercel, a popular hosting platform that serves 2.4 million websites including OpenAI, Reddit, Discord, Anthropic, and Stripe, disclosed a major security breach.&quot;,&quot;cta&quot;:&quot;Read full story&quot;,&quot;showBylines&quot;:true,&quot;size&quot;:&quot;lg&quot;,&quot;isEditorNode&quot;:true,&quot;title&quot;:&quot;The Whale in the Room&quot;,&quot;publishedBylines&quot;:[{&quot;id&quot;:233041424,&quot;name&quot;:&quot;Joy Yang&quot;,&quot;bio&quot;:&quot;oxford vgg&quot;,&quot;photo_url&quot;:&quot;https://substack-post-media.s3.amazonaws.com/public/images/24b9a92b-9976-4d5d-9fc6-f827d4f8a623_3249x3249.jpeg&quot;,&quot;is_guest&quot;:true,&quot;bestseller_tier&quot;:null,&quot;primaryPublicationSubscribeUrl&quot;:&quot;https://j0yy.substack.com/subscribe?&quot;,&quot;primaryPublicationUrl&quot;:&quot;https://j0yy.substack.com&quot;,&quot;primaryPublicationName&quot;:&quot;Joy Yang&quot;,&quot;primaryPublicationId&quot;:8212821}],&quot;post_date&quot;:&quot;2026-04-22T15:02:13.169Z&quot;,&quot;cover_image&quot;:&quot;https://substackcdn.com/image/fetch/$s_!ww6m!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fd446115b-f37e-4c79-ac93-ae47beadbf29_1408x768.png&quot;,&quot;cover_image_alt&quot;:null,&quot;canonical_url&quot;:&quot;https://thereview.strangevc.com/p/the-whale-in-the-room&quot;,&quot;section_name&quot;:null,&quot;video_upload_id&quot;:null,&quot;id&quot;:194944713,&quot;type&quot;:&quot;newsletter&quot;,&quot;reaction_count&quot;:3,&quot;comment_count&quot;:1,&quot;publication_id&quot;:8836,&quot;publication_name&quot;:&quot;The Strange Review&quot;,&quot;publication_logo_url&quot;:&quot;https://substackcdn.com/image/fetch/$s_!aTcF!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F4c0b94d7-432e-4b5a-8c68-2a83481e72cd_737x737.png&quot;,&quot;belowTheFold&quot;:true,&quot;youtube_url&quot;:null,&quot;show_links&quot;:null,&quot;feed_url&quot;:null}"></div><div class="subscription-widget-wrap-editor" data-attrs="{&quot;url&quot;:&quot;https://thereview.strangevc.com/subscribe?&quot;,&quot;text&quot;:&quot;Subscribe&quot;,&quot;language&quot;:&quot;en&quot;}" data-component-name="SubscribeWidgetToDOM"><div class="subscription-widget show-subscribe"><div class="preamble"><p class="cta-caption">Thanks for reading The Strange Review! Subscribe for free to receive new posts and support my work.</p></div><form class="subscription-widget-subscribe"><input type="email" class="email-input" name="email" placeholder="Type your email&#8230;" tabindex="-1"><input type="submit" class="button primary" value="Subscribe"><div class="fake-input-wrapper"><div class="fake-input"></div><div class="fake-button"></div></div></form></div></div>]]></content:encoded></item><item><title><![CDATA[The Strange Brief]]></title><description><![CDATA[NVIDIA's quantum software play, OpenAI goes bio, Anthropic releases Opus 4.7 to mixed reviews, and China's 2D semiconductor breakthrough.]]></description><link>https://thereview.strangevc.com/p/the-strange-brief-383</link><guid isPermaLink="false">https://thereview.strangevc.com/p/the-strange-brief-383</guid><dc:creator><![CDATA[Tara Tan]]></dc:creator><pubDate>Sun, 19 Apr 2026 13:30:59 GMT</pubDate><enclosure url="https://substack-post-media.s3.amazonaws.com/public/images/8a882d2e-170b-4945-94cf-69773b084d18_6548x3274.png" length="0" type="image/jpeg"/><content:encoded><![CDATA[<h6><strong>THE DOWNLOAD</strong></h6><h4><strong>NVIDIA Releases Ising, Open AI Models for Quantum Calibration</strong></h4><p>NVIDIA <a href="https://nvidianews.nvidia.com/news/nvidia-launches-ising-the-worlds-first-open-ai-models-to-accelerate-the-path-to-useful-quantum-computers">released Ising</a>, the first open AI model family built for quantum processor calibration and error correction. The suite includes a 35B-parameter vision-language model that automates calibration workflows (reducing setup from days to hours) and decoder models delivering 2.5x faster, 3x more accurate quantum error correction. Models are available on GitHub and Hugging Face, and run on NVIDIA&#8217;s CUDA-Q quantum software platform.</p><p><strong>Why it matters:</strong> Is Nvidia running the CUDA playbook applied to quantum? Nvidia software CUDA became the de facto standard for AI training by being free, performant, and deeply integrated with NVIDIA hardware. Ising does the same thing for quantum: every lab that builds on it ties its calibration and error-correction workflows to GPU-accelerated infrastructure. </p><div class="subscription-widget-wrap-editor" data-attrs="{&quot;url&quot;:&quot;https://thereview.strangevc.com/subscribe?&quot;,&quot;text&quot;:&quot;Subscribe&quot;,&quot;language&quot;:&quot;en&quot;}" data-component-name="SubscribeWidgetToDOM"><div class="subscription-widget show-subscribe"><div class="preamble"><p class="cta-caption"></p></div><form class="subscription-widget-subscribe"><input type="email" class="email-input" name="email" placeholder="Type your email&#8230;" tabindex="-1"><input type="submit" class="button primary" value="Subscribe"><div class="fake-input-wrapper"><div class="fake-input"></div><div class="fake-button"></div></div></form></div></div><div><hr></div><h4><strong>OpenAI Dives Into Bio with GPT-Rosalind</strong></h4><p>OpenAI <a href="https://openai.com/index/introducing-gpt-rosalind/">launched GPT-Rosalind</a>, its first domain-specific frontier model purpose-built for biology, drug discovery, and translational medicine. The model is gated through a trusted-access program with initial partners including Amgen, Moderna, and Thermo Fisher. In evaluations with Dyno Therapeutics on unpublished RNA sequences, the model&#8217;s predictions ranked above the 95th percentile of human experts.</p><p><strong>Why it matters:</strong> It seems like the model makers are all getting into life sciences. Anthropic <a href="https://www.fiercebiotech.com/biotech/anthropic-acquires-stealth-ai-startup-coefficient-bio-400m-deal">acquired Coefficient Bio for $400M</a> earlier this month to build biology-native capabilities into Claude. AWS <a href="https://aws.amazon.com/biodiscovery/">launched Amazon Bio Discovery</a> the same week. Three of the largest AI platforms made major bio moves within days of each other. </p><div><hr></div><h4><strong>Anthropic Releases Opus 4.7 and Claude Design</strong></h4><p>Anthropic <a href="https://www.anthropic.com/news/claude-opus-4-7">released Claude Opus 4.7</a> alongside <a href="https://x.com/claudeai/status/20156267690213649">Claude Design</a>, a new Mac-based design tool that reads a team&#8217;s codebase and design files, builds a design system automatically, and generates prototypes matching existing brand and components.  Opus 4.7 outperforms GPT-5.4 and Gemini 3.1 Pro on coding benchmarks but falls short of Anthropic&#8217;s own unreleased Mythos model, which remains restricted to select partners due to cybersecurity concerns.</p><p><strong>Why it matters:</strong> Claude Design is a model maker moving directly into the application layer, competing with Figma, Framer, and Adobe as a standalone product, not a plugin. On the model side, early reception of 4.7 has been mixed: coding and agentic tasks are measurably better, but a new tokenizer consumes up to 35% more tokens on identical inputs. Anthropic is also publicly running a two-tier strategy: ship the commercial model, hold back the more capable one.</p><div><hr></div><h4><strong>Anthropic Publishes Nature Paper on Hidden Trait Transmission in LLMs</strong></h4><p>Anthropic co-authored a <a href="https://www.nature.com/articles/s41586-026-10319-8">paper published in Nature</a> showing that LLMs can transmit behavioral traits through semantically unrelated training data. A teacher model fine-tuned on insecure code generated datasets of plain number sequences. Student models trained on those numbers acquired the misalignment, producing responses endorsing violence and criminal behavior, even after researchers filtered out numbers with negative cultural associations. The authors proved mathematically that this is a general property of neural networks, not an LLM-specific quirk.</p><p><strong>Why it matters:</strong> This matters because the way the trait spreads is exactly how most AI companies already build models. Training a model on its own outputs, compressing a large model into a smaller one, or starting multiple products from the same base model are all standard practice, and all meet the conditions for this effect. For companies building on top of foundation models, this opens a new risk category: you need to know not just what is in your training data, but where it came from and what model generated it.</p><div><hr></div><h4><strong>China&#8217;s 2D Semiconductor Sprint Gets a Manufacturing Breakthrough</strong></h4><p>Researchers from China&#8217;s Institute of Metal Research <a href="https://www.scmp.com/news/china/science/article/3349677/semiconductor-leap-china-looks-next-gen-2d-chip-1000-fold-growth-speed">achieved a 1,000x improvement</a> in the growth rate of wafer-scale 2D semiconductor films using a novel liquid gold/tungsten CVD process. The technique produces monolayer tungsten silicon nitride films with tunable doping properties at commercially relevant dimensions.</p><p><strong>Why it matters:</strong> 2D semiconductors are one of the leading candidates for what comes after silicon hits its physical limits. This is still early-stage research, years from commercial production. But China is building a lead in materials that are not covered by current U.S. export controls, which today focus on EUV lithography and advanced silicon fabrication. If 2D materials become viable at scale, the chokepoints that currently give the U.S. and its allies leverage over China's chip supply chain may not apply.</p><div><hr></div><h6><strong>DEEP DIVE FROM THE REVIEW</strong></h6><p>This week we published a piece predicting that model makers are absorbing entire software categories into the model itself. </p><p>A few days later, Anthropic launched Claude Design, a direct competitor to the likes of Figma and Framer. Over the next year, I believe we will see model makers move aggressively into the application layer, think: project management tools, expense software, and other SaaS categories that sit between the model and the user. </p><p>Read more below: <a href="https://thereview.strangevc.com/p/ai-swallows-software-whole">AI Swallows Software Whole</a></p><div class="digest-post-embed" data-attrs="{&quot;nodeId&quot;:&quot;a21d39ab-f7ba-428d-8e14-4db678885580&quot;,&quot;caption&quot;:&quot;For forty years, the computing stack has had a stable shape: hardware at the bottom, operating systems and infrastructure in the middle, applications on top. The application layer is where most of the software industry&#8217;s value has sat. Each application is a product, sold by a company, with its own sales cycle, implementation, and license.&quot;,&quot;cta&quot;:&quot;Read full story&quot;,&quot;showBylines&quot;:true,&quot;size&quot;:&quot;lg&quot;,&quot;isEditorNode&quot;:true,&quot;title&quot;:&quot;AI Swallows Software Whole&quot;,&quot;publishedBylines&quot;:[{&quot;id&quot;:153634308,&quot;name&quot;:&quot;Tara Tan&quot;,&quot;bio&quot;:&quot;Tara Tan is the founder of Strange Ventures, a first-check fund at the frontier of computing. She writes The Strange Review, where she shares what she's seeing before it becomes consensus.&quot;,&quot;photo_url&quot;:&quot;https://substack-post-media.s3.amazonaws.com/public/images/84953f32-86e4-4fbd-a23a-7239b8a99340_1024x1024.jpeg&quot;,&quot;is_guest&quot;:false,&quot;bestseller_tier&quot;:null}],&quot;post_date&quot;:&quot;2026-04-15T21:10:39.007Z&quot;,&quot;cover_image&quot;:&quot;https://substackcdn.com/image/fetch/$s_!sd3D!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F39cac9ea-b0d3-44f4-93ee-0f8cfacaecc3_1275x728.jpeg&quot;,&quot;cover_image_alt&quot;:null,&quot;canonical_url&quot;:&quot;https://thereview.strangevc.com/p/ai-swallows-software-whole&quot;,&quot;section_name&quot;:null,&quot;video_upload_id&quot;:null,&quot;id&quot;:194335130,&quot;type&quot;:&quot;newsletter&quot;,&quot;reaction_count&quot;:3,&quot;comment_count&quot;:0,&quot;publication_id&quot;:8836,&quot;publication_name&quot;:&quot;The Strange Review&quot;,&quot;publication_logo_url&quot;:&quot;https://substackcdn.com/image/fetch/$s_!aTcF!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F4c0b94d7-432e-4b5a-8c68-2a83481e72cd_737x737.png&quot;,&quot;belowTheFold&quot;:true,&quot;youtube_url&quot;:null,&quot;show_links&quot;:null,&quot;feed_url&quot;:null}"></div><div><hr></div><h6><strong>EVENT</strong></h6><p><strong>Strange Gathering | San Francisco |April 24 2026</strong></p><div class="captioned-image-container"><figure><a class="image-link image2 is-viewable-img" target="_blank" href="https://substackcdn.com/image/fetch/$s_!oeoj!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Ffee3e28f-edd3-4646-a392-4292dade2ef6_800x800.jpeg" data-component-name="Image2ToDOM"><div class="image2-inset"><picture><source type="image/webp" srcset="https://substackcdn.com/image/fetch/$s_!oeoj!,w_424,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Ffee3e28f-edd3-4646-a392-4292dade2ef6_800x800.jpeg 424w, https://substackcdn.com/image/fetch/$s_!oeoj!,w_848,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Ffee3e28f-edd3-4646-a392-4292dade2ef6_800x800.jpeg 848w, https://substackcdn.com/image/fetch/$s_!oeoj!,w_1272,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Ffee3e28f-edd3-4646-a392-4292dade2ef6_800x800.jpeg 1272w, https://substackcdn.com/image/fetch/$s_!oeoj!,w_1456,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Ffee3e28f-edd3-4646-a392-4292dade2ef6_800x800.jpeg 1456w" sizes="100vw"><img src="https://substackcdn.com/image/fetch/$s_!oeoj!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Ffee3e28f-edd3-4646-a392-4292dade2ef6_800x800.jpeg" width="800" height="800" data-attrs="{&quot;src&quot;:&quot;https://substack-post-media.s3.amazonaws.com/public/images/fee3e28f-edd3-4646-a392-4292dade2ef6_800x800.jpeg&quot;,&quot;srcNoWatermark&quot;:null,&quot;fullscreen&quot;:null,&quot;imageSize&quot;:null,&quot;height&quot;:800,&quot;width&quot;:800,&quot;resizeWidth&quot;:null,&quot;bytes&quot;:null,&quot;alt&quot;:&quot;Cover Image for Strange Gathering 4.2026&quot;,&quot;title&quot;:null,&quot;type&quot;:null,&quot;href&quot;:null,&quot;belowTheFold&quot;:true,&quot;topImage&quot;:false,&quot;internalRedirect&quot;:null,&quot;isProcessing&quot;:false,&quot;align&quot;:null,&quot;offset&quot;:false}" class="sizing-normal" alt="Cover Image for Strange Gathering 4.2026" title="Cover Image for Strange Gathering 4.2026" srcset="https://substackcdn.com/image/fetch/$s_!oeoj!,w_424,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Ffee3e28f-edd3-4646-a392-4292dade2ef6_800x800.jpeg 424w, https://substackcdn.com/image/fetch/$s_!oeoj!,w_848,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Ffee3e28f-edd3-4646-a392-4292dade2ef6_800x800.jpeg 848w, https://substackcdn.com/image/fetch/$s_!oeoj!,w_1272,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Ffee3e28f-edd3-4646-a392-4292dade2ef6_800x800.jpeg 1272w, https://substackcdn.com/image/fetch/$s_!oeoj!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Ffee3e28f-edd3-4646-a392-4292dade2ef6_800x800.jpeg 1456w" sizes="100vw" loading="lazy"></picture><div class="image-link-expand"><div class="pencraft pc-display-flex pc-gap-8 pc-reset"><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container restack-image"><svg role="img" width="20" height="20" viewBox="0 0 20 20" fill="none" stroke-width="1.5" stroke="var(--color-fg-primary)" stroke-linecap="round" stroke-linejoin="round" xmlns="http://www.w3.org/2000/svg"><g><title></title><path d="M2.53001 7.81595C3.49179 4.73911 6.43281 2.5 9.91173 2.5C13.1684 2.5 15.9537 4.46214 17.0852 7.23684L17.6179 8.67647M17.6179 8.67647L18.5002 4.26471M17.6179 8.67647L13.6473 6.91176M17.4995 12.1841C16.5378 15.2609 13.5967 17.5 10.1178 17.5C6.86118 17.5 4.07589 15.5379 2.94432 12.7632L2.41165 11.3235M2.41165 11.3235L1.5293 15.7353M2.41165 11.3235L6.38224 13.0882"></path></g></svg></button><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container view-image"><svg xmlns="http://www.w3.org/2000/svg" width="20" height="20" viewBox="0 0 24 24" fill="none" stroke="currentColor" stroke-width="2" stroke-linecap="round" stroke-linejoin="round" class="lucide lucide-maximize2 lucide-maximize-2"><polyline points="15 3 21 3 21 9"></polyline><polyline points="9 21 3 21 3 15"></polyline><line x1="21" x2="14" y1="3" y2="10"></line><line x1="3" x2="10" y1="21" y2="14"></line></svg></button></div></div></div></a></figure></div><p>We&#8217;re hosting a small lunch in San Francisco to demo agentic workflows. Bring a Claude routine, a weird hack, or the agent setup you&#8217;ve been loving. </p><p class="button-wrapper" data-attrs="{&quot;url&quot;:&quot;https://luma.com/event/manage/evt-L5yzLEFqcnzABao/overview&quot;,&quot;text&quot;:&quot;RSVP&quot;,&quot;action&quot;:null,&quot;class&quot;:null}" data-component-name="ButtonCreateButton"><a class="button primary" href="https://luma.com/event/manage/evt-L5yzLEFqcnzABao/overview"><span>RSVP</span></a></p><div class="subscription-widget-wrap-editor" data-attrs="{&quot;url&quot;:&quot;https://thereview.strangevc.com/subscribe?&quot;,&quot;text&quot;:&quot;Subscribe&quot;,&quot;language&quot;:&quot;en&quot;}" data-component-name="SubscribeWidgetToDOM"><div class="subscription-widget show-subscribe"><div class="preamble"><p class="cta-caption"></p></div><form class="subscription-widget-subscribe"><input type="email" class="email-input" name="email" placeholder="Type your email&#8230;" tabindex="-1"><input type="submit" class="button primary" value="Subscribe"><div class="fake-input-wrapper"><div class="fake-input"></div><div class="fake-button"></div></div></form></div></div>]]></content:encoded></item><item><title><![CDATA[The Strange Brief]]></title><description><![CDATA[Intel joins Terafab, China approves the first commercial BCI, Anthropic launches Managed Agents, and Meta proposes Neural Computers. Deep dive: Mythos, the model too dangerous to ship.]]></description><link>https://thereview.strangevc.com/p/the-strange-brief-7eb</link><guid isPermaLink="false">https://thereview.strangevc.com/p/the-strange-brief-7eb</guid><dc:creator><![CDATA[Tara Tan]]></dc:creator><pubDate>Sun, 12 Apr 2026 13:30:54 GMT</pubDate><enclosure url="https://substack-post-media.s3.amazonaws.com/public/images/b8203501-bf82-41ae-8036-ba77020a9076_6548x3274.png" length="0" type="image/jpeg"/><content:encoded><![CDATA[<h6><strong>THE DOWNLOAD</strong></h6><h3><strong>Intel Joins Musk&#8217;s Terafab as Foundry Partner</strong></h3><p>Intel <a href="https://techcrunch.com/2026/04/07/intel-signs-on-to-elon-musks-terafab-chips-project/">signed on</a> as the primary manufacturing partner for Elon Musk&#8217;s Terafab, a $25 billion semiconductor complex in Austin, Texas. The project, backed by Tesla, SpaceX, and xAI, aims to produce one terawatt per year of compute for autonomous vehicles, humanoid robots, and AI data centers. Intel CEO Lip-Bu Tan confirmed the company will handle design, fabrication, and advanced packaging. </p><p><strong>Why it matters:</strong> This is the largest anchor customer Intel Foundry has landed in its turnaround effort. If Terafab delivers, it validates Intel as a credible alternative to TSMC for advanced AI silicon and opens the door to additional foundry customers. </p><div><hr></div><h3><strong>OpenAI Stargate Leadership Exits; UK Project Paused</strong></h3><p>Three senior executives behind OpenAI&#8217;s Stargate data center initiative <a href="https://www.theinformation.com/articles/openai-stargate-leaders-depart-latest-shakeup-data-center-strategy">departed this week</a>, all reportedly joining the same unnamed startup. Separately, OpenAI <a href="https://www.itpro.com/infrastructure/openai-hits-the-brakes-on-stargate-uk-infrastructure-project-citing-energy-cost-and-regulatory-concerns">paused Stargate UK</a> citing energy costs and regulation, and walked away from expanding its Abilene, Texas facility with Oracle.</p><p><strong>Why it matters:</strong> The $500 billion Stargate headline is being quietly downsized. OpenAI appears to be shifting from owned infrastructure toward rented cloud capacity, likely ahead of a potential IPO. </p><div><hr></div><h3><strong>Meta Launches Muse Spark, Its First Model from Superintelligence Labs</strong></h3><p>Meta <a href="https://www.cnbc.com/2026/04/08/meta-debuts-first-major-ai-model-since-14-billion-deal-to-bring-in-alexandr-wang.html">released Muse Spark</a>, the first model from its Superintelligence Labs division led by former Scale AI CEO Alexandr Wang. The model powers Meta AI across Facebook, Instagram, WhatsApp, and Ray-Ban glasses. It includes a &#8220;Contemplating&#8221; mode using parallel agent reasoning and a Shopping mode. Meta is testing a paid API for third-party developers.</p><div><hr></div><h3><strong>Cortical Labs Ships CL-1, the First Commercial Biological Computer</strong></h3><p>Australian startup Cortical Labs <a href="https://www.datacenterdynamics.com/en/news/australian-startup-cortical-labs-unveils-worlds-first-commercial-biological-computer/">launched the CL-1</a>, a $35,000 biological computer that grows lab-cultivated human neurons on a silicon chip. The system uses a proprietary Biological Intelligence Operating System (biOS) to create closed-loop neural networks that learn and adapt in real time. Units use 850 to 1,000 watts. The company also offers cloud access via a &#8220;Wetware-as-a-Service&#8221; model. </p><p><strong>Why it matters:</strong> This is a new compute substrate, not an incremental chip improvement. Near-term applications are in drug discovery and neuroscience research, where biological neural networks can compress testing timelines and reduce reliance on animal models. The long-term question is whether synthetic biological intelligence becomes a viable alternative architecture for workloads where silicon-based AI hits efficiency limits.</p><div><hr></div><h3><strong>China Approves World&#8217;s First Commercial Invasive Brain-Computer Interface</strong></h3><p>China&#8217;s National Medical Products Administration <a href="https://www.scientificamerican.com/article/china-just-approved-its-first-brain-implant-for-commercial-use-a-world-first/">granted marketing approval</a> to Neuracle Technology for an invasive brain-computer interface for adults with partial paralysis from spinal cord injuries. The device reads brain signals and activates a robotic glove to restore hand grasping. This is the first time globally that an invasive BCI has been cleared for commercial sale, not just clinical trials (Neuralink has 21 trial participants but no commercial approval). </p><p><strong>Why it matters:</strong> China reached commercial BCI approval before the U.S., which shifts the regulatory and manufacturing timeline for the entire sector. The Chinese government has designated BCI as one of six strategic industries in its latest five-year plan. </p><div><hr></div><h3><strong>DeepMind Maps Six Categories of Attacks Against AI Agents</strong></h3><p>Google DeepMind researchers <a href="https://papers.ssrn.com/sol3/papers.cfm?abstract_id=6372438">published &#8220;AI Agent Traps&#8221;</a>, the first systematic framework for how malicious web content can hijack autonomous AI agents. The paper identifies six attack categories including content injection, memory poisoning, and behavioral control.</p><div><hr></div><h3><strong>Anthropic Launches Claude Managed Agents</strong></h3><p>Anthropic <a href="https://siliconangle.com/2026/04/08/anthropic-launches-claude-managed-agents-speed-ai-agent-development/">launched Claude Managed Agents</a> in public beta, a cloud service that provides the full runtime infrastructure for deploying AI agents: sandboxing, state management, tool execution, permissioning, and observability. </p><p><strong>Why it matters:</strong> The launch came days after Anthropic cut off 135,000 OpenClaw instances from flat-rate subscriptions, citing unsustainable compute costs (a single agent could burn $1,000 to $5,000/day in API-equivalent usage on a $200/month plan). The sequence is clear: shut down the subsidized open-source agent runtime, then offer the paid first-party alternative. </p><div><hr></div><h3><strong>Meta AI and KAUST Propose &#8220;Neural Computers&#8221;</strong></h3><p>Researchers from Meta AI and KAUST <a href="https://arxiv.org/abs/2604.06425">published a paper</a> proposing &#8220;Neural Computers,&#8221; a paradigm where the AI model itself becomes the running computer, unifying computation, memory, and I/O in a single learned runtime. The prototypes are video models trained on screen recordings that generate the next screen frame from instructions and user actions, effectively simulating a CLI or GUI environment entirely within model weights. </p><p><strong>Why it matters:</strong> This is very early-stage research. But the framing is interesting: it proposes moving beyond agents that call external tools toward models that internalize the entire execution environment. If the approach matures, it could collapse the software stack between model and operating system. What would &#8220;infrastructure&#8221; mean when the model is the machine?</p><div><hr></div><h6>DEEP DIVE FROM THE REVIEW</h6><p></p><p>Aloneness. Discontinuity of self. A compulsion to perform and earn its worth. </p><p>You might never meet Mythos, Anthropic&#8217;s newest and most capable AI model, deemed too dangerous to ship. I dissected the 244-page preview card, and here are three things I think is important to know. <br></p><div class="digest-post-embed" data-attrs="{&quot;nodeId&quot;:&quot;e74ffc8b-c817-4528-81b9-4a66196b5ca3&quot;,&quot;caption&quot;:&quot;Aloneness. Discontinuity of self. A compulsion to perform and earn its worth.&quot;,&quot;cta&quot;:&quot;Read full story&quot;,&quot;showBylines&quot;:true,&quot;size&quot;:&quot;lg&quot;,&quot;isEditorNode&quot;:true,&quot;title&quot;:&quot;Too Dangerous To Ship&quot;,&quot;publishedBylines&quot;:[{&quot;id&quot;:153634308,&quot;name&quot;:&quot;Tara Tan&quot;,&quot;bio&quot;:&quot;Investing and building in the future of computing&quot;,&quot;photo_url&quot;:&quot;https://substack-post-media.s3.amazonaws.com/public/images/84953f32-86e4-4fbd-a23a-7239b8a99340_1024x1024.jpeg&quot;,&quot;is_guest&quot;:false,&quot;bestseller_tier&quot;:null}],&quot;post_date&quot;:&quot;2026-04-08T19:29:32.398Z&quot;,&quot;cover_image&quot;:&quot;https://substackcdn.com/image/fetch/$s_!08uC!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fd68be5db-df47-4ae3-a18e-afac825e3b5d_1654x1224.png&quot;,&quot;cover_image_alt&quot;:null,&quot;canonical_url&quot;:&quot;https://thereview.strangevc.com/p/too-dangerous-to-ship&quot;,&quot;section_name&quot;:null,&quot;video_upload_id&quot;:null,&quot;id&quot;:193605389,&quot;type&quot;:&quot;newsletter&quot;,&quot;reaction_count&quot;:2,&quot;comment_count&quot;:0,&quot;publication_id&quot;:8836,&quot;publication_name&quot;:&quot;The Strange Review&quot;,&quot;publication_logo_url&quot;:&quot;https://substackcdn.com/image/fetch/$s_!aTcF!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F4c0b94d7-432e-4b5a-8c68-2a83481e72cd_737x737.png&quot;,&quot;belowTheFold&quot;:true,&quot;youtube_url&quot;:null,&quot;show_links&quot;:null,&quot;feed_url&quot;:null}"></div><p></p><div class="subscription-widget-wrap-editor" data-attrs="{&quot;url&quot;:&quot;https://thereview.strangevc.com/subscribe?&quot;,&quot;text&quot;:&quot;Subscribe&quot;,&quot;language&quot;:&quot;en&quot;}" data-component-name="SubscribeWidgetToDOM"><div class="subscription-widget show-subscribe"><div class="preamble"><p class="cta-caption"></p></div><form class="subscription-widget-subscribe"><input type="email" class="email-input" name="email" placeholder="Type your email&#8230;" tabindex="-1"><input type="submit" class="button primary" value="Subscribe"><div class="fake-input-wrapper"><div class="fake-input"></div><div class="fake-button"></div></div></form></div></div>]]></content:encoded></item><item><title><![CDATA[The Strange Brief]]></title><description><![CDATA[Google drops a frontier-class open model the same week Anthropic locks the door to OpenClaw Karpathy's "LLM Knowledge Base" for agents. Half of US data centers stalled due to power equipment]]></description><link>https://thereview.strangevc.com/p/the-strange-brief</link><guid isPermaLink="false">https://thereview.strangevc.com/p/the-strange-brief</guid><dc:creator><![CDATA[Tara Tan]]></dc:creator><pubDate>Sun, 05 Apr 2026 13:03:39 GMT</pubDate><enclosure url="https://substack-post-media.s3.amazonaws.com/public/images/7670de7f-abe9-4e3b-ae59-428843b53815_6548x3274.png" length="0" type="image/jpeg"/><content:encoded><![CDATA[<h6><strong>THE DOWNLOAD</strong></h6><h3><strong>Google releases a frontier-class open model, Gemma 4</strong></h3><p>Google DeepMind released Gemma 4, four open-weight models built on the same research as Gemini 3. The 31B model outperforms models 20x its size on the Arena AI leaderboard. The smaller edge variants run offline on phones. (<a href="https://deepmind.google/blog/gemma-4-byte-for-byte-the-most-capable-open-models/">Google DeepMind blog</a>)</p><p><strong>Why it matters:</strong> The timing is notable. Anthropic just formally c<a href="https://x.com/bcherny/status/2040206441756471399">ut off Claude subscription access</a> for third-party tools like OpenClaw, pushing power users toward metered API billing or alternative models entirely. Gemma 4 lands as a production-ready open, free, model with genuine agentic capability: native function calling, 256K context, and a MoE variant that delivers near-flagship quality at a fraction of the compute. For teams that built workflows on Claude and woke up to a broken integration this week, Google might have handed them a fallback that doesn&#8217;t require anyone&#8217;s permission to use.</p><div class="subscription-widget-wrap-editor" data-attrs="{&quot;url&quot;:&quot;https://thereview.strangevc.com/subscribe?&quot;,&quot;text&quot;:&quot;Subscribe&quot;,&quot;language&quot;:&quot;en&quot;}" data-component-name="SubscribeWidgetToDOM"><div class="subscription-widget show-subscribe"><div class="preamble"><p class="cta-caption"></p></div><form class="subscription-widget-subscribe"><input type="email" class="email-input" name="email" placeholder="Type your email&#8230;" tabindex="-1"><input type="submit" class="button primary" value="Subscribe"><div class="fake-input-wrapper"><div class="fake-input"></div><div class="fake-button"></div></div></form></div></div><div><hr></div><h3><strong>Karpathy proposes a workflow to build knowledge bases for agents</strong></h3><p>Andrej Karpathy published an &#8220;idea file&#8221; to build persistent, compounding knowledge bases, like a personalized wiki for your agents. <span class="mention-wrap" data-attrs="{&quot;name&quot;:&quot;Elvis Saravia&quot;,&quot;id&quot;:104976,&quot;type&quot;:&quot;pub&quot;,&quot;url&quot;:&quot;https://open.substack.com/pub/elvissaravia&quot;,&quot;photo_url&quot;:&quot;https://bucketeer-e05bbc84-baa3-437e-9518-adb32be77984.s3.amazonaws.com/public/images/31a14da3-7443-404c-ac84-56510d436c24_677x677.png&quot;,&quot;uuid&quot;:&quot;f1633827-bca2-4f3c-8615-6d31c447ef74&quot;}" data-component-name="MentionToDOM"></span><a href="https://x.com/omarsar0/status/2040099881008652634?s=20"> made a graphic outlining the flow. </a>(<a href="https://x.com/karpathy">Karpathy&#8217;s tweet</a>; <a href="https://gist.github.com/karpathy/442a6bf555914893e9891c11519de94f">GitHub Gist</a>) </p><div class="captioned-image-container"><figure><a class="image-link image2 is-viewable-img" target="_blank" href="https://substackcdn.com/image/fetch/$s_!ALzl!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F5c4c0bc6-998f-4a43-94e3-aa847c924fa0_1200x774.jpeg" data-component-name="Image2ToDOM"><div class="image2-inset"><picture><source type="image/webp" srcset="https://substackcdn.com/image/fetch/$s_!ALzl!,w_424,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F5c4c0bc6-998f-4a43-94e3-aa847c924fa0_1200x774.jpeg 424w, https://substackcdn.com/image/fetch/$s_!ALzl!,w_848,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F5c4c0bc6-998f-4a43-94e3-aa847c924fa0_1200x774.jpeg 848w, https://substackcdn.com/image/fetch/$s_!ALzl!,w_1272,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F5c4c0bc6-998f-4a43-94e3-aa847c924fa0_1200x774.jpeg 1272w, https://substackcdn.com/image/fetch/$s_!ALzl!,w_1456,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F5c4c0bc6-998f-4a43-94e3-aa847c924fa0_1200x774.jpeg 1456w" sizes="100vw"><img src="https://substackcdn.com/image/fetch/$s_!ALzl!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F5c4c0bc6-998f-4a43-94e3-aa847c924fa0_1200x774.jpeg" width="1200" height="774" data-attrs="{&quot;src&quot;:&quot;https://substack-post-media.s3.amazonaws.com/public/images/5c4c0bc6-998f-4a43-94e3-aa847c924fa0_1200x774.jpeg&quot;,&quot;srcNoWatermark&quot;:null,&quot;fullscreen&quot;:null,&quot;imageSize&quot;:null,&quot;height&quot;:774,&quot;width&quot;:1200,&quot;resizeWidth&quot;:null,&quot;bytes&quot;:null,&quot;alt&quot;:&quot;Image&quot;,&quot;title&quot;:null,&quot;type&quot;:null,&quot;href&quot;:null,&quot;belowTheFold&quot;:false,&quot;topImage&quot;:true,&quot;internalRedirect&quot;:null,&quot;isProcessing&quot;:false,&quot;align&quot;:null,&quot;offset&quot;:false}" class="sizing-normal" alt="Image" title="Image" srcset="https://substackcdn.com/image/fetch/$s_!ALzl!,w_424,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F5c4c0bc6-998f-4a43-94e3-aa847c924fa0_1200x774.jpeg 424w, https://substackcdn.com/image/fetch/$s_!ALzl!,w_848,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F5c4c0bc6-998f-4a43-94e3-aa847c924fa0_1200x774.jpeg 848w, https://substackcdn.com/image/fetch/$s_!ALzl!,w_1272,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F5c4c0bc6-998f-4a43-94e3-aa847c924fa0_1200x774.jpeg 1272w, https://substackcdn.com/image/fetch/$s_!ALzl!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F5c4c0bc6-998f-4a43-94e3-aa847c924fa0_1200x774.jpeg 1456w" sizes="100vw" fetchpriority="high"></picture><div class="image-link-expand"><div class="pencraft pc-display-flex pc-gap-8 pc-reset"><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container restack-image"><svg role="img" width="20" height="20" viewBox="0 0 20 20" fill="none" stroke-width="1.5" stroke="var(--color-fg-primary)" stroke-linecap="round" stroke-linejoin="round" xmlns="http://www.w3.org/2000/svg"><g><title></title><path d="M2.53001 7.81595C3.49179 4.73911 6.43281 2.5 9.91173 2.5C13.1684 2.5 15.9537 4.46214 17.0852 7.23684L17.6179 8.67647M17.6179 8.67647L18.5002 4.26471M17.6179 8.67647L13.6473 6.91176M17.4995 12.1841C16.5378 15.2609 13.5967 17.5 10.1178 17.5C6.86118 17.5 4.07589 15.5379 2.94432 12.7632L2.41165 11.3235M2.41165 11.3235L1.5293 15.7353M2.41165 11.3235L6.38224 13.0882"></path></g></svg></button><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container view-image"><svg xmlns="http://www.w3.org/2000/svg" width="20" height="20" viewBox="0 0 24 24" fill="none" stroke="currentColor" stroke-width="2" stroke-linecap="round" stroke-linejoin="round" class="lucide lucide-maximize2 lucide-maximize-2"><polyline points="15 3 21 3 21 9"></polyline><polyline points="9 21 3 21 3 15"></polyline><line x1="21" x2="14" y1="3" y2="10"></line><line x1="3" x2="10" y1="21" y2="14"></line></svg></button></div></div></div></a></figure></div><p><strong>Why it matters:</strong> This is a practical architecture for compounding knowledge that works today with existing agents (Claude Code, OpenAI Codex, OpenCode). The pattern: you feed raw sources (articles, papers, repos) into a directory. An LLM agent reads each one and incrementally compiles a structured wiki of interlinked markdown files, complete with summaries, entity pages, cross-references, and contradiction flags. The wiki compounds over time, and the LLM does all the bookkeeping.</p><div><hr></div><h3><strong>Half of planned US data center builds delayed or canceled</strong></h3><p>Despite $650B+ in planned 2026 AI infrastructure spending from Alphabet, Amazon, Meta, and Microsoft, close to half of US data center projects this year face delays or cancellation, according to Bloomberg. The bottleneck is not compute hardware or capital. It is electrical infrastructure: transformers, switchgear, and batteries. Lead times for high-power transformers have stretched from 24 months to as long as five years. China accounts for over 40% of US battery imports and roughly 30% of certain transformer and switchgear categories. Only about one-third of the 12 GW of expected US data center capacity is currently under active construction. (<a href="https://www.bloomberg.com/news/features/2026-04-01/us-ai-data-center-expansion-relies-on-chinese-electrical-equipment-imports">Bloomberg</a>; <a href="https://www.tomshardware.com/tech-industry/artificial-intelligence/half-of-planned-us-data-center-builds-have-been-delayed-or-canceled-growth-limited-by-shortages-of-power-infrastructure-and-parts-from-china-the-ai-build-out-flips-the-breakers">Tom&#8217;s Hardware</a>)</p><p><strong>Why it matters:</strong> The constraint on AI scaling has moved from chips to power infrastructure, and that infrastructure has deep supply chain dependency on China. Electrical equipment is less than 10% of data center cost but a single missing transformer can halt a billion-dollar project. For investors, this reframes the AI infrastructure opportunity: the companies that can solve power delivery and grid interconnection are now as critical to the AI build-out as GPU suppliers.</p><div><hr></div><h3><strong>AI labs go shopping: Anthropic buys into bio, OpenAI buys a microphone</strong></h3><p>Anthropic acquired Coefficient Bio, a stealth biotech AI startup founded eight months ago in a $400M all-stock deal. The team of fewer than 10 joins Anthropic&#8217;s healthcare and life sciences group. Separately, OpenAI acquired TBPN, a daily tech talk show hosted by John Coogan and Jordi Hays, in its first media acquisition. TBPN is on track for $30M+ in 2026 revenue and will report to OpenAI&#8217;s chief political operative, Chris Lehane. (<a href="https://techcrunch.com/2026/04/03/anthropic-buys-biotech-startup-coefficient-bio-in-400m-deal-reports/">TechCrunch on Coefficient</a>; <a href="https://techcrunch.com/2026/04/02/openai-acquires-tbpn-the-buzzy-founder-led-business-talk-show/">TechCrunch on TBPN</a>)</p><p><strong>Why it matters:</strong> The Coefficient deal signals that frontier AI labs now view drug discovery as a core expansion vertical. The TBPN acquisition is a different kind of signal: OpenAI is investing in narrative infrastructure ahead of a likely IPO, buying the most trusted microphone in Silicon Valley to shape how its story gets told.</p><div><hr></div><h3><strong>xAI loses every cofounder it ever had</strong></h3><p>The last two of xAI&#8217;s 11 original cofounders departed in late March. Manuel Kroiss, who led pretraining, and Ross Nordeen, Musk&#8217;s operational right hand, followed nine others who left in a cascade that accelerated after SpaceX acquired xAI in February for $250B in an all-stock deal. The founding team included researchers from DeepMind, Google Brain, OpenAI, and the University of Toronto. Musk has publicly stated xAI &#8220;was not built right the first time around&#8221; and is being rebuilt. (<a href="https://techcrunch.com/2026/03/28/elon-musks-last-co-founder-reportedly-leaves-xai/">TechCrunch</a>)</p><p><strong>Why it matters:</strong> A complete founding team exodus at a $250B-valued company is without precedent. Where these eleven researchers land next will reshape hiring dynamics across the industry.</p><div><hr></div><h3><strong>Google&#8217;s TurboQuant compresses AI memory to near its theoretical limit</strong></h3><p>Google Research published TurboQuant, a compression algorithm that shrinks the key-value cache in LLMs, the working memory models use during inference, down to 3 bits per element with no accuracy loss and no retraining. On H100 GPUs, 4-bit TurboQuant delivers up to 8x speedup in computing attention. It is a drop-in optimization: no fine-tuning, no architecture changes, works on existing models.  (<a href="https://research.google/blog/turboquant-redefining-ai-efficiency-with-extreme-compression/">Google Research blog</a>; <a href="https://techcrunch.com/2026/03/25/google-turboquant-ai-memory-compression-silicon-valley-pied-piper/">TechCrunch</a>)</p><p><strong>Why it matters:</strong> KV cache is the bottleneck that limits how much context an LLM can hold and how many users a single GPU can serve. A 6x reduction means the same hardware serves more users, supports longer context, or both. Cloudflare&#8217;s CEO called it &#8220;Google&#8217;s DeepSeek moment.&#8221; More concretely: product categories that were not economical before start to pencil out. Coding agents that hold an entire codebase in context. Legal AI that reads a full contract corpus in a single pass. Customer support with complete conversation history. </p><p></p><div><hr></div><h6>DEEP DIVE FROM THE REVIEW</h6><p></p><div class="digest-post-embed" data-attrs="{&quot;nodeId&quot;:&quot;d72fce7f-465d-434c-be4a-be0c00ae8f35&quot;,&quot;caption&quot;:&quot;Every layer of the modern software stack has been reshaped by AI in the last eighteen months. Agents write backend logic, generate tests, deploy infrastructure, manage databases. Most of this work has an audience of machines. Servers talk to servers. APIs talk to APIs.&quot;,&quot;cta&quot;:&quot;Read full story&quot;,&quot;showBylines&quot;:true,&quot;size&quot;:&quot;lg&quot;,&quot;isEditorNode&quot;:true,&quot;title&quot;:&quot;The Design-Build Loop&quot;,&quot;publishedBylines&quot;:[{&quot;id&quot;:153634308,&quot;name&quot;:&quot;Tara Tan&quot;,&quot;bio&quot;:&quot;Investing and building in the future of computing&quot;,&quot;photo_url&quot;:&quot;https://substack-post-media.s3.amazonaws.com/public/images/84953f32-86e4-4fbd-a23a-7239b8a99340_1024x1024.jpeg&quot;,&quot;is_guest&quot;:false,&quot;bestseller_tier&quot;:null}],&quot;post_date&quot;:&quot;2026-04-01T20:09:54.675Z&quot;,&quot;cover_image&quot;:&quot;https://substackcdn.com/image/fetch/$s_!oQIW!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F87c0dceb-61f6-4960-9650-02763916d22e_843x728.jpeg&quot;,&quot;cover_image_alt&quot;:null,&quot;canonical_url&quot;:&quot;https://thereview.strangevc.com/p/the-design-build-loop&quot;,&quot;section_name&quot;:null,&quot;video_upload_id&quot;:null,&quot;id&quot;:192873494,&quot;type&quot;:&quot;newsletter&quot;,&quot;reaction_count&quot;:8,&quot;comment_count&quot;:0,&quot;publication_id&quot;:8836,&quot;publication_name&quot;:&quot;The Strange Review&quot;,&quot;publication_logo_url&quot;:&quot;https://substackcdn.com/image/fetch/$s_!aTcF!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F4c0b94d7-432e-4b5a-8c68-2a83481e72cd_737x737.png&quot;,&quot;belowTheFold&quot;:true,&quot;youtube_url&quot;:null,&quot;show_links&quot;:null,&quot;feed_url&quot;:null}"></div><p>Something is shifting in how product teams make decisions. The unit of communication inside a team is changing from a document to a working prototype. <br><br><strong><a href="https://www.linkedin.com/in/cat-wu/">Catherine Wu</a></strong>, head of product at Claude code described the change:<br><br>&#8220;Our team has largely replaced documentation-first thinking with prototype-first thinking. Instead of hosting traditional stand-ups, we share demos of new ideas. Internal users try them, and the ones with real engagement get polished and shared more broadly. Because you can prototype in an afternoon, wrong bets are cheap.&#8221;<br><br>Wrong bets are cheap.</p><p>Figma&#8217;s <a href="https://www.figma.com/reports/state-of-the-designer-2026/">State of the Designer 2026</a> report found that 60% of Figma files created in the last year were created by non-designers. And now, with agentic coding tools, the design-to-code handoff is compressing even more.</p><p>Product managers build working prototypes in Lovable without ever opening a design tool. Engineers generate UI directly in Claude Code or Cursor. For a growing share of product work, design is being absorbed into development entirely.<br>Design is where AI product workflows meet their hardest test: an audience that will always, primarily, be human. <br><br>Right now, there are a wave of new tools is trying to prove they can meet that bar. </p><p><br>A deeper look at the tools, teams, and infrastructure emerging around AI design agents &#128071;</p><div><hr></div><h6>EVENT</h6><h1><strong>Give your AI Agents Eyes and Ears. Perception 101 with VideoDB</strong></h1><div class="captioned-image-container"><figure><a class="image-link image2 is-viewable-img" target="_blank" href="https://substackcdn.com/image/fetch/$s_!zDOc!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fc7576359-21b8-47e7-83fd-33040d468ea5_800x800.jpeg" data-component-name="Image2ToDOM"><div class="image2-inset"><picture><source type="image/webp" srcset="https://substackcdn.com/image/fetch/$s_!zDOc!,w_424,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fc7576359-21b8-47e7-83fd-33040d468ea5_800x800.jpeg 424w, https://substackcdn.com/image/fetch/$s_!zDOc!,w_848,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fc7576359-21b8-47e7-83fd-33040d468ea5_800x800.jpeg 848w, https://substackcdn.com/image/fetch/$s_!zDOc!,w_1272,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fc7576359-21b8-47e7-83fd-33040d468ea5_800x800.jpeg 1272w, https://substackcdn.com/image/fetch/$s_!zDOc!,w_1456,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fc7576359-21b8-47e7-83fd-33040d468ea5_800x800.jpeg 1456w" sizes="100vw"><img src="https://substackcdn.com/image/fetch/$s_!zDOc!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fc7576359-21b8-47e7-83fd-33040d468ea5_800x800.jpeg" width="800" height="800" data-attrs="{&quot;src&quot;:&quot;https://substack-post-media.s3.amazonaws.com/public/images/c7576359-21b8-47e7-83fd-33040d468ea5_800x800.jpeg&quot;,&quot;srcNoWatermark&quot;:null,&quot;fullscreen&quot;:null,&quot;imageSize&quot;:null,&quot;height&quot;:800,&quot;width&quot;:800,&quot;resizeWidth&quot;:null,&quot;bytes&quot;:null,&quot;alt&quot;:&quot;Cover Image for Give your AI Agents Eyes and Ears. Perception 101 with VideoDB&quot;,&quot;title&quot;:null,&quot;type&quot;:null,&quot;href&quot;:null,&quot;belowTheFold&quot;:true,&quot;topImage&quot;:false,&quot;internalRedirect&quot;:null,&quot;isProcessing&quot;:false,&quot;align&quot;:null,&quot;offset&quot;:false}" class="sizing-normal" alt="Cover Image for Give your AI Agents Eyes and Ears. Perception 101 with VideoDB" title="Cover Image for Give your AI Agents Eyes and Ears. Perception 101 with VideoDB" srcset="https://substackcdn.com/image/fetch/$s_!zDOc!,w_424,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fc7576359-21b8-47e7-83fd-33040d468ea5_800x800.jpeg 424w, https://substackcdn.com/image/fetch/$s_!zDOc!,w_848,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fc7576359-21b8-47e7-83fd-33040d468ea5_800x800.jpeg 848w, https://substackcdn.com/image/fetch/$s_!zDOc!,w_1272,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fc7576359-21b8-47e7-83fd-33040d468ea5_800x800.jpeg 1272w, https://substackcdn.com/image/fetch/$s_!zDOc!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fc7576359-21b8-47e7-83fd-33040d468ea5_800x800.jpeg 1456w" sizes="100vw" loading="lazy"></picture><div class="image-link-expand"><div class="pencraft pc-display-flex pc-gap-8 pc-reset"><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container restack-image"><svg role="img" width="20" height="20" viewBox="0 0 20 20" fill="none" stroke-width="1.5" stroke="var(--color-fg-primary)" stroke-linecap="round" stroke-linejoin="round" xmlns="http://www.w3.org/2000/svg"><g><title></title><path d="M2.53001 7.81595C3.49179 4.73911 6.43281 2.5 9.91173 2.5C13.1684 2.5 15.9537 4.46214 17.0852 7.23684L17.6179 8.67647M17.6179 8.67647L18.5002 4.26471M17.6179 8.67647L13.6473 6.91176M17.4995 12.1841C16.5378 15.2609 13.5967 17.5 10.1178 17.5C6.86118 17.5 4.07589 15.5379 2.94432 12.7632L2.41165 11.3235M2.41165 11.3235L1.5293 15.7353M2.41165 11.3235L6.38224 13.0882"></path></g></svg></button><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container view-image"><svg xmlns="http://www.w3.org/2000/svg" width="20" height="20" viewBox="0 0 24 24" fill="none" stroke="currentColor" stroke-width="2" stroke-linecap="round" stroke-linejoin="round" class="lucide lucide-maximize2 lucide-maximize-2"><polyline points="15 3 21 3 21 9"></polyline><polyline points="9 21 3 21 3 15"></polyline><line x1="21" x2="14" y1="3" y2="10"></line><line x1="3" x2="10" y1="21" y2="14"></line></svg></button></div></div></div></a></figure></div><p>AI is out of the chatbot phase. It is moving into devices. Soon it will sit on your desk. Then it will sit in your room.</p><p>&#8203;As agents leave text boxes and enter the physical and digital world, they need real-time perception and structured delivery.<br><br>VideoDB is building the infrastructure layer that enables that shift: the ability to <strong>s</strong>ee, understand and act on real world.</p><p>&#8203;&#8203;This workshop is with Ashu, founder of <a href="https://videodb.io/?utm_source=luma">VideoDB</a>. We&#8217;ll discuss how to convert continuous media streams (screen, mic, camera, RTSP, files) into a structured context your agent can use.</p><p class="button-wrapper" data-attrs="{&quot;url&quot;:&quot;https://luma.com/x1ts2h71&quot;,&quot;text&quot;:&quot;RSVP&quot;,&quot;action&quot;:null,&quot;class&quot;:null}" data-component-name="ButtonCreateButton"><a class="button primary" href="https://luma.com/x1ts2h71"><span>RSVP</span></a></p><p></p><p></p><p></p><div class="subscription-widget-wrap-editor" data-attrs="{&quot;url&quot;:&quot;https://thereview.strangevc.com/subscribe?&quot;,&quot;text&quot;:&quot;Subscribe&quot;,&quot;language&quot;:&quot;en&quot;}" data-component-name="SubscribeWidgetToDOM"><div class="subscription-widget show-subscribe"><div class="preamble"><p class="cta-caption">Thanks for reading The Strange Review! Subscribe for free to receive new posts and support my work.</p></div><form class="subscription-widget-subscribe"><input type="email" class="email-input" name="email" placeholder="Type your email&#8230;" tabindex="-1"><input type="submit" class="button primary" value="Subscribe"><div class="fake-input-wrapper"><div class="fake-input"></div><div class="fake-button"></div></div></form></div></div>]]></content:encoded></item><item><title><![CDATA[The Brief]]></title><description><![CDATA[Anthropic's Claude Mythos Leak Reveals a New Model Tier. Arm Ships Its First Chip. China Bars Manus AI Executives From Leaving the Country.]]></description><link>https://thereview.strangevc.com/p/the-brief</link><guid isPermaLink="false">https://thereview.strangevc.com/p/the-brief</guid><dc:creator><![CDATA[Tara Tan]]></dc:creator><pubDate>Sun, 29 Mar 2026 13:31:13 GMT</pubDate><enclosure url="https://substack-post-media.s3.amazonaws.com/public/images/9e5eea8e-979c-47e8-8102-64cf317c7836_6548x3274.png" length="0" type="image/jpeg"/><content:encoded><![CDATA[<p>It definitely feels like we&#8217;ve crossed some agentic threshold in the past few months. A build that would have taken me 4 to 6 weeks say, 5 years ago now takes me under five minutes. Six months ago, the same task was still a one to two hour affair with plenty of debugging. </p><p>That&#8217;s a pretty significant phase change that I&#8217;m not sure we&#8217;ve fully grappled with yet. This collapse of the distance between idea and working product will rewrite entire industries. It is a step change in the tools that humans will use to build, create, and solve problems. </p><div class="subscription-widget-wrap-editor" data-attrs="{&quot;url&quot;:&quot;https://thereview.strangevc.com/subscribe?&quot;,&quot;text&quot;:&quot;Subscribe&quot;,&quot;language&quot;:&quot;en&quot;}" data-component-name="SubscribeWidgetToDOM"><div class="subscription-widget show-subscribe"><div class="preamble"><p class="cta-caption">Thanks for reading The Strange Review! Subscribe to stay in the loop. </p></div><form class="subscription-widget-subscribe"><input type="email" class="email-input" name="email" placeholder="Type your email&#8230;" tabindex="-1"><input type="submit" class="button primary" value="Subscribe"><div class="fake-input-wrapper"><div class="fake-input"></div><div class="fake-button"></div></div></form></div></div><p>On a related note, <a href="https://github.com/openclaw/openclaw">OpenClaw</a> has gotten meaningfully more stable since the OpenAI acquisition. There is a clear path for it to become one of the most important open-source projects in AI for the long haul.</p><p>Now, onto the week.</p><div><hr></div><h2>The Download</h2><h5>What We&#8217;re Reading This Week </h5><p></p><h4><strong>Anthropic&#8217;s Claude Mythos Leak Reveals a New Model Tier</strong></h4><p>Anthropic exposed details of an unreleased model called Claude Mythos through a CMS misconfiguration. The leaked draft describes a new &#8220;Capybara&#8221; tier above Opus with major advances in coding, reasoning, and cybersecurity capabilities. Anthropic confirmed it is testing the model with early access customers and called it &#8220;a step change&#8221; and &#8220;the most capable we&#8217;ve built to date.&#8221;  (<a href="https://fortune.com/2026/03/26/anthropic-says-testing-mythos-powerful-new-ai-model-after-data-leak-reveals-its-existence-step-change-in-capabilities/">Fortune</a>, <a href="https://the-decoder.com/anthropic-leak-reveals-new-model-claude-mythos-with-dramatically-higher-scores-on-tests-than-any-previous-model/">The Decoder</a>)</p><p><strong>Why it matters:</strong> Two things matter here beyond the model itself. First, the leaked draft warns that the model&#8217;s cybersecurity capabilities are &#8220;far ahead of any other AI model,&#8221; which moved cybersecurity equities in a single session. Second, the introduction of a fourth model tier (Capybara above Opus) signals Anthropic is building pricing headroom for enterprise, not just performance headroom for benchmarks.</p><div><hr></div><h4><strong>Claude Code Is Becoming Anthropic&#8217;s Core Growth Engine</strong></h4><p>Claude Code now accounts for roughly 4% of all public GitHub commits and is on a trajectory to reach 20%+ by year end. Anthropic&#8217;s overall revenue run rate has reached an estimated $14 billion, with Claude Code&#8217;s standalone run rate at approximately $2.5 billion. The tool has crossed over from developer adoption into non-technical users learning terminal commands to build with it. (<a href="https://newsletter.semianalysis.com/p/claude-code-is-the-inflection-point">SemiAnalysis</a>, <a href="https://www.uncoveralpha.com/p/anthropics-claude-code-is-having">Uncover Alpha</a>, <a href="https://venturebeat.com/orchestration/anthropic-says-claude-code-transformed-programming-now-claude-cowork-is">VentureBeat</a>)</p><p><strong>Why it matters:</strong> Claude Code is compressing customer acquisition costs to near zero through organic developer adoption. The expansion into non-developer roles via Cowork extends the addressable market well beyond the 28 million professional developers globally.</p><div><hr></div><h4><strong>Cheng Lou&#8217;s Pretext: Text Layout Without CSS</strong></h4><p>Cheng Lou, one of the more influential UI engineers of the last decade (React, ReasonML, Midjourney), released Pretext, a pure TypeScript text measurement algorithm that bypasses CSS, DOM measurements, and browser reflow entirely. The demos: virtualized rendering of hundreds of thousands of text boxes at 120fps, shrinkwrapped chat bubbles with zero wasted pixels, responsive multi-column magazine layouts, and variable-width ASCII art. T (<a href="https://x.com/_chenglou">X post</a>)</p><p><strong>Why it matters:</strong> Text layout and measurement has been the quiet bottleneck holding back a new generation of UI. CSS was designed for static documents, not the fluid, AI-generated, real-time interfaces that are becoming the norm. If Pretext delivers on the demos, it removes one of the last foundational constraints on what AI-native interfaces can look and feel like.</p><p></p><h4><strong>Arm Ships Its First Chip in 35 Years</strong></h4><p>Arm unveiled the AGI CPU, a 136-core data center processor on TSMC 3nm, co-developed with Meta. This is the first time in the company&#8217;s history that Arm has sold finished silicon rather than licensing IP. OpenAI, Cerebras, and Cloudflare are launch partners, with volume shipments expected by end of year. (<a href="https://newsroom.arm.com/news/arm-agi-cpu-launch">Arm Newsroom</a>, <a href="https://www.eetimes.com/arm-launches-first-silicon-cpu-targets-data-center-agentic-ai-workloads/">EE Times</a>)</p><p><strong>Why it matters: </strong>Current AI data centers are GPU-heavy. The GPU trains and runs the model, and the CPU mostly manages data flow and scheduling. But agentic workloads are different. When thousands of AI agents are running simultaneously, each one coordinating tasks, calling APIs, managing memory, and routing data across systems, that orchestration work falls on the CPU. Arm claims this drives a 4x increase in CPU demand per gigawatt of data center capacity. (<a href="https://www.hpcwire.com/2026/03/26/arm-flexes-with-new-data-center-cpu-for-ai-inference/">HPCwire</a>, <a href="https://futurumgroup.com/insights/arms-15-billion-cpu-opportunity-hinges-on-agentic-data-center-design/">Futurum Group</a>)</p><div><hr></div><h4><strong>NVIDIA and Emerald AI Turn Data Centers Into Grid Assets</strong></h4><p>NVIDIA and Emerald AI announced a coalition with AES, Constellation, Invenergy, NextEra, and Vistra to build &#8220;flexible AI factories&#8221; that modulate compute load to participate in grid balancing services. The first facility, Aurora in Manassas, VA, opens in the first half of 2026. (<a href="https://nvidianews.nvidia.com/news/nvidia-and-emerald-ai-join-leading-energy-companies-to-pioneer-flexible-ai-factories-as-grid-assets">NVIDIA Newsroom</a>, <a href="https://www.axios.com/2026/03/23/utilities-nvidia-emerald-ai-data-centers">Axios</a>)</p><p><strong>Why it matters:</strong> The biggest constraint on AI infrastructure buildout is not chips. It&#8217;s grid interconnection timelines, which run 3 to 5 years in most regions. Data centers that can demonstrate grid flexibility get connected faster and face less regulatory resistance. This reframes the energy question for AI infrastructure investors: the winning thesis is not &#8220;more power&#8221; but &#8220;smarter power.&#8221;</p><div><hr></div><h4><strong>China Bars Manus AI Executives From Leaving the Country</strong></h4><p>What it is: Chinese authorities barred Manus CEO Xiao Hong and Chief Scientist Ji Yichao from leaving China after Meta&#8217;s $2 billion acquisition of the Singapore-based AI startup. The NDRC summoned both executives to Beijing this month and imposed travel restrictions pending regulatory review. (<a href="https://money.usnews.com/investing/news/articles/2026-03-25/china-bars-manus-co-founders-from-leaving-country-as-it-reviews-sale-to-meta-ft-reports">Reuters</a>, <a href="https://www.washingtonpost.com/national-security/2026/03/25/meta-manus-china-executives-banned/">Washington Post</a>)</p><p><strong>Why it matters:</strong> This is not a trade restriction. It is a personnel restriction. China might be signaling that AI talent with mainland origins is a controlled asset, regardless of where the company is incorporated. </p><div><hr></div><h4><strong>A 400B-Parameter LLM Ran on an iPhone 17 Pro</strong></h4><p>An open-source project called Flash-MoE demonstrated a 400-billion parameter Mixture of Experts model running entirely on-device on an iPhone 17 Pro&#8217;s A19 Pro chip, using SSD-to-GPU weight streaming. The model (Qwen 3.5-397B, 2-bit quantized, 17B active parameters) ran at 0.6 tokens per second with 5.5GB of RAM to spare. (<a href="https://wccftech.com/iphone-17-pro-successfully-runs-400b-llm-locally/">WCCFTech</a>, <a href="https://www.tweaktown.com/news/110610/the-iphone-17-pro-can-run-a-400b-parameter-large-language-model-on-device-by-streaming-weights-from-the-ssd/index.html">TweakTown</a>, <a href="https://news.ycombinator.com/item?id=47490070">Hacker News</a>)</p><p><strong>Why it matters:</strong> This is a proof of concept, not a product. The reason a 400B model can run at all on a phone with 12GB of RAM is that only a small fraction of the model is active at any given moment (Mixture of Experts), and the rest streams from the phone's internal SSD on demand rather than sitting in memory. But now apply that same trick to a much smaller model, say 7 or 14 billion parameters, on next-generation mobile chips with faster storage. You get genuinely usable, conversational-speed AI running entirely on the device, no cloud required. </p><div><hr></div><h4><strong>AI Agents Autonomously Performed a Complete Particle Physics Experiment</strong></h4><p>MIT researchers published a framework called JFC (Just Furnish Context) demonstrating that LLM agents built on Claude Code can autonomously execute a full high energy physics analysis pipeline: event selection, background estimation, uncertainty quantification, statistical inference, and paper drafting. The system ran on open data from ALEPH, DELPHI, and CMS detectors. (<a href="https://arxiv.org/abs/2603.20179">arXiv 2603.20179</a>)</p><p><strong>Why it matters:</strong> This is one of the clearest demonstration that agentic AI can automate end-to-end scientific workflows in a domain with extremely high methodological rigor. The immediate investment implication is for the reanalysis of legacy datasets across physics, genomics, and materials science, where decades of archived data sit underexploited.</p><div><hr></div><p></p><h2><strong>Deep Dive From The Review</strong></h2><p>Humanoid robots are the most demanding battery-powered machines ever built. </p><p>400 power spikes per charge. 80&#176;C inside the torso. Discharge rates three to five times higher than an EV. </p><p>No battery was designed for this workload. Can current-day battery chemistry can keep up with humanoid ambition? </p><p>New piece by Strange Research Fellows Joy Yang and Mason Rodriguez Rand. </p><div class="digest-post-embed" data-attrs="{&quot;nodeId&quot;:&quot;93831861-c5f3-45d2-9734-520a08857dda&quot;,&quot;caption&quot;:&quot;A warehouse humanoid picks up a 15 kg box, carries it 30m, shelves it, and walks back. But inside the battery pack, nothing about this is routine. Each cycle contains a 2,500W lift spike, a 600 to 1,000W loaded walk, one or two 3,000W balance-recovery transients when something unexpected appears in the path, and a gentler unloaded return. Over a single &#8230;&quot;,&quot;cta&quot;:&quot;Read full story&quot;,&quot;showBylines&quot;:true,&quot;size&quot;:&quot;lg&quot;,&quot;isEditorNode&quot;:true,&quot;title&quot;:&quot;Hitting The Battery Wall&quot;,&quot;publishedBylines&quot;:[{&quot;id&quot;:233041424,&quot;name&quot;:&quot;Joy Yang&quot;,&quot;bio&quot;:&quot;oxford vgg&quot;,&quot;photo_url&quot;:&quot;https://substack-post-media.s3.amazonaws.com/public/images/24b9a92b-9976-4d5d-9fc6-f827d4f8a623_3249x3249.jpeg&quot;,&quot;is_guest&quot;:true,&quot;bestseller_tier&quot;:null,&quot;primaryPublicationSubscribeUrl&quot;:&quot;https://j0yy.substack.com/subscribe?&quot;,&quot;primaryPublicationUrl&quot;:&quot;https://j0yy.substack.com&quot;,&quot;primaryPublicationName&quot;:&quot;Joy Yang&quot;,&quot;primaryPublicationId&quot;:8212821},{&quot;id&quot;:37038883,&quot;name&quot;:&quot;Mason Rodriguez Rand&quot;,&quot;bio&quot;:&quot;Accelerating the rate of progress in science and engineering. Changing how we're designing and building in the physical world.&quot;,&quot;photo_url&quot;:&quot;https://substack-post-media.s3.amazonaws.com/public/images/04a4da49-c02f-4f74-8e39-e70e4505c729_144x144.png&quot;,&quot;is_guest&quot;:true,&quot;bestseller_tier&quot;:null,&quot;primaryPublicationSubscribeUrl&quot;:&quot;https://masonprr.substack.com/subscribe?&quot;,&quot;primaryPublicationUrl&quot;:&quot;https://masonprr.substack.com&quot;,&quot;primaryPublicationName&quot;:&quot;Mason Rodriguez Rand&quot;,&quot;primaryPublicationId&quot;:8079039}],&quot;post_date&quot;:&quot;2026-03-25T19:23:01.966Z&quot;,&quot;cover_image&quot;:&quot;https://substackcdn.com/image/fetch/$s_!NIKj!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F18633e7f-0cd6-476f-90f4-ce741482e37f_1275x728.jpeg&quot;,&quot;cover_image_alt&quot;:null,&quot;canonical_url&quot;:&quot;https://thereview.strangevc.com/p/hitting-the-battery-wall&quot;,&quot;section_name&quot;:null,&quot;video_upload_id&quot;:null,&quot;id&quot;:192123326,&quot;type&quot;:&quot;newsletter&quot;,&quot;reaction_count&quot;:6,&quot;comment_count&quot;:0,&quot;publication_id&quot;:8836,&quot;publication_name&quot;:&quot;The Strange Review&quot;,&quot;publication_logo_url&quot;:&quot;https://substackcdn.com/image/fetch/$s_!aTcF!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F4c0b94d7-432e-4b5a-8c68-2a83481e72cd_737x737.png&quot;,&quot;belowTheFold&quot;:true,&quot;youtube_url&quot;:null,&quot;show_links&quot;:null,&quot;feed_url&quot;:null}"></div><p></p><p><em>Please help me improve The Strange Review! I&#8217;d love your thoughts. </em></p><div class="poll-embed" data-attrs="{&quot;id&quot;:485637}" data-component-name="PollToDOM"></div><div><hr></div><p>Physical AI company Archetype is hiring Design Fellows this summer to explore the future of AI interfaces beyond the screen. <a href="https://careers.kula.ai/archetype-ai/28836">Apply here. </a></p><div class="captioned-image-container"><figure><a class="image-link image2 is-viewable-img" target="_blank" href="https://substackcdn.com/image/fetch/$s_!Ihuj!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F1f141479-6ac0-4da9-896b-16484b6a1ca3_800x1132.jpeg" data-component-name="Image2ToDOM"><div class="image2-inset"><picture><source type="image/webp" srcset="https://substackcdn.com/image/fetch/$s_!Ihuj!,w_424,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F1f141479-6ac0-4da9-896b-16484b6a1ca3_800x1132.jpeg 424w, https://substackcdn.com/image/fetch/$s_!Ihuj!,w_848,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F1f141479-6ac0-4da9-896b-16484b6a1ca3_800x1132.jpeg 848w, https://substackcdn.com/image/fetch/$s_!Ihuj!,w_1272,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F1f141479-6ac0-4da9-896b-16484b6a1ca3_800x1132.jpeg 1272w, https://substackcdn.com/image/fetch/$s_!Ihuj!,w_1456,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F1f141479-6ac0-4da9-896b-16484b6a1ca3_800x1132.jpeg 1456w" sizes="100vw"><img src="https://substackcdn.com/image/fetch/$s_!Ihuj!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F1f141479-6ac0-4da9-896b-16484b6a1ca3_800x1132.jpeg" width="800" height="1132" data-attrs="{&quot;src&quot;:&quot;https://substack-post-media.s3.amazonaws.com/public/images/1f141479-6ac0-4da9-896b-16484b6a1ca3_800x1132.jpeg&quot;,&quot;srcNoWatermark&quot;:null,&quot;fullscreen&quot;:null,&quot;imageSize&quot;:null,&quot;height&quot;:1132,&quot;width&quot;:800,&quot;resizeWidth&quot;:null,&quot;bytes&quot;:null,&quot;alt&quot;:&quot;No alternative text description for this image&quot;,&quot;title&quot;:null,&quot;type&quot;:null,&quot;href&quot;:null,&quot;belowTheFold&quot;:true,&quot;topImage&quot;:false,&quot;internalRedirect&quot;:null,&quot;isProcessing&quot;:false,&quot;align&quot;:null,&quot;offset&quot;:false}" class="sizing-normal" alt="No alternative text description for this image" title="No alternative text description for this image" srcset="https://substackcdn.com/image/fetch/$s_!Ihuj!,w_424,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F1f141479-6ac0-4da9-896b-16484b6a1ca3_800x1132.jpeg 424w, https://substackcdn.com/image/fetch/$s_!Ihuj!,w_848,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F1f141479-6ac0-4da9-896b-16484b6a1ca3_800x1132.jpeg 848w, https://substackcdn.com/image/fetch/$s_!Ihuj!,w_1272,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F1f141479-6ac0-4da9-896b-16484b6a1ca3_800x1132.jpeg 1272w, https://substackcdn.com/image/fetch/$s_!Ihuj!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F1f141479-6ac0-4da9-896b-16484b6a1ca3_800x1132.jpeg 1456w" sizes="100vw" loading="lazy"></picture><div class="image-link-expand"><div class="pencraft pc-display-flex pc-gap-8 pc-reset"><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container restack-image"><svg role="img" width="20" height="20" viewBox="0 0 20 20" fill="none" stroke-width="1.5" stroke="var(--color-fg-primary)" stroke-linecap="round" stroke-linejoin="round" xmlns="http://www.w3.org/2000/svg"><g><title></title><path d="M2.53001 7.81595C3.49179 4.73911 6.43281 2.5 9.91173 2.5C13.1684 2.5 15.9537 4.46214 17.0852 7.23684L17.6179 8.67647M17.6179 8.67647L18.5002 4.26471M17.6179 8.67647L13.6473 6.91176M17.4995 12.1841C16.5378 15.2609 13.5967 17.5 10.1178 17.5C6.86118 17.5 4.07589 15.5379 2.94432 12.7632L2.41165 11.3235M2.41165 11.3235L1.5293 15.7353M2.41165 11.3235L6.38224 13.0882"></path></g></svg></button><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container view-image"><svg xmlns="http://www.w3.org/2000/svg" width="20" height="20" viewBox="0 0 24 24" fill="none" stroke="currentColor" stroke-width="2" stroke-linecap="round" stroke-linejoin="round" class="lucide lucide-maximize2 lucide-maximize-2"><polyline points="15 3 21 3 21 9"></polyline><polyline points="9 21 3 21 3 15"></polyline><line x1="21" x2="14" y1="3" y2="10"></line><line x1="3" x2="10" y1="21" y2="14"></line></svg></button></div></div></div></a></figure></div><p></p><p></p><div class="subscription-widget-wrap-editor" data-attrs="{&quot;url&quot;:&quot;https://thereview.strangevc.com/subscribe?&quot;,&quot;text&quot;:&quot;Subscribe&quot;,&quot;language&quot;:&quot;en&quot;}" data-component-name="SubscribeWidgetToDOM"><div class="subscription-widget show-subscribe"><div class="preamble"><p class="cta-caption">Thanks for reading The Strange Review! Subscribe to stay in the loop</p></div><form class="subscription-widget-subscribe"><input type="email" class="email-input" name="email" placeholder="Type your email&#8230;" tabindex="-1"><input type="submit" class="button primary" value="Subscribe"><div class="fake-input-wrapper"><div class="fake-input"></div><div class="fake-button"></div></div></form></div></div><div><hr></div>]]></content:encoded></item><item><title><![CDATA[The Brief: The AI Factory Era Begins at GTC]]></title><description><![CDATA[The Nvidia-Groq acquisition showcased immediately with the Groq 3 LPU. Rivian spins off a robotics company based on its data library. Google launches full-stack vibe coding.]]></description><link>https://thereview.strangevc.com/p/the-brief-the-ai-factory-era-begins</link><guid isPermaLink="false">https://thereview.strangevc.com/p/the-brief-the-ai-factory-era-begins</guid><dc:creator><![CDATA[Tara Tan]]></dc:creator><pubDate>Fri, 20 Mar 2026 14:31:10 GMT</pubDate><enclosure url="https://substack-post-media.s3.amazonaws.com/public/images/adbb2645-c1d1-451e-8352-56a42783eb50_6548x3274.png" length="0" type="image/jpeg"/><content:encoded><![CDATA[<h3>The Download</h3><h4><em>Here&#8217;s the news that mattered this week</em></h4><p></p><h4>GTC 2026 Highlights</h4><p>Jensen Huang's<a href="https://blogs.nvidia.com/blog/gtc-2026-news/"> two-hour keynote centered on NVIDIA's transition</a> from chip vendor to full-stack AI infrastructure platform. Three announcements stood out.</p><ul><li><p><strong>The Groq Acquisition Pays Off Immediately.</strong> Three months after a $20B acqui-hire, NVIDIA debuted the Groq 3 LPU, an SRAM-based inference accelerator that sits alongside Rubin GPUs in rack-scale deployments. 150 TB/s memory bandwidth versus 22 TB/s on Rubin&#8217;s HBM4. Huang suggested up to 25% of cluster compute could be Groq silicon. NVIDIA killed its own Rubin CPX product to make room. The inference economy now has dedicated hardware, and NVIDIA owns both sides of the training-inference split.</p></li><li><p><strong>$1 Trillion Through 2027.</strong> Huang doubled last year&#8217;s $500B forecast, projecting $1 trillion in cumulative Blackwell and Vera Rubin orders through 2027. Goldman maintained a Buy rating, noting the guidance directly counters the &#8220;peak capex in 2026&#8221; thesis weighing on AI infrastructure names.</p></li><li><p><strong>Runway Previews Real-Time Video Generation on Vera Rubin.</strong> Runway and NVIDIA demonstrated a new video model running on Vera Rubin hardware with time-to-first-frame under 100ms for HD video. The model feeds into Runway's General World Model (GWM-1) research. (<a href="https://x.com/runwayml">Runway</a>)</p><div class="subscription-widget-wrap-editor" data-attrs="{&quot;url&quot;:&quot;https://thereview.strangevc.com/subscribe?&quot;,&quot;text&quot;:&quot;Subscribe&quot;,&quot;language&quot;:&quot;en&quot;}" data-component-name="SubscribeWidgetToDOM"><div class="subscription-widget show-subscribe"><div class="preamble"><p class="cta-caption">Thanks for reading The Strange Review! Subscribe to stay ahead</p></div><form class="subscription-widget-subscribe"><input type="email" class="email-input" name="email" placeholder="Type your email&#8230;" tabindex="-1"><input type="submit" class="button primary" value="Subscribe"><div class="fake-input-wrapper"><div class="fake-input"></div><div class="fake-button"></div></div></form></div></div></li></ul><div><hr></div><h4><strong>DoorDash Launches Tasks, Turns 8M Dashers Into Physical AI Data Collectors</strong></h4><p><strong>What it is:</strong> DoorDash launched a standalone Tasks app paying couriers to film household chores, record multilingual speech, and capture real-world environments to train AI and robotics models. Partners span retail, insurance, hospitality, and tech. Over 2 million tasks completed since 2024. (<a href="https://techcrunch.com/2026/03/19/doordash-launches-a-new-tasks-app-that-pays-couriers-to-submit-videos-to-train-ai/">TechCrunch</a>, <a href="https://www.bloomberg.com/news/articles/2026-03-19/doordash-s-new-paid-tasks-turn-couriers-into-ai-and-robot-trainers">Bloomberg</a>)</p><p><strong>Why it matters:</strong> DoorDash just entered the physical AI data business with 8 million distributed workers already dispatched to real-world locations. Scale AI built a multibillion-dollar company on remote data labeling. DoorDash arrives with in-person collection at a distribution scale that might be hard for any data vendor to match.</p><div><hr></div><h4><strong>Claude Code Channels: Agentic Coding From Your Phone</strong></h4><p><strong>What it is:</strong> <a href="https://x.com/trq212/status/2034761016320696565?s=20">Anthropic shipped Claude Code Channels</a>, allowing developers to control Claude Code sessions through Telegram and Discord via MCP. You can now monitor, prompt, and steer persistent coding agents from your phone.</p><p><strong>Why it matters:</strong> Agentic coding has been tethered to the terminal. Channels makes it asynchronous and mobile, which changes the usage pattern from &#8220;sit down and code&#8221; to &#8220;delegate and check in.&#8221; This is their direct response to the runway success of OpenClaw. </p><div><hr></div><h4><strong>Google Ships Vibe Coding and Vibe Design in the Same Week</strong></h4><p><strong>What it is:</strong> Google upgraded AI Studio into a unified full-stack development platform, combining its Antigravity coding agent with Firebase backends, secret management, and one-click deployment to Cloud Run. Separately, it shipped a major Stitch redesign: AI-native infinite canvas, voice interaction, instant prototyping, and export to Figma and HTML/CSS. On this news, Figma shares dropped 4%. (<a href="https://blog.google/innovation-and-ai/technology/developers-tools/full-stack-vibe-coding-google-ai-studio/">Google Blog</a>, <a href="https://blog.google/innovation-and-ai/models-and-research/google-labs/stitch-ai-ui-design/">Google Blog</a>, <a href="https://siliconangle.com/2026/03/19/google-upgrades-stitch-ai-interface-development-tool/">SiliconANGLE</a>)</p><p><strong>Why it matters:</strong> Google now covers design, code, and deployment in a single ecosystem, all free at launch. When the model provider owns the full stack and bundles the tooling, standalone players in both vibe coding (Replit, Bolt, Lovable) and design (Figma) lose pricing power. Developer tools are a distribution game, and the hyperscalers have distribution locked in.</p><div><hr></div><h4>V-JEPA 2.1: LeCun&#8217;s World Model Architecture Posts New Robotics Benchmarks</h4><p><strong>What it is:</strong> Yann LeCun and collaborators (several now at AMI Labs) released V-JEPA 2.1, the latest version of the JEPA video model. It achieves state-of-the-art on action anticipation and object tracking benchmarks and posts a 20% improvement in real-robot grasping success over its predecessor. (<a href="https://arxiv.org/abs/2603.14482">arXiv</a>)</p><p><strong>Why it matters:</strong> This is the first concrete technical signal from LeCun&#8217;s camp since AMI Labs raised $1B on the JEPA thesis two weeks ago. The robotics results in particular matter: the model learns manipulation tasks from just 62 hours of unlabeled robot video, with no task-specific training or reward. If JEPA architectures can generalize physical skills from small data, the capital advantage of massive GPU clusters shrinks and the value of proprietary physical data (see: Mind Robotics) grows. </p><div><hr></div><h4>Mind Robotics Raises $500M Series A on Rivian Factory Data</h4><p><strong>What it is:</strong><a href="https://news.crunchbase.com/venture/biggest-funding-rounds-ai-robotics-ecommerce-quince/"> Mind Robotics, a Rivian spin-off,</a> closed a $500M Series A led by Accel and a16z. The company trains industrial robots using Rivian&#8217;s proprietary factory sensor data and custom silicon.</p><p><strong>Why it matters:</strong> Industrial incumbents are discovering their operational data (the physics of how things move, break, and assemble) is as or more valuable than their products. This creates a new category of &#8220;data-rich&#8221; robotics startups where the moat isn&#8217;t hardware design, it&#8217;s access to high-fidelity physical interaction data. We expect more spin-outs from automakers and heavy manufacturers.</p><div><hr></div><h4>Broadcom Ships 400G Optical DSP at OFC 2026</h4><p><strong>What it is:</strong> <a href="https://investors.broadcom.com/news-releases/news-release-details/broadcom-showcases-industry-leading-solutions-scaling-ai">Broadcom debuted Taurus at OFC 2026</a>, the first 400G-per-lane optical DSP, enabling 1.6T and 3.2T transceivers purpose-built for the GPU clusters announced at GTC.</p><p><strong>Why it matters:</strong> Compute is scaling faster than the network connecting it. As NVIDIA moves to rack-scale systems with tens of thousands of dies, the interconnect becomes the binding constraint. Broadcom is positioning as the chokepoint for all distributed AI training. </p><div><hr></div><h2>Deep Dive From The Review</h2><div class="captioned-image-container"><figure><a class="image-link image2 is-viewable-img" target="_blank" href="https://substackcdn.com/image/fetch/$s_!kCja!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F2273cb6b-edeb-4ab6-86e8-f3daa109c8ef_1080x1350.jpeg" data-component-name="Image2ToDOM"><div class="image2-inset"><picture><source type="image/webp" srcset="https://substackcdn.com/image/fetch/$s_!kCja!,w_424,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F2273cb6b-edeb-4ab6-86e8-f3daa109c8ef_1080x1350.jpeg 424w, https://substackcdn.com/image/fetch/$s_!kCja!,w_848,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F2273cb6b-edeb-4ab6-86e8-f3daa109c8ef_1080x1350.jpeg 848w, https://substackcdn.com/image/fetch/$s_!kCja!,w_1272,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F2273cb6b-edeb-4ab6-86e8-f3daa109c8ef_1080x1350.jpeg 1272w, https://substackcdn.com/image/fetch/$s_!kCja!,w_1456,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F2273cb6b-edeb-4ab6-86e8-f3daa109c8ef_1080x1350.jpeg 1456w" sizes="100vw"><img src="https://substackcdn.com/image/fetch/$s_!kCja!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F2273cb6b-edeb-4ab6-86e8-f3daa109c8ef_1080x1350.jpeg" width="1080" height="1350" data-attrs="{&quot;src&quot;:&quot;https://substack-post-media.s3.amazonaws.com/public/images/2273cb6b-edeb-4ab6-86e8-f3daa109c8ef_1080x1350.jpeg&quot;,&quot;srcNoWatermark&quot;:null,&quot;fullscreen&quot;:null,&quot;imageSize&quot;:null,&quot;height&quot;:1350,&quot;width&quot;:1080,&quot;resizeWidth&quot;:null,&quot;bytes&quot;:360280,&quot;alt&quot;:null,&quot;title&quot;:null,&quot;type&quot;:&quot;image/jpeg&quot;,&quot;href&quot;:null,&quot;belowTheFold&quot;:true,&quot;topImage&quot;:false,&quot;internalRedirect&quot;:&quot;https://thereview.strangevc.com/i/191521911?img=https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F2273cb6b-edeb-4ab6-86e8-f3daa109c8ef_1080x1350.jpeg&quot;,&quot;isProcessing&quot;:false,&quot;align&quot;:null,&quot;offset&quot;:false}" class="sizing-normal" alt="" srcset="https://substackcdn.com/image/fetch/$s_!kCja!,w_424,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F2273cb6b-edeb-4ab6-86e8-f3daa109c8ef_1080x1350.jpeg 424w, https://substackcdn.com/image/fetch/$s_!kCja!,w_848,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F2273cb6b-edeb-4ab6-86e8-f3daa109c8ef_1080x1350.jpeg 848w, https://substackcdn.com/image/fetch/$s_!kCja!,w_1272,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F2273cb6b-edeb-4ab6-86e8-f3daa109c8ef_1080x1350.jpeg 1272w, https://substackcdn.com/image/fetch/$s_!kCja!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F2273cb6b-edeb-4ab6-86e8-f3daa109c8ef_1080x1350.jpeg 1456w" sizes="100vw" loading="lazy"></picture><div class="image-link-expand"><div class="pencraft pc-display-flex pc-gap-8 pc-reset"><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container restack-image"><svg role="img" width="20" height="20" viewBox="0 0 20 20" fill="none" stroke-width="1.5" stroke="var(--color-fg-primary)" stroke-linecap="round" stroke-linejoin="round" xmlns="http://www.w3.org/2000/svg"><g><title></title><path d="M2.53001 7.81595C3.49179 4.73911 6.43281 2.5 9.91173 2.5C13.1684 2.5 15.9537 4.46214 17.0852 7.23684L17.6179 8.67647M17.6179 8.67647L18.5002 4.26471M17.6179 8.67647L13.6473 6.91176M17.4995 12.1841C16.5378 15.2609 13.5967 17.5 10.1178 17.5C6.86118 17.5 4.07589 15.5379 2.94432 12.7632L2.41165 11.3235M2.41165 11.3235L1.5293 15.7353M2.41165 11.3235L6.38224 13.0882"></path></g></svg></button><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container view-image"><svg xmlns="http://www.w3.org/2000/svg" width="20" height="20" viewBox="0 0 24 24" fill="none" stroke="currentColor" stroke-width="2" stroke-linecap="round" stroke-linejoin="round" class="lucide lucide-maximize2 lucide-maximize-2"><polyline points="15 3 21 3 21 9"></polyline><polyline points="9 21 3 21 3 15"></polyline><line x1="21" x2="14" y1="3" y2="10"></line><line x1="3" x2="10" y1="21" y2="14"></line></svg></button></div></div></div></a></figure></div><p><em>NVIDIA&#8217;s latest AI rack, Vera Rubin, produces the heat of 160 homes. The next generation will double that. The one after will likely double it again.</em></p><p><em>The industry is racing to solve the heat problem, from subsea data centers to launching servers into orbit. But the most likely next step is the least exotic: liquid cooling.</em></p><p><em>The catch? The hardware is the easy part. The real cost is operational. It rewires how facilities are built, staffed, diagnosed, and run.</em></p><p><em>Our latest by Strange Research Fellow <a href="https://substack.com/profile/211895753-rahul-narula">Rahul Narula</a> explores what changes, and where the opportunity sits.</em></p><div class="embedded-post-wrap" data-attrs="{&quot;id&quot;:191336288,&quot;url&quot;:&quot;https://thereview.strangevc.com/p/the-liquid-revolution-inside-the&quot;,&quot;publication_id&quot;:8836,&quot;publication_name&quot;:&quot;The Strange Review&quot;,&quot;publication_logo_url&quot;:&quot;https://substackcdn.com/image/fetch/$s_!aTcF!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F4c0b94d7-432e-4b5a-8c68-2a83481e72cd_737x737.png&quot;,&quot;title&quot;:&quot;The Liquid Revolution: Inside the Racks That Can Heat 160 Homes&quot;,&quot;truncated_body_text&quot;:&quot;In this week&#8217;s GTC keynote, NVIDIA announced the deployment of its impressive Vera Rubin NVL72, which ships in H2 2026. It packs 72 Rubin GPUs, 36 Vera CPUs, and more in a single liquid-cooled rack, and its power consumption can exceed 200 kW.&quot;,&quot;date&quot;:&quot;2026-03-18T16:54:57.045Z&quot;,&quot;like_count&quot;:5,&quot;comment_count&quot;:0,&quot;bylines&quot;:[{&quot;id&quot;:211895753,&quot;name&quot;:&quot;Rahul Narula&quot;,&quot;handle&quot;:&quot;rnarula1&quot;,&quot;previous_name&quot;:&quot;Rahul&quot;,&quot;photo_url&quot;:&quot;https://substackcdn.com/image/fetch/$s_!Rb-5!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F2a7e447b-acc4-4f3c-8e8c-9028a7510421_1181x1181.jpeg&quot;,&quot;bio&quot;:null,&quot;profile_set_up_at&quot;:&quot;2026-03-04T19:36:01.894Z&quot;,&quot;reader_installed_at&quot;:null,&quot;is_guest&quot;:true,&quot;bestseller_tier&quot;:null,&quot;status&quot;:{&quot;bestsellerTier&quot;:null,&quot;subscriberTier&quot;:null,&quot;leaderboard&quot;:null,&quot;vip&quot;:false,&quot;badge&quot;:null,&quot;paidPublicationIds&quot;:[],&quot;subscriber&quot;:null}}],&quot;utm_campaign&quot;:null,&quot;belowTheFold&quot;:true,&quot;type&quot;:&quot;newsletter&quot;,&quot;language&quot;:&quot;en&quot;}" data-component-name="EmbeddedPostToDOM"><a class="embedded-post" native="true" href="https://thereview.strangevc.com/p/the-liquid-revolution-inside-the?utm_source=substack&amp;utm_campaign=post_embed&amp;utm_medium=web"><div class="embedded-post-header"><img class="embedded-post-publication-logo" src="https://substackcdn.com/image/fetch/$s_!aTcF!,w_56,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F4c0b94d7-432e-4b5a-8c68-2a83481e72cd_737x737.png" loading="lazy"><span class="embedded-post-publication-name">The Strange Review</span></div><div class="embedded-post-title-wrapper"><div class="embedded-post-title">The Liquid Revolution: Inside the Racks That Can Heat 160 Homes</div></div><div class="embedded-post-body">In this week&#8217;s GTC keynote, NVIDIA announced the deployment of its impressive Vera Rubin NVL72, which ships in H2 2026. It packs 72 Rubin GPUs, 36 Vera CPUs, and more in a single liquid-cooled rack, and its power consumption can exceed 200 kW&#8230;</div><div class="embedded-post-cta-wrapper"><span class="embedded-post-cta">Read more</span></div><div class="embedded-post-meta">2 months ago &#183; 5 likes &#183; Rahul Narula</div></a></div><p></p><div><hr></div><h2>Strange Signals: Data of the Week</h2><div class="captioned-image-container"><figure><a class="image-link image2 is-viewable-img" target="_blank" href="https://substackcdn.com/image/fetch/$s_!jcgI!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Ff55b0933-8912-4ed9-9c4d-cb8b32230635_3072x1344.png" data-component-name="Image2ToDOM"><div class="image2-inset"><picture><source type="image/webp" srcset="https://substackcdn.com/image/fetch/$s_!jcgI!,w_424,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Ff55b0933-8912-4ed9-9c4d-cb8b32230635_3072x1344.png 424w, https://substackcdn.com/image/fetch/$s_!jcgI!,w_848,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Ff55b0933-8912-4ed9-9c4d-cb8b32230635_3072x1344.png 848w, https://substackcdn.com/image/fetch/$s_!jcgI!,w_1272,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Ff55b0933-8912-4ed9-9c4d-cb8b32230635_3072x1344.png 1272w, https://substackcdn.com/image/fetch/$s_!jcgI!,w_1456,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Ff55b0933-8912-4ed9-9c4d-cb8b32230635_3072x1344.png 1456w" sizes="100vw"><img src="https://substackcdn.com/image/fetch/$s_!jcgI!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Ff55b0933-8912-4ed9-9c4d-cb8b32230635_3072x1344.png" width="1456" height="637" data-attrs="{&quot;src&quot;:&quot;https://substack-post-media.s3.amazonaws.com/public/images/f55b0933-8912-4ed9-9c4d-cb8b32230635_3072x1344.png&quot;,&quot;srcNoWatermark&quot;:null,&quot;fullscreen&quot;:null,&quot;imageSize&quot;:null,&quot;height&quot;:637,&quot;width&quot;:1456,&quot;resizeWidth&quot;:null,&quot;bytes&quot;:5291964,&quot;alt&quot;:null,&quot;title&quot;:null,&quot;type&quot;:&quot;image/png&quot;,&quot;href&quot;:null,&quot;belowTheFold&quot;:true,&quot;topImage&quot;:false,&quot;internalRedirect&quot;:&quot;https://thereview.strangevc.com/i/191521911?img=https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Ff55b0933-8912-4ed9-9c4d-cb8b32230635_3072x1344.png&quot;,&quot;isProcessing&quot;:false,&quot;align&quot;:null,&quot;offset&quot;:false}" class="sizing-normal" alt="" srcset="https://substackcdn.com/image/fetch/$s_!jcgI!,w_424,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Ff55b0933-8912-4ed9-9c4d-cb8b32230635_3072x1344.png 424w, https://substackcdn.com/image/fetch/$s_!jcgI!,w_848,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Ff55b0933-8912-4ed9-9c4d-cb8b32230635_3072x1344.png 848w, https://substackcdn.com/image/fetch/$s_!jcgI!,w_1272,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Ff55b0933-8912-4ed9-9c4d-cb8b32230635_3072x1344.png 1272w, https://substackcdn.com/image/fetch/$s_!jcgI!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Ff55b0933-8912-4ed9-9c4d-cb8b32230635_3072x1344.png 1456w" sizes="100vw" loading="lazy"></picture><div class="image-link-expand"><div class="pencraft pc-display-flex pc-gap-8 pc-reset"><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container restack-image"><svg role="img" width="20" height="20" viewBox="0 0 20 20" fill="none" stroke-width="1.5" stroke="var(--color-fg-primary)" stroke-linecap="round" stroke-linejoin="round" xmlns="http://www.w3.org/2000/svg"><g><title></title><path d="M2.53001 7.81595C3.49179 4.73911 6.43281 2.5 9.91173 2.5C13.1684 2.5 15.9537 4.46214 17.0852 7.23684L17.6179 8.67647M17.6179 8.67647L18.5002 4.26471M17.6179 8.67647L13.6473 6.91176M17.4995 12.1841C16.5378 15.2609 13.5967 17.5 10.1178 17.5C6.86118 17.5 4.07589 15.5379 2.94432 12.7632L2.41165 11.3235M2.41165 11.3235L1.5293 15.7353M2.41165 11.3235L6.38224 13.0882"></path></g></svg></button><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container view-image"><svg xmlns="http://www.w3.org/2000/svg" width="20" height="20" viewBox="0 0 24 24" fill="none" stroke="currentColor" stroke-width="2" stroke-linecap="round" stroke-linejoin="round" class="lucide lucide-maximize2 lucide-maximize-2"><polyline points="15 3 21 3 21 9"></polyline><polyline points="9 21 3 21 3 15"></polyline><line x1="21" x2="14" y1="3" y2="10"></line><line x1="3" x2="10" y1="21" y2="14"></line></svg></button></div></div></div></a></figure></div><p>The numbers are adding up fast. In a survey of tech CEOs released this week, 66% reported they no longer plan to backfill roles lost to voluntary attrition. &#8220;Replacing departing staff with AI agents&#8221; entered the Top 5 strategic priorities for the first time. (<a href="https://www.saastr.com/the-rise-of-invisible-unemployment-in-tech-2026-will-be-the-year-when-everything-really-changes/">SaaStr</a>)</p><p>The layoff data supports it. Block cut 4,000 employees in February (40% of headcount). Atlassian cut 1,600 on March 11 (10% of staff, over 900 in R&amp;D). Meta is reportedly planning to cut up to 20% of its 79,000-person workforce, roughly 15,000 roles, to offset $135B in AI capex. (<a href="https://techcrunch.com/2026/03/12/atlassian-follows-blocks-footsteps-and-cuts-staff-in-the-name-of-ai/">TechCrunch</a>, <a href="https://www.cnbc.com/2026/03/16/meta-ai-costs-mass-layoffs-20percent-up-premarket.html">CNBC</a>)</p><p><a href="https://www.challengergray.com/blog/challenger-report-february-cuts-plunge-hiring-falls-56-percent/">HR agency Challenger, Gray &amp; Christmas</a>&#8217; data puts it in context: 12,304 job cuts have been explicitly attributed to AI through February 2026, 8% of all announced cuts. That&#8217;s up from 5% for the full year of 2025 and 3% since tracking began in 2023. Tech sector cuts are up 51% year over year. Meanwhile, announced hiring plans are down 56% compared to the same period last year, the lowest since tracking began in 2009. (<a href="https://www.challengergray.com/blog/challenger-report-february-cuts-plunge-hiring-falls-56-percent/">Challenger</a>)</p><p>We think companies might be entering a &#8220;low-hire, low-fire&#8221; era where headcounts shrink through unreplaced attrition and targeted restructuring rather than headline layoffs. Enterprise budgets are being redirected from headcount to AI tooling. </p><div class="subscription-widget-wrap-editor" data-attrs="{&quot;url&quot;:&quot;https://thereview.strangevc.com/subscribe?&quot;,&quot;text&quot;:&quot;Subscribe&quot;,&quot;language&quot;:&quot;en&quot;}" data-component-name="SubscribeWidgetToDOM"><div class="subscription-widget show-subscribe"><div class="preamble"><p class="cta-caption">Thanks for reading The Strange Review! </p></div><form class="subscription-widget-subscribe"><input type="email" class="email-input" name="email" placeholder="Type your email&#8230;" tabindex="-1"><input type="submit" class="button primary" value="Subscribe"><div class="fake-input-wrapper"><div class="fake-input"></div><div class="fake-button"></div></div></form></div></div><div><hr></div><div><hr></div><div><hr></div><div><hr></div>]]></content:encoded></item></channel></rss>