📰

subtl daily briefing

Share𝕏in

Good morning, founders and builders. The AI arms race just got a lot cheaper — and a lot more disruptive. DeepSeek dropped a bombshell open-source model this week, Meta quietly laid off 8,000 people to pay for its $72B AI bet, and one analyst named Jeremy replaced a 100-person team with a Claude subscription. Let's get into it.

In today's briefing

  • 1.DeepSeek V4 Undercuts Competitors by 90%
  • 2.Meta Cuts 8K Jobs to Fund AI Buildout
  • 3.Open-Source Models Built for Agentic AI
  • 4.B2B Buyers Demanding Shorter Contracts
  • 5.Evan Spiegel: Distribution Is the Only Moat
  • Quick hits on other news
Latest Developments
AI

🚀DeepSeek V4 Launches With 1M-Token Context at 10% the Cost of ChatGPT

The Rundown: DeepSeek released V4 Pro with a 1-million-token context window and pricing at just $4/million output tokens — roughly 10% of what ChatGPT and Claude charge — while openly acknowledging it trails frontier models by 3-6 months.

The details:

  • DeepSeek V4 Pro features 1.6 trillion parameters and uses novel Compressed Sparse Attention (CSA) and Heavily Compressed Attention (HCA) to achieve 1M-token context on a fraction of the VRAM of its predecessor
  • Output pricing sits at $4/million tokens, compared to roughly $40/million for ChatGPT and Claude — a 90% cost reduction for developers building on top of frontier-class models
  • DeepSeek self-disclosed that V4 trails GPT-5.4 and Gemini 3.1 Pro by 3-6 months on key benchmarks, a rare move of transparency from an AI lab
  • Model is released under a permissive MIT license, making it freely deployable for commercial use without royalties
Why it matters: For founders building AI-native products, DeepSeek V4 is a direct shot across the bow at your cost structure. If you're paying OpenAI or Anthropic rates for inference, you now have a legitimate open-source alternative at a fraction of the price. The MIT license means you can self-host, fine-tune, and ship without vendor lock-in. The 1M-token context window also opens entirely new product categories — long-form document analysis, codebase-level reasoning, and persistent agent memory — that were economically unviable at previous price points.

📰 Source: The Neuron / AlphaSignal

Share𝕏in
AI

💼Meta Cuts 8,000 Jobs — The First Major 'AI Layoff at an AI Company' Moment of 2026

The Rundown: Meta eliminated 8,000 roles — 10% of its workforce — redirecting that payroll toward its $72 billion AI capital expenditure plan, marking a defining moment in how Big Tech is restructuring around AI.

The details:

  • Meta cut 8,000 jobs (10% of total staff), the largest single workforce reduction the company has made since its 2022 'Year of Efficiency' layoffs
  • The cuts are explicitly tied to funding Meta's $72B AI capex plan for 2026, covering data centers, GPUs, and infrastructure
  • A SemiAnalysis analyst nicknamed 'Jeremy' spent $6,000/day on Claude tokens and singlehandedly rebuilt a product that a 100-person team had spent a decade building — illustrating the new productivity equation driving these decisions
  • Google separately committed up to $40 billion to Anthropic, signaling that hyperscaler AI investment is accelerating even as headcounts shrink
Why it matters: The 'Jeremy' story is the most important data point here for founders and operators. Meta's layoffs aren't about a failing business — they're about a fundamentally different ratio of humans to output. One highly AI-fluent employee with the right tools and $6K/day in compute is now outproducing teams of 100. For startups, this is both a threat and an opportunity: the companies that figure out how to hire 'Jeremys' and build AI-amplified workflows will have a structural cost and speed advantage over everyone else. The question isn't whether AI will replace roles — it's whether your team is on the replacing or the replaced side of that equation.

📰 Source: The Neuron

Share𝕏in
AI

🤖Kimi-K2.6 and Open-Source Models Are Being Built Specifically for Agentic AI

The Rundown: A new wave of open-source models — led by Kimi-K2.6 and Qwen3.6-27B — are being architected from the ground up for long-horizon agentic tasks, not just chat, signaling a fundamental shift in how the open-source ecosystem is positioning itself.

The details:

  • Kimi-K2.6 tops the Artificial Analysis Intelligence Index among all open models, featuring native multimodal support, 256K context, and proven performance on agent swarm orchestration over long task horizons
  • Kimi-K2.6 carries a modified MIT license requiring UI attribution for any product exceeding 100M monthly active users or $20M in revenue — a notable caveat for scaling startups
  • Qwen3.6-27B, released under Apache 2.0, is runnable locally on M-series MacBook Pros and scores competitively on agentic coding benchmarks without the complexity of a Mixture-of-Experts architecture
  • Both models reflect a deliberate design philosophy shift: MoE architectures, massive context windows, and tool-use capabilities built for autonomous multi-step agents, not just single-turn queries
Why it matters: The open-source model landscape is no longer just about raw benchmark scores — it's converging on agentic use cases, which is exactly where enterprise software value is being created right now. For founders building agent-based products, the ability to run competitive models locally (Qwen3.6 on a MacBook Pro) or deploy Kimi-K2.6 for complex multi-agent orchestration without paying frontier API costs is a genuine competitive unlock. Watch the licensing terms carefully though — Kimi's revenue threshold clause could create unexpected legal exposure for fast-growing startups.

📰 Source: AlphaSignal

Share𝕏in

Everything else in the news today

Caleb Franzen (Cubic Analytics) says Bitcoin remains in a bear market until it reclaims the 2-day 200 MA at ~$95,650 — he's using Williams %R to time re-accumulation entries
Michael McGuiness (Etherealize) argues ETH is more secure than Bitcoin due to proof-of-stake economics and projects a path to $250K if ETH reprices as a monetary asset rather than a tech token
Milk Road head of research predicts at least one of Anthropic, OpenAI, xAI, or Google Gemini will go bankrupt, citing zero moat in AI models and inference costs that scale with engagement
Goldman Sachs, BlackRock, and Morgan Stanley are all launching new ETF products tied to MicroStrategy's institutional Bitcoin strategy, driving fresh institutional BTC inflows
Boys' high school volleyball participation surged 76% over the last decade, creating a recruiting pipeline small colleges are now using to survive enrollment crises
Division II and III schools now have athletes comprising ~25% of all students, up from ~15% in 2004, making niche sports a critical enrollment and financial lifeline
Hartwick College launched a men's volleyball team this year as part of a broader survival strategy that also includes slashing tuition and adding new majors after enrollment dropped ~30% since the early 2010s
Google committed up to $40 billion to Anthropic — one of the largest single AI investment commitments from a hyperscaler to date
Kimi-K2.6's modified MIT license requires UI attribution for any product exceeding 100M MAU or $20M in revenue — a clause fast-growing AI startups should flag before building on it
Qwen3.6-27B is fully runnable on M-series MacBook Pros under Apache 2.0, making enterprise-grade agentic coding viable without cloud API costs
📰TodayFeed📡Signals💰Capital