Anthropic Leaked Claude Code's Entire Source. Again.
A debug file got bundled into a routine npm update of Claude Code on Monday. That file pointed to a zip on Anthropic's cloud storage containing the full source: nearly 2,000 files, 500,000 lines of TypeScript. Within hours, the repo had 41,500+ forks on GitHub.
This is the second time in just over a year that Anthropic has accidentally shipped Claude Code's internals. The leaked code revealed dozens of unreleased feature flags, including a persistent background assistant, remote control from a phone or browser, and session review tools. Anthropic says no customer data was exposed.
Here's what gets me. The features in those flags sound genuinely useful. A background assistant that keeps working after you close the terminal? Remote control from your phone? That's the kind of stuff small dev shops would pay for tomorrow. The irony is that the leak probably generated more excitement about Claude Code than any launch event could have.
Perplexity AI Sued Over Sharing User Chats With Meta and Google
A class-action complaint filed Tuesday in San Francisco federal court alleges that Perplexity downloads trackers the moment you log in. Those trackers allegedly give Meta and Google full access to your conversations with Perplexity's search engine, even in Incognito mode.
Perplexity denies it. Their spokesperson says they don't share user data with either company and hadn't even been served with the lawsuit yet. But this comes on top of Amazon winning a court order blocking Perplexity's Comet shopping agent for unauthorized access.
If you're building products on top of AI search APIs, this matters. The trust layer between users and AI tools is still thin. One credible privacy scandal and your users start looking for alternatives. For small businesses using Perplexity in their workflow, it's worth watching how this plays out before going deeper.
Mistral's Open-Source TTS Model Beats ElevenLabs
Mistral released Voxtral TTS last week. It's a 4-billion parameter text-to-speech model that supports nine languages. The weights are free on Hugging Face. Human evaluations show it matches or beats ElevenLabs on naturalness, with a time-to-first-audio of 90 milliseconds.
The real kicker: it can clone a voice from less than five seconds of audio. Accents, inflections, speech quirks. All captured. The API runs $0.016 per 1,000 characters if you don't want to self-host.
This is a big deal for anyone building voice features. ElevenLabs charges significantly more, and now there's an open-weight alternative that holds up in quality. If you're a solo dev or small team adding voice to your app, Voxtral just dropped your costs to nearly zero. Self-host it and your only expense is compute.
The Thread Connecting All Three
Every story this week points the same direction. The tools are getting cheaper and more accessible. The code behind the biggest AI products keeps finding its way into the open. And the companies building these tools still haven't figured out the trust basics around privacy and security.
For builders, that gap is the opportunity. Ship things people trust, using tools that cost less every month. That's the game right now.