Hey, it's Lucy,

In today's issue:

  • The AI gap that’s sinking employee retention 

  • What happens when LLMs get subpoenaed for your data? 

  • Our favorite note-taking app is trending 

🧠 This Week's Fix: The hidden reason you're losing your best people

The Microsoft AI Economy Institute dropped a stat that's been making the rounds: only 16% of the world's population is currently using AI. A lot of founders are reading that as a massive untapped market. 

In this week’s HyperFix breakdown on YouTube, we kept coming back to a more urgent question for anyone running a team right now: Are you prepared to invest in top AI talent? 

Here's what I suspect is happening inside a lot of companies right now: Your most ambitious employees are already using AI on their dime and learning skills on their own time. They’re delivering faster results, while the employees who aren’t investing on the side fall behind. But once the ones with real skills and talent have an advantage, they’ll go elsewhere and take their skills and AI tools with them. 

The employees who aren't self-teaching aren't lazy. They're busy doing the job you hired them to do. The gap between these two groups is growing every week you don't have a real policy in place.

As a founder, this is actually an easier problem to solve than you think. Google has implemented a longstanding model of giving employees the second half of Friday as unstructured time for learning, side projects, exploration.

And training doesn’t have to cost much. Anthropic just released free short-form courses that can take someone from zero to genuinely productive with AI pretty fast.

An AI tool stipend, protected learning time, even just a clear internal policy on which tools are approved are low-cost signals that tell your team you're building something worth staying for.

🔒 Hot Takes: You have nothing to hide... until you do

I've been sitting with something from an interview Meredith Whittaker, Signal's president, said to Scott Galloway on the Professor G podcast.

She spoke about how LLM’s could be subpoenaed for your queries, which is why wall-to-wall encryption is so important. 

“Encryption either works for everyone or it works for no one. You can't have a backdoor for some users and real privacy for others,” she said. 

Signal operates with a very different philosophy from most tech companies.

Instead of collecting data and protecting it, Signal tries to avoid collecting it in the first place. That’s because If the data doesn’t exist, it can’t be leaked or subpoenaed.

Signal also publishes the subpoenas they receive and what (if anything) they turn over.

In this week’s YouTube breakdown Xavier explains why cross-platform encryption is so hard:  Basically, both people have to be on the same app for it to work. That's why Signal and ProtonMail are great in theory but hard to actually adopt. Networking effects keep everyone on WhatsApp and Gmail even when better alternatives exist.

This is also why it’s important for companies to have enterprise and custom AI tools and agents that not only protect company data and trade secrets as well as policies about what employees can or can’t input into LLMs (and why people should never be using personal AI tools for work!)

Other points Whittaker makes: 

  • She believes people have a right to private communication.

    That includes journalists, dissidents, whistleblowers, ordinary friends, and anyone else trying to live a full human life with room for intimacy, dissent, and independent thought.

  • On regulation, she points to meaningful consent.

    Not performative cookie banners. Real consent around whether companies should be allowed to collect and generate data about people in the first place.

  • At the core, she thinks tech companies have been given too much authority to define us.

    Through data collection, profiling, and algorithmic sorting, companies increasingly shape how we are understood by advertisers, institutions, and even governments.

  • She does not think consumers freely chose to give up privacy.

    Her view is that people want connection, convenience, inclusion, and participation. The problem is that modern digital life has made those needs increasingly dependent on surveillance-driven platforms.

  • Her broader message is that privacy loss is structural, not just personal failure.

    People are not careless because they do not care. They often lack meaningful alternatives and real control.

AI Kool-Aid: The note-taker that doesn't crash your meetings

Granola is an AI meeting note tool that’s been showing up on Ramp’s trending and fastest-growing lists for Top SaaS vendors over the last month. 

Here's what makes it different from Otter, Firefly, Fathom and every other notetaker you've probably signed up for and forgotten about: it doesn't join your meeting. If you’re tired of meeting with four people and six note-takers, Granola captures audio directly from your device and works in the background.

The interface feels like a souped-up Apple Notes, so you can take written notes alongside the recording and it synthesizes everything together at the end.

The AI layer lets you query across all your past meetings. Ask it what your biggest open tasks are, what you got done this week, what was decided in that client call three weeks ago. 

The free version covers your last 30 days. Paid is around $15–20/month for full history. 

🔥 EXTRA HYPE

• John Oliver exposes data brokers

• OpenAI says its 5.3 model is less annoying and 5.4 is even better

• Software engineer claims AI has fried his brain

• The next $1 Trillion company won’t be a tech product

• OpenAI’s Head of Robotics resigns citing concerns with DOD contract

• How to talk to someone experiencing ‘AI Psychosis’

• More takeaways from The New Media Summit

📨 P.S. If you are building a newsletter and want to eliminate the weekly scramble, that is exactly what we help our customers do.

And if this sparked something, forward it to an AI-curious friend.

Keep Reading