Claude AI Users: Big Privacy Update Coming!

Anthropic—the company behind Claude AI—just announced a major privacy change, and if you use Claude, you’ll need to take action before September 28.

What’s changing?

  • Before: Regular users’ chats were auto-deleted after 30 days.

  • Now: Unless you opt out, your data could be stored for up to 5 years.

  • Who’s affected: All Claude Free, Pro, Max, and Claude Code users.

  • Who’s not affected: Claude for Work & Education customers.

Anthropic says this shift will help make Claude “smarter and safer.” But here’s the reality: AI companies like Anthropic, OpenAI, and Google need massive amounts of real-world conversations to keep training their models and stay competitive.

Why it matters

Your chats may contain sensitive information—business ideas, personal struggles, even snippets of code. Keeping this data for years (instead of 30 days) raises serious data privacy and ethics questions.

And here’s a red flag:

The new pop-up makes “Accept” big and bold, while the opt-out toggle is smaller and easy to overlook.

That design choice means a lot of users could end up sharing data without fully realizing it.

What you should do

👉 If you’re a Claude user, check your settings right away.
👉 Decide if you’re comfortable sharing your chats for model training—or if you’d rather opt out.

Your data is valuable, and this update puts the choice directly in your hands.

💡 Pro tip: Always read the fine print when AI companies update their policies. Convenience is nice, but privacy lasts longer.

Next
Next

Skip the glam. Show up anyway.