Anthropic will begin training its AI models on new user chat transcripts and coding sessions by default, unless users opt out. The company is also extending data retention to five years for those who do not opt out, with decisions required by September 28th, according to The Verge.
What changes for Claude users
The setting applies to new or resumed chats and coding sessions. Anthropic says previous chats or coding sessions will not be used for training unless a user resumes them. For users who click “Accept” now, training on their data and retention of up to five years will begin immediately, the report notes.
The updates cover Claude’s consumer subscription tiers — Free, Pro, and Max — including when those users access Claude Code from accounts tied to those plans. They do not apply to commercial tiers such as Claude Gov, Claude for Work, and Claude for Education, or to API usage, including via Amazon Bedrock and Google Cloud’s Vertex AI.
How the opt-out works
New users will choose their preference during signup. Existing users will encounter a pop-up reading “Updates to Consumer Terms and Policies,” noting an effective date of September 28, 2025, and offering a large “Accept” button. Beneath, a smaller line reads, “Allow the use of your chats and coding sessions to train and improve Anthropic AI models,” with a toggle that defaults to “On.” Users can defer with “Not now,” but must decide by September 28th.
Changing your decision
Users who want to opt out can toggle the setting to “Off” when the pop-up appears. If they already accepted and wish to change, they can go to Settings, then Privacy, then Privacy Settings, and toggle “Help improve Claude” to “Off.” The change will apply to future data; data already used for training cannot be withdrawn.
Anthropic wrote that, “To protect users’ privacy, we use a combination of tools and automated processes to filter or obfuscate sensitive data,” and stated, “We do not sell users’ data to third-parties.” The decisions and timelines described above were reported by The Verge, citing Anthropic’s blog post and consumer terms update.