Technology
- Home
- Technology
- News
Anthropic will start training its AI models on chat transcripts
Anthropic will start training its AI models on user data, including new chat transcripts and coding sessions, unless users choose to opt out. It's also extending its data retention policy to five years - again, for users that don't choose to opt out. All user…

Published 9 days ago on Aug 31st 2025, 5:00 am
By Web Desk

Anthropic will start training its AI models on user data, including new chat transcripts and coding sessions, unless users choose to opt out. It’s also extending its data retention policy to five years — again, for users that don’t choose to opt out.
All users will have to make a decision by September 28th. For users that click “Accept” now, Anthropic will immediately begin training its models on their data and keeping said data for up to five years, according to a blog post published by Anthropic on Thursday.
The setting applies to “new or resumed chats and coding sessions.” Even if you do agree to Anthropic training its AI models on your data, it won’t do so with previous chats or coding sessions that you haven’t resumed. But if you do continue an old chat or coding session, all bets are off.
The updates apply to all of Claude’s consumer subscription tiers, including Claude Free, Pro, and Max, “including when they use Claude Code from accounts associated with those plans,” Anthropic wrote. But they don’t apply to Anthropic’s commercial usage tiers, such as Claude Gov, Claude for Work, Claude for Education, or API use, “including via third parties such as Amazon Bedrock and Google Cloud’s Vertex AI.”
New users will have to select their preference via the Claude signup process. Existing users must decide via a pop-up, which they can defer by clicking a “Not now” button — though they will be forced to make a decision on September 28th.
But it’s important to note that many users may accidentally and quickly hit “Accept” without reading what they’re agreeing to.
[Image: Anthropic’s new terms. https://platform.theverge.com/wp-content/uploads/sites/2/2025/08/0.png?quality=90&strip=all]
The pop-up that users will see reads, in large letters, “Updates to Consumer Terms and Policies,” and the lines below it say, “An update to our Consumer Terms and Privacy Policy will take effect on September 28, 2025. You can accept the updated terms today.” There’s a big black “Accept” button at the bottom.
In smaller print below that, a few lines say, “Allow the use of your chats and coding sessions to train and improve Anthropic AI models,” with a toggle on / off switch next to it. It’s automatically set to “On.” Ostensibly, many users will immediately click the large “Accept” button without changing the toggle switch, even if they haven’t read it.
If you want to opt out, you can toggle the switch to “Off” when you see the pop-up. If you already accepted without realizing and want to change your decision, navigate to your Settings, then the Privacy tab, then the Privacy Settings section, and, finally, toggle to “Off” under the “Help improve Claude” option. Consumers can change their decision anytime via their privacy settings, but that new decision will just apply to future data — you can’t take back the data that the system has already been trained on.
“To protect users’ privacy, we use a combination of tools and automated processes to filter or obfuscate sensitive data,” Anthropic wrote in the blog post. “We do not sell users’ data to third-parties.”

TCL gives parents a monochrome mode to combat kids’ phone addiction
- 6 hours ago

The 'invisible fight' that has inspired The Fighting Nerds' viral success
- 5 hours ago

The Southern takeover of American culture
- 4 hours ago

52 more Palestinians martyred in Israeli worst bombing
- 2 hours ago

Flood in Jalalpur: Urban dam in danger, several settlements affected
- 2 hours ago
French parliament topples Bayrou government after just 9 months
- 12 hours ago

PSX Index reaches new high of 157,000 points
- 5 minutes ago

Pakistan’s pharma exports reach $457m
- 12 hours ago

'He changed for the better of the game': How Jim Harbaugh has evolved as Chargers coach
- 5 hours ago

Federal govt debt, liabilities reach new highs in history
- 20 minutes ago

Five family members killed in car-trailer collision on Haripur Road
- 3 hours ago

Earthquake tremors felt in Swat, surrounding areas
- 16 minutes ago
You May Like
Trending