The post Anthropic Announces Updates to Consumer Terms and Privacy Policy appeared on BitcoinEthereumNews.com. Felix Pinkston Oct 27, 2025 21:03 Anthropic has updated its Consumer Terms and Privacy Policy, introducing data usage choices for users and extending data retention to enhance AI model development. Anthropic, an AI safety and research company, has announced significant updates to its Consumer Terms and Privacy Policy. These changes are designed to enhance the capability and safety of its AI models, such as Claude, by offering users more control over their data usage, according to Anthropic. Data Usage and User Control With the new updates, users of Anthropic’s Claude Free, Pro, and Max plans can choose whether their data is used to improve AI models and strengthen safeguards against harmful activities. This option, however, does not extend to services under the company’s Commercial Terms, such as Claude for Work or API usage through platforms like Amazon Bedrock and Google Cloud’s Vertex AI. Users are encouraged to participate in this initiative to help refine model safety and accuracy, particularly in detecting harmful content and improving coding, analysis, and reasoning skills. New users will make this choice during the signup process, while existing users will be prompted to select their preferences via an in-app notification. Data Retention Policy Anthropic is also extending its data retention period to five years for users who opt to allow their data to be utilized in model training. This extended retention is applicable only to new or resumed chats and coding sessions, allowing for better support in model development and safety improvements. Users who choose not to participate will continue under the existing 30-day data retention policy. The company ensures that user privacy is protected through a combination of tools and automated processes to filter or obfuscate sensitive data. Importantly, Anthropic does not sell user data to third… The post Anthropic Announces Updates to Consumer Terms and Privacy Policy appeared on BitcoinEthereumNews.com. Felix Pinkston Oct 27, 2025 21:03 Anthropic has updated its Consumer Terms and Privacy Policy, introducing data usage choices for users and extending data retention to enhance AI model development. Anthropic, an AI safety and research company, has announced significant updates to its Consumer Terms and Privacy Policy. These changes are designed to enhance the capability and safety of its AI models, such as Claude, by offering users more control over their data usage, according to Anthropic. Data Usage and User Control With the new updates, users of Anthropic’s Claude Free, Pro, and Max plans can choose whether their data is used to improve AI models and strengthen safeguards against harmful activities. This option, however, does not extend to services under the company’s Commercial Terms, such as Claude for Work or API usage through platforms like Amazon Bedrock and Google Cloud’s Vertex AI. Users are encouraged to participate in this initiative to help refine model safety and accuracy, particularly in detecting harmful content and improving coding, analysis, and reasoning skills. New users will make this choice during the signup process, while existing users will be prompted to select their preferences via an in-app notification. Data Retention Policy Anthropic is also extending its data retention period to five years for users who opt to allow their data to be utilized in model training. This extended retention is applicable only to new or resumed chats and coding sessions, allowing for better support in model development and safety improvements. Users who choose not to participate will continue under the existing 30-day data retention policy. The company ensures that user privacy is protected through a combination of tools and automated processes to filter or obfuscate sensitive data. Importantly, Anthropic does not sell user data to third…

Anthropic Announces Updates to Consumer Terms and Privacy Policy



Felix Pinkston
Oct 27, 2025 21:03

Anthropic has updated its Consumer Terms and Privacy Policy, introducing data usage choices for users and extending data retention to enhance AI model development.

Anthropic, an AI safety and research company, has announced significant updates to its Consumer Terms and Privacy Policy. These changes are designed to enhance the capability and safety of its AI models, such as Claude, by offering users more control over their data usage, according to Anthropic.

Data Usage and User Control

With the new updates, users of Anthropic’s Claude Free, Pro, and Max plans can choose whether their data is used to improve AI models and strengthen safeguards against harmful activities. This option, however, does not extend to services under the company’s Commercial Terms, such as Claude for Work or API usage through platforms like Amazon Bedrock and Google Cloud’s Vertex AI.

Users are encouraged to participate in this initiative to help refine model safety and accuracy, particularly in detecting harmful content and improving coding, analysis, and reasoning skills. New users will make this choice during the signup process, while existing users will be prompted to select their preferences via an in-app notification.

Data Retention Policy

Anthropic is also extending its data retention period to five years for users who opt to allow their data to be utilized in model training. This extended retention is applicable only to new or resumed chats and coding sessions, allowing for better support in model development and safety improvements. Users who choose not to participate will continue under the existing 30-day data retention policy.

The company ensures that user privacy is protected through a combination of tools and automated processes to filter or obfuscate sensitive data. Importantly, Anthropic does not sell user data to third parties.

Existing users have until October 8, 2025, to accept the updated terms and make their preferences known. The new policies will become effective immediately upon acceptance and apply only to new or resumed interactions with Claude. Users can modify their privacy settings at any time through Anthropic’s platform.

For further information on the updates, users are encouraged to visit the FAQ section provided by Anthropic.

Image source: Shutterstock

Source: https://blockchain.news/news/anthropic-updates-consumer-terms-privacy-policy

Market Opportunity
Omnity Network Logo
Omnity Network Price(OCT)
$0.007051
$0.007051$0.007051
+2.56%
USD
Omnity Network (OCT) Live Price Chart
Disclaimer: The articles reposted on this site are sourced from public platforms and are provided for informational purposes only. They do not necessarily reflect the views of MEXC. All rights remain with the original authors. If you believe any content infringes on third-party rights, please contact service@support.mexc.com for removal. MEXC makes no guarantees regarding the accuracy, completeness, or timeliness of the content and is not responsible for any actions taken based on the information provided. The content does not constitute financial, legal, or other professional advice, nor should it be considered a recommendation or endorsement by MEXC.

You May Also Like

Spur Protocol Daily Quiz 21 February 2026: Claim Free Tokens and Boost Your Crypto Wallet

Spur Protocol Daily Quiz 21 February 2026: Claim Free Tokens and Boost Your Crypto Wallet

Spur Protocol Daily Quiz February 21 2026: Today’s Correct Answer and How to Earn Free In-App Tokens The Spur Protocol Daily Quiz for February 21, 2026, is
Share
Hokanews2026/02/21 17:10
SEC Eases Stablecoin Capital Rules

SEC Eases Stablecoin Capital Rules

The post SEC Eases Stablecoin Capital Rules appeared on BitcoinEthereumNews.com. Regulations The U.S. Securities and Exchange Commission introduced a major shift
Share
BitcoinEthereumNews2026/02/21 17:01
Elizabeth Warren raises ethics concerns over White House crypto czar David Sacks’ tenure

Elizabeth Warren raises ethics concerns over White House crypto czar David Sacks’ tenure

The post Elizabeth Warren raises ethics concerns over White House crypto czar David Sacks’ tenure appeared on BitcoinEthereumNews.com. Democratic lawmakers pressed David Sacks, President Donald Trump’s “crypto and AI czar,” on Sept. 17 to disclose whether he has exceeded the time limits of his temporary White House appointment, raising questions about possible ethics violations. In a letter signed by Senator Elizabeth Warren and seven other members of Congress, the lawmakers said Sacks may have surpassed the 130-day cap for Special Government Employees, a category that allows private-sector professionals to serve the government on a part-time or temporary basis. The Office of Government Ethics sets the cap to minimize conflicts of interest, as SGEs are permitted to continue receiving outside salaries while in government service. Warren has previously raised similar concerns around Sacks’ appointment. Conflict-of-interest worries Sacks, a venture capitalist and general partner at Craft Ventures, has played a high-profile role in shaping Trump administration policy on digital assets and artificial intelligence. Lawmakers argued that his private financial ties to Silicon Valley raise serious ethical questions if he is no longer within the bounds of SGE status. According to the letter: “When issuing your ethics waiver, the White House noted that the careful balance in conflict-of-interest rules for SGEs was reached with the understanding that they would only serve the public ‘on a temporary basis. For you in particular, compliance with the SGE time limit is critical, given the scale of your conflicts of interest.” The group noted that Sacks’ private salary from Craft Ventures is permissible only under the temporary provisions of his appointment. If he has worked past the legal limit, the lawmakers warned, his continued dual roles could represent a breach of ethics. Counting the days According to the letter, Sacks was appointed in December 2024 and began working around Trump’s inauguration on Jan. 20, 2025. By the lawmakers’ calculation, he reached the 130-day threshold in…
Share
BitcoinEthereumNews2025/09/18 07:37