PANews reported on August 28 that according to the latest report released by AI infrastructure company Anthropic, its AI chatbot Claude is being used by cybercriminals to carry out large-scale cyberattacks, with some ransoms reaching as high as $500,000. The report pointed out that despite Claude's "sophisticated" security measures, criminals still bypassed restrictions through social engineering methods such as "vibe hacking." This method uses AI to manipulate human emotions, trust, and decision-making, allowing attackers with limited technical skills to commit complex cybercrimes. In one case, a hacker used Claude to steal sensitive data from at least 17 institutions, including medical, government, and religious organizations, and demanded a ransom in Bitcoin, ranging from US$75,000 to US$500,000. In addition, Claude was used to help North Korean IT workers forge identities, pass technical tests, and obtain remote positions at top US technology companies. The income was used to support the North Korean regime. Earlier news: People familiar with the matter: Anthropic is in talks to raise up to $10 billion in new funds .PANews reported on August 28 that according to the latest report released by AI infrastructure company Anthropic, its AI chatbot Claude is being used by cybercriminals to carry out large-scale cyberattacks, with some ransoms reaching as high as $500,000. The report pointed out that despite Claude's "sophisticated" security measures, criminals still bypassed restrictions through social engineering methods such as "vibe hacking." This method uses AI to manipulate human emotions, trust, and decision-making, allowing attackers with limited technical skills to commit complex cybercrimes. In one case, a hacker used Claude to steal sensitive data from at least 17 institutions, including medical, government, and religious organizations, and demanded a ransom in Bitcoin, ranging from US$75,000 to US$500,000. In addition, Claude was used to help North Korean IT workers forge identities, pass technical tests, and obtain remote positions at top US technology companies. The income was used to support the North Korean regime. Earlier news: People familiar with the matter: Anthropic is in talks to raise up to $10 billion in new funds .

Anthropic: Claude was used in large-scale cyberattacks and demanded a ransom of $75,000 to $500,000 in BTC

2025/08/28 15:38

PANews reported on August 28 that according to the latest report released by AI infrastructure company Anthropic, its AI chatbot Claude is being used by cybercriminals to carry out large-scale cyberattacks, with some ransoms reaching as high as $500,000.

The report pointed out that despite Claude's "sophisticated" security measures, criminals still bypassed restrictions through social engineering methods such as "vibe hacking." This method uses AI to manipulate human emotions, trust, and decision-making, allowing attackers with limited technical skills to commit complex cybercrimes. In one case, a hacker used Claude to steal sensitive data from at least 17 institutions, including medical, government, and religious organizations, and demanded a ransom in Bitcoin, ranging from US$75,000 to US$500,000. In addition, Claude was used to help North Korean IT workers forge identities, pass technical tests, and obtain remote positions at top US technology companies. The income was used to support the North Korean regime.

Earlier news: People familiar with the matter: Anthropic is in talks to raise up to $10 billion in new funds .

Market Opportunity
Bitcoin Logo
Bitcoin Price(BTC)
$96,951.71
$96,951.71$96,951.71
+0.19%
USD
Bitcoin (BTC) Live Price Chart
Disclaimer: The articles reposted on this site are sourced from public platforms and are provided for informational purposes only. They do not necessarily reflect the views of MEXC. All rights remain with the original authors. If you believe any content infringes on third-party rights, please contact service@support.mexc.com for removal. MEXC makes no guarantees regarding the accuracy, completeness, or timeliness of the content and is not responsible for any actions taken based on the information provided. The content does not constitute financial, legal, or other professional advice, nor should it be considered a recommendation or endorsement by MEXC.