The post Coalition Urges OpenAI to Scrap AI Ballot Measure Over Child Safety Concerns appeared on BitcoinEthereumNews.com. In brief A coalition of advocacy groupsThe post Coalition Urges OpenAI to Scrap AI Ballot Measure Over Child Safety Concerns appeared on BitcoinEthereumNews.com. In brief A coalition of advocacy groups

Coalition Urges OpenAI to Scrap AI Ballot Measure Over Child Safety Concerns

For feedback or concerns regarding this content, please contact us at crypto.news@mexc.com

In brief

  • A coalition of advocacy groups asks OpenAI to withdraw a California AI safety ballot initiative.
  • Critics say the measure would limit legal accountability and weaken protections for children.
  • While OpenAI has paused the campaign, the coalition claims it retains control of the initiative ahead of key deadlines.

A coalition of advocacy groups is urging ChatGPT developer OpenAI to withdraw a California ballot initiative that critics say could weaken protections for children and limit legal accountability for AI companies.

In a letter sent to OpenAI on Wednesday, reviewed by Decrypt, the group argues that the measure would lock in narrow child-safety protections, limit families’ ability to sue, and restrict California’s ability to strengthen AI laws in the future.

The letter, signed by more than two dozen organizations including AI policy non-profit Encode AI, the Center for Humane Technology, and the Electronic Privacy Information Center, asks OpenAI to dissolve its ballot committee and step back from the proposal while lawmakers work on legislation.

“The main demand here is for OpenAI to withdraw from the ballot,” Adam Billen, co-executive director of Encode AI, told Decrypt.

The dispute centers on a proposed “Parents & Kids Safe AI Act,” a California ballot initiative backed by OpenAI and Common Sense Media that would establish rules for how AI chatbots interact with minors, including safety requirements and compliance standards.

In the letter, the groups argue that those rules fall short. They say the measure defines harm too narrowly, limits enforcement, and restricts families’ ability to bring claims when children are harmed.

But OpenAI controls the actual ballot initiative, Billen said.

“OpenAI has the power to withdraw it or put the money in for signatures. All of the legal authority rests in their hands,” he said. “They have not actually withdrawn the initiative from the ballot. This is a common tactic in California, where you put an initiative up and put money in the committee.”

The letter points to the initiative’s definition of “severe harm,” which focuses on physical injury tied to suicide or violence, excluding a range of mental health impacts that researchers and families have raised as concerns.

It also highlights provisions that would bar parents and children from bringing claims under the initiative and limit enforcement tools available to state and local officials.

Another concern centers on how the proposal treats user data. The groups argue that its definition of encrypted user content could make it harder to access chatbot conversations that have served as key evidence in recent lawsuits.

“We read that as an attempt to block families from being able to disclose their dead children’s chat logs in court,” Billen said.

The letter also warns that the measure could be difficult to revise if passed. It would require a two-thirds vote in the legislature to amend and tie future changes to standards such as supporting “economic progress,” which advocates say could limit lawmakers’ ability to respond to new risks.

Billen said the initiative remains a factor in ongoing negotiations in Sacramento, even as OpenAI has paused its efforts to qualify it for the ballot.

“They have $10 million in the committee, and then you say to the legislature, if you don’t do what we want, we’ll put the money in and get the signatures and put this on the ballot, and if it passes, it will override whatever the legislature does,” he said. “So essentially, what’s happening now is they’re trying to steer and control what state legislators do through the use of the initiative as a threat they’re leaving on the table.”

OpenAI is not the only company facing scrutiny over chatbot-related harms. Earlier this month, the family of Jonathan Gavalas sued Google, claiming that Gemini pushed a delusion that escalated to violence and his ultimate suicide. Billen, however, said OpenAI’s approach reflects a broader pattern in the tech industry.

“The lobbying playbook that’s getting used on AI from these big guys in particular—the Googles, the Metas, Amazons—is the same strategy that was used previously on other tech issues,” he said.

For now, the coalition is focused on getting OpenAI to withdraw the measure and allow lawmakers to move forward through the legislative process.

“It’s really important, particularly for the companies that are putting that technology out there, to not be the ones who are writing the rules that regulate them, because that’s not meaningful protections,” Billen said.

OpenAI did not immediately respond to Decrypt’s request for comment.

Daily Debrief Newsletter

Start every day with the top news stories right now, plus original features, a podcast, videos and more.

Source: https://decrypt.co/361638/coalition-openai-scrap-ai-ballot-measure-child-safety

Market Opportunity
The AI Prophecy Logo
The AI Prophecy Price(ACT)
$0.01372
$0.01372$0.01372
-0.29%
USD
The AI Prophecy (ACT) Live Price Chart
Disclaimer: The articles reposted on this site are sourced from public platforms and are provided for informational purposes only. They do not necessarily reflect the views of MEXC. All rights remain with the original authors. If you believe any content infringes on third-party rights, please contact crypto.news@mexc.com for removal. MEXC makes no guarantees regarding the accuracy, completeness, or timeliness of the content and is not responsible for any actions taken based on the information provided. The content does not constitute financial, legal, or other professional advice, nor should it be considered a recommendation or endorsement by MEXC.

You May Also Like

Leonardo AI Unveils Comprehensive Image Editing Suite with Six Model Options

Leonardo AI Unveils Comprehensive Image Editing Suite with Six Model Options

Leonardo AI releases detailed guide to AI image editing featuring Nano Banana, GPT Image 1.5, and Flux models as competition heats up with Adobe, Google, and Canva
Share
BlockChain News2026/03/19 12:39
RBA warns high and rising risk of severe shock to world economy amid Iran war

RBA warns high and rising risk of severe shock to world economy amid Iran war

The post RBA warns high and rising risk of severe shock to world economy amid Iran war appeared on BitcoinEthereumNews.com. The Reserve Bank of Australia (RBA)
Share
BitcoinEthereumNews2026/03/19 11:49
Headwind Helps Best Wallet Token

Headwind Helps Best Wallet Token

The post Headwind Helps Best Wallet Token appeared on BitcoinEthereumNews.com. Google has announced the launch of a new open-source protocol called Agent Payments Protocol (AP2) in partnership with Coinbase, the Ethereum Foundation, and 60 other organizations. This allows AI agents to make payments on behalf of users using various methods such as real-time bank transfers, credit and debit cards, and, most importantly, stablecoins. Let’s explore in detail what this could mean for the broader cryptocurrency markets, and also highlight a presale crypto (Best Wallet Token) that could explode as a result of this development. Google’s Push for Stablecoins Agent Payments Protocol (AP2) uses digital contracts known as ‘Intent Mandates’ and ‘Verifiable Credentials’ to ensure that AI agents undertake only those payments authorized by the user. Mandates, by the way, are cryptographically signed, tamper-proof digital contracts that act as verifiable proof of a user’s instruction. For example, let’s say you instruct an AI agent to never spend more than $200 in a single transaction. This instruction is written into an Intent Mandate, which serves as a digital contract. Now, whenever the AI agent tries to make a payment, it must present this mandate as proof of authorization, which will then be verified via the AP2 protocol. Alongside this, Google has also launched the A2A x402 extension to accelerate support for the Web3 ecosystem. This production-ready solution enables agent-based crypto payments and will help reshape the growth of cryptocurrency integration within the AP2 protocol. Google’s inclusion of stablecoins in AP2 is a massive vote of confidence in dollar-pegged cryptocurrencies and a huge step toward making them a mainstream payment option. This widens stablecoin usage beyond trading and speculation, positioning them at the center of the consumption economy. The recent enactment of the GENIUS Act in the U.S. gives stablecoins more structure and legal support. Imagine paying for things like data crawls, per-task…
Share
BitcoinEthereumNews2025/09/18 01:27