BitcoinWorld
AI Safety Legislation Breakthrough: New York’s RAISE Act Creates Powerful New Regulations
In a landmark move that could reshape the artificial intelligence landscape, New York Governor Kathy Hochul has signed the RAISE Act, positioning the state as a national leader in AI safety legislation. This decisive action comes as federal efforts stall, creating a powerful new regulatory framework that tech giants must now navigate. For cryptocurrency and blockchain innovators watching regulatory trends, this development signals how governments are approaching emerging technology oversight with increasing urgency.
The RAISE Act establishes comprehensive requirements for AI developers operating in New York. The legislation creates what many are calling the strongest state-level AI regulation framework in the United States. Here’s what companies need to know:
The path to Governor Kathy Hochul‘s signature was anything but smooth. State lawmakers initially passed the RAISE Act in June, but intense lobbying from the tech industry prompted Hochul to propose significant revisions. According to The New York Times, a compromise was reached: Hochul agreed to sign the original bill, while lawmakers committed to implementing her requested changes in the next legislative session.
This political maneuvering highlights the tension between innovation and oversight. State Senator Andrew Gounardes, one of the bill’s sponsors, didn’t mince words about the process: “Big Tech thought they could weasel their way into killing our bill. We shut them down and passed the strongest AI safety law in the country.”
New York isn’t pioneering this approach alone. California Governor Gavin Newsom signed similar legislation in September, creating what Hochul calls a “unified benchmark” between America’s two leading technology states.
| State | Legislation | Key Features | Penalties |
|---|---|---|---|
| New York | RAISE Act | Safety protocol disclosure, 72-hour incident reporting, new AI office | Up to $1M ($3M repeat) |
| California | Safety Bill (September) | Risk assessment requirements, transparency measures | Similar penalty structure |
Hochul emphasized the significance of this coordinated approach: “This law builds on California’s recently adopted framework, creating a unified benchmark among the country’s leading tech states as the federal government lags behind, failing to implement common-sense regulations that protect the public.”
The response from the technology sector has been mixed but revealing. Major AI companies have taken notably different positions:
This opposition comes amid broader federal challenges. President Donald Trump recently signed an executive order directing federal agencies to challenge state AI laws, backed by his AI czar David Sacks. This move represents the latest attempt to limit state regulatory authority and will likely face legal challenges.
For businesses operating in the AI space, the RAISE Act creates immediate compliance considerations:
The RAISE Act represents more than just another regulation—it signals a fundamental shift in how society approaches artificial intelligence governance. As states take the lead where federal government hesitates, we’re witnessing the emergence of a patchwork regulatory environment that could either spur innovation through clear guidelines or hinder it through conflicting requirements.
For cryptocurrency and blockchain professionals, this development offers valuable insights into regulatory trends affecting emerging technologies. The same tension between innovation and protection, between state and federal authority, plays out across multiple technological frontiers.
What companies does the RAISE Act affect?
The legislation primarily targets large AI developers operating in New York State, with specific thresholds to be defined in implementing regulations.
How does this relate to federal AI regulation?
The RAISE Act creates state-level requirements while federal legislation remains under discussion. Companies like OpenAI and Anthropic have called for federal standards to create consistency.
Who were the key political figures involved?
Governor Kathy Hochul signed the bill after negotiations with sponsors including State Senator Andrew Gounardes and Assemblyman Alex Bores.
What penalties can companies face?
Violations can result in fines up to $1 million for initial offenses and $3 million for subsequent violations, with additional enforcement through the new AI office.
When do these requirements take effect?
The legislation establishes timelines for implementation, with specific dates to be determined as the new regulatory office is established and rules are finalized.
New York’s RAISE Act represents a transformative moment in AI safety legislation. By establishing clear requirements, substantial penalties, and dedicated oversight, Governor Kathy Hochul has positioned New York at the forefront of responsible AI development. This action, combined with California’s parallel efforts, creates powerful momentum for comprehensive AI regulation that balances innovation with public protection.
The coming months will reveal how effectively this framework operates in practice, how industry responds to the new requirements, and whether federal lawmakers follow the states’ lead. One thing is certain: the era of unregulated AI development is ending, and a new chapter of accountable innovation is beginning.
To learn more about the latest developments in artificial intelligence regulation and policy, explore our comprehensive coverage of key trends shaping AI governance and institutional adoption.
This post AI Safety Legislation Breakthrough: New York’s RAISE Act Creates Powerful New Regulations first appeared on BitcoinWorld.


