In the fast-evolving world of AI, open-sourcing models has become a battleground for innovation, ethics, and regulation. Just recently, on August 25, 2025 (yes, that’s today!), Elon Musk announced that xAI has open-sourced Grok 2.5, its flagship model from last year, making the weights available on Hugging Face. This move echoes OpenAI’s earlier release on August 5, 2025, of two open-source models: gpt-oss-120b (120 billion parameters) and gpt-oss-20b (20 billion parameters), under the permissive Apache 2.0 license with an added usage policy. Both companies are pushing the boundaries of general-purpose AI (GPAI) models — those versatile systems capable of tackling reasoning, coding, math, and more. But with great power comes great scrutiny, especially under the EU AI Act (Regulation (EU) 2024/1689), which sets strict rules for transparency, risk management, and open-source claims. Inspired by a recent Medium article analyzing OpenAI’s models (check it out here), I’ll conduct a similar compliance check for Grok 2.5. Using publicly available info like model cards and announcements, we’ll evaluate its alignment with the Act’s requirements for open-source GPAI models. Spoiler: It’s not as straightforward as it seems. Note that this is a high-level analysis — true compliance needs official regulatory review.Grok Breaking Down the Models: Grok 2.5 vs. OpenAI’s Duo Let’s start with the basics to set the stage. Grok 2.5 (xAI): This beast clocks in at around 270 billion parameters, trained back in 2024 on text-based tasks like reasoning. What’s released? The model weights (a hefty ~500 GB across 42 files) and the tokenizer. But details on the full architecture, training code, or datasets? Slim to none — only hints like a June 2024 knowledge cutoff. The license is a custom “Grok 2 Community License Agreement”: revocable, allows commercial and non-commercial use, but slaps on restrictions like banning its use to train or improve other AI models. No separate usage policy, just the license terms. gpt-oss-120b and gpt-oss-20b (OpenAI): Smaller siblings at 120B and 20B parameters, trained on trillions of filtered tokens with chain-of-thought tweaks. Releases include weights, architecture details, tokenizer (via TikToken), and even some training code snippets. Licensed under Apache 2.0 — super permissive — with a usage policy encouraging responsible AI without heavy restrictions. Training compute (measured in FLOPs) isn’t explicitly shared for Grok 2.5, but given its size, it’s probably under the 10²⁵ FLOPs mark that triggers “systemic risk” status — similar to GPT-3’s estimates (around 3.14 x 10²³ FLOPs). OpenAI’s models are in the same boat, as noted in the original article. The EU AI Act: What Open-Source GPAI Models Need to Nail The EU AI Act classifies GPAI as AI that handles diverse tasks without a narrow focus (Article 3, point 63). For “open-source” ones (Recital 102, Article 3 point 12), the bar is high: They must use a “free and open license” allowing unrestricted access, use, study, modification, and sharing — including derivatives — with no commercial bans or field limits. Key obligations include: Article 53 (Baseline for All GPAI Providers): Technical docs on training/testing, copyright respect (like honoring opt-outs from Directive 2019/790), and usage info. Article 55 (For Systemic Risks): If over 10²⁵ FLOPs or deemed high-risk, add risk assessments, testing, and reporting. Open-source models can snag exemptions from some transparency rules if their license is truly open and they’re not monetized (e.g., no subs). Exemptions and the Code of Practice: Genuine open-source (non-systemic) skips some hurdles if the license is barrier-free. The EU’s upcoming Code of Practice (rolling out in 2025) offers a voluntary roadmap for safety, transparency, and copyright. The Act loves open-source for sparking innovation but calls out licenses with sneaky restrictions. The Compliance Breakdown: Where Grok 2.5 Stands vs. OpenAI Mirroring the original article’s style, here’s a table assessing compliance based on public data. Ratings: “Likely Compliant” (checks out), “Partial/Questionable” (iffy spots), or “Potential Non-Compliance” (red flags). I’ve included OpenAI for direct comparison.https://medium.com/media/8218eea850b241bff8bd2a4f09d44233/href Wrapping It Up: Lessons for xAI and the AI World Grok 2.5 ticks some boxes as a GPAI release but stumbles on open-source purity thanks to its custom license’s revocability and restrictions — potentially stripping away exemptions and inviting deeper EU scrutiny under Articles 53 and 55. OpenAI’s gpt-oss models, with their straightforward Apache 2.0 setup and better docs, seem to sail through more smoothly, qualifying for those sweet exemptions while hitting baselines. If Grok 2.5’s FLOPs secretly top 10²⁵ (doubtful, but possible), the gaps widen. xAI could level up by switching to a standard open license and beefing up transparency. For anyone in AI, this highlights the Act’s push: Open-source is great, but only if it’s truly open. Curious about the EU’s full guidelines? Dive into them or chat with regulators for the real deal. What do you think — will more companies follow suit, or tighten up? Drop your thoughts below! xAI’s Grok 2.5: Open-Sourced, But Does It Pass the EU AI Act Test? was originally published in Coinmonks on Medium, where people are continuing the conversation by highlighting and responding to this storyIn the fast-evolving world of AI, open-sourcing models has become a battleground for innovation, ethics, and regulation. Just recently, on August 25, 2025 (yes, that’s today!), Elon Musk announced that xAI has open-sourced Grok 2.5, its flagship model from last year, making the weights available on Hugging Face. This move echoes OpenAI’s earlier release on August 5, 2025, of two open-source models: gpt-oss-120b (120 billion parameters) and gpt-oss-20b (20 billion parameters), under the permissive Apache 2.0 license with an added usage policy. Both companies are pushing the boundaries of general-purpose AI (GPAI) models — those versatile systems capable of tackling reasoning, coding, math, and more. But with great power comes great scrutiny, especially under the EU AI Act (Regulation (EU) 2024/1689), which sets strict rules for transparency, risk management, and open-source claims. Inspired by a recent Medium article analyzing OpenAI’s models (check it out here), I’ll conduct a similar compliance check for Grok 2.5. Using publicly available info like model cards and announcements, we’ll evaluate its alignment with the Act’s requirements for open-source GPAI models. Spoiler: It’s not as straightforward as it seems. Note that this is a high-level analysis — true compliance needs official regulatory review.Grok Breaking Down the Models: Grok 2.5 vs. OpenAI’s Duo Let’s start with the basics to set the stage. Grok 2.5 (xAI): This beast clocks in at around 270 billion parameters, trained back in 2024 on text-based tasks like reasoning. What’s released? The model weights (a hefty ~500 GB across 42 files) and the tokenizer. But details on the full architecture, training code, or datasets? Slim to none — only hints like a June 2024 knowledge cutoff. The license is a custom “Grok 2 Community License Agreement”: revocable, allows commercial and non-commercial use, but slaps on restrictions like banning its use to train or improve other AI models. No separate usage policy, just the license terms. gpt-oss-120b and gpt-oss-20b (OpenAI): Smaller siblings at 120B and 20B parameters, trained on trillions of filtered tokens with chain-of-thought tweaks. Releases include weights, architecture details, tokenizer (via TikToken), and even some training code snippets. Licensed under Apache 2.0 — super permissive — with a usage policy encouraging responsible AI without heavy restrictions. Training compute (measured in FLOPs) isn’t explicitly shared for Grok 2.5, but given its size, it’s probably under the 10²⁵ FLOPs mark that triggers “systemic risk” status — similar to GPT-3’s estimates (around 3.14 x 10²³ FLOPs). OpenAI’s models are in the same boat, as noted in the original article. The EU AI Act: What Open-Source GPAI Models Need to Nail The EU AI Act classifies GPAI as AI that handles diverse tasks without a narrow focus (Article 3, point 63). For “open-source” ones (Recital 102, Article 3 point 12), the bar is high: They must use a “free and open license” allowing unrestricted access, use, study, modification, and sharing — including derivatives — with no commercial bans or field limits. Key obligations include: Article 53 (Baseline for All GPAI Providers): Technical docs on training/testing, copyright respect (like honoring opt-outs from Directive 2019/790), and usage info. Article 55 (For Systemic Risks): If over 10²⁵ FLOPs or deemed high-risk, add risk assessments, testing, and reporting. Open-source models can snag exemptions from some transparency rules if their license is truly open and they’re not monetized (e.g., no subs). Exemptions and the Code of Practice: Genuine open-source (non-systemic) skips some hurdles if the license is barrier-free. The EU’s upcoming Code of Practice (rolling out in 2025) offers a voluntary roadmap for safety, transparency, and copyright. The Act loves open-source for sparking innovation but calls out licenses with sneaky restrictions. The Compliance Breakdown: Where Grok 2.5 Stands vs. OpenAI Mirroring the original article’s style, here’s a table assessing compliance based on public data. Ratings: “Likely Compliant” (checks out), “Partial/Questionable” (iffy spots), or “Potential Non-Compliance” (red flags). I’ve included OpenAI for direct comparison.https://medium.com/media/8218eea850b241bff8bd2a4f09d44233/href Wrapping It Up: Lessons for xAI and the AI World Grok 2.5 ticks some boxes as a GPAI release but stumbles on open-source purity thanks to its custom license’s revocability and restrictions — potentially stripping away exemptions and inviting deeper EU scrutiny under Articles 53 and 55. OpenAI’s gpt-oss models, with their straightforward Apache 2.0 setup and better docs, seem to sail through more smoothly, qualifying for those sweet exemptions while hitting baselines. If Grok 2.5’s FLOPs secretly top 10²⁵ (doubtful, but possible), the gaps widen. xAI could level up by switching to a standard open license and beefing up transparency. For anyone in AI, this highlights the Act’s push: Open-source is great, but only if it’s truly open. Curious about the EU’s full guidelines? Dive into them or chat with regulators for the real deal. What do you think — will more companies follow suit, or tighten up? Drop your thoughts below! xAI’s Grok 2.5: Open-Sourced, But Does It Pass the EU AI Act Test? was originally published in Coinmonks on Medium, where people are continuing the conversation by highlighting and responding to this story

xAI’s Grok 2.5: Open-Sourced, But Does It Pass the EU AI Act Test?

2025/08/25 23:38

In the fast-evolving world of AI, open-sourcing models has become a battleground for innovation, ethics, and regulation. Just recently, on August 25, 2025 (yes, that’s today!), Elon Musk announced that xAI has open-sourced Grok 2.5, its flagship model from last year, making the weights available on Hugging Face. This move echoes OpenAI’s earlier release on August 5, 2025, of two open-source models: gpt-oss-120b (120 billion parameters) and gpt-oss-20b (20 billion parameters), under the permissive Apache 2.0 license with an added usage policy.

Both companies are pushing the boundaries of general-purpose AI (GPAI) models — those versatile systems capable of tackling reasoning, coding, math, and more. But with great power comes great scrutiny, especially under the EU AI Act (Regulation (EU) 2024/1689), which sets strict rules for transparency, risk management, and open-source claims.

Inspired by a recent Medium article analyzing OpenAI’s models (check it out here), I’ll conduct a similar compliance check for Grok 2.5. Using publicly available info like model cards and announcements, we’ll evaluate its alignment with the Act’s requirements for open-source GPAI models. Spoiler: It’s not as straightforward as it seems. Note that this is a high-level analysis — true compliance needs official regulatory review.

Grok

Breaking Down the Models: Grok 2.5 vs. OpenAI’s Duo

Let’s start with the basics to set the stage.

  • Grok 2.5 (xAI): This beast clocks in at around 270 billion parameters, trained back in 2024 on text-based tasks like reasoning. What’s released? The model weights (a hefty ~500 GB across 42 files) and the tokenizer. But details on the full architecture, training code, or datasets? Slim to none — only hints like a June 2024 knowledge cutoff. The license is a custom “Grok 2 Community License Agreement”: revocable, allows commercial and non-commercial use, but slaps on restrictions like banning its use to train or improve other AI models. No separate usage policy, just the license terms.
  • gpt-oss-120b and gpt-oss-20b (OpenAI): Smaller siblings at 120B and 20B parameters, trained on trillions of filtered tokens with chain-of-thought tweaks. Releases include weights, architecture details, tokenizer (via TikToken), and even some training code snippets. Licensed under Apache 2.0 — super permissive — with a usage policy encouraging responsible AI without heavy restrictions.

Training compute (measured in FLOPs) isn’t explicitly shared for Grok 2.5, but given its size, it’s probably under the 10²⁵ FLOPs mark that triggers “systemic risk” status — similar to GPT-3’s estimates (around 3.14 x 10²³ FLOPs). OpenAI’s models are in the same boat, as noted in the original article.

The EU AI Act: What Open-Source GPAI Models Need to Nail

The EU AI Act classifies GPAI as AI that handles diverse tasks without a narrow focus (Article 3, point 63). For “open-source” ones (Recital 102, Article 3 point 12), the bar is high: They must use a “free and open license” allowing unrestricted access, use, study, modification, and sharing — including derivatives — with no commercial bans or field limits.

Key obligations include:

  • Article 53 (Baseline for All GPAI Providers): Technical docs on training/testing, copyright respect (like honoring opt-outs from Directive 2019/790), and usage info.
  • Article 55 (For Systemic Risks): If over 10²⁵ FLOPs or deemed high-risk, add risk assessments, testing, and reporting. Open-source models can snag exemptions from some transparency rules if their license is truly open and they’re not monetized (e.g., no subs).
  • Exemptions and the Code of Practice: Genuine open-source (non-systemic) skips some hurdles if the license is barrier-free. The EU’s upcoming Code of Practice (rolling out in 2025) offers a voluntary roadmap for safety, transparency, and copyright.

The Act loves open-source for sparking innovation but calls out licenses with sneaky restrictions.

The Compliance Breakdown: Where Grok 2.5 Stands vs. OpenAI

Mirroring the original article’s style, here’s a table assessing compliance based on public data. Ratings: “Likely Compliant” (checks out), “Partial/Questionable” (iffy spots), or “Potential Non-Compliance” (red flags). I’ve included OpenAI for direct comparison.

https://medium.com/media/8218eea850b241bff8bd2a4f09d44233/href

Wrapping It Up: Lessons for xAI and the AI World

Grok 2.5 ticks some boxes as a GPAI release but stumbles on open-source purity thanks to its custom license’s revocability and restrictions — potentially stripping away exemptions and inviting deeper EU scrutiny under Articles 53 and 55. OpenAI’s gpt-oss models, with their straightforward Apache 2.0 setup and better docs, seem to sail through more smoothly, qualifying for those sweet exemptions while hitting baselines.

If Grok 2.5’s FLOPs secretly top 10²⁵ (doubtful, but possible), the gaps widen. xAI could level up by switching to a standard open license and beefing up transparency. For anyone in AI, this highlights the Act’s push: Open-source is great, but only if it’s truly open.

Curious about the EU’s full guidelines? Dive into them or chat with regulators for the real deal. What do you think — will more companies follow suit, or tighten up? Drop your thoughts below!


xAI’s Grok 2.5: Open-Sourced, But Does It Pass the EU AI Act Test? was originally published in Coinmonks on Medium, where people are continuing the conversation by highlighting and responding to this story.

Market Opportunity
Xai Logo
Xai Price(XAI)
$0.01588
$0.01588$0.01588
+0.37%
USD
Xai (XAI) Live Price Chart
Disclaimer: The articles reposted on this site are sourced from public platforms and are provided for informational purposes only. They do not necessarily reflect the views of MEXC. All rights remain with the original authors. If you believe any content infringes on third-party rights, please contact service@support.mexc.com for removal. MEXC makes no guarantees regarding the accuracy, completeness, or timeliness of the content and is not responsible for any actions taken based on the information provided. The content does not constitute financial, legal, or other professional advice, nor should it be considered a recommendation or endorsement by MEXC.

You May Also Like

U.S. Moves Grip on Crypto Regulation Intensifies

U.S. Moves Grip on Crypto Regulation Intensifies

The post U.S. Moves Grip on Crypto Regulation Intensifies appeared on BitcoinEthereumNews.com. The United States is contending with the intricacies of cryptocurrency regulation as newly enacted legislation stirs debate over centralized versus decentralized finance. The recent passage of the GENIUS Act under Bo Hines’ leadership is perceived to skew favor towards centralized entities, potentially disadvantaging decentralized innovations. Continue Reading:U.S. Moves Grip on Crypto Regulation Intensifies Source: https://en.bitcoinhaber.net/u-s-moves-grip-on-crypto-regulation-intensifies
Share
BitcoinEthereumNews2025/09/18 01:09
Shocking OpenVPP Partnership Claim Draws Urgent Scrutiny

Shocking OpenVPP Partnership Claim Draws Urgent Scrutiny

The post Shocking OpenVPP Partnership Claim Draws Urgent Scrutiny appeared on BitcoinEthereumNews.com. The cryptocurrency world is buzzing with a recent controversy surrounding a bold OpenVPP partnership claim. This week, OpenVPP (OVPP) announced what it presented as a significant collaboration with the U.S. government in the innovative field of energy tokenization. However, this claim quickly drew the sharp eye of on-chain analyst ZachXBT, who highlighted a swift and official rebuttal that has sent ripples through the digital asset community. What Sparked the OpenVPP Partnership Claim Controversy? The core of the issue revolves around OpenVPP’s assertion of a U.S. government partnership. This kind of collaboration would typically be a monumental endorsement for any private cryptocurrency project, especially given the current regulatory climate. Such a partnership could signify a new era of mainstream adoption and legitimacy for energy tokenization initiatives. OpenVPP initially claimed cooperation with the U.S. government. This alleged partnership was said to be in the domain of energy tokenization. The announcement generated considerable interest and discussion online. ZachXBT, known for his diligent on-chain investigations, was quick to flag the development. He brought attention to the fact that U.S. Securities and Exchange Commission (SEC) Commissioner Hester Peirce had directly addressed the OpenVPP partnership claim. Her response, delivered within hours, was unequivocal and starkly contradicted OpenVPP’s narrative. How Did Regulatory Authorities Respond to the OpenVPP Partnership Claim? Commissioner Hester Peirce’s statement was a crucial turning point in this unfolding story. She clearly stated that the SEC, as an agency, does not engage in partnerships with private cryptocurrency projects. This response effectively dismantled the credibility of OpenVPP’s initial announcement regarding their supposed government collaboration. Peirce’s swift clarification underscores a fundamental principle of regulatory bodies: maintaining impartiality and avoiding endorsements of private entities. Her statement serves as a vital reminder to the crypto community about the official stance of government agencies concerning private ventures. Moreover, ZachXBT’s analysis…
Share
BitcoinEthereumNews2025/09/18 02:13
OpenVPP accused of falsely advertising cooperation with the US government; SEC commissioner clarifies no involvement

OpenVPP accused of falsely advertising cooperation with the US government; SEC commissioner clarifies no involvement

PANews reported on September 17th that on-chain sleuth ZachXBT tweeted that OpenVPP ( $OVPP ) announced this week that it was collaborating with the US government to advance energy tokenization. SEC Commissioner Hester Peirce subsequently responded, stating that the company does not collaborate with or endorse any private crypto projects. The OpenVPP team subsequently hid the response. Several crypto influencers have participated in promoting the project, and the accounts involved have been questioned as typical influencer accounts.
Share
PANews2025/09/17 23:58