In the fast-evolving world of AI, open-sourcing models has become a battleground for innovation, ethics, and regulation. Just recently, on August 25, 2025 (yes, that’s today!), Elon Musk announced that xAI has open-sourced Grok 2.5, its flagship model from last year, making the weights available on Hugging Face. This move echoes OpenAI’s earlier release on August 5, 2025, of two open-source models: gpt-oss-120b (120 billion parameters) and gpt-oss-20b (20 billion parameters), under the permissive Apache 2.0 license with an added usage policy. Both companies are pushing the boundaries of general-purpose AI (GPAI) models — those versatile systems capable of tackling reasoning, coding, math, and more. But with great power comes great scrutiny, especially under the EU AI Act (Regulation (EU) 2024/1689), which sets strict rules for transparency, risk management, and open-source claims. Inspired by a recent Medium article analyzing OpenAI’s models (check it out here), I’ll conduct a similar compliance check for Grok 2.5. Using publicly available info like model cards and announcements, we’ll evaluate its alignment with the Act’s requirements for open-source GPAI models. Spoiler: It’s not as straightforward as it seems. Note that this is a high-level analysis — true compliance needs official regulatory review.Grok Breaking Down the Models: Grok 2.5 vs. OpenAI’s Duo Let’s start with the basics to set the stage. Grok 2.5 (xAI): This beast clocks in at around 270 billion parameters, trained back in 2024 on text-based tasks like reasoning. What’s released? The model weights (a hefty ~500 GB across 42 files) and the tokenizer. But details on the full architecture, training code, or datasets? Slim to none — only hints like a June 2024 knowledge cutoff. The license is a custom “Grok 2 Community License Agreement”: revocable, allows commercial and non-commercial use, but slaps on restrictions like banning its use to train or improve other AI models. No separate usage policy, just the license terms. gpt-oss-120b and gpt-oss-20b (OpenAI): Smaller siblings at 120B and 20B parameters, trained on trillions of filtered tokens with chain-of-thought tweaks. Releases include weights, architecture details, tokenizer (via TikToken), and even some training code snippets. Licensed under Apache 2.0 — super permissive — with a usage policy encouraging responsible AI without heavy restrictions. Training compute (measured in FLOPs) isn’t explicitly shared for Grok 2.5, but given its size, it’s probably under the 10²⁵ FLOPs mark that triggers “systemic risk” status — similar to GPT-3’s estimates (around 3.14 x 10²³ FLOPs). OpenAI’s models are in the same boat, as noted in the original article. The EU AI Act: What Open-Source GPAI Models Need to Nail The EU AI Act classifies GPAI as AI that handles diverse tasks without a narrow focus (Article 3, point 63). For “open-source” ones (Recital 102, Article 3 point 12), the bar is high: They must use a “free and open license” allowing unrestricted access, use, study, modification, and sharing — including derivatives — with no commercial bans or field limits. Key obligations include: Article 53 (Baseline for All GPAI Providers): Technical docs on training/testing, copyright respect (like honoring opt-outs from Directive 2019/790), and usage info. Article 55 (For Systemic Risks): If over 10²⁵ FLOPs or deemed high-risk, add risk assessments, testing, and reporting. Open-source models can snag exemptions from some transparency rules if their license is truly open and they’re not monetized (e.g., no subs). Exemptions and the Code of Practice: Genuine open-source (non-systemic) skips some hurdles if the license is barrier-free. The EU’s upcoming Code of Practice (rolling out in 2025) offers a voluntary roadmap for safety, transparency, and copyright. The Act loves open-source for sparking innovation but calls out licenses with sneaky restrictions. The Compliance Breakdown: Where Grok 2.5 Stands vs. OpenAI Mirroring the original article’s style, here’s a table assessing compliance based on public data. Ratings: “Likely Compliant” (checks out), “Partial/Questionable” (iffy spots), or “Potential Non-Compliance” (red flags). I’ve included OpenAI for direct comparison.https://medium.com/media/8218eea850b241bff8bd2a4f09d44233/href Wrapping It Up: Lessons for xAI and the AI World Grok 2.5 ticks some boxes as a GPAI release but stumbles on open-source purity thanks to its custom license’s revocability and restrictions — potentially stripping away exemptions and inviting deeper EU scrutiny under Articles 53 and 55. OpenAI’s gpt-oss models, with their straightforward Apache 2.0 setup and better docs, seem to sail through more smoothly, qualifying for those sweet exemptions while hitting baselines. If Grok 2.5’s FLOPs secretly top 10²⁵ (doubtful, but possible), the gaps widen. xAI could level up by switching to a standard open license and beefing up transparency. For anyone in AI, this highlights the Act’s push: Open-source is great, but only if it’s truly open. Curious about the EU’s full guidelines? Dive into them or chat with regulators for the real deal. What do you think — will more companies follow suit, or tighten up? Drop your thoughts below! xAI’s Grok 2.5: Open-Sourced, But Does It Pass the EU AI Act Test? was originally published in Coinmonks on Medium, where people are continuing the conversation by highlighting and responding to this storyIn the fast-evolving world of AI, open-sourcing models has become a battleground for innovation, ethics, and regulation. Just recently, on August 25, 2025 (yes, that’s today!), Elon Musk announced that xAI has open-sourced Grok 2.5, its flagship model from last year, making the weights available on Hugging Face. This move echoes OpenAI’s earlier release on August 5, 2025, of two open-source models: gpt-oss-120b (120 billion parameters) and gpt-oss-20b (20 billion parameters), under the permissive Apache 2.0 license with an added usage policy. Both companies are pushing the boundaries of general-purpose AI (GPAI) models — those versatile systems capable of tackling reasoning, coding, math, and more. But with great power comes great scrutiny, especially under the EU AI Act (Regulation (EU) 2024/1689), which sets strict rules for transparency, risk management, and open-source claims. Inspired by a recent Medium article analyzing OpenAI’s models (check it out here), I’ll conduct a similar compliance check for Grok 2.5. Using publicly available info like model cards and announcements, we’ll evaluate its alignment with the Act’s requirements for open-source GPAI models. Spoiler: It’s not as straightforward as it seems. Note that this is a high-level analysis — true compliance needs official regulatory review.Grok Breaking Down the Models: Grok 2.5 vs. OpenAI’s Duo Let’s start with the basics to set the stage. Grok 2.5 (xAI): This beast clocks in at around 270 billion parameters, trained back in 2024 on text-based tasks like reasoning. What’s released? The model weights (a hefty ~500 GB across 42 files) and the tokenizer. But details on the full architecture, training code, or datasets? Slim to none — only hints like a June 2024 knowledge cutoff. The license is a custom “Grok 2 Community License Agreement”: revocable, allows commercial and non-commercial use, but slaps on restrictions like banning its use to train or improve other AI models. No separate usage policy, just the license terms. gpt-oss-120b and gpt-oss-20b (OpenAI): Smaller siblings at 120B and 20B parameters, trained on trillions of filtered tokens with chain-of-thought tweaks. Releases include weights, architecture details, tokenizer (via TikToken), and even some training code snippets. Licensed under Apache 2.0 — super permissive — with a usage policy encouraging responsible AI without heavy restrictions. Training compute (measured in FLOPs) isn’t explicitly shared for Grok 2.5, but given its size, it’s probably under the 10²⁵ FLOPs mark that triggers “systemic risk” status — similar to GPT-3’s estimates (around 3.14 x 10²³ FLOPs). OpenAI’s models are in the same boat, as noted in the original article. The EU AI Act: What Open-Source GPAI Models Need to Nail The EU AI Act classifies GPAI as AI that handles diverse tasks without a narrow focus (Article 3, point 63). For “open-source” ones (Recital 102, Article 3 point 12), the bar is high: They must use a “free and open license” allowing unrestricted access, use, study, modification, and sharing — including derivatives — with no commercial bans or field limits. Key obligations include: Article 53 (Baseline for All GPAI Providers): Technical docs on training/testing, copyright respect (like honoring opt-outs from Directive 2019/790), and usage info. Article 55 (For Systemic Risks): If over 10²⁵ FLOPs or deemed high-risk, add risk assessments, testing, and reporting. Open-source models can snag exemptions from some transparency rules if their license is truly open and they’re not monetized (e.g., no subs). Exemptions and the Code of Practice: Genuine open-source (non-systemic) skips some hurdles if the license is barrier-free. The EU’s upcoming Code of Practice (rolling out in 2025) offers a voluntary roadmap for safety, transparency, and copyright. The Act loves open-source for sparking innovation but calls out licenses with sneaky restrictions. The Compliance Breakdown: Where Grok 2.5 Stands vs. OpenAI Mirroring the original article’s style, here’s a table assessing compliance based on public data. Ratings: “Likely Compliant” (checks out), “Partial/Questionable” (iffy spots), or “Potential Non-Compliance” (red flags). I’ve included OpenAI for direct comparison.https://medium.com/media/8218eea850b241bff8bd2a4f09d44233/href Wrapping It Up: Lessons for xAI and the AI World Grok 2.5 ticks some boxes as a GPAI release but stumbles on open-source purity thanks to its custom license’s revocability and restrictions — potentially stripping away exemptions and inviting deeper EU scrutiny under Articles 53 and 55. OpenAI’s gpt-oss models, with their straightforward Apache 2.0 setup and better docs, seem to sail through more smoothly, qualifying for those sweet exemptions while hitting baselines. If Grok 2.5’s FLOPs secretly top 10²⁵ (doubtful, but possible), the gaps widen. xAI could level up by switching to a standard open license and beefing up transparency. For anyone in AI, this highlights the Act’s push: Open-source is great, but only if it’s truly open. Curious about the EU’s full guidelines? Dive into them or chat with regulators for the real deal. What do you think — will more companies follow suit, or tighten up? Drop your thoughts below! xAI’s Grok 2.5: Open-Sourced, But Does It Pass the EU AI Act Test? was originally published in Coinmonks on Medium, where people are continuing the conversation by highlighting and responding to this story

xAI’s Grok 2.5: Open-Sourced, But Does It Pass the EU AI Act Test?

2025/08/25 23:38

In the fast-evolving world of AI, open-sourcing models has become a battleground for innovation, ethics, and regulation. Just recently, on August 25, 2025 (yes, that’s today!), Elon Musk announced that xAI has open-sourced Grok 2.5, its flagship model from last year, making the weights available on Hugging Face. This move echoes OpenAI’s earlier release on August 5, 2025, of two open-source models: gpt-oss-120b (120 billion parameters) and gpt-oss-20b (20 billion parameters), under the permissive Apache 2.0 license with an added usage policy.

Both companies are pushing the boundaries of general-purpose AI (GPAI) models — those versatile systems capable of tackling reasoning, coding, math, and more. But with great power comes great scrutiny, especially under the EU AI Act (Regulation (EU) 2024/1689), which sets strict rules for transparency, risk management, and open-source claims.

Inspired by a recent Medium article analyzing OpenAI’s models (check it out here), I’ll conduct a similar compliance check for Grok 2.5. Using publicly available info like model cards and announcements, we’ll evaluate its alignment with the Act’s requirements for open-source GPAI models. Spoiler: It’s not as straightforward as it seems. Note that this is a high-level analysis — true compliance needs official regulatory review.

Grok

Breaking Down the Models: Grok 2.5 vs. OpenAI’s Duo

Let’s start with the basics to set the stage.

  • Grok 2.5 (xAI): This beast clocks in at around 270 billion parameters, trained back in 2024 on text-based tasks like reasoning. What’s released? The model weights (a hefty ~500 GB across 42 files) and the tokenizer. But details on the full architecture, training code, or datasets? Slim to none — only hints like a June 2024 knowledge cutoff. The license is a custom “Grok 2 Community License Agreement”: revocable, allows commercial and non-commercial use, but slaps on restrictions like banning its use to train or improve other AI models. No separate usage policy, just the license terms.
  • gpt-oss-120b and gpt-oss-20b (OpenAI): Smaller siblings at 120B and 20B parameters, trained on trillions of filtered tokens with chain-of-thought tweaks. Releases include weights, architecture details, tokenizer (via TikToken), and even some training code snippets. Licensed under Apache 2.0 — super permissive — with a usage policy encouraging responsible AI without heavy restrictions.

Training compute (measured in FLOPs) isn’t explicitly shared for Grok 2.5, but given its size, it’s probably under the 10²⁵ FLOPs mark that triggers “systemic risk” status — similar to GPT-3’s estimates (around 3.14 x 10²³ FLOPs). OpenAI’s models are in the same boat, as noted in the original article.

The EU AI Act: What Open-Source GPAI Models Need to Nail

The EU AI Act classifies GPAI as AI that handles diverse tasks without a narrow focus (Article 3, point 63). For “open-source” ones (Recital 102, Article 3 point 12), the bar is high: They must use a “free and open license” allowing unrestricted access, use, study, modification, and sharing — including derivatives — with no commercial bans or field limits.

Key obligations include:

  • Article 53 (Baseline for All GPAI Providers): Technical docs on training/testing, copyright respect (like honoring opt-outs from Directive 2019/790), and usage info.
  • Article 55 (For Systemic Risks): If over 10²⁵ FLOPs or deemed high-risk, add risk assessments, testing, and reporting. Open-source models can snag exemptions from some transparency rules if their license is truly open and they’re not monetized (e.g., no subs).
  • Exemptions and the Code of Practice: Genuine open-source (non-systemic) skips some hurdles if the license is barrier-free. The EU’s upcoming Code of Practice (rolling out in 2025) offers a voluntary roadmap for safety, transparency, and copyright.

The Act loves open-source for sparking innovation but calls out licenses with sneaky restrictions.

The Compliance Breakdown: Where Grok 2.5 Stands vs. OpenAI

Mirroring the original article’s style, here’s a table assessing compliance based on public data. Ratings: “Likely Compliant” (checks out), “Partial/Questionable” (iffy spots), or “Potential Non-Compliance” (red flags). I’ve included OpenAI for direct comparison.

https://medium.com/media/8218eea850b241bff8bd2a4f09d44233/href

Wrapping It Up: Lessons for xAI and the AI World

Grok 2.5 ticks some boxes as a GPAI release but stumbles on open-source purity thanks to its custom license’s revocability and restrictions — potentially stripping away exemptions and inviting deeper EU scrutiny under Articles 53 and 55. OpenAI’s gpt-oss models, with their straightforward Apache 2.0 setup and better docs, seem to sail through more smoothly, qualifying for those sweet exemptions while hitting baselines.

If Grok 2.5’s FLOPs secretly top 10²⁵ (doubtful, but possible), the gaps widen. xAI could level up by switching to a standard open license and beefing up transparency. For anyone in AI, this highlights the Act’s push: Open-source is great, but only if it’s truly open.

Curious about the EU’s full guidelines? Dive into them or chat with regulators for the real deal. What do you think — will more companies follow suit, or tighten up? Drop your thoughts below!


xAI’s Grok 2.5: Open-Sourced, But Does It Pass the EU AI Act Test? was originally published in Coinmonks on Medium, where people are continuing the conversation by highlighting and responding to this story.

Market Opportunity
Xai Logo
Xai Price(XAI)
$0.01574
$0.01574$0.01574
-0.50%
USD
Xai (XAI) Live Price Chart
Disclaimer: The articles reposted on this site are sourced from public platforms and are provided for informational purposes only. They do not necessarily reflect the views of MEXC. All rights remain with the original authors. If you believe any content infringes on third-party rights, please contact service@support.mexc.com for removal. MEXC makes no guarantees regarding the accuracy, completeness, or timeliness of the content and is not responsible for any actions taken based on the information provided. The content does not constitute financial, legal, or other professional advice, nor should it be considered a recommendation or endorsement by MEXC.

You May Also Like

Polygon Tops RWA Rankings With $1.1B in Tokenized Assets

Polygon Tops RWA Rankings With $1.1B in Tokenized Assets

The post Polygon Tops RWA Rankings With $1.1B in Tokenized Assets appeared on BitcoinEthereumNews.com. Key Notes A new report from Dune and RWA.xyz highlights Polygon’s role in the growing RWA sector. Polygon PoS currently holds $1.13 billion in RWA Total Value Locked (TVL) across 269 assets. The network holds a 62% market share of tokenized global bonds, driven by European money market funds. The Polygon POL $0.25 24h volatility: 1.4% Market cap: $2.64 B Vol. 24h: $106.17 M network is securing a significant position in the rapidly growing tokenization space, now holding over $1.13 billion in total value locked (TVL) from Real World Assets (RWAs). This development comes as the network continues to evolve, recently deploying its major “Rio” upgrade on the Amoy testnet to enhance future scaling capabilities. This information comes from a new joint report on the state of the RWA market published on Sept. 17 by blockchain analytics firm Dune and data platform RWA.xyz. The focus on RWAs is intensifying across the industry, coinciding with events like the ongoing Real-World Asset Summit in New York. Sandeep Nailwal, CEO of the Polygon Foundation, highlighted the findings via a post on X, noting that the TVL is spread across 269 assets and 2,900 holders on the Polygon PoS chain. The Dune and https://t.co/W6WSFlHoQF report on RWA is out and it shows that RWA is happening on Polygon. Here are a few highlights: – Leading in Global Bonds: Polygon holds 62% share of tokenized global bonds (driven by Spiko’s euro MMF and Cashlink euro issues) – Spiko U.S.… — Sandeep | CEO, Polygon Foundation (※,※) (@sandeepnailwal) September 17, 2025 Key Trends From the 2025 RWA Report The joint publication, titled “RWA REPORT 2025,” offers a comprehensive look into the tokenized asset landscape, which it states has grown 224% since the start of 2024. The report identifies several key trends driving this expansion. According to…
Share
BitcoinEthereumNews2025/09/18 00:40
OpenVPP accused of falsely advertising cooperation with the US government; SEC commissioner clarifies no involvement

OpenVPP accused of falsely advertising cooperation with the US government; SEC commissioner clarifies no involvement

PANews reported on September 17th that on-chain sleuth ZachXBT tweeted that OpenVPP ( $OVPP ) announced this week that it was collaborating with the US government to advance energy tokenization. SEC Commissioner Hester Peirce subsequently responded, stating that the company does not collaborate with or endorse any private crypto projects. The OpenVPP team subsequently hid the response. Several crypto influencers have participated in promoting the project, and the accounts involved have been questioned as typical influencer accounts.
Share
PANews2025/09/17 23:58
Will XRP Price Increase In September 2025?

Will XRP Price Increase In September 2025?

Ripple XRP is a cryptocurrency that primarily focuses on building a decentralised payments network to facilitate low-cost and cross-border transactions. It’s a native digital currency of the Ripple network, which works as a blockchain called the XRP Ledger (XRPL). It utilised a shared, distributed ledger to track account balances and transactions. What Do XRP Charts Reveal? […]
Share
Tronweekly2025/09/18 00:00