The post Grok 2.5 model becomes open-source, as xAI's Grok 3 to follow soon appeared on BitcoinEthereumNews.com. Elon Musk revealed that his xAI platform has open-sourced the Grok 2.5 model, with Grok 3 expected to be released in six months. He also highlighted that Chinese AI companies are his platform’s biggest competitors.  According to xAI, the Grok 2.5 model, which contains 42 files and occupies 500GB, has been made available on Hugging Face for download. xAI’s instructions direct developers to download the files, set up the SGlang inference engine, and launch the inference server with the tokenizer to run the model. The model requires at least eight GPUs, each with more than 40GB of video memory. Musk highlights China’s competition in the AI landscape Musk confirmed in a post on X that Grok 2.5, which he considers his firm’s best model, has been made available for public use. He added that Grok 3 will also be made available in about six months. He pointed out the competitive nature of the AI landscape by mentioning that Chinese AI companies pose a big challenge for them beyond Google’s competition.  The @xAI Grok 2.5 model, which was our best model last year, is now open source. Grok 3 will be made open source in about 6 months. https://t.co/TXM0wyJKOh — Elon Musk (@elonmusk) August 23, 2025 Elon’s post specified that the model requires a significant amount of resources to run. Grok 2.5, which occupies 500GB, will require at least 8 GPUS, each with 40 GB of video memory. Users can access the model weight, which has been made available on Hugging Face for download. Some developers online have pointed out that the model requirements are too high for most of them to run.  xAI has also released a new update for the Grok app, the latest version being v1.1.58. The update adds AI-powered video generation features, which can now be accessed… The post Grok 2.5 model becomes open-source, as xAI's Grok 3 to follow soon appeared on BitcoinEthereumNews.com. Elon Musk revealed that his xAI platform has open-sourced the Grok 2.5 model, with Grok 3 expected to be released in six months. He also highlighted that Chinese AI companies are his platform’s biggest competitors.  According to xAI, the Grok 2.5 model, which contains 42 files and occupies 500GB, has been made available on Hugging Face for download. xAI’s instructions direct developers to download the files, set up the SGlang inference engine, and launch the inference server with the tokenizer to run the model. The model requires at least eight GPUs, each with more than 40GB of video memory. Musk highlights China’s competition in the AI landscape Musk confirmed in a post on X that Grok 2.5, which he considers his firm’s best model, has been made available for public use. He added that Grok 3 will also be made available in about six months. He pointed out the competitive nature of the AI landscape by mentioning that Chinese AI companies pose a big challenge for them beyond Google’s competition.  The @xAI Grok 2.5 model, which was our best model last year, is now open source. Grok 3 will be made open source in about 6 months. https://t.co/TXM0wyJKOh — Elon Musk (@elonmusk) August 23, 2025 Elon’s post specified that the model requires a significant amount of resources to run. Grok 2.5, which occupies 500GB, will require at least 8 GPUS, each with 40 GB of video memory. Users can access the model weight, which has been made available on Hugging Face for download. Some developers online have pointed out that the model requirements are too high for most of them to run.  xAI has also released a new update for the Grok app, the latest version being v1.1.58. The update adds AI-powered video generation features, which can now be accessed…

Grok 2.5 model becomes open-source, as xAI's Grok 3 to follow soon

Elon Musk revealed that his xAI platform has open-sourced the Grok 2.5 model, with Grok 3 expected to be released in six months. He also highlighted that Chinese AI companies are his platform’s biggest competitors. 

According to xAI, the Grok 2.5 model, which contains 42 files and occupies 500GB, has been made available on Hugging Face for download. xAI’s instructions direct developers to download the files, set up the SGlang inference engine, and launch the inference server with the tokenizer to run the model. The model requires at least eight GPUs, each with more than 40GB of video memory.

Musk highlights China’s competition in the AI landscape

Musk confirmed in a post on X that Grok 2.5, which he considers his firm’s best model, has been made available for public use. He added that Grok 3 will also be made available in about six months. He pointed out the competitive nature of the AI landscape by mentioning that Chinese AI companies pose a big challenge for them beyond Google’s competition. 

Elon’s post specified that the model requires a significant amount of resources to run. Grok 2.5, which occupies 500GB, will require at least 8 GPUS, each with 40 GB of video memory. Users can access the model weight, which has been made available on Hugging Face for download.

Some developers online have pointed out that the model requirements are too high for most of them to run. 

xAI has also released a new update for the Grok app, the latest version being v1.1.58. The update adds AI-powered video generation features, which can now be accessed directly through the platform. Musk confirmed that the company will continue to improve the Grok app and pursue the open-source roadmap simultaneously.

Grok 2, which primarily shaped Grok 2.5, performance data shared by xAI showed Grok 2 achieving results that compete with leading models such as Claude and GPT-4.

On the LMSYS leaderboard, Grok 2’s Elo score exceeded those of Claude and GPT-4. Such benchmarks placed the model in a strong position against postgraduate scientific knowledge (GPQA), general knowledge (MMLU, MMLU-Pro), and mathematical models.

xAI faces licensing backlash, anti-competitive terms claims

The open-source release has sparked debates online, focusing on the nature of the license and xAI’s failure to disclose the exact parameters of the model on Hugging Face. AI engineer Tim Kellogg described the license as custom with some anti-competitive terms. Companies such as DeepSeek, Qwen, OpenAI, and Microsoft have used more open licenses such as Apache 2.0 and MIT. 

The Grok app has faced some controversies in the past, with the most recent being the responses related to the white genocide conspiracy in South Africa. Cryptopolitan reported on it in May, highlighting the incident in which Grok gave responses that question the Holocaust’s death toll and even described itself as “MechaHitler.”  

Musk later described Grok 4 as a truth-seeking AI model, although it still faces some controversies. Some users online identified that it referred to Musk’s own social posts when responding to contentious topics.

Chinese tech giants Baidu, Alibaba, and Tencent have also released more than ten model updates since January this year. Baidu expanded its input limit, allowing over 1,000 characters and interacting conversationally with its AI chatbot in the latest update release. Nearly all of their models are open-source, showing China’s ambition to be a global leader in AI. Such firms pose a great challenge for Western tech firms like OpenAI, Google, and xAI.  

Want your project in front of crypto’s top minds? Feature it in our next industry report, where data meets impact.

Source: https://www.cryptopolitan.com/musks-xai-open-sources-grok-2-5/

Market Opportunity
Threshold Logo
Threshold Price(T)
$0.006716
$0.006716$0.006716
+3.25%
USD
Threshold (T) Live Price Chart
Disclaimer: The articles reposted on this site are sourced from public platforms and are provided for informational purposes only. They do not necessarily reflect the views of MEXC. All rights remain with the original authors. If you believe any content infringes on third-party rights, please contact crypto.news@mexc.com for removal. MEXC makes no guarantees regarding the accuracy, completeness, or timeliness of the content and is not responsible for any actions taken based on the information provided. The content does not constitute financial, legal, or other professional advice, nor should it be considered a recommendation or endorsement by MEXC.

You May Also Like

North America Sees $2.3T in Crypto

North America Sees $2.3T in Crypto

The post North America Sees $2.3T in Crypto appeared on BitcoinEthereumNews.com. Key Notes North America received $2.3 trillion in crypto value between July 2024 and June 2025, representing 26% of global activity. Tokenized U.S. treasuries saw assets under management (AUM) grow from $2 billion to over $7 billion in the last twelve months. U.S.-listed Bitcoin ETFs now account for over $120 billion in AUM, signaling strong institutional demand for the asset. . North America has established itself as a major center for cryptocurrency activity, with significant transaction volumes recorded over the past year. The region’s growth highlights an increasing institutional and retail interest in digital assets, particularly within the United States. According to a new report from blockchain analytics firm Chainalysis published on September 17, North America received $2.3 trillion in cryptocurrency value between July 2024 and June 2025. This volume represents 26% of all global transaction activity during that period. The report suggests this activity was influenced by a more favorable regulatory outlook and institutional trading strategies. A peak in monthly value was recorded in December 2024, when an estimated $244 billion was transferred in a single month. ETFs and Tokenization Drive Adoption The rise of spot Bitcoin BTC $115 760 24h volatility: 0.5% Market cap: $2.30 T Vol. 24h: $43.60 B ETFs has been a significant factor in the market’s expansion. U.S.-listed Bitcoin ETFs now hold over $120 billion in assets under management (AUM), making up a large portion of the roughly $180 billion held globally. The strong demand is reflected in a recent resumption of inflows, although the products are not without their detractors, with author Robert Kiyosaki calling ETFs “for losers.” The market for tokenized real-world assets also saw notable growth. While funds holding tokenized U.S. treasuries expanded their AUM from approximately $2 billion to more than $7 billion, the trend is expanding into other asset classes.…
Share
BitcoinEthereumNews2025/09/18 02:07
The Critical Path To A Potential $10k Milestone

The Critical Path To A Potential $10k Milestone

The post The Critical Path To A Potential $10k Milestone appeared on BitcoinEthereumNews.com. Ethereum Price Prediction 2026-2030: The Critical Path To A Potential
Share
BitcoinEthereumNews2026/02/27 14:40
Priced Below $0.003, Google’s AI Says This is the Most Promising Crypto in 2025, Beating Solana (SOL)

Priced Below $0.003, Google’s AI Says This is the Most Promising Crypto in 2025, Beating Solana (SOL)

The post Priced Below $0.003, Google’s AI Says This is the Most Promising Crypto in 2025, Beating Solana (SOL) appeared on BitcoinEthereumNews.com. Little Pepe ($LILPEPE) may be the next cryptocurrency that investors are looking for to compete with Solana (SOL) and Ethereum (ETH). Google’s AI models say it’s the best choice for 2025. This meme-powered Layer 2 blockchain is currently in Stage 12 of its presale, with a cost of $0.0021. Traders, analysts, and meme coin fans are all interested in it. A Presale That’s Almost Sold Out Momentum for Little Pepe is undeniable. At the time of writing: Stage 12 Price: $0.0021 (Next Stage: $0.0022) USD Raised: $25.3 million / $25.4 million Tokens Sold: 15,692,215,448 / 15,750,000,000 Completion: 99.63% With only a fraction of tokens left before advancing to the next stage, early investors are racing to secure their positions. Once the presale ends, $LILPEPE will list on two major centralized exchanges (CEX) at launch, followed by listings on top decentralized exchanges with deep liquidity support. What is Unique about Little Pepe? Little Pepe is the world’s first Layer 2 blockchain, designed specifically for meme coins, offering a dedicated ecosystem where speed, security, and ultra-low fees are core component. Ultra-Fast & Cheap Transactions: Built to outpace Ethereum and even Solana in cost-efficiency. No Sniper Bots: Designed to keep trading fair and free from predatory bots. Utility-Powered Ecosystem: $LILPEPE is the lifeblood of the chain, powering everything from transfers to staking and participation on the launchpad. Zero Tax Policy: True DeFi freedom—no hidden buy/sell taxes. Little Pepe positions itself as a meme icon and an unstoppable kingdom for meme coin culture, where Pepe reigns supreme and innovation meets fun. Security First: The CertiK Audit Trust is critical in DeFi, and Little Pepe has taken steps to ensure investors feel secure. The project recently completed a CertiK audit, one of the industry’s gold standards for blockchain security. Audit Score: 95.49% Coverage Areas: Smart…
Share
BitcoinEthereumNews2025/09/19 05:40