The post Someone Built a Working AI Chatbot Inside ‘Minecraft’—Here’s How appeared on BitcoinEthereumNews.com. In brief A Minecraft builder encoded a 5-million-parameter language model entirely via the game’s redstone logic, spanning hundreds of millions of blocks. The system works (in principle) but is excruciatingly slow; responses can take hours even with speed boosts. It’s a proof of concept, a technical art piece, and a reminder that computation can live in odd places—what matters is structure, not just substrate. Over the weekend, a Minecraft maestro performed a striking feat: embedding a working language-model inference engine inside the immensely popular sandbox game. The catch? All of the “wiring” was done with Minecraft’s redstone system—no command blocks, no mods (beyond rendering assistance), just circuit logic built from in-game materials. The result is quirky, painfully slow, but fascinating: a GPT-style chatbot inside a block world, built from digital redstone and analog logic. To understand the stunt, you need to know what redstone is. In Minecraft, redstone is the game’s analog to wiring and electronic components: power sources, repeaters, comparators, logic gates, signal delays—all built with blocks. Redstone circuits emulate digital behavior (on/off, logic operations) inside the game world. Players long ago built calculators, memory units, and even CPUs inside Minecraft using redstone—and now they’re pushing it into AI territory. Sammyuri, the person who build the GPT hack, even built a recursive version of Minecraft within Minecraft using redstone. In this project, the creator first trained a small language model outside of the game (in Python) using a dataset called TinyChat. That model has roughly 5,087,280 parameters, an embedding dimension of 240, a vocabulary of about 1,920 tokens, six layers, and a context window of 64 tokens. Most weights are quantized to 8 bits, though embedding and LayerNorm weights use higher precision. The redstone build itself spans roughly 1,020 × 260 × 1,656 blocks (≈ 439 million blocks in… The post Someone Built a Working AI Chatbot Inside ‘Minecraft’—Here’s How appeared on BitcoinEthereumNews.com. In brief A Minecraft builder encoded a 5-million-parameter language model entirely via the game’s redstone logic, spanning hundreds of millions of blocks. The system works (in principle) but is excruciatingly slow; responses can take hours even with speed boosts. It’s a proof of concept, a technical art piece, and a reminder that computation can live in odd places—what matters is structure, not just substrate. Over the weekend, a Minecraft maestro performed a striking feat: embedding a working language-model inference engine inside the immensely popular sandbox game. The catch? All of the “wiring” was done with Minecraft’s redstone system—no command blocks, no mods (beyond rendering assistance), just circuit logic built from in-game materials. The result is quirky, painfully slow, but fascinating: a GPT-style chatbot inside a block world, built from digital redstone and analog logic. To understand the stunt, you need to know what redstone is. In Minecraft, redstone is the game’s analog to wiring and electronic components: power sources, repeaters, comparators, logic gates, signal delays—all built with blocks. Redstone circuits emulate digital behavior (on/off, logic operations) inside the game world. Players long ago built calculators, memory units, and even CPUs inside Minecraft using redstone—and now they’re pushing it into AI territory. Sammyuri, the person who build the GPT hack, even built a recursive version of Minecraft within Minecraft using redstone. In this project, the creator first trained a small language model outside of the game (in Python) using a dataset called TinyChat. That model has roughly 5,087,280 parameters, an embedding dimension of 240, a vocabulary of about 1,920 tokens, six layers, and a context window of 64 tokens. Most weights are quantized to 8 bits, though embedding and LayerNorm weights use higher precision. The redstone build itself spans roughly 1,020 × 260 × 1,656 blocks (≈ 439 million blocks in…

Someone Built a Working AI Chatbot Inside ‘Minecraft’—Here’s How

For feedback or concerns regarding this content, please contact us at crypto.news@mexc.com

In brief

  • A Minecraft builder encoded a 5-million-parameter language model entirely via the game’s redstone logic, spanning hundreds of millions of blocks.
  • The system works (in principle) but is excruciatingly slow; responses can take hours even with speed boosts.
  • It’s a proof of concept, a technical art piece, and a reminder that computation can live in odd places—what matters is structure, not just substrate.

Over the weekend, a Minecraft maestro performed a striking feat: embedding a working language-model inference engine inside the immensely popular sandbox game. The catch? All of the “wiring” was done with Minecraft’s redstone system—no command blocks, no mods (beyond rendering assistance), just circuit logic built from in-game materials.

The result is quirky, painfully slow, but fascinating: a GPT-style chatbot inside a block world, built from digital redstone and analog logic.

To understand the stunt, you need to know what redstone is. In Minecraft, redstone is the game’s analog to wiring and electronic components: power sources, repeaters, comparators, logic gates, signal delays—all built with blocks. Redstone circuits emulate digital behavior (on/off, logic operations) inside the game world.

Players long ago built calculators, memory units, and even CPUs inside Minecraft using redstone—and now they’re pushing it into AI territory. Sammyuri, the person who build the GPT hack, even built a recursive version of Minecraft within Minecraft using redstone.

In this project, the creator first trained a small language model outside of the game (in Python) using a dataset called TinyChat. That model has roughly 5,087,280 parameters, an embedding dimension of 240, a vocabulary of about 1,920 tokens, six layers, and a context window of 64 tokens. Most weights are quantized to 8 bits, though embedding and LayerNorm weights use higher precision. The redstone build itself spans roughly 1,020 × 260 × 1,656 blocks (≈ 439 million blocks in total). To film the scale, the creator used the Distant Horizons mod, which allows distant structures to be visualized in a giant world.

When you input a prompt (via in-game interface), the redstone circuits carry out the inference step by step, embedding lookup, feedforward passes, matrix multiplications, and softmax approximations. According to a video demonstration, the elaborate redstone build took months to assemble.

But the system is glacial. Even with an artificially boosted tick rate (~40,000× faster than normal, via MCHPRS), the response time is on the order of a couple of hours. At standard Minecraft speed, some estimate it would take over nine years to generate an answer.

Still, the point of the project isn’t utility—it’s demonstration. At its heart, this build is a playful proof that neural inference can be recast into almost any logical substrate. It reminds us how much abstraction our software and hardware stacks hide: that architecture, medium, and speed are separate dimensions. In other words, this is a technical art piece, a conversation starter: what counts is computation, not necessarily where it runs.

Does this matter? Yes, it does

Obviously, this isn’t practical, but it’s cool because it demonstrates something profound: that logic and neural-style computation can be mapped into bizarre substrates—here, virtual redstone circuits. It’s a playful proof of the universality of computation, an artistic and educational showpiece, and a challenge to our assumptions about how and where “intelligence” can run.

It forces us to ask: what really matters in a model—architecture, medium, speed—and what other strange substrates might one try (optics, DNA, mechanical systems)?

Across the web, the build has ignited debates. Tom’s Hardware ran a piece breaking down the block count, the redstone logic, and the performance tradeoffs. On Hacker News, commenters marveled at the engineering but also pressed on limitations: “At normal redstone tick rate… it would take just over 9 years for a response.” Meanwhile, in the Minecraft subreddit, fans and skeptics alike debated how much of the achievement is spectacle vs. technical depth.

Some observers suggest the project borders on meme more than research. Indeed, the real training happened externally; Minecraft only hosts inference logic. It’s a showpiece more than a practical model. But that’s precisely what gives it charm—and purpose.

GG Newsletter

Get the latest web3 gaming news, hear directly from gaming studios and influencers covering the space, and receive power-ups from our partners.

Source: https://decrypt.co/341930/someone-built-working-ai-chatbot-inside-minecraft

Market Opportunity
null Logo
null Price(null)
--
----
USD
null (null) Live Price Chart
Disclaimer: The articles reposted on this site are sourced from public platforms and are provided for informational purposes only. They do not necessarily reflect the views of MEXC. All rights remain with the original authors. If you believe any content infringes on third-party rights, please contact crypto.news@mexc.com for removal. MEXC makes no guarantees regarding the accuracy, completeness, or timeliness of the content and is not responsible for any actions taken based on the information provided. The content does not constitute financial, legal, or other professional advice, nor should it be considered a recommendation or endorsement by MEXC.

You May Also Like

Aave DAO to Shut Down 50% of L2s While Doubling Down on GHO

Aave DAO to Shut Down 50% of L2s While Doubling Down on GHO

The post Aave DAO to Shut Down 50% of L2s While Doubling Down on GHO appeared on BitcoinEthereumNews.com. Aave DAO is gearing up for a significant overhaul by shutting down over 50% of underperforming L2 instances. It is also restructuring its governance framework and deploying over $100 million to boost GHO. This could be a pivotal moment that propels Aave back to the forefront of on-chain lending or sparks unprecedented controversy within the DeFi community. Sponsored Sponsored ACI Proposes Shutting Down 50% of L2s The “State of the Union” report by the Aave Chan Initiative (ACI) paints a candid picture. After a turbulent period in the DeFi market and internal challenges, Aave (AAVE) now leads in key metrics: TVL, revenue, market share, and borrowing volume. Aave’s annual revenue of $130 million surpasses the combined cash reserves of its competitors. Tokenomics improvements and the AAVE token buyback program have also contributed to the ecosystem’s growth. Aave global metrics. Source: Aave However, the ACI’s report also highlights several pain points. First, regarding the Layer-2 (L2) strategy. While Aave’s L2 strategy was once a key driver of success, it is no longer fit for purpose. Over half of Aave’s instances on L2s and alt-L1s are not economically viable. Based on year-to-date data, over 86.6% of Aave’s revenue comes from the mainnet, indicating that everything else is a side quest. On this basis, ACI proposes closing underperforming networks. The DAO should invest in key networks with significant differentiators. Second, ACI is pushing for a complete overhaul of the “friendly fork” framework, as most have been unimpressive regarding TVL and revenue. In some cases, attackers have exploited them to Aave’s detriment, as seen with Spark. Sponsored Sponsored “The friendly fork model had a good intention but bad execution where the DAO was too friendly towards these forks, allowing the DAO only little upside,” the report states. Third, the instance model, once a smart…
Share
BitcoinEthereumNews2025/09/18 02:28
Trump erupts at Fox News reporter during  roundtable: 'What a stupid question'

Trump erupts at Fox News reporter during  roundtable: 'What a stupid question'

An agitated President Donald Trump lashed out at two reporters during his White House “Saving College Sports” roundtable, complaining that the journalists failed
Share
Rawstory2026/03/07 07:19
Lyn Alden Tips Bitcoin Outperforming Gold Through to 2029

Lyn Alden Tips Bitcoin Outperforming Gold Through to 2029

The post Lyn Alden Tips Bitcoin Outperforming Gold Through to 2029 appeared on BitcoinEthereumNews.com. Bitcoin is likely to outperform gold on price performance
Share
BitcoinEthereumNews2026/03/07 07:22