The post Someone Built a Working AI Chatbot Inside ‘Minecraft’—Here’s How appeared on BitcoinEthereumNews.com. In brief A Minecraft builder encoded a 5-million-parameter language model entirely via the game’s redstone logic, spanning hundreds of millions of blocks. The system works (in principle) but is excruciatingly slow; responses can take hours even with speed boosts. It’s a proof of concept, a technical art piece, and a reminder that computation can live in odd places—what matters is structure, not just substrate. Over the weekend, a Minecraft maestro performed a striking feat: embedding a working language-model inference engine inside the immensely popular sandbox game. The catch? All of the “wiring” was done with Minecraft’s redstone system—no command blocks, no mods (beyond rendering assistance), just circuit logic built from in-game materials. The result is quirky, painfully slow, but fascinating: a GPT-style chatbot inside a block world, built from digital redstone and analog logic. To understand the stunt, you need to know what redstone is. In Minecraft, redstone is the game’s analog to wiring and electronic components: power sources, repeaters, comparators, logic gates, signal delays—all built with blocks. Redstone circuits emulate digital behavior (on/off, logic operations) inside the game world. Players long ago built calculators, memory units, and even CPUs inside Minecraft using redstone—and now they’re pushing it into AI territory. Sammyuri, the person who build the GPT hack, even built a recursive version of Minecraft within Minecraft using redstone. In this project, the creator first trained a small language model outside of the game (in Python) using a dataset called TinyChat. That model has roughly 5,087,280 parameters, an embedding dimension of 240, a vocabulary of about 1,920 tokens, six layers, and a context window of 64 tokens. Most weights are quantized to 8 bits, though embedding and LayerNorm weights use higher precision. The redstone build itself spans roughly 1,020 × 260 × 1,656 blocks (≈ 439 million blocks in… The post Someone Built a Working AI Chatbot Inside ‘Minecraft’—Here’s How appeared on BitcoinEthereumNews.com. In brief A Minecraft builder encoded a 5-million-parameter language model entirely via the game’s redstone logic, spanning hundreds of millions of blocks. The system works (in principle) but is excruciatingly slow; responses can take hours even with speed boosts. It’s a proof of concept, a technical art piece, and a reminder that computation can live in odd places—what matters is structure, not just substrate. Over the weekend, a Minecraft maestro performed a striking feat: embedding a working language-model inference engine inside the immensely popular sandbox game. The catch? All of the “wiring” was done with Minecraft’s redstone system—no command blocks, no mods (beyond rendering assistance), just circuit logic built from in-game materials. The result is quirky, painfully slow, but fascinating: a GPT-style chatbot inside a block world, built from digital redstone and analog logic. To understand the stunt, you need to know what redstone is. In Minecraft, redstone is the game’s analog to wiring and electronic components: power sources, repeaters, comparators, logic gates, signal delays—all built with blocks. Redstone circuits emulate digital behavior (on/off, logic operations) inside the game world. Players long ago built calculators, memory units, and even CPUs inside Minecraft using redstone—and now they’re pushing it into AI territory. Sammyuri, the person who build the GPT hack, even built a recursive version of Minecraft within Minecraft using redstone. In this project, the creator first trained a small language model outside of the game (in Python) using a dataset called TinyChat. That model has roughly 5,087,280 parameters, an embedding dimension of 240, a vocabulary of about 1,920 tokens, six layers, and a context window of 64 tokens. Most weights are quantized to 8 bits, though embedding and LayerNorm weights use higher precision. The redstone build itself spans roughly 1,020 × 260 × 1,656 blocks (≈ 439 million blocks in…

Someone Built a Working AI Chatbot Inside ‘Minecraft’—Here’s How

2025/09/30 08:55
4분 읽기
이 콘텐츠에 대한 의견이나 우려 사항이 있으시면 crypto.news@mexc.com으로 연락주시기 바랍니다

In brief

  • A Minecraft builder encoded a 5-million-parameter language model entirely via the game’s redstone logic, spanning hundreds of millions of blocks.
  • The system works (in principle) but is excruciatingly slow; responses can take hours even with speed boosts.
  • It’s a proof of concept, a technical art piece, and a reminder that computation can live in odd places—what matters is structure, not just substrate.

Over the weekend, a Minecraft maestro performed a striking feat: embedding a working language-model inference engine inside the immensely popular sandbox game. The catch? All of the “wiring” was done with Minecraft’s redstone system—no command blocks, no mods (beyond rendering assistance), just circuit logic built from in-game materials.

The result is quirky, painfully slow, but fascinating: a GPT-style chatbot inside a block world, built from digital redstone and analog logic.

To understand the stunt, you need to know what redstone is. In Minecraft, redstone is the game’s analog to wiring and electronic components: power sources, repeaters, comparators, logic gates, signal delays—all built with blocks. Redstone circuits emulate digital behavior (on/off, logic operations) inside the game world.

Players long ago built calculators, memory units, and even CPUs inside Minecraft using redstone—and now they’re pushing it into AI territory. Sammyuri, the person who build the GPT hack, even built a recursive version of Minecraft within Minecraft using redstone.

In this project, the creator first trained a small language model outside of the game (in Python) using a dataset called TinyChat. That model has roughly 5,087,280 parameters, an embedding dimension of 240, a vocabulary of about 1,920 tokens, six layers, and a context window of 64 tokens. Most weights are quantized to 8 bits, though embedding and LayerNorm weights use higher precision. The redstone build itself spans roughly 1,020 × 260 × 1,656 blocks (≈ 439 million blocks in total). To film the scale, the creator used the Distant Horizons mod, which allows distant structures to be visualized in a giant world.

When you input a , the redstone circuits carry out the inference step by step, embedding lookup, feedforward passes, matrix multiplications, and softmax approximations. According to a video demonstration, the elaborate redstone build took months to assemble.

But the system is glacial. Even with an artificially boosted tick rate (~40,000× faster than normal, via MCHPRS), the response time is on the order of a couple of hours. At standard Minecraft speed, some estimate it would take over nine years to generate an answer.

Still, the point of the project isn’t utility—it’s demonstration. At its heart, this build is a playful proof that neural inference can be recast into almost any logical substrate. It reminds us how much abstraction our software and hardware stacks hide: that architecture, medium, and speed are separate dimensions. In other words, this is a technical art piece, a conversation starter: what counts is computation, not necessarily where it runs.

Does this matter? Yes, it does

Obviously, this isn’t practical, but it’s cool because it demonstrates something profound: that logic and neural-style computation can be mapped into bizarre substrates—here, virtual redstone circuits. It’s a playful proof of the universality of computation, an artistic and educational showpiece, and a challenge to our assumptions about how and where “intelligence” can run.

It forces us to ask: what really matters in a model—architecture, medium, speed—and what other strange substrates might one try (optics, DNA, mechanical systems)?

Across the web, the build has ignited debates. Tom’s Hardware ran a piece breaking down the block count, the redstone logic, and the performance tradeoffs. On Hacker News, commenters marveled at the engineering but also pressed on limitations: “At normal redstone tick rate… it would take just over 9 years for a response.” Meanwhile, in the Minecraft subreddit, fans and skeptics alike debated how much of the achievement is spectacle vs. technical depth.

Some observers suggest the project borders on meme more than research. Indeed, the real training happened externally; Minecraft only hosts inference logic. It’s a showpiece more than a practical model. But that’s precisely what gives it charm—and purpose.

GG Newsletter

Get the latest web3 gaming news, hear directly from gaming studios and influencers covering the space, and receive power-ups from our partners.

Source: https://decrypt.co/341930/someone-built-working-ai-chatbot-inside-minecraft

시장 기회
플러리싱 에이아이 로고
플러리싱 에이아이 가격(SLEEPLESSAI)
$0.01868
$0.01868$0.01868
+2.86%
USD
플러리싱 에이아이 (SLEEPLESSAI) 실시간 가격 차트
면책 조항: 본 사이트에 재게시된 글들은 공개 플랫폼에서 가져온 것으로 정보 제공 목적으로만 제공됩니다. 이는 반드시 MEXC의 견해를 반영하는 것은 아닙니다. 모든 권리는 원저자에게 있습니다. 제3자의 권리를 침해하는 콘텐츠가 있다고 판단될 경우, crypto.news@mexc.com으로 연락하여 삭제 요청을 해주시기 바랍니다. MEXC는 콘텐츠의 정확성, 완전성 또는 시의적절성에 대해 어떠한 보증도 하지 않으며, 제공된 정보에 기반하여 취해진 어떠한 조치에 대해서도 책임을 지지 않습니다. 본 콘텐츠는 금융, 법률 또는 기타 전문적인 조언을 구성하지 않으며, MEXC의 추천이나 보증으로 간주되어서는 안 됩니다.

$30,000 in PRL + 15,000 USDT

$30,000 in PRL + 15,000 USDT$30,000 in PRL + 15,000 USDT

Deposit & trade PRL to boost your rewards!