To try everything Brilliant has to offer-free-for a full 30 days, visit brilliant.org/mattbatwings You’ll also get 20% off an annual premium subscription.
Yeah bro they'll make a server, represent the internet, someone will then recreate chatgpt with redstone make it learn alot and people would be able to use it, but the problem is redstone is very slow, so they have to speed up the time so much, that it even responses in a "ok" time.
@@Centorym The GPT language models are so huge that, if we convert the whole model into redstone, the scale of the redstone machine will be so large that it will not even fit within render distance! For comparison chatGPT model size is somewhere about 10 million~10 billion times larger than the number-recognitoin model. Yeah I think command blocks is the only way to go, but even that the amount of command blocks would be monumental! And the labor of copying the entire model by hand... I think the conversion process has to be automated to be feasible
I remember having my mind blown when I saw the first working computer in minecraft... the redstone was so enormous for the time. To see neural networks in minecraft a little over 10 years later is truly staggering. I'm no one of any real note but I just want you to know that you have impressed me and I am not easily impressed.
While I agree it’s impressive, a computer is much harder to build than basic neural networks (which are still very impressive). The hard part about ai has always been the training. The evaluation at the end is quite trivial and as he shown, is mainly basic multiplications and additions.
❤😊😊😊🎉🎉🎉😊❤🎉 Truly truly i say to you all Jesus is the only one who can save you from eternal death. If you just put all your trust in Him, you will find eternal life. But, you may be ashamed by the World as He was. But don't worry, because the Kingdom of Heaven is at hand, and it's up to you to choose this world or That / Heaven or Hell. I say these things for it is written: "Go therefore and make disciples of all nations, baptizing them in the name of the Father and of the Son and of the Holy Spirit, *teaching them* to observe all that I have commanded you; and behold, I am with you always, even to the end of seasonal". Amen." -Jesus -Matthew 28:19-20
@gamer__dud10 According to the bible I go to hell regardless of my actions for something I can't change. Why would I pray to someone who would stick me in hellfire for all eternity?
@@SimoneBellomonte Murakami Shiina. I don't watch the anime, but it be funny having an anime profile carrying C programming book. I'm a great proponent of "Advanced Programming in the Unix Environment" though (not the book she's holding). It helped me going through the times when I had to write my own shell, and readline in C from scratch as a school project.
You solved a number of difficult problems elegantly, but your amazing ability to communicate those ideas both visually and with narrative ease really stands out. Fantastic piece of content my dude.
The problem with porting stuff to minecraft is that redstone built-in delays make everything a whole lot less efficient, a sentient AI in minecraft thats the size of Chat-GPT 4 or l8r (millions of times bigger than this 1 in theishre video, btw i think) wouldnt even be able to fit in render distance, but good joke nonetheless albeit jushtre a tadre lil’ bitre too predictreable. 🗿
@@SimoneBellomonte but think we have mods to make chunks hundred of thousands of blocks away to stay render and even if not, maybe just enough players standing would go around that problem. imagine also, just adding commands to make players do certain actions like force them to right click or jumb... and adding a powerful AI made out of redstone that would control an AFK player. we would go from showing how Minecraft is turing-complete to making Minecraft passing the turning test lol
I approached the topic years ago when I was doing a perceptron for playing tic tac toe. I had problems with keeping neurons and it's weight in a small enough size not to add too much of delay. Today it's solved by saving it as a signal strength in a barrel. It's such a genius thing that was impossible back in my day. Mine perceptron was five times bigger and had shit accuracy as I had to limit hidden layers (every weight and bias had to be saved in a separate RS NOR Latch bases 8-bit register plus every neutron had a 8 bit multiplicator and summator). Eventually I circled back to a rule based solution as the tic tac toe is simple enough to implement it in a smaller factor size, but it was deterministic and not really "very AI". I'm so proud and happy to see that quality redstone engineering is still alive and well and now you can do those things in a very nice and compact way.
the ONLY person on youtube that managed to explain neural networks in seconds, it took me days of research to understand them, be able to make and explain them
@@TheMochaMadness If a particular pixel on the input is lighted up, chances are you can make a list of numbers that could have that pixel included in their final drawing, as well as a list of numbers that are very unlikely to have that one included in theirs. If you combine all of these lists from each pixel on the input, then you can get on the output how likely it is that each of the numbers was the one actually drawn. Everything else (hidden layers, weights, biases, etc) is just an algorithmic way to process and combine the "lists" of information made in an ingenious manner that allows you to automatically pre-generate the lists (a.k.a. get the values for weights and biases) by "training" the network beforehand (which in reality is as simple as than taking every possible final drawing to begin with, looking at the pixels that are lighted up in each of them and storing that information).
@@TheMochaMadness its multiplying a bunch of numbers (the input pixels) with a bunch of set values (the weights), then adding a bias (should the neuron be biased towards negative or positive activation) then adding a nonlinearity (any function which cannot be plotted as a single line)
my brother in christ, IT TOOK ME TWO MONTHS TO MAKE A NETWORK FROM SCRATCH THAT SOVLED THE MNIST DATASET IN PYTHON AND YOU DID IT IN REDSTONE IN 2 WEEKS, i applaude you, you redstone genius
That's super cool ! I like that you explained the difficulties you had and how you overcame them, makes everything less mystical and really helps understand why you do what you do
In 16-bit logic, you can replace division by 15 by a multiplication by -30583 (32 bit result), three shifts, and two addition operations. You can easily figure this out by compiling a function that returns its 16-bit argument divided by 15 on clang with -O2, and what's efficient to do on silicon fabric (integers over floats, and multiplication over division) is almost always efficient in minecraft too. As for softmax, in 2021, researchers at nvidia created a hardware-efficient softmax replacement called "softermax" that is realistically implementable in minecraft. I'm not a minecraft expert, but I love seeing hardware implementations of functions, and minecraft is no exception.
Just because a function is hardware-efficient doesn't necessarily mean it can be easily or efficiently implemented in Minecraft, but it's an interesting point.
@@law1337 Minecraft logic speed relies on distance traveled by redstone for most cases. There are isntant rail-based propagation systems, but they require far more real estate, and all gates ultimately suffer redstone tick delay. In short, builds with less gates are more efficient because they take less real-estate and less real-estate translates to faster and more compact circuits. Similarly, in sillicon fabric, builds with less gates are more efficient because they require less capacitance delay, allowing shorter clock cycles for the synchronous logic, and because they consume less power. So regardless of the underlying reasons, gate count per logic achieved is an efficiency metric that's equivalent in both minecraft and hardware - and multiplying instead of dividing uses less gates for the same results, yielding higher efficiency in both. I just assumed the audience reading the comment are aware of the underlying reasons, which was a mistake on my part. I hope the explanation is correct and sufficient.
@@LtDan-fy7lc Minecraft redstone efficiency is bound to redstone ticks, rather than the underlying Java implementation of minecraft. It'll consume less redstone ticks if it has less gates, which is an equivalent efficiency metric to hardware. I hope I'm making sense.
The internet is such a cool place, imagine having a degree and choosing it to build real video games and software into minecraft and share it for a job, instead of actually building the video games and software, and making a living from that. The internet is so cool.
We’re getting to the point where pretty soon someone is gonna recreate the nes in minecraft, or make doom in minecraft, im betting that within 10 years someone will get either doom or super Mario bros or the legend of Zelda running just off redstone
Idk about other games but doom already exist, someone ran it on his redstone computer (I believe it was called IRIS) I'll get back and edit this comment with the code of the video (edit) _SvLXy74Jr4 Also I have no idea if this has been done before
That was inCREDIBLE. I'm floored. Not just by the redstone prowess but also by the ingenuity to be able to dissect these concepts and then rebuild them from scratch. Seriously impressive.
14:19 Exponentiation is pretty simple, just convert the exponent to a binary number, then for each bit that is turned on you add the corresponding exponent, and to get the list of corresponding exponents you just start with the number you're raising to the power of the exponent and multiply by two each step. Here's an example, if you have 5^7 then it will convert 7 to binary which is 111 then it will multiply 5, 25, and 625 to get 78,125 which is the correct answer.
@@skaleee1207 Nice! I didn't know its official name. Originally, I thought I was the first person to come up with it, I remember being quite proud of it, later on I learned that it already existed, but I didn't know the name until now! That name is a lot simpler than my explanation and will allow people to find more information on it too, thanks!
You could do it with base 2 (or 4), its just changing the "temperature". In that case exponentiation is trivial (bitshift). But you still have to do division.
dont worry dude! it just takes time! You should watch his logical redstone reloaded series. (both new and old). they’re really helpful in understanding how computational redstone works. After that, just try to make an ALU. Its an amazing starting goal and once you’ve made your own, you can confidently say you’re proficient. I wish you luck on your journey
10:30 I love the music and I personally listen to it on my free time and for anyone else who wants to listen to the song it’s “rain” or a play list named Lo-fy bets
I just did a machine learning course last semester, and your 2 minute explanation for an MLP network was way easier to understand than our textbooks chapter that covered it. This entire build is insane, amazing work!
Create a mod that lets you put a redstone build inside a single block. This block would have an input and an output, with the guts of it being shrunk down in a smaller dimension inside the block. Clicking on it takes you inside the block where you can build (or paste in) the redstone circuitry and connect it all to the output. Then, of course, you could have these circuits nested inside of each other so you could further compartmentalize the functions. This would compact these massive builds into a few blocks. Full functional encapsulation. Imagine the possibilities.
and you just keep the redstones process as code rather then actually running the graphics of the redstone in the block, also you would have input blocks that take a input and transfers to the other out put block inside the demension
offtopic but recently started second semester on my computer science in college and was like "omg it's mattbatwings thing" the whole lecture because i already learned most of the stuff they were talking about from you 💀
The difference in space occupied between the MLP and the Barchart graphing section is an amazing way to visualize the difference in logical functions and the use of like, analog signals vs digital. Amazing and insane, keep it up man!
Actually you only need to be continuous for training, for deployment you can drastically decrease the precision Without losing accuracy, if you do it right There's a paper where they reduce it all the way to one bit per neuron, which is a perfect fit for minecraft (And I'm pretty sure also to 4 bits, which would fit signal strength applications)
I'm guessing this is the BNN paper by Courbariaux et al. from 2016? I'm skimming through the claims and it's insane what quantization can theoretically do.
Brother. I spent a while learning how to make neural networks as a school project, and just doing this from scratch, in redstone is absloutely astonishing. Legend, Mattbat.
The first thing that comes to mind is a recent cutting edge implementation of QAT (quantization aware training) called Bitnet 1.58; it operates on different principles than a standard MLP. It replaces the Matrix multiplication with binary operators (addition, subtraction, or no-ops), so it's fast in inference deployment and cheap in that you can sort of fit a single "unit" of weights into 1.58 bits (though it's easier to just do it as a 2bit implementation with one state unused). It'd probably be way faster in a Minecraft context as one of the biggest disadvantages in IRL deployment, that you need custom hardware to take full advantage of the speed improvements, isn't really a disadvantage in a bespoke system. Anyway, the biggest difference is in the training process; it's trained at Int8 or FP8 (if memory serves, it's been a little while), and is then downscaled to the 1.58bit representation, but the information lost in that conversion to ternary values is preserved in a weight reconstruction matrix, basically. The end goal is that the network is made aware that it will be converted to a ternary representation. Hence, "quantization *aware* training", so you might be able to preserve more of the accuracy of the floating point model than you thought. Strictly speaking, the full bitnet implementation is a Transformer network, but it should still apply to raw MLPs given that they started with the FFN (essentially an MLP placed inside a more complex network with self attention and a language head).
Although I didn't actually understand what you were doing, it's always fascinating how those people like you has pushed the minecraft redstone community so far. Keep up with your work!
I can hear so many people I know saying, "I feel like an easier way to do that would be to just connect them directly, what is even the point of this machine?" People like that don't have the capacity to see beyond what's in front of them into what could be. God damn, this is cool.
Amazing project, congrats. Note: instead of multiplying the weights by 100, you can perform post-training int8 quantization to maintain most of the original accuracy.
Nice i also thought at first that your going to train the model in Minecraft but it seems that if its going to happen its going to be a whole other story
This is really good bro for visualising how computers work deep down in their tiny chips. Like ur essentially blowing up a cpu to its full size and literally WALKING thru the details and wiring. U can be a goated CS major bro, u have so much f**ING talent bro. How old are you dude? Did you do UNI, or are you currently doing uni? Like bro, go do a CS major or smth, you could make a shit ton of money from just research and development. U got like bottomless talent levels bro
Dude this is so amazing. To have the skill to make a machine like this, understand the math and computations behind it, minecraft knowledge, and the video production after it all? That's amazing
So a really cool detail is how you handle the floating point limitation, this is actually really close to some quantitation solutions, look at the paper : "The Era of 1-bit LLMs: All Large Language Models are in 1.58 Bits" if you have the time, you might find further optimizations there
its not machine learning, he just pasted the weights and biases into the neural network, not making it learn itself like a machine learning algorithm would
This is honestly incredible. I wish this was around when I was studying these concepts, would have helped me understand back propagation and softmax so much quicker
Ok, now make an AI assisted shape drawing tool for your paint program. e.g. draw a bad square, it draws a good square with the same width and height. draw an ugly number, it fixes it by converting it to the closest possible number with correct dimensions.
Oh my god. This is the best neural network I ever heard. I listen to a bunch of videos, and I always had some trouble about "what are the operations done" Thanks for breaking it down.. Multiply, adds up result, activation function, and here we go again. It makes sense now: we need to multiply, so when activation function returns 0, it completly toggle off the neurone.. Damn I love you
After not being able to build a Python simulation for one project of mine, the last thing I expected was to find a detailed explanation of how the particular part that wasn't working for me was working in a random Minecraft video I watched for fun in my free time. Thank you for the (probably unintended) help with my project, and the video was interesting in its own right as well.
Can't say enough about how great it is that you showed prior work from others in the community before digging in to your version. That's what we want to see in the community ❤
It is obviously very impressive but perhaps many people do not know that he is actually doing kind of assembly programming which many of you would not find so cool
I don't know why I only now started to actively seek Redstone computer RUclipsrs. Maybe it's because I recently got a job that involves low-ish level stuff. But now projects like this make me want to learn more about implementing even basic computing processes in my survival world. Probably won't ever be useful in single player, but the learning is what I'm here for
A simple MLP can learn a pretty good representation already for this dataset, but one easy approach would be to transform the input images (e.g. skew, rotate) and add these as additional training samples, this makes the learned representations even more robust :)
Wait what!???? Please tell me that this is just uploading the model into redstone and not all complex things like backpropagation to train the NN inside Minecraft...
For simple nural networks you dont really need backpropagation, you can just randomize the values until it gets better, itll take longer to train and wont be as efficient but its way easier to do
it‘s so crazy to think about but these minecraft videos were my motivation to study electrical engineering, which starts tomorrow. In a few years I might be able to do something like this!
The amount of engineering that people get to live through Minecraft is truly insane. There is sammyuri who built a completely programmable redstone engine, there is Matt who reverse engineered a learning AI and there are countless other hobbyist engineers who warped Minecraft to their imagination and built stupidly impressive things and the insane part is that all of this is a hobby.
Minecraft never ceases to amaze me. No wonder it was used in schools for learning purposes, there's so much actual practical use for it as a leaening tool. And that's just the vanilla game, if you dip into modding, there's sooo much more, from programming mods yourself to learn java to using mods centered around realistic things to learn about stuff.
Your channel should be sooo much bigger. The talent on show, the skill of the production and explanations, the references to other material you found. It's all ridiculous quality. Do you have any other channels? I'd be interested in non minecraft related projects with videos of this quality if you do anything like that
The UI taking up more space than the actual "meat" of a program/machine is so very typical of development! That said, this is wonderful. It really feels good when a redstone circuit works, works well, and even looks structured! Bravo
I finished a CNN machine learning model on ENSO’s (El Nino Souther Oscillation), a weather phenomenon in the southern Pacific Ocean, prediction for STEM fair, it didn’t make it past regionals because I’m still in junior league, and it is not as a easy as this man explains and showcases in this video. Props to you!
even with all of his explanations, i still have no clue how red stone can even do any type of math. I thought it was just making stuff move in a game 😭
At 7:30 what you are doing is a very known technique called quantization. It is particularly used on embedded systems, and state of the art propose a lot of optimizations and methods to work with small integers instead of floating point numbers.
To try everything Brilliant has to offer-free-for a full 30 days, visit brilliant.org/mattbatwings
You’ll also get 20% off an annual premium subscription.
ok man we got to address your genius 💀😭🙏
"Why should we try brilliant if we have you?"-me
i mean, neural networks, brilliant, it's all connected
Did you use a cnn to make the mnist image reduced and input the weights in a feed forward neural network?
Thanks I love brilliant
if you guys think this is insane, it took this guy like 2 weeks to make this all start to finish this man is a MACHINE
lol
cwaftymwastewman:3
Crazy 🎉
im sorry WHAT
@@土星猫 Ew
10 neurons? Bro just built me
Oh that means 10 of me = 1 of u
@@priyank5161Oh that means 10 of me = 1 of you (100 of original commenter’)
@@Sphinxery101 woah how u only have 1/10 of neuron?
@@priyank5161 si
@@priyank5161rip, 9/10 of their Neuron got paywalled.
ChatGPT playing minecraft: ❌️
Minecraft running ChatGPT: ✅️
Yeah bro they'll make a server, represent the internet, someone will then recreate chatgpt with redstone make it learn alot and people would be able to use it, but the problem is redstone is very slow, so they have to speed up the time so much, that it even responses in a "ok" time.
Someone NEEDS to make a chat GPT in minecraft, I Don't care if it uses command blocks, it would be so cool!
@@Centorym The GPT language models are so huge that, if we convert the whole model into redstone, the scale of the redstone machine will be so large that it will not even fit within render distance!
For comparison chatGPT model size is somewhere about 10 million~10 billion times larger than the number-recognitoin model.
Yeah I think command blocks is the only way to go, but even that the amount of command blocks would be monumental!
And the labor of copying the entire model by hand...
I think the conversion process has to be automated to be feasible
@@tung-hsinliu861 then we must settle for a very barebones version that has predetermined responses - although that'll be more of a magic 8ball ngl
@@Ari_Fudubut then it's not a neural network
I remember having my mind blown when I saw the first working computer in minecraft... the redstone was so enormous for the time. To see neural networks in minecraft a little over 10 years later is truly staggering. I'm no one of any real note but I just want you to know that you have impressed me and I am not easily impressed.
While I agree it’s impressive, a computer is much harder to build than basic neural networks (which are still very impressive).
The hard part about ai has always been the training. The evaluation at the end is quite trivial and as he shown, is mainly basic multiplications and additions.
@Pwassoncru Imagine if you could train a model right in the game)
❤😊😊😊🎉🎉🎉😊❤🎉
Truly truly i say to you all Jesus is the only one who can save you from eternal death. If you just put all your trust in Him, you will find eternal life. But, you may be ashamed by the World as He was. But don't worry, because the Kingdom of Heaven is at hand, and it's up to you to choose this world or That / Heaven or Hell.
I say these things for it is written:
"Go therefore and make disciples of all nations, baptizing them in the name of the Father and of the Son and of the Holy Spirit, *teaching them* to observe all that I have commanded you; and behold, I am with you always, even to the end of seasonal". Amen."
-Jesus
-Matthew 28:19-20
@gamer__dud10amen brother
@gamer__dud10 According to the bible I go to hell regardless of my actions for something I can't change. Why would I pray to someone who would stick me in hellfire for all eternity?
In a couple years I wouldn’t be surprised to see a video from you about building a LLM in Minecraft
this is basically that tho, you would just have to make it bigger
@@ligma445 Yea the core concepts are there but implementing things like transformers and lstms would be a whole different beast
Your transcript for college, internships, and future jobs in computer science is gonna be so stacked
nice pun
@@gryphonvalorant That's what we called "Stack overflow"
BA DUM TSSS
I'm manpage kinda guy lol
@@tanawatjukmongkol2178 Pfp (Profile Picture) and / or Banner Sauce (Source [Artist])? 🗿
@@SimoneBellomonte Murakami Shiina. I don't watch the anime, but it be funny having an anime profile carrying C programming book. I'm a great proponent of "Advanced Programming in the Unix Environment" though (not the book she's holding). It helped me going through the times when I had to write my own shell, and readline in C from scratch as a school project.
Imagine in an interview “do you have any achievements?” Then responds “I made an ai” “oh which one” “in Minecraft” “what”
You solved a number of difficult problems elegantly, but your amazing ability to communicate those ideas both visually and with narrative ease really stands out. Fantastic piece of content my dude.
E
Hello there
Bro, people out there creating neural networks in Minecraft, and I'm struggeling opening a chocalate bar while watchin them
Bruh
I dont see how those two actions compare
if it makes you feel any better most really advanced ai robots really struggle opening a chocolate bar as well
Me struggling to open FUCKING CHIP BAG (:
I still have a small wound on my finger after trying to open a water bottle
2024: Neural Networks in Minecraft
2027: Sentient AI in minecraft
Hmm, not that early. Even when it comes to hard, general purpose AI, we are not nearly there yet.
The problem with porting stuff to minecraft is that redstone built-in delays make everything a whole lot less efficient, a sentient AI in minecraft thats the size of Chat-GPT 4 or l8r (millions of times bigger than this 1 in theishre video, btw i think) wouldnt even be able to fit in render distance, but good joke nonetheless albeit jushtre a tadre lil’ bitre too predictreable. 🗿
More like ChatGPT
@@SimoneBellomonte but think we have mods to make chunks hundred of thousands of blocks away to stay render and even if not, maybe just enough players standing would go around that problem.
imagine also, just adding commands to make players do certain actions like force them to right click or jumb... and adding a powerful AI made out of redstone that would control an AFK player. we would go from showing how Minecraft is turing-complete to making Minecraft passing the turning test lol
I approached the topic years ago when I was doing a perceptron for playing tic tac toe. I had problems with keeping neurons and it's weight in a small enough size not to add too much of delay. Today it's solved by saving it as a signal strength in a barrel. It's such a genius thing that was impossible back in my day. Mine perceptron was five times bigger and had shit accuracy as I had to limit hidden layers (every weight and bias had to be saved in a separate RS NOR Latch bases 8-bit register plus every neutron had a 8 bit multiplicator and summator). Eventually I circled back to a rule based solution as the tic tac toe is simple enough to implement it in a smaller factor size, but it was deterministic and not really "very AI". I'm so proud and happy to see that quality redstone engineering is still alive and well and now you can do those things in a very nice and compact way.
This was 100% a brilliant partnership
It was 💀
😂
you were right !!
bad pun (·n·)-p
LMAO
the ONLY person on youtube that managed to explain neural networks in seconds, it took me days of research to understand them, be able to make and explain them
Imma be perfectly honest I still ain't understand
@@TheMochaMadness skill issue 😔
@@giosee_ zoinks 😔
@@TheMochaMadness If a particular pixel on the input is lighted up, chances are you can make a list of numbers that could have that pixel included in their final drawing, as well as a list of numbers that are very unlikely to have that one included in theirs. If you combine all of these lists from each pixel on the input, then you can get on the output how likely it is that each of the numbers was the one actually drawn.
Everything else (hidden layers, weights, biases, etc) is just an algorithmic way to process and combine the "lists" of information made in an ingenious manner that allows you to automatically pre-generate the lists (a.k.a. get the values for weights and biases) by "training" the network beforehand (which in reality is as simple as than taking every possible final drawing to begin with, looking at the pixels that are lighted up in each of them and storing that information).
@@TheMochaMadness its multiplying a bunch of numbers (the input pixels) with a bunch of set values (the weights), then adding a bias (should the neuron be biased towards negative or positive activation) then adding a nonlinearity (any function which cannot be plotted as a single line)
my brother in christ, IT TOOK ME TWO MONTHS TO MAKE A NETWORK FROM SCRATCH THAT SOVLED THE MNIST DATASET IN PYTHON AND YOU DID IT IN REDSTONE IN 2 WEEKS, i applaude you, you redstone genius
lol there is a video i love of some bloke just writing it in like half an hour :> watching it is great way to lose confidence in your abilities
@@WoolyCow Link?
@@GustvandeWal yt doesn't play nice with links, but its called "Building a neural network FROM SCRATCH (no Tensorflow/Pytorch, just numpy & math)"
@@WoolyCow Thx!
(Most copy the part after /watch?v= 🙂)
@@GustvandeWal oh lol i shouldve thought of that! thanks for the tip :D
Fun fact the human body has over a hundred billion neurons (100,000,000,000) even so making 10 in MINECRAFT is a amazing achievement
That's super cool ! I like that you explained the difficulties you had and how you overcame them, makes everything less mystical and really helps understand why you do what you do
In 16-bit logic, you can replace division by 15 by a multiplication by -30583 (32 bit result), three shifts, and two addition operations. You can easily figure this out by compiling a function that returns its 16-bit argument divided by 15 on clang with -O2, and what's efficient to do on silicon fabric (integers over floats, and multiplication over division) is almost always efficient in minecraft too.
As for softmax, in 2021, researchers at nvidia created a hardware-efficient softmax replacement called "softermax" that is realistically implementable in minecraft.
I'm not a minecraft expert, but I love seeing hardware implementations of functions, and minecraft is no exception.
Just because a function is hardware-efficient doesn't necessarily mean it can be easily or efficiently implemented in Minecraft, but it's an interesting point.
@@law1337 "what's efficient to do on silicon ... is almost always efficient in Minecraft too."
Java: *raises eyebrow*
I liked this because I'm curious. ☺️
@@law1337 Minecraft logic speed relies on distance traveled by redstone for most cases. There are isntant rail-based propagation systems, but they require far more real estate, and all gates ultimately suffer redstone tick delay. In short, builds with less gates are more efficient because they take less real-estate and less real-estate translates to faster and more compact circuits. Similarly, in sillicon fabric, builds with less gates are more efficient because they require less capacitance delay, allowing shorter clock cycles for the synchronous logic, and because they consume less power.
So regardless of the underlying reasons, gate count per logic achieved is an efficiency metric that's equivalent in both minecraft and hardware - and multiplying instead of dividing uses less gates for the same results, yielding higher efficiency in both.
I just assumed the audience reading the comment are aware of the underlying reasons, which was a mistake on my part. I hope the explanation is correct and sufficient.
@@LtDan-fy7lc Minecraft redstone efficiency is bound to redstone ticks, rather than the underlying Java implementation of minecraft. It'll consume less redstone ticks if it has less gates, which is an equivalent efficiency metric to hardware.
I hope I'm making sense.
The internet is such a cool place, imagine having a degree and choosing it to build real video games and software into minecraft and share it for a job, instead of actually building the video games and software, and making a living from that. The internet is so cool.
Gaming companies are so scummy and exploitative that honestly that's ain't really a bad deal after all
A forum for all ppl from stupid kids to Elon Musk
@@VortexFlickens Not much of a flattering comparison for stupid kids don't ya think?
@@VortexFlickens you said stupid kids twice
Wow look at the stupid kids hating on Elon cause he’s successful. Someone made a joke, cope
We’re getting to the point where pretty soon someone is gonna recreate the nes in minecraft, or make doom in minecraft, im betting that within 10 years someone will get either doom or super Mario bros or the legend of Zelda running just off redstone
Idk about other games but doom already exist, someone ran it on his redstone computer (I believe it was called IRIS) I'll get back and edit this comment with the code of the video
(edit) _SvLXy74Jr4 Also I have no idea if this has been done before
such an original comment
Modpunchtree already ran doom on his cpu iris you can look up the video
@@proceduralism376 yeah and its only 28~32s for each frame
A NES Emulator in minecraft would be CRAZY
That was inCREDIBLE. I'm floored. Not just by the redstone prowess but also by the ingenuity to be able to dissect these concepts and then rebuild them from scratch. Seriously impressive.
Bet bro is gonna make the observable universe with redstone next
14:19 Exponentiation is pretty simple, just convert the exponent to a binary number, then for each bit that is turned on you add the corresponding exponent, and to get the list of corresponding exponents you just start with the number you're raising to the power of the exponent and multiply by two each step. Here's an example, if you have 5^7 then it will convert 7 to binary which is 111 then it will multiply 5, 25, and 625 to get 78,125 which is the correct answer.
Also known as Square-And-Multiply algorithm
@@skaleee1207 Nice! I didn't know its official name. Originally, I thought I was the first person to come up with it, I remember being quite proud of it, later on I learned that it already existed, but I didn't know the name until now! That name is a lot simpler than my explanation and will allow people to find more information on it too, thanks!
This is an exponential with eulers number. Any output would be irrational and very messy. I understand why he would avoid this.
You could do it with base 2 (or 4), its just changing the "temperature". In that case exponentiation is trivial (bitshift). But you still have to do division.
That's like shift-and-add but for exp instead of mul
This is such a good demonstration that every hard problem is just a ton of smaller easier problems.
This is also a good demonstration that there is always someone out there smarter than you could ever be lol
Which is ironically exactly how neural networks work
I’m struggling on a 2x2 this dudes making a Neural Network.
dont worry dude! it just takes time! You should watch his logical redstone reloaded series. (both new and old). they’re really helpful in understanding how computational redstone works. After that, just try to make an ALU. Its an amazing starting goal and once you’ve made your own, you can confidently say you’re proficient. I wish you luck on your journey
its 4
@@takyc7883he is talking about door
@@takyc7883 god damnit i laughed way too hard at that
@@MrFiveHimself That's assuming the commenter is not on bedrock.
this was the first time i was confused by the redstone and not the actual mechanics of the build
10:30 I love the music and I personally listen to it on my free time and for anyone else who wants to listen to the song it’s “rain” or a play list named Lo-fy bets
I will likely never fully understand these videos, but man are they impressive 👏
Incredible work! My brain is fried
ive never seen people not reply to a famous youtuber lol
Hi knarfy
Are you gonna be doing "Breaking a neural network with your dumb ideas"?
Fried brain 🤤
@@ThatGuyNyan run knarfy RUN before this guy makes a 3 course meal from you
I just did a machine learning course last semester, and your 2 minute explanation for an MLP network was way easier to understand than our textbooks chapter that covered it. This entire build is insane, amazing work!
We got real AI in Minecraft before GTA 6
Diabolical
😭😭WE ONLY HAVE A COUPLE YEARS TO MAKE THESE JOKES; EVERYTHING WILL STOP BEING IMPRESSIVE SINCE ITS AFTER GTA 6
i came here looking for this comment LMFAO
@@goldfishglorywe got gta 6 before gta 7 - some guy in 2093
@@NolanHOfficial true
Create a mod that lets you put a redstone build inside a single block. This block would have an input and an output, with the guts of it being shrunk down in a smaller dimension inside the block. Clicking on it takes you inside the block where you can build (or paste in) the redstone circuitry and connect it all to the output. Then, of course, you could have these circuits nested inside of each other so you could further compartmentalize the functions. This would compact these massive builds into a few blocks. Full functional encapsulation. Imagine the possibilities.
and you just keep the redstones process as code rather then actually running the graphics of the redstone in the block, also you would have input blocks that take a input and transfers to the other out put block inside the demension
Compact Claustrophobia would be just fine! It even have REAL computer WITH NETWORK, programmable in Lua!
Congrats, you passed your PhD thesis.
It’s always a good day when a mattbatwings Video is on my recommended
Brp you could not get recommended this before premiere
Same
@@CubeXC you can. before a premire starts, it can be recommended
offtopic but recently started second semester on my computer science in college and was like "omg it's mattbatwings thing" the whole lecture because i already learned most of the stuff they were talking about from you 💀
Cool
At this rate in 5 years I'm going to see a video on my homepage from mattbatswings where he ports the entire Linux kernel into Minecraft
Well, they do say that Linux runs on just about anything
@@kaz49 dont give him ideas bro
The difference in space occupied between the MLP and the Barchart graphing section is an amazing way to visualize the difference in logical functions and the use of like, analog signals vs digital.
Amazing and insane, keep it up man!
As a minecraft player and having experience in ML for over one year, i''m literally blown away!!! This is truly amazing.
Actually you only need to be continuous for training, for deployment you can drastically decrease the precision
Without losing accuracy, if you do it right
There's a paper where they reduce it all the way to one bit per neuron, which is a perfect fit for minecraft
(And I'm pretty sure also to 4 bits, which would fit signal strength applications)
quantization baby
@@AgamSama-u2t Imagine getting a binary quantization good at mnist lol
Even for training, you can use quantization-aware or non-differentiable methods and meet parity on inference during training.
@@whatisrokosbasilisk80 That's what I'm talking about
I'm guessing this is the BNN paper by Courbariaux et al. from 2016? I'm skimming through the claims and it's insane what quantization can theoretically do.
Brother.
I spent a while learning how to make neural networks as a school project, and just doing this from scratch, in redstone is absloutely astonishing. Legend, Mattbat.
The first thing that comes to mind is a recent cutting edge implementation of QAT (quantization aware training) called Bitnet 1.58; it operates on different principles than a standard MLP. It replaces the Matrix multiplication with binary operators (addition, subtraction, or no-ops), so it's fast in inference deployment and cheap in that you can sort of fit a single "unit" of weights into 1.58 bits (though it's easier to just do it as a 2bit implementation with one state unused). It'd probably be way faster in a Minecraft context as one of the biggest disadvantages in IRL deployment, that you need custom hardware to take full advantage of the speed improvements, isn't really a disadvantage in a bespoke system.
Anyway, the biggest difference is in the training process; it's trained at Int8 or FP8 (if memory serves, it's been a little while), and is then downscaled to the 1.58bit representation, but the information lost in that conversion to ternary values is preserved in a weight reconstruction matrix, basically. The end goal is that the network is made aware that it will be converted to a ternary representation. Hence, "quantization *aware* training", so you might be able to preserve more of the accuracy of the floating point model than you thought.
Strictly speaking, the full bitnet implementation is a Transformer network, but it should still apply to raw MLPs given that they started with the FFN (essentially an MLP placed inside a more complex network with self attention and a language head).
Although I didn't actually understand what you were doing, it's always fascinating how those people like you has pushed the minecraft redstone community so far. Keep up with your work!
Simple words, he made Ai
I can hear so many people I know saying, "I feel like an easier way to do that would be to just connect them directly, what is even the point of this machine?" People like that don't have the capacity to see beyond what's in front of them into what could be. God damn, this is cool.
That's incredible! Combining neural networks with Minecraft is pure genius. Keep up the amazing work!
I never thought a Minecraft video will teach me neural network better than my teacher, thanks for the upload
Amazing project, congrats. Note: instead of multiplying the weights by 100, you can perform post-training int8 quantization to maintain most of the original accuracy.
bro what you are a genuine genius. I do not mean this non-literally, you are a genius
Wow, How kind of you to give the world download!!!, appreciate it! ❤
Wow, that's actually crazy, good on you!
mattbatwings in 1 year: I Made a Technological Singularity with just Redstone!
Nice i also thought at first that your going to train the model in Minecraft but it seems that if its going to happen its going to be a whole other story
This is really good bro for visualising how computers work deep down in their tiny chips. Like ur essentially blowing up a cpu to its full size and literally WALKING thru the details and wiring. U can be a goated CS major bro, u have so much f**ING talent bro. How old are you dude? Did you do UNI, or are you currently doing uni? Like bro, go do a CS major or smth, you could make a shit ton of money from just research and development. U got like bottomless talent levels bro
Dude this is so amazing. To have the skill to make a machine like this, understand the math and computations behind it, minecraft knowledge, and the video production after it all? That's amazing
So a really cool detail is how you handle the floating point limitation, this is actually really close to some quantitation solutions, look at the paper : "The Era of 1-bit LLMs: All Large Language Models are in 1.58 Bits" if you have the time, you might find further optimizations there
This is great work! I never thought we would have machine learning with just Redstone.
There's a guys who made this 1 year ago lol in minecraft
@@bintangramadan3217 Yeah but Mattbatwings is aware of that so maybe there will be something new?
You could npt have seen it yet, stop saying stuff just to get like. It was before premiere
its not machine learning, he just pasted the weights and biases into the neural network, not making it learn itself like a machine learning algorithm would
@@mineq4967 yeah that's a bit deceptive tbh
bro had a couple of lectures on how neural networks work and decided to build it out of redstone
absolute madlad
Cool, now make it in hardcore mode
This is honestly incredible. I wish this was around when I was studying these concepts, would have helped me understand back propagation and softmax so much quicker
7:18
“I pressed F5”
Ok, now make an AI assisted shape drawing tool for your paint program.
e.g. draw a bad square, it draws a good square with the same width and height.
draw an ugly number, it fixes it by converting it to the closest possible number with correct dimensions.
that sounds like pure hell
I love it
nah
@@alluseri i like how google translate assertively translates this to "Now"
Oh my god.
This is the best neural network I ever heard.
I listen to a bunch of videos, and I always had some trouble about "what are the operations done"
Thanks for breaking it down..
Multiply, adds up result, activation function, and here we go again.
It makes sense now: we need to multiply, so when activation function returns 0, it completly toggle off the neurone..
Damn I love you
After not being able to build a Python simulation for one project of mine, the last thing I expected was to find a detailed explanation of how the particular part that wasn't working for me was working in a random Minecraft video I watched for fun in my free time. Thank you for the (probably unintended) help with my project, and the video was interesting in its own right as well.
respect for the sponsor's dish at the end of the episode
Can't say enough about how great it is that you showed prior work from others in the community before digging in to your version. That's what we want to see in the community ❤
This guy in 2030: Building robots to colonize all solar system planets with just redstone!
It is obviously very impressive but perhaps many people do not know that he is actually doing kind of assembly programming which many of you would not find so cool
I don't know why I only now started to actively seek Redstone computer RUclipsrs. Maybe it's because I recently got a job that involves low-ish level stuff. But now projects like this make me want to learn more about implementing even basic computing processes in my survival world.
Probably won't ever be useful in single player, but the learning is what I'm here for
at this point bro is gonna make hooman brain in redstone dang good job
Cringe
@@ziphy_6471 omg its linus no way 🔥🔥🔥
@@ALPRNX422 I have several children in my basement
@@ziphy_6471 cool
@@ALPRNX422 Will you be my next OwO UwU * turns up bulge *
Now please make a calculator where you can draw the numbers yourself (using a neural network and calculator) that would be awesome
That would be so cool. It would also be fun if you could draw the plus sign (Or whatever operation you were doing) as well.
Bro is bout to build a quantum computer in Minecraft… 💀
the multiplication and division portions can be simplified by taking the binary input and bit shifting them
I remember when I used to feel smart making a 2x4 flush piston door.
1 step closer to google in minecraft
There is a mod that uses block's as a screen and it connects to Google's url so thechnicly you can wach RUclips in Minecraft
No mods. Pure bloodstone @@Fineas_Bondar
0:22 Loved it
It’s 10 seconds and there’s nothing to fucking love
@@Only_Some There is
NO DONT TAKE OUR REDSTONE ENGINEERS JOBS
you could probably have made your life easier with the network design but the confidence display is extremely cool.
Congratulations, that's so cool! I used to do a lot of redstone back then, so I love seeing people pushing the limits further and further with it :)
Amazing
I hope you mention the first guy who did a neural network thing in minecraft, recognising numbers
Edit: he did
Feels good to comment before watching the video...
Why would you comment this before watching the video. He mentioned the other guy very early on in the video
Twitter rot
I’m a time traveler and mattbatt has recently made a human brain in Minecraft
He also made a Time Machine in Minecraft which is how you’re here I assume
@@Meyer-gp7nq Naturally
This man is SEVERELY underrated. He made this whole thing in 11 days.
Nah bro this minecraft project is getting more crazier
How did u get around the network being bad at actual digit recognition, due to the MNIST data set all being perfectly centered?
A simple MLP can learn a pretty good representation already for this dataset, but one easy approach would be to transform the input images (e.g. skew, rotate) and add these as additional training samples, this makes the learned representations even more robust :)
@@ferguspick6845 tysm
Wait what!???? Please tell me that this is just uploading the model into redstone and not all complex things like backpropagation to train the NN inside Minecraft...
For simple nural networks you dont really need backpropagation, you can just randomize the values until it gets better, itll take longer to train and wont be as efficient but its way easier to do
Bro there's a guy from Chinese who made neural Network in mine5 1 year ago lol
@@bintangramadan3217Send the vid pls
yes it is uploading the model into redstone dw
At least...
2:07 my little pony or what?
My little pony or cable news network
it‘s so crazy to think about but these minecraft videos were my motivation to study electrical engineering, which starts tomorrow. In a few years I might be able to do something like this!
imagine if it randomly displayed "HATE. LET ME TELL YOU HOW MUCH I'VE COME TO HATE YOU SINCE I BEGAN TO LIVE" and then recited the entire speech
Dude, MLP is My Little Pony
😂
Programmers 👇
Your definitley not a programmer if you like beg
I got one thing to say… 🗿
@@IdkBruh-z8z I'm curious
@@Burueberii curious my ass
The amount of engineering that people get to live through Minecraft is truly insane.
There is sammyuri who built a completely programmable redstone engine, there is Matt who reverse engineered a learning AI and there are countless other hobbyist engineers who warped Minecraft to their imagination and built stupidly impressive things and the insane part is that all of this is a hobby.
Just another comment saying I'm thoroughly impressed, both in your execution and your explanation of neural networks. Thank you!
The way you explained all of those deep learning terms in simple words is just marvelous!
The only reasons why I don't watch this is because my -100 neourons cannot understand, and because I'm scared
What i like to do with giant redstone projects like this is just to break blocks until it stops working
I think you create videos that are worth subscribing for. Good job!
Minecraft never ceases to amaze me. No wonder it was used in schools for learning purposes, there's so much actual practical use for it as a leaening tool. And that's just the vanilla game, if you dip into modding, there's sooo much more, from programming mods yourself to learn java to using mods centered around realistic things to learn about stuff.
Without having watched the video, this shows you truly understand neural networks
Your channel should be sooo much bigger. The talent on show, the skill of the production and explanations, the references to other material you found. It's all ridiculous quality. Do you have any other channels? I'd be interested in non minecraft related projects with videos of this quality if you do anything like that
Check out Spu7Nix, he did insane projects on Geometry Dash.
The UI taking up more space than the actual "meat" of a program/machine is so very typical of development!
That said, this is wonderful. It really feels good when a redstone circuit works, works well, and even looks structured! Bravo
I finished a CNN machine learning model on ENSO’s (El Nino Souther Oscillation), a weather phenomenon in the southern Pacific Ocean, prediction for STEM fair, it didn’t make it past regionals because I’m still in junior league, and it is not as a easy as this man explains and showcases in this video. Props to you!
even with all of his explanations, i still have no clue how red stone can even do any type of math. I thought it was just making stuff move in a game 😭
At 7:30 what you are doing is a very known technique called quantization. It is particularly used on embedded systems, and state of the art propose a lot of optimizations and methods to work with small integers instead of floating point numbers.
It's so weird and amazing to see technology advance in a game.