The most funny thing about this is that the only thing slowing the execution down was the time it took to write the result to disk and it still took only a few seconds. It's crazy to think that, shortly before I was born, calculations like these were the sort of things one set up, ran, and then came back the next day when it was finished executing. What I like about you is that the _wonder_ of modern computers seemingly isn't lost on you. Most people who see your demonstration would yawn and ask what the point was because most people are content with magical black boxes in their life that they don't know even the first things about. The whole reason I got into Computer Science in college was because I wanted to understand HOW these machines work - REALLY work - at the most fundamental and detailed level possible. Now that I know those things, the wonder still hasn't left me because, every time I think about it, I realize that what I am interacting with is the sum total result of centuries of development in such disparate fields as Mathematics, Metallurgy, and Physics. Trying to comprehend the vast web of knowledge that had to be first discovered, and then correlated, before even the concept of a "computer" could be imagined is breathtaking. I genuinely hope you stay this curious in the future.
What's really mind blowing was Ada Lovelace forsaw all of this, including LLMs. She got that insight from something that was only slightly more advanced than a programmable loom machine.
One of the greatest comments I've ever seen. Truly inspirational for those who likes computing and principles. When learning networking I was also fascinated by the inner workings of optical-fiber cables and transferring information which use photons.
Programs like this are the ultimate proof of "computers aren't slow, software is slow". The fact that my computer can calculate the 100 millionth fibonacci number 10 times in the times it takes to load up a chat program
@@aciddev_ nope, megabytes, it has 2 packs of zeroes (or other numbers), 20-868-800, 2 packs is million, three is billion. mega = million. giga = billion. terra = trillion... maybe you mean kilobytes, in that case we add a pack and it is giga
Computers are amazingly fast. They have been for decades. Everyday tasks can be so fast that you didn’t realize anything happened. Basic and old systems are even faster in some ways. I was surprised when I made a reaction time tester on an Arduino Uno some years ago. System latency was on the order of a microsecond and it showed. The PC had several centiseconds: whatever the mouse/keyboard does, a USB poll, scheduling and drawing an update, waiting for vblank, transmitting to the panel, some buffering there, LCD switch time. It’s freaky to watch high-speed video of it. Also to realize how much faster you are than the supposed human average, although who knows how that was determined. New gaming-oriented systems should be better, but instant is on another level.
It's still amazing to me that Intel and AMD can fit all these numbers into their CPUs. And every year the number of numbers increases by doubling. Amazing stuff 10/10
what? they dont fit in a CPU word. These are only 64 bit. The algorithm uses the gmp library which can perform bignum arithmetic representing each number as an array of chars in the heap.
The most funny thing about this is that the only thing slowing the execution down was the time it took to write the result to disk and it still took only a few seconds. It's crazy to think that, shortly before I was born, calculations like these were the sort of things one set up, ran, and then came back the next day when it was finished executing.
What I like about you is that the _wonder_ of modern computers seemingly isn't lost on you. Most people who see your demonstration would yawn and ask what the point was because most people are content with magical black boxes in their life that they don't know even the first things about. The whole reason I got into Computer Science in college was because I wanted to understand HOW these machines work - REALLY work - at the most fundamental and detailed level possible. Now that I know those things, the wonder still hasn't left me because, every time I think about it, I realize that what I am interacting with is the sum total result of centuries of development in such disparate fields as Mathematics, Metallurgy, and Physics. Trying to comprehend the vast web of knowledge that had to be first discovered, and then correlated, before even the concept of a "computer" could be imagined is breathtaking.
I genuinely hope you stay this curious in the future.
not to mention the vast arrays of infrastructure required to power it, let alone create it
very deep but i agree computers are a very unique thing
What's really mind blowing was Ada Lovelace forsaw all of this, including LLMs. She got that insight from something that was only slightly more advanced than a programmable loom machine.
I work with numerical simulations for an automobile manufacturer, and I wonder if in a couple years the same could be said to FEA... TBF I hope so
One of the greatest comments I've ever seen. Truly inspirational for those who likes computing and principles. When learning networking I was also fascinated by the inner workings of optical-fiber cables and transferring information which use photons.
Programs like this are the ultimate proof of "computers aren't slow, software is slow". The fact that my computer can calculate the 100 millionth fibonacci number 10 times in the times it takes to load up a chat program
that and networks are also slow
@@marmaladetoast2431 this is before the network, I just mean the ui
That speaks more to the chat program being slow.
Computers aren’t slow, IO is slow (hdd etc)
Выбирай, чат будет запускаться за долю секунды, или твой аккаунт будет принадлежать только тебе. Дело не только в скорости сети, но и в безопасности.
0:29 to 0:36 u can clearly hear the CPU fans sound rumbling and then calming after a few sec
20 mb text file is crazy😂
im sorry to disappoint you but 20868800 bytes is 20 *gigabytes*
@@aciddev_ uff
@@aciddev_ What? You mean kilobytes? ls -l outputs in bytes.
@@aciddev_ nope, megabytes, it has 2 packs of zeroes (or other numbers), 20-868-800, 2 packs is million, three is billion. mega = million. giga = billion. terra = trillion... maybe you mean kilobytes, in that case we add a pack and it is giga
@@Simple-EDU my math aint mathin
god, I love computers so much, they sure are fast! but human ingenuity to able to create these types of optimizations is also great
Computers are amazingly fast. They have been for decades. Everyday tasks can be so fast that you didn’t realize anything happened. Basic and old systems are even faster in some ways. I was surprised when I made a reaction time tester on an Arduino Uno some years ago. System latency was on the order of a microsecond and it showed. The PC had several centiseconds: whatever the mouse/keyboard does, a USB poll, scheduling and drawing an update, waiting for vblank, transmitting to the panel, some buffering there, LCD switch time. It’s freaky to watch high-speed video of it. Also to realize how much faster you are than the supposed human average, although who knows how that was determined. New gaming-oriented systems should be better, but instant is on another level.
The Transistor is top 1 human invention
CPU: Its getting hot in here...
Fan: Are we going Chernobyl here?
1.3 seconds on M2 Pro chip. And this is just using one single core out of 10
Computer science is incredible.
can you hear the computer burning? 0:49
I turned off the sound of the video but It didn't work
imagine holding yourself to not to calculate fibonacci of 100 quadrillion
amazing
What is terminal setup/font/crt effect/colorscheme. I really wanna try it out
It would be interesting to see a prime numbers program.
can it calculate the pi like also fast?
It's still amazing to me that Intel and AMD can fit all these numbers into their CPUs. And every year the number of numbers increases by doubling. Amazing stuff 10/10
what? they dont fit in a CPU word. These are only 64 bit. The algorithm uses the gmp library which can perform bignum arithmetic representing each number as an array of chars in the heap.
Is that a Linux distribution ? (By the way it is incredible how fast computers can be if you know what you do)
Yes, it's Fedora Linux (using cool-retro-term here for the phosphor/cathode type look)
@@softwave1662 Yeah it looks really cool. One day maybe i'll try Linux but for now it is too complex for me ... I'm a developer, not an egineer
@@softwave1662 I would love to have a link or name for this phosphor/cathode type look
which Font and Terminal is this? I like the glowing effect it has
Make a prime numbers program
the million dollar program
What font do you use ?
I tend to use ibm_bios-2y and variants, from "The Ultimate Oldschool PC Font Pack" int10h.org/oldschool-pc-fonts/
How in the world..
what terminal are you using?
@rerereuj thx
what keyboard u use?
A keychron k6 with the switches swapped for Glorious Panda switches, and SA profile PBT keycaps