Why don't create a supercomputer with thousands of these chips for mimicking a whole human brain? We can call it Heuristically Programmed Algorithmic Computer .
Your video always great. It gave me a bigger picture to see what's happening in the world of technology right now and the future. Keep up the good work 👍🏻
Finally got a chance to look at the house and see how it goes and if you want to be a part of the address is it possible to get a better idea of what you want to do that 😉😉😉
The chip could be designed in a way to enable capturing and restoring 'snapshots' of the individual neuron states. It could also be hooked up the traditional storage mediums and 'taught' to use it like you would record data on a piece of paper.
Well yes and no... in a complete system it can of course simply store weights and biases off chip and access them later... a kind of neuromorphic cache stored on a standard digital retrieval system. And storing a machine learning model is just that, it really is just text in the form of weights and biases. But internally? Well yes the processor will immediately 'forget' its previous state on every single iteration ... its forgetting things millions of times a second faster than you can... lol. But ultimately the 'memory' is a system that results from the current state of voltages between axons on the chip... which do change over time yes... so yes, it can forget things, that's a thing. Unlike your brain however, which also has an issue with recall? Well you use paper and pen or a smartphone to remind yourself... the machine can as I said store such things and is even pretty much psychic since it can also rely on the machines around it to help it out. You can't access someone else's head to get a clue about something you never learned... these machine s can of course.
Thanks so much for the comment! My visual style is heavily influenced by Nerdwriter1, kaptainkristian and the vox playlist: earworm. The best advice I can give is to just start creating content and you will build your skills over time!
Great advice and help. Thank you. I did a catalog for 20 years and web ms front page web. Now learning wp. Practice site GeeWhiz.store testing fun site. Old site RMConnection.com
thare is a companey called brainchip that reacently signed a contract with NASA to send a neuromorphic chip into space. the chip is called akida and its main function is to be implanted into sensors on rovers and the like to make those sensors inherently smart and not rely on a central computer.
Great video! Happy to discover a new topic I haven’t heard of yet, especially already in product stage. Just realized I haven’t watched your past 2 videos, will do this now.
This is awesome!! So, is this "as simple" as replacing a GPU with an MSC160 Board - putting aside the need to re-write the CUDA Kernels into whatever language or waiting for google to update TensorFlow? How does data get into the TrueNorth chips? Can the system be setup to directly stream raw video data, from a video camera, into the MSC160 and then some HW interrupt would trigger "the kernel" to run?
YAWN! This has been done TO DEATH already. It's just asynchronous dataflow architecture. Check out GreenArrays, etc. It's also a basic structure of the GOOD FPGAs and CPLDs. IBM has been known to do this -- make a project to just hype and troll to the extreme for marketing purposes. Anybody seen Watson these days?
I’ve always wanted to get into A.I/ Robotics as of late, I’m 27 now but don’t wanna work in food manufacturing because it’s a dead end job, in the near future, A.I could replace my job by 2030, and that’s not a doubt, I think by 2030 the Mckinsey report said that 800 Million jobs would be lost at the start of that decade by advancements in automation/ robotics/ A.I, but by 2022 100 Million new jobs will emerge from A.I/ data science, I’ve started to learn C programming for robotics and python for A.I this month
@@bassplayer807 I started working with PCs back in 1986 and shortly thereafter A.I. was a thing, I believe it was a thing before that. ELSA was the A.I. at the time for PC while Apple introduced Lisa. A.I and SciFi go way back but given the newer designs of CPUs and GPUs A.I. are more practical these days then decades ago. C seems to have a serious future given its overall age so you are wise to pick it up now; python also has a solid future. Unfortunately for me Basic, Fortran and a few others were quickly phased out while C just kept chugging along and I never got into C as deep as I should have. Automation and/or A.I. will be the new slave labor of the future and will cut costs dramatically and also require less manual labor but there will always be some form of human labor to maintain the robots until they become fully aware and all bets are off then... LOL... Too bad Corporations turn excess profit into GREEDY PROFITS rather than reducing the product price. Greed is the worst drug the Earth has and those controlled by it could not spend their wealth if they lived 5 lifetimes yet GREED causes them to obtain MORE... Take Big Oil for example, did you know their profit margin is 85 Trillion Dollars per year? Big Oil could pay off the US Debt and buy every American a 50,000 USD Vehicle and still profit over 20 Trillion Dollars each year. Imagine what it would be like to get a new 50k vehicle for FREE every year, how nice would life be... And by every American I mean every Man, Woman, and Child. NOPE Big Oil HORDS their Profits rather than give back to those that make them rich but I digress... Cheers...
Congratulations! You are incredibly basic. You said the obvious comment, the comment which plagues all of the comment sections of almost every video about Artificial Intelligence.
Thank you. this is the first time I've heard that new terminology. I will have to keep my eye on "neuromorphic" technology
Happy you're back man
Thanks Robby!
Nice progress in the editing, interesting topic ~
Thanks!
Dude I missed you!
Thanks! Good to be back, much more to come!
facts
Why don't create a supercomputer with thousands of these chips for mimicking a whole human brain? We can call it Heuristically Programmed Algorithmic Computer .
Spellbinding! Hey Neo who would have thought you could produce something that creates awe from IT. New level reached my friend , bravo!
Thanks so much!
Your video always great. It gave me a bigger picture to see what's happening in the world of technology right now and the future. Keep up the good work 👍🏻
Thanks!
Great content as always. Keep safe during this period.
Thanks so much Kenneth. You keep safe as well.
u deserve more likes and views than anyone else's video on this topic. Respect for U keep it up
Solving neuromorphic architecture is one key to the future
Brainchip Inc has already got the most powerful self-learning Neuromorphic Chip out called Akida, it's available for order via Socionext soon.
Since the memory unit isn't separate, this type of technology could be prone to 'forget' data just like a biological brain.
I'm thinking more a hybrid of long term CMOS based storage with slower recall speeds. (Kinda analogous to Cache and RAM respectively)
Finally got a chance to look at the house and see how it goes and if you want to be a part of the address is it possible to get a better idea of what you want to do that 😉😉😉
The chip could be designed in a way to enable capturing and restoring 'snapshots' of the individual neuron states.
It could also be hooked up the traditional storage mediums and 'taught' to use it like you would record data on a piece of paper.
Well yes and no... in a complete system it can of course simply store weights and biases off chip and access them later... a kind of neuromorphic cache stored on a standard digital retrieval system. And storing a machine learning model is just that, it really is just text in the form of weights and biases. But internally? Well yes the processor will immediately 'forget' its previous state on every single iteration ... its forgetting things millions of times a second faster than you can... lol. But ultimately the 'memory' is a system that results from the current state of voltages between axons on the chip... which do change over time yes... so yes, it can forget things, that's a thing.
Unlike your brain however, which also has an issue with recall? Well you use paper and pen or a smartphone to remind yourself... the machine can as I said store such things and is even pretty much psychic since it can also rely on the machines around it to help it out. You can't access someone else's head to get a clue about something you never learned... these machine s can of course.
@@covoeus its is actually referred to in machine learning as a 'save state' ... well done.
Gaming GPU?
Wow!! Terrific video of videos. The bad call you Dad! And the graphics... can you point me to a path of creating them.
Thanks so much for the comment!
My visual style is heavily influenced by Nerdwriter1, kaptainkristian and the vox playlist: earworm.
The best advice I can give is to just start creating content and you will build your skills over time!
Great advice and help. Thank you. I did a catalog for 20 years and web ms front page web. Now learning wp. Practice site GeeWhiz.store testing fun site. Old site RMConnection.com
Again great video!
liked the video, but the background music was a bit intense
thare is a companey called brainchip that reacently signed a contract
with NASA to send a neuromorphic chip into space. the chip is called
akida and its main function is to be implanted into sensors on rovers
and the like to make those sensors inherently smart and not rely on a
central computer.
Amazing!
fascinating
Awesome soundtrack dude 👍
Thanks!
"My CPU is a NEURAL NET processor, a learning computer..." When the art/movie mimicks life.
Great video! Happy to discover a new topic I haven’t heard of yet, especially already in product stage.
Just realized I haven’t watched your past 2 videos, will do this now.
Thanks!
What is the speed of nuron??
"Biological neurons can fire around 200 times a second on average."
This is awesome!! So, is this "as simple" as replacing a GPU with an MSC160 Board - putting aside the need to re-write the CUDA Kernels into whatever language or waiting for google to update TensorFlow?
How does data get into the TrueNorth chips? Can the system be setup to directly stream raw video data, from a video camera, into the MSC160 and then some HW interrupt would trigger "the kernel" to run?
Sources would make it even better
Actually, not really. Folding@Home reached above 2 exaflops a few days after this video was posted
what's the flops of a Truenorth chip anyone know?
flops calculation method is different for every kind of chip also real world and theoretical values aren't same
IBM has now developed a chemical / magnetic based, memristor type chip for AI applications. It is not transistor based.
Music Credit Please
and here is born skynet
I love C+
Can’t wait to see how Tesla’s chip holds up against this beast
the Tesla V100 SMX2 can do 15.7 FP32 Tflops
Papr3ka Wow, that’s 5.4 more than the ps5
Terminator is coming.
Can't we just use a C++ Converter of some kind?
So God should've just kept up simple or woah maybe it was when we ate the fruit
Sure makes Apple computers look like little kids toys
YAWN! This has been done TO DEATH already. It's just asynchronous dataflow architecture. Check out GreenArrays, etc. It's also a basic structure of the GOOD FPGAs and CPLDs. IBM has been known to do this -- make a project to just hype and troll to the extreme for marketing purposes. Anybody seen Watson these days?
It's pronounced "Von Noiman".
Pokhi chip is the latest.
Oh man, that gave me a nerdgasm. 😝
Lol
Yep, Skynet is on the Horizon, Enjoy life while you can... LOL...
I’ve always wanted to get into A.I/ Robotics as of late, I’m 27 now but don’t wanna work in food manufacturing because it’s a dead end job, in the near future, A.I could replace my job by 2030, and that’s not a doubt, I think by 2030 the Mckinsey report said that 800 Million jobs would be lost at the start of that decade by advancements in automation/ robotics/ A.I, but by 2022 100 Million new jobs will emerge from A.I/ data science, I’ve started to learn C programming for robotics and python for A.I this month
@@bassplayer807 I started working with PCs back in 1986 and shortly thereafter A.I. was a thing, I believe it was a thing before that. ELSA was the A.I. at the time for PC while Apple introduced Lisa. A.I and SciFi go way back but given the newer designs of CPUs and GPUs A.I. are more practical these days then decades ago.
C seems to have a serious future given its overall age so you are wise to pick it up now; python also has a solid future. Unfortunately for me Basic, Fortran and a few others were quickly phased out while C just kept chugging along and I never got into C as deep as I should have. Automation and/or A.I. will be the new slave labor of the future and will cut costs dramatically and also require less manual labor but there will always be some form of human labor to maintain the robots until they become fully aware and all bets are off then... LOL...
Too bad Corporations turn excess profit into GREEDY PROFITS rather than reducing the product price. Greed is the worst drug the Earth has and those controlled by it could not spend their wealth if they lived 5 lifetimes yet GREED causes them to obtain MORE... Take Big Oil for example, did you know their profit margin is 85 Trillion Dollars per year? Big Oil could pay off the US Debt and buy every American a 50,000 USD Vehicle and still profit over 20 Trillion Dollars each year.
Imagine what it would be like to get a new 50k vehicle for FREE every year, how nice would life be... And by every American I mean every Man, Woman, and Child. NOPE Big Oil HORDS their Profits rather than give back to those that make them rich but I digress... Cheers...
Congratulations! You are incredibly basic. You said the obvious comment, the comment which plagues all of the comment sections of almost every video about Artificial Intelligence.
Exactly why Neuralink exists. I direct hedge against Skynet.
HOLY SHIT!~
Too bad our Memory is RAM and as soon as power is lost so is the data.....
#milesbennetdyson
ok humans do this 0.93625*0.123553 that is 1 flop, supercomputer can do it 200000000000000000 times per second. yes brain are smart lol
4 thumbs down? LUDDITES!
The reason why humans brain don’t use a lot energy is because we probably game something up while evolution was going on.
more likely because if the brain used any more energy it would be extremely difficult to gather enough fuel to survive on a day to day basis
runneypo it probably learn to use it efficiently but fuck how did we become self aware is crazy to me