“Don't sacrifice your salary for your ‘incredibly valuable’ stock ’cause, you know, that may not work” = most practically useful quote of the lecture lol
@@bnsonline5099 ARM is IP. As soon as RISC-V "matures" companies will jump on it and ARM might be in trouble. Not going to happen overnight but it's certainly closer than it was a few years ago.
@@bnsonline5099 RISC-V has one disadvantage. Because their licensing allows developers/manufacturers to add hardware extensions there will be incompatibilities that the 86x systems did not have. ARM doesn't have that issue. Also at the moment RISC-V is more power hungry for less performance than ARM chips now (~2-4 years behind).
@@madmotorcyclist well .... If you're a developer, you could interrogate the risc-v hart for any built-in extensions when your program starts... You could then fail fast with a reasonable error message OR fail over to emulation of the non standard instructions at a performance cost.
Anybody developing a system should know at least something about the fields above and below (or before and after, depending on how you visualise processing stacks) their area. A software person needs to know something about the problem domain and about hardware. The hardware types need to understand software and materials science, and so on.
It's good, but it's very specialised. It's like asking how many copies of "Computer Architecture: A Quantitive Approach" have been bought (I have Edition 2, which I bought in 1996). It's never been a huge best-seller, because it's so specialised and most computing types (especially, ironically, hardware people), don't tend to care about computer architecture. Nevertheless, it's one of the most influential computing books of all time, because the right people have copies and they've put the ideas into practice for the benefit of all of us. Anyway, 1.3K likes so far!
@@julianskidmore293 I sometime think that 2nd edition nailed it and that it is best of all editions. Newer editions have newer examples, but sadly some things were moved to appendix.
@@arunavaghatak6281 But the brain is made of neural networks right? I don't really understand this criticism. People skeptical of neural networks, do they have a narrow definition of neural networks they don't like, one that excludes the biological brain? Or are they critiquing neural nets writ large, all variants, including 'graph neural networks' and other artificial models, as well as the human brain itself?
@bitflogger why not? They may not be YET. What’s the difference between artificial intelligence and intelligence? Neural networks are exactly what gave birth to intelligence in the first place...
@@youtubiuttoni Nural Networks are an abstraction of the real thing. A good enough abstraction? I assume it would get better with time, but would it ever become good enough? I've read the NN go back 30 years, not yet? 30 years from now, not yet? Seems like fusion tech.
In talk He said Berkeley students made the chip?Can Anyone tell me How did they did it ? In Rudimentary steps.For Research I need this Because It would be cool I can Do this.
It’s been a little more than a year since this lecture and so much has changed already... CISC is practically doomed and RISC V keeps getting bigger and bigger...
i just read his book on hardware software interface and get to be familiar with RISC-V, and came here to view some videos on it and finally come across the inventor’s lecture on computer architecture, and find that RISC is in such a development process right now...
RISC-V still a couple years behind AArch (when it comes to microarch) and it's like triple the price. That said, this RISCV is valuable beyond imagination and it's on the right track!
@@cat-.-Thing is, it's much like arm in the first place. It was small and efficient but powerful enough, but not enough, power came with time and need. Now just look at what apple is doing with M3, it's seriously competing favorably with x86 and much more efficient. Risc-V will take time to get the high performance cores developed and desire to do so, which isn't hard considering it's free to use.
@@mikafoxx2717 I hope! But I don't see how, without major commercial adoption, RiscV could compete with arm and x86 in terms of uarch development. Vendors are very secretive and protective of their uarch. Oh, and, besides, it's becoming increasingly clear that we need a open ISA for GPU compute as well!
You're copying those big guys out there. The problem you got is those big guys aren't gonna tell you their Trade secrets. Did you tell of your Jujutsu.
“Don't sacrifice your salary for your ‘incredibly valuable’ stock ’cause, you know, that may not work”
= most practically useful quote of the lecture lol
@32:16. About ARM. Chromebooks and MacBooks M1 are ARM64. Amazon AWS has ARM64 based EC2 servers. ARM64 is gradually replacing x86_64.
I think in the same way, ARM is growing very fast and RISC-V too.
@@bnsonline5099 ARM is IP. As soon as RISC-V "matures" companies will jump on it and ARM might be in trouble. Not going to happen overnight but it's certainly closer than it was a few years ago.
@@bnsonline5099 RISC-V has one disadvantage. Because their licensing allows developers/manufacturers to add hardware extensions there will be incompatibilities that the 86x systems did not have. ARM doesn't have that issue. Also at the moment RISC-V is more power hungry for less performance than ARM chips now (~2-4 years behind).
Specifically it's I think it's called aarchv8-a
@@madmotorcyclist well .... If you're a developer, you could interrogate the risc-v hart for any built-in extensions when your program starts... You could then fail fast with a reasonable error message OR fail over to emulation of the non standard instructions at a performance cost.
Anybody developing a system should know at least something about the fields above and below (or before and after, depending on how you visualise processing stacks) their area. A software person needs to know something about the problem domain and about hardware. The hardware types need to understand software and materials science, and so on.
Agree 100%
I'm pretty surprised that this has only 800 likes. This is good information.
It's good, but it's very specialised. It's like asking how many copies of "Computer Architecture: A Quantitive Approach" have been bought (I have Edition 2, which I bought in 1996). It's never been a huge best-seller, because it's so specialised and most computing types (especially, ironically, hardware people), don't tend to care about computer architecture. Nevertheless, it's one of the most influential computing books of all time, because the right people have copies and they've put the ideas into practice for the benefit of all of us. Anyway, 1.3K likes so far!
@@julianskidmore293 I sometime think that 2nd edition nailed it and that it is best of all editions. Newer editions have newer examples, but sadly some things were moved to appendix.
He looks like walter white of computers
I am not in the meth empire, I am in the silicon empire.
@@cognominal *Stares intently into camera* My Instruction sets are pure.
Dear Electronic engineer, please subscribe to my channel.
Fred Brooks Jr gave a talk at SIGGRAPH once.
He apologized for creating a notation that had syntactic significance for spaces...
Thanks a lot! Amazing lecture. Explains a lot about m1 from apple and trends in future cpu for servers today
59:10 is a typical "I bet you say that to all the girls" moment :)
amazing!! thank you
hello..can you help me to translate code from C language to RISC-V?
I'm not convinced that Neural Networks are THE answer to AI. Adding massive computer power does not mean anything if its not the answer.
Me too. Neural networks are not as data efficient as the human brain is. We need some radically different architecture for AI.
@@arunavaghatak6281 Radical? How about zeros and ones, and Xs?
@@arunavaghatak6281 But the brain is made of neural networks right? I don't really understand this criticism. People skeptical of neural networks, do they have a narrow definition of neural networks they don't like, one that excludes the biological brain?
Or are they critiquing neural nets writ large, all variants, including 'graph neural networks' and other artificial models, as well as the human brain itself?
@bitflogger why not? They may not be YET. What’s the difference between artificial intelligence and intelligence? Neural networks are exactly what gave birth to intelligence in the first place...
@@youtubiuttoni Nural Networks are an abstraction of the real thing. A good enough abstraction? I assume it would get better with time, but would it ever become good enough? I've read the NN go back 30 years, not yet? 30 years from now, not yet? Seems like fusion tech.
timing attacks 20:10
Yeppp. Very Amazing.
Coolest ACM A. M. Turing Award laureate ever
very good !!
In talk He said Berkeley students made the chip?Can Anyone tell me How did they did it ? In Rudimentary steps.For Research I need this Because It would be cool I can Do this.
It’s been a little more than a year since this lecture and so much has changed already... CISC is practically doomed and RISC V keeps getting bigger and bigger...
i just read his book on hardware software interface and get to be familiar with RISC-V, and came here to view some videos on it and finally come across the inventor’s lecture on computer architecture, and find that RISC is in such a development process right now...
RISC-V still a couple years behind AArch (when it comes to microarch) and it's like triple the price. That said, this RISCV is valuable beyond imagination and it's on the right track!
"The longer the icon of RISC-V is on earth, the stronger it will become"
"RISC-V's power double every year"
@@cat-.-Thing is, it's much like arm in the first place. It was small and efficient but powerful enough, but not enough, power came with time and need. Now just look at what apple is doing with M3, it's seriously competing favorably with x86 and much more efficient. Risc-V will take time to get the high performance cores developed and desire to do so, which isn't hard considering it's free to use.
@@mikafoxx2717 I hope! But I don't see how, without major commercial adoption, RiscV could compete with arm and x86 in terms of uarch development. Vendors are very secretive and protective of their uarch. Oh, and, besides, it's becoming increasingly clear that we need a open ISA for GPU compute as well!
Please do at least 1080p. We're about to end the second decade of the 21st century.
At least they got a decent mic on him. (those cuts to the audience mic aren't great tho)
Yeah plus 720p would have been fine if slides were bigger
But they arent
Walther White shaved and got into computer science
WW is already bald
Great video! one thing though, I believe Mr Patterson actually drank out of someone else's water bottle!
good
CISC was always intended for humans to use, RISC makes more sense if machine code is being generated by a computer
You mean like compilers :)
@@brokula1312 yes!
hello..can you help me to translate code from C language to RISC-V?
@@roaaalsadeq3530 no, i'm sure there's a compiler for it for free somewhere
I must be a computer I hand coded 6502 assembler with just an 8x8 table of opcodes in 1980.
Fight me.
Walter White
Hello Mr. Patterson it's Aleisha Holt or Angel Olsen I'm still here ,no one is going brak my stride ... What does Google mean the circle ...
Damian Marley life is a circle ⭕
hello..can you help me to translate code from C language to RISC-V?
أعجبني
تعليق
مشاركة
Dear Electronic engineer, please subscribe to my channel.
GCC
So this guy voted from Biden. Ha! I'm smarter than him.
🤎💜
You're copying those big guys out there. The problem you got is those big guys aren't gonna tell you their Trade secrets. Did you tell of your Jujutsu.