Eh, these are really too shallow for any in depth discussions, but a decent introduction. Its a good compromise if you want to talk about a bunch of topics, but don't plan on really using any of them. That is, they are good information for the general public, but not enough for engineers.
You are totally right, these are all very introductory concepts, but well presented. Also, you have to assume that these are intended for the curious general public and for high school students that would like to start studying college stuff. I'm saying because I subscribed when in high school and I know I'd have enjoyed this overview of what I ended up knowing in depth today
+Ricardo Lages Yeah. She just explained something in a very simpler way than my teacher that took ages to explain the very same thing and let everybody confused...
I love how you explain the levels of abstraction. Each time something seems overcomplicated, I just remember that it's just a bunch of transistors wired up in clever ways at the core of it all and at the core of those transistors are on and off electric signals. Helps me relax and admire the beauty and craftsmanship of computers.
I'm a computer engineer, and I have to say that I am really impressed by the quality of this series. Nice way to summarize a 4-year degree program! And that "new level of abstraction" thing is all that computer engineering really is. You might as well call it abstraction engineering :p
Ex-software engineer here. A brilliant software architect I used work with would often use the phrase, "... and lets go to another level of abstraction." On the downside, levels of abstraction although giving you huge amounts of power reduce clarity so shouldn't be overused, but that's mostly a software issue.
"There is no problem in computer science that cannot be solved with another level of indirection, except the problem of too many levels of indirection" - Quote from a long forgotten source.
I'm a complete amateur, so this is pretty hard to imagine and conceptualise. Is it sort of like a combination of the movies Inception and The Matrix, so that the further away we are from the first level of dream (or abstraction) the more Matrix-like it becomes? Or am I totally off? Thanks.
Scattered Moon Shards Allot of people get this confused at first. It's just like in biology. You have a fundamental level that's easy to understand, but note a useful way to view most problems. As you look at bigger systems (more abstract ones) you forget about the details and conceptualize it differently. Back to the physics analogy, physics is fundamental, chemistry is a level of abstraction, biology adds another level, psychology adds another level, and sociology adds another level. Generally, more abstract concepts are more similar to how you normally think about problems, but doesn't specify as many details.
I took a computer architecture course at university and we build an 8-bit ALU that could Add, Subtract, AND, OR, XOR, and a few other logic functions that I can't remember for sure (maybe multiply and divide). We designed a PCB and sent off our design to be manufactured. Then we soldered the chips into our board ourselves. It was a really fun project.
which university is that? Our university just did everything on paper and the instructor's always in an angry and gloomy mood, reluctant to respond to questions while suppressing the class' will to bring up questions. We never thoroughly understood how it worked.
i'm a 34 year old software developer that's become burnt out. This series is reminding me that this stuff is fun and exciting and not just "oh my god why is this library broken?"
like Einstein once said. " Intellectual growth should commence at birth and cease only at death" Life is like riding a bicycle to keep your balance you must keep moving.
I'm 38 and I got burned out so bad when I was 35 that I've been out of work for 3 years while recovering. All I can say is do whatever you need to do to take care of yourself and I hope you can find the spark again. :)
Sorry, but what are the inputs hooked to? Whey are they turning on or off in the first place? And if you get a 1 from A is that different than a 1 from B? What do the result do for you? if it's 1 or 0? There are two ways to get a 1. Is that make a difference in the bit result?
As a side note to those wondering how subtraction is done, it is also done through addition. The way computers store negative integers in binary is a system called 2's complement, which is difficult to discuss in a short text comment, but it allows the addition of a positive integer X and a negative integer Y to equal X - Y. If you want to subtract Y from X you just turn Y into a negative number (which is a very fast operation) and then add them together.
Well, typically computers didn't want a negative 0, as it was generally seen as wasteful. Two's compliment was created to be more efficient in terms of number usage. 1's compliment was used for simplicity when computers increased in speed naturally all of the time.
Oh yeah, there are many reasons 2s compliment is better. It's just a fun foot note in Comp Sci history, like sexidecimal (hexidecimals true name), and using bytes to represent two digit numbers for human readability.
You can just do a multiplication that has the decimal shifted over. The multiplier circuit would see two integers, but then you'd need additional circuitry to account for decimals.
Crash Course World History is amazing. Many other good ones though. Astronomy is excellent as well. Really they are all worth watching. I find this one to be very good.
It's somewhat worrisome just how happy this series makes me. Carrie Anne does a great job explaining the basics (and the abstractions that follow), the writers have put in just the right amount of detail and the animators are, as always, exemplary. I spent a decade writing low level code for microcontrollers, bit slice processors and very early GPUs. I lived at the ALU level, passing data from register to register while checking flags. These days I program for the web and it's so much more abstract than physically pushing bits from A to B; I kind of miss the simplicity ;-)
I'm studying computer engineering (Sophmore) and this series turned out to be more valuable than my entire first semester! Thanks to everyone who is involved with the production. Cheers!! I'm gonna watch every playlist on this channel during the Quarantine now.
In any scientific endeavor--especially computer science--it's a positive sign if you're feeling overwhelmed. That means you're trying. The only way to become an expert is to allow yourself to be constantly overwhelmed and to continue persevering. If you invest enough time and effort, all of a sudden you'll realize that things start clicking and making sense. No matter how difficult a concept seems, or how slowly you feel like you're learning, just don't give up, keep coming back, and one day you'll have mastered what you once thought was too difficult for you to learn. You can do it.
Justin Touchdown Yes it is. But you can certainly do it, my friend. Just set small goals and tackle one thing at a time. Also, try to start building things as early as you’re comfortable, because it can make the learning process a lot more enjoyable. Lastly, if you ever have questions, feel free to drop ‘em right here. I’d be glad to help out with anything if I’m able
The more I watch CrashCourse, the more I want to complain as to why this kind of video wasn't around when I was in high school. Could have helped me a lot when I need to look up something that I have forgotten, but I can't understand what my textbook was trying to say to me to get me to understand to concept.
For the longest time I've believed that electrical engineering coursework is not hard instrinsically, but made unnecessarily difficult with confusing and boring lectures from professors who've been given no incentive to rethink how to best teach the content. This video series is complete validation of that. The clarity with which CrashCourse/PBS Digital presented this is awe-inspiring. Thank you so much!
CrashCourse is amazing. I didn't think I'd ever get the privilege of understanding how a computer works or how a muscle works on the cellular level. Keep up the excellent work!
I study computer science - so i know all this stuff - but still enjoy very much, how you explain all this in such an elegant easy way! .Its important, so many peaople think of computers as sort of magic, which they aren't. Especially since computers shape our lives, in a way that in democratic societies, we need more people to be able to discuss computer related issues. Thanks.
I am a computer science graduate in a famous university for 4 year. I have learned a whole bunch of details and complexities. I know how to explain and apply it but have never been told to see them as a whole. You can call me an expert compared with those beginers. But this series helps me to understand the assemble of a complexity from scratch and as a whole. Thank you, Carrie.
I watched this series when it first came out, before I started uni. I loved it. I was so exciting to learn about these magical things that goes on inside of computers. Now I just started my second year of computer science engineering, partly thanks to this course. It's so wonderful to go back an rematch the episodes with a whole new understanding of the topics. This is truly a wonderfully produced series!
You could also do `++levelOfAbstraction` Since ++ before or after a variable (++var vs var++) is essentially the same operation, although on older compilers, ++var would result in less instructions if it's not in an assignment or anything like that. A little bit of random micro-optimisation trivia (nowadays, if the result of ++var is not being assigned into another variable in style of `int foo = ++var;` or `int bar = var++;`, it will be optimised into the same instructions)
Yeah, I always use ++var when I can, just because I never trust the compiler to optimize that for me :P (For the non-programmers reading this and wondering: ++x adds 1 to x and then returns the result. x++ returns the current value of x, but then adds 1 to it. You can't have instructions continue after returning a value, so in order to do x++, you first have to store the current value of x in a temporary variable, then add 1 to it, then return the value you stored; that's extra instructions compared to just the "add and return" of ++x.)
Yeah, in this case it doesn't matter. In does matter though when post-increment and pre-increment makes a difference, as you mentioned. I usually just do post-increment/decrement by habit.
Of the many years ive been alive nobody put across these basics of what a computer is composed of better than this series, I just wish i had the chance to have watched these when i was a kid because it would have helped me grasp the concept of computing alot easier. I thankyou.
English is not my first language, I very reluctantly put it to 0.75... It takes more time but I am less frustrated and can understand each concept much better. Amazing course.
Impressive , I am a computer science student and You make the subjects so much more interesting than what a professor does. Well done. I can't get enough of these videos
First off, this is great. Coming back after two years because I had to start intensive work since this first came out and couldn't wait for the episodes one at a time. I have to say at this point, you first drop the ball. After taking us step by step in precise detail about how each individual, fundamental step works on the construction, you suddenly zoom past the full adder part without taking the time to outline how it works in detail or show examples to demonstrate. Also, I think it helps to overstate the significance of xor gates essentially being single bit binary adders. To me, that helps acknowledge where that actual adding process is occurring in the circuit. You move up the level of abstraction too quickly here. Also, I know you come back to it after, but it's really not initially apparent where the first C input comes from in the full adder. v=VPw9vPN-3ac maybe this will help people if they want to go through it again slowly before moving past the full adder and actually understanding it. Remember A**ange \m/
this is good, not just because the information she's giving more so is the fact that she's capable of explaining it all i cant even explain why im single
I love this series and I love Carrie Anne as a host! She's so good at explaining really complex stuff! That and I love that's she so enthusiastic and even more of a nerd that I am xD
I am an Architect who always find myself looking for improving myself. Computer Science isn't my expertise but I find it very intriguing. I truly appreciate the video series of the Crash-Course lesson on line. The videos are done professionally and well presented. I want to thank you so much for sharing your profound knowledgeable.
I'm really impressed at the quality of this series so far (this coming from a software engineer). The explanations so far are spot on. The more industrious students can Google the remaining content. For those that do... you need to have a good understanding of the material covered in the previous four episodes to be able to implement the remaining operations. Next episode: flops/latches!
I'm not anywhere near the field of computer science, but a lot of people in my life are. I've been trying to learn the basics so that I can understand what they're talking about when I ask them "so how are your classes?" or "what are you working on at the office?" Of all the things I've tried so far, I've found this series to be the most helpful (even if I do have to rewatch a few parts to fully comprehend them). Thanks so much for the great resource!
I watched this series 2 years before I started study Computer Engineering - It does help me a lot! My teachers are lame and boring. I pitty my classmates, they always get confused with the explanations in class and for me it's peace a cake because I watched this series. And I still watching, this is my third time!
While watching this, I kept thinking about quantum computing, which would theoretically use 6 states instead of 2 (on/off). It's already hard to follow all the logic for our binary system, it blows my mind how complex a six-state system would then have to be.
Once quantum computing is mastered and made easy to mass produce we will have a lot of power at our hands. Unfortunately that also means that hackers have that power, and a lot of modern encryption methods are easiily broken by quantum computing. I see the advancements in quantum computing putting a lot more emphasis in security and encryption for future computer scientists.
Logic circuits make me bizarrely excited... Also! You can't trivially tell if A > B just by testing (B - A) < 0! Joshua Bloch wrote a good article about how this trick only works if you know the subtraction won't overflow, but on true arbitrary numbers it will fail half the time. (Maybe ALUs have different ways around this, but programming languages do not...) Spread the knowledge far and wide :D Great episode!
you are truly the only one that explains all the layers in the computer. and i thank you for that. i finaly start to understand everything instead different layer on there own.
A basic ALU is actually pretty simple, complexity goes up the more operations you need to implement via hardware (a multiplication made by hardware is incredible complex, like, a few hundreds of logic gates for a 4 bit product)
It's incredibly clever too, I was mystified for days when I found a paper describing how to implement it for real. Then I got how it worked, I felt as my IQ increased, but I don't believe in IQ.
Still, the video misleads saying that we've build an ALU at that point, while we've only build an Adder. I kind of can't believe she promised to move on from ALU after this video, without even explaining how the hell ALU works. I mean, it's definitely not an obvious thing for a newbie, even less obvious than binary addition which _was_ discussed in the series.
The budget is only the Executive Branch's wish list. Congress can, and almost always does, veto it in favour of their own budget which the Execs are required by law to implement.
Sordatos Probably not. Even if the budget it passed and Crash Course, as a consequence, gets completely cut from PBS funding, there would still be the money from the patreon support (and several other sponsors). So if you're worried, just sponsor them on patreon!
@Asitya Mathur What??? PBS gets it's funding from the federal government and private donations. President Trump's recently proposed Federal Budget completely cuts PBS and many other small programs completely off from federal funds. While not final, if Congress approves the change then yes PBS will lose a significant source of revenue. Congress can't directly tell PBS to stop funding Crash Course, but it can remove the funds PBS needs to do so. Without federal funds there's basically no way PBS could continue paying for all the shows on TV and RUclips that it does, it will need to cut some of them. And that's the best case scenario, it's possible that without federal funds PBS will be unable to sustain anything and will completely shut down.
This is great, it's pretty much like a class but easier for people new to computer science to understand and far more interactive/interesting. haha, I'm a grade 12 student and I've been doing web programming/design, and I love to learn things about computers so this is perfect! Thanks
That moment when Carry was legitimately excited when "we" constructed our ALU was way too cute It was a tough episode, had to rewatch twice at slower pace to fully understand, but was awesome nonetheless. Thank you for your hard work
+Tommy Rosendahl She's going to explain it better later but... Algorithms are just thought processes, they are not executed directly by computers. They are defined through programs written in a programming language. Program code are translated to assembly code. Assembly code are translated to machine code. Machine code is just a bunch of sequence of 0's and 1's that defines your very one program and everything boils down to very simple commands of sum, subtraction, addition, comparison, etc... All executed by the ALU she just explained. There are way more things than that obviously (like memories she's going to explain later) but you can see the bigger picture...
"They are defined through programs written in a programming language" Not necessarily. With an FPGA (Field Programmable Gate Array) one can build custom hardware architectures for the implementation of an specific algorithm using logic gates and some good old sequential logic. Now, if one is using the FPGA as a hardware accelerator, it is possible to use a program in a programming language to feed and receive data into the FPGA, however the whole point of it is that this program will only give it the ones and zeros of the data and some configurations, and the actual algorithm will be executed by the custom architecture.
verdatum Yes, but VHDL and Verilog are not programming languages, they are hardware description languages. Even if they have some behavioural synthesis options, after you compile your code you end up with a bitsteram that is sent via jtag and configures the registries in the LUTs to implement your architecture.
This is one of my favourite videos ever, it made me start to actually understand how computers work. I’m going to school for computer engineering starting next year and am very excited
Really enjoyed the show. I was, in theory, already familiar with the topic, but that was several years ago, so I really enjoyed the refresher course! I dont suppose you would have the apatite to go from transistors to op-amps in a future episode with this same awesome format?
Is it me or has Carrie Anns sitting-posture improved greatly since her first episode? Btw, is there going to be an episode about silicone computing contra human brain computing? At this point there seems to be a lot of similarity, but human brains seems to be able to multiply without convoluted logic gates.
The brain is complicated. I mean, we don't use logic gates, but we use retrieve the multiplication answer we memorized through a request from neurons that get it from our memory area. For big numbers, we use an algorithm to do it (the multiplication one we teach elementary school students). Computer engineers have experimented with simulating this with lookup table hybrids with a faster algorithm. I think the faster algorithm is used on its own nowadays IIRC.
You might as well be comparing Conways game of life to a turing machine. Are the conceptual models of both able to solve any computationally solvable problem? Sure. Do they really have similare operational features, or is the conversion from one to the other obvious? No. Not at all. Its like trying to compare a rifle to a chainsaw because they can both break things. Yeah, there is something there, but not really. Of course, go up a few level of abstractions and they look similar again.
I would say that although the organisation of the brain is different than computers their are low level components that appear similar: neurons are like transistors, and neural nets are like logic gates. As for how humans do multiplication, I agree with MegaZeroX7, humans generally use lookup tables (memorization) to determine the answer to simple multiplication. I use a lookup table as reference and then offset from there (I know 5 x 7 = -7 + (6 x 7), and I have 6 x 7 = 42 memorized, so 42 - 7 = 35).
Unfortunately, we've still got a ton to learn about how the brain works. At the moment, it's still kind of difficult to compare them. However, this show might address Neural Networks, which is slightly related to how the mind works. And btw, it's "silicon" not "silicone". Silicone is for stuff like heat-resistant rubber cookware and breast augmentations. Silicon is a metallic element that can be modified into a semiconducting material :-)
I'm actually surprised at how low-level these videos get. Not complaining, just a bit of a shock, as many CS courses just gloss over architecture and go to algorithms.
Courses generally don't, but if you major in CS, you'll pretty much always end up taking a Computer Hardware course that covers all this, but in still-greater detail.
Rachel your enthusiasm for life reminds me of my dear daughter, sweet and loving and with a bright future ahead of herself. She killed herself a few months ago at the age of 17. No one suspected that she had depression and was constantly fighting with her inner daemons.
@@mazapanputrefacto3299 sorry for your loss but why do you feel the need to comment that? Perhaps you need to seek therapy instead of commenting something like this to a young girl
I hope you will continue making more videos on computer science. Hopefully, you will eventually get onto programming languages like C, C++, Java, etc. I love to see how you teach them to us. Your tutorials are brilliant. They make our brains swollen.
Can we get courses for different types of math? (trigonometry, algebra, calculus etc.) I see you have already started with statistics so I think that would be the way to go. Also Crash Course Music and Art would be awesome!
When ever they said "Intel 74-181" they really mean Texas Instruments. Intel was still a start-up making RAM chips at the time that beast was incorporated into the mini computers of the time (late-60s, early-70s). See this for more info: www.righto.com/2017/03/inside-vintage-74181-alu-chip-how-it.html
CORRECTION: I'm scouring the internet and there's debate if it was TI or Fairchild that introduced it first... but I can't find any definitive source. It definitely wasn't Intel though. 😲
Carry Ann (like carry bit), you' re my personal hero! I'm teaching Computer Architecture to college students and though I have a huge passion for the subject I always thought that the scientific committees have a very sterile and inhuman approach to what is one of the most fascinating subjects of computer science! So, again, a huge thanks for the "new wave" approach, contribution and isnpiration!
I'm a computer engineering student and I'm truly impressed by how good these videos are! Congratulations!
Eh, these are really too shallow for any in depth discussions, but a decent introduction. Its a good compromise if you want to talk about a bunch of topics, but don't plan on really using any of them. That is, they are good information for the general public, but not enough for engineers.
It's almost as if we had some term for a quick overview of a subject laid out in segments. Like some sort of.....crash course.
You are totally right, these are all very introductory concepts, but well presented. Also, you have to assume that these are intended for the curious general public and for high school students that would like to start studying college stuff. I'm saying because I subscribed when in high school and I know I'd have enjoyed this overview of what I ended up knowing in depth today
+Ricardo Lages Yeah. She just explained something in a very simpler way than my teacher that took ages to explain the very same thing and let everybody confused...
lol welcome to good college teachers hunt
I love how you explain the levels of abstraction. Each time something seems overcomplicated, I just remember that it's just a bunch of transistors wired up in clever ways at the core of it all and at the core of those transistors are on and off electric signals. Helps me relax and admire the beauty and craftsmanship of computers.
I'm a computer engineer, and I have to say that I am really impressed by the quality of this series. Nice way to summarize a 4-year degree program! And that "new level of abstraction" thing is all that computer engineering really is. You might as well call it abstraction engineering :p
Ex-software engineer here. A brilliant software architect I used work with would often use the phrase, "... and lets go to another level of abstraction." On the downside, levels of abstraction although giving you huge amounts of power reduce clarity so shouldn't be overused, but that's mostly a software issue.
"There is no problem in computer science that cannot be solved with another level of indirection, except the problem of too many levels of indirection" - Quote from a long forgotten source.
I'm a complete amateur, so this is pretty hard to imagine and conceptualise. Is it sort of like a combination of the movies Inception and The Matrix, so that the further away we are from the first level of dream (or abstraction) the more Matrix-like it becomes? Or am I totally off? Thanks.
Joshua Cook, yep. Forgot that quote. It should be a rule of software framed and hanging on a wall.
Scattered Moon Shards Allot of people get this confused at first. It's just like in biology. You have a fundamental level that's easy to understand, but note a useful way to view most problems. As you look at bigger systems (more abstract ones) you forget about the details and conceptualize it differently.
Back to the physics analogy, physics is fundamental, chemistry is a level of abstraction, biology adds another level, psychology adds another level, and sociology adds another level.
Generally, more abstract concepts are more similar to how you normally think about problems, but doesn't specify as many details.
I took a computer architecture course at university and we build an 8-bit ALU that could Add, Subtract, AND, OR, XOR, and a few other logic functions that I can't remember for sure (maybe multiply and divide). We designed a PCB and sent off our design to be manufactured. Then we soldered the chips into our board ourselves. It was a really fun project.
which university is that? Our university just did everything on paper and the instructor's always in an angry and gloomy mood, reluctant to respond to questions while suppressing the class' will to bring up questions. We never thoroughly understood how it worked.
i'm a 34 year old software developer that's become burnt out.
This series is reminding me that this stuff is fun and exciting and not just "oh my god why is this library broken?"
like Einstein once said. " Intellectual growth should commence at birth and cease only at death" Life is like riding a bicycle to keep your balance you must keep moving.
I'm 38 and I got burned out so bad when I was 35 that I've been out of work for 3 years while recovering. All I can say is do whatever you need to do to take care of yourself and I hope you can find the spark again. :)
@@uemusicman were you a software engineer as well?
.dll files are a hellava drug.
Sorry, but what are the inputs hooked to? Whey are they turning on or off in the first place? And if you get a 1 from A is that different than a 1 from B? What do the result do for you? if it's 1 or 0? There are two ways to get a 1. Is that make a difference in the bit result?
In binary when you add 1+1, you get 0 and Carrie Anne.
*Groan and facepalm*
Miguel d'Oliveira smooth
Was going to make a similar pun, got beaten.
@@IceMetalPunk must be the life of every party (if he ever gets invited to one).
10*
As a side note to those wondering how subtraction is done, it is also done through addition. The way computers store negative integers in binary is a system called 2's complement, which is difficult to discuss in a short text comment, but it allows the addition of a positive integer X and a negative integer Y to equal X - Y. If you want to subtract Y from X you just turn Y into a negative number (which is a very fast operation) and then add them together.
Interesting note, some early computers used other solutions. Try looking up 1's compliment, the way to go if you prefer a negative 0.
Well, typically computers didn't want a negative 0, as it was generally seen as wasteful. Two's compliment was created to be more efficient in terms of number usage. 1's compliment was used for simplicity when computers increased in speed naturally all of the time.
Oh yeah, there are many reasons 2s compliment is better. It's just a fun foot note in Comp Sci history, like sexidecimal (hexidecimals true name), and using bytes to represent two digit numbers for human readability.
How does it do divisions?
You can just do a multiplication that has the decimal shifted over. The multiplier circuit would see two integers, but then you'd need additional circuitry to account for decimals.
I've just started watching this, Crash Course mythology, sociology and physics. Much to my surprise, I've enjoyed this one the most!
Try watching the history. You'll enjoy it
Crash Course World History is amazing. Many other good ones though. Astronomy is excellent as well. Really they are all worth watching. I find this one to be very good.
Maybe you have a future in computer science.
being a professor of computer science I have to say that I really think you are doing a great job! keep it up!!!
It's somewhat worrisome just how happy this series makes me. Carrie Anne does a great job explaining the basics (and the abstractions that follow), the writers have put in just the right amount of detail and the animators are, as always, exemplary.
I spent a decade writing low level code for microcontrollers, bit slice processors and very early GPUs. I lived at the ALU level, passing data from register to register while checking flags. These days I program for the web and it's so much more abstract than physically pushing bits from A to B; I kind of miss the simplicity ;-)
This is such a great series, definitely one of the best crash courses. you guys are nailing it
Mostly Focused Brings me memories of my classes in Uni. Except that crash course actually makes it easy 😒
I'm studying computer engineering (Sophmore) and this series turned out to be more valuable than my entire first semester! Thanks to everyone who is involved with the production. Cheers!! I'm gonna watch every playlist on this channel during the Quarantine now.
Me: It's simple as 1+1!
ALU: Is it really?
*inhales deeply* good lord I am going to need all of my brain power to understand this.
Or do meth
In any scientific endeavor--especially computer science--it's a positive sign if you're feeling overwhelmed. That means you're trying. The only way to become an expert is to allow yourself to be constantly overwhelmed and to continue persevering. If you invest enough time and effort, all of a sudden you'll realize that things start clicking and making sense. No matter how difficult a concept seems, or how slowly you feel like you're learning, just don't give up, keep coming back, and one day you'll have mastered what you once thought was too difficult for you to learn. You can do it.
@@vjzb3 Dude, I needed to hear this. Thank you. Computer Science is so hard.
Justin Touchdown Yes it is. But you can certainly do it, my friend. Just set small goals and tackle one thing at a time. Also, try to start building things as early as you’re comfortable, because it can make the learning process a lot more enjoyable. Lastly, if you ever have questions, feel free to drop ‘em right here. I’d be glad to help out with anything if I’m able
@@parkersmith3886 and then do a little meth*
I'm not a computer engineering student and I have to say, I have No idea whats going on.
me too
legit bro
same here man
I wanna learn too but I can't understand anything at all ..
Yeah it would be nice if she slowed down just a little bit. Too much caffeine. Its kind of manic.
The more I watch CrashCourse, the more I want to complain as to why this kind of video wasn't around when I was in high school. Could have helped me a lot when I need to look up something that I have forgotten, but I can't understand what my textbook was trying to say to me to get me to understand to concept.
I agree with you..:-)
What kind of highschool had you learning computer science
For the longest time I've believed that electrical engineering coursework is not hard instrinsically, but made unnecessarily difficult with confusing and boring lectures from professors who've been given no incentive to rethink how to best teach the content. This video series is complete validation of that. The clarity with which CrashCourse/PBS Digital presented this is awe-inspiring. Thank you so much!
I can't believe crash course is now doing a computer science course. I'm kind of really proud that they've come so far
It's like I'm looking at my computers soul.
You know about the theory of the ghost in the machine?
CrashCourse is amazing. I didn't think I'd ever get the privilege of understanding how a computer works or how a muscle works on the cellular level. Keep up the excellent work!
I study computer science - so i know all this stuff - but still enjoy very much, how you explain all this in such an elegant easy way! .Its important, so many peaople think of computers as sort of magic, which they aren't. Especially since computers shape our lives, in a way that in democratic societies, we need more people to be able to discuss computer related issues. Thanks.
I am a computer science graduate in a famous university for 4 year. I have learned a whole bunch of details and complexities. I know how to explain and apply it but have never been told to see them as a whole. You can call me an expert compared with those beginers. But this series helps me to understand the assemble of a complexity from scratch and as a whole. Thank you, Carrie.
I watched this series when it first came out, before I started uni. I loved it. I was so exciting to learn about these magical things that goes on inside of computers. Now I just started my second year of computer science engineering, partly thanks to this course. It's so wonderful to go back an rematch the episodes with a whole new understanding of the topics. This is truly a wonderfully produced series!
I make my living as a programmer and I've never heard such a good explanation of ALUs. Great job!
You better add a counter to the abstraction level animation
levelOfAbstraction++
You could also do
`++levelOfAbstraction`
Since ++ before or after a variable (++var vs var++) is essentially the same operation, although on older compilers, ++var would result in less instructions if it's not in an assignment or anything like that. A little bit of random micro-optimisation trivia (nowadays, if the result of ++var is not being assigned into another variable in style of `int foo = ++var;` or `int bar = var++;`, it will be optimised into the same instructions)
Yeah, I always use ++var when I can, just because I never trust the compiler to optimize that for me :P
(For the non-programmers reading this and wondering: ++x adds 1 to x and then returns the result. x++ returns the current value of x, but then adds 1 to it. You can't have instructions continue after returning a value, so in order to do x++, you first have to store the current value of x in a temporary variable, then add 1 to it, then return the value you stored; that's extra instructions compared to just the "add and return" of ++x.)
Yeah, in this case it doesn't matter. In does matter though when post-increment and pre-increment makes a difference, as you mentioned. I usually just do post-increment/decrement by habit.
Javier Figueroa the number at the top of the elevator
Of the many years ive been alive nobody put across these basics of what a computer is composed of better than this series, I just wish i had the chance to have watched these when i was a kid because it would have helped me grasp the concept of computing alot easier. I thankyou.
This video on ALUs gives me such a warm feeling. Half bit adders, full adders, operands.
English is not my first language, I very reluctantly put it to 0.75... It takes more time but I am less frustrated and can understand each concept much better. Amazing course.
Really high quality education out there... Don't have enough words to congratulate this team appropriately...
Impressive , I am a computer science student and You make the subjects so much more interesting than what a professor does. Well done. I can't get enough of these videos
First off, this is great. Coming back after two years because I had to start intensive work since this first came out and couldn't wait for the episodes one at a time.
I have to say at this point, you first drop the ball. After taking us step by step in precise detail about how each individual, fundamental step works on the construction, you suddenly zoom past the full adder part without taking the time to outline how it works in detail or show examples to demonstrate. Also, I think it helps to overstate the significance of xor gates essentially being single bit binary adders. To me, that helps acknowledge where that actual adding process is occurring in the circuit.
You move up the level of abstraction too quickly here. Also, I know you come back to it after, but it's really not initially apparent where the first C input comes from in the full adder.
v=VPw9vPN-3ac maybe this will help people if they want to go through it again slowly before moving past the full adder and actually understanding it.
Remember A**ange \m/
this is good, not just because the information she's giving
more so is the fact that she's capable of explaining it all
i cant even explain why im single
I love this series and I love Carrie Anne as a host! She's so good at explaining really complex stuff! That and I love that's she so enthusiastic and even more of a nerd that I am xD
I am an Architect who always find myself looking for improving myself. Computer Science isn't my expertise but I find it very intriguing. I truly appreciate the video series of the Crash-Course lesson on line. The videos are done professionally and well presented. I want to thank you so much for sharing your profound knowledgeable.
I am speechless, how amazing this series is.
I'm really impressed at the quality of this series so far (this coming from a software engineer). The explanations so far are spot on. The more industrious students can Google the remaining content. For those that do... you need to have a good understanding of the material covered in the previous four episodes to be able to implement the remaining operations.
Next episode: flops/latches!
This is the best series on crash course yet. I'm a middle school student but stuff is explained well and it's really enjoyable thx
I'm not anywhere near the field of computer science, but a lot of people in my life are. I've been trying to learn the basics so that I can understand what they're talking about when I ask them "so how are your classes?" or "what are you working on at the office?" Of all the things I've tried so far, I've found this series to be the most helpful (even if I do have to rewatch a few parts to fully comprehend them). Thanks so much for the great resource!
I think it is surprisingly more than a crash course! Just amazing! Thank you guys for all you are doing here!
I watched this series 2 years before I started study Computer Engineering - It does help me a lot! My teachers are lame and boring. I pitty my classmates, they always get confused with the explanations in class and for me it's peace a cake because I watched this series. And I still watching, this is my third time!
While watching this, I kept thinking about quantum computing, which would theoretically use 6 states instead of 2 (on/off). It's already hard to follow all the logic for our binary system, it blows my mind how complex a six-state system would then have to be.
Once quantum computing is mastered and made easy to mass produce we will have a lot of power at our hands. Unfortunately that also means that hackers have that power, and a lot of modern encryption methods are easiily broken by quantum computing. I see the advancements in quantum computing putting a lot more emphasis in security and encryption for future computer scientists.
I am an Indian and I live in West Bengal and I am building a computer, your video is helping me a lot, so I thank you from my heart.
+CrashCourse Texas Instruments made this ALU. It was part of their 7400 series TTL.
My Professor added this playlist to his recommended refferences for our homework. This series is great. Thank you!
i subbed to this crash course mainly because this computer course I LOVE YOUR VIDEOS
SCI SHOW FAN 2.0 They're awesome! Great refresher!
Also she is hot
Same
We need this type of lecturers in our college...
Logic circuits make me bizarrely excited...
Also! You can't trivially tell if A > B just by testing (B - A) < 0! Joshua Bloch wrote a good article about how this trick only works if you know the subtraction won't overflow, but on true arbitrary numbers it will fail half the time. (Maybe ALUs have different ways around this, but programming languages do not...) Spread the knowledge far and wide :D
Great episode!
you are truly the only one that explains all the layers in the computer. and i thank you for that. i finaly start to understand everything instead different layer on there own.
"Sees video title" WHAT!?!? They went from the binary system to a full blown ALU?
A basic ALU is actually pretty simple, complexity goes up the more operations you need to implement via hardware (a multiplication made by hardware is incredible complex, like, a few hundreds of logic gates for a 4 bit product)
Spencer White well they did discuss transistors and logic gates already so :)
It's incredibly clever too, I was mystified for days when I found a paper describing how to implement it for real. Then I got how it worked, I felt as my IQ increased, but I don't believe in IQ.
Still, the video misleads saying that we've build an ALU at that point, while we've only build an Adder. I kind of can't believe she promised to move on from ALU after this video, without even explaining how the hell ALU works. I mean, it's definitely not an obvious thing for a newbie, even less obvious than binary addition which _was_ discussed in the series.
I love how enthusiastic Carrie Ann speaks to us, smiling from one cheek to another!
I hope the budget cut to PBS doesn't hit this channel hard...
The budget is only the Executive Branch's wish list. Congress can, and almost always does, veto it in favour of their own budget which the Execs are required by law to implement.
Sordatos Probably not. Even if the budget it passed and Crash Course, as a consequence, gets completely cut from PBS funding, there would still be the money from the patreon support (and several other sponsors). So if you're worried, just sponsor them on patreon!
"there would still be the money from the patreon support (and several other sponsors)"...as it SHOULD be....
Sordatos I'm commenting and liking all videos on digital studios now. I don't ever wanna think I took this for granted while it lasted.
@Asitya Mathur What??? PBS gets it's funding from the federal government and private donations. President Trump's recently proposed Federal Budget completely cuts PBS and many other small programs completely off from federal funds. While not final, if Congress approves the change then yes PBS will lose a significant source of revenue. Congress can't directly tell PBS to stop funding Crash Course, but it can remove the funds PBS needs to do so. Without federal funds there's basically no way PBS could continue paying for all the shows on TV and RUclips that it does, it will need to cut some of them. And that's the best case scenario, it's possible that without federal funds PBS will be unable to sustain anything and will completely shut down.
I am a student at computer engineering and i have to say this playlist is awesome and helpful. Greetings from Turkey :)
This is great, it's pretty much like a class but easier for people new to computer science to understand and far more interactive/interesting. haha, I'm a grade 12 student and I've been doing web programming/design, and I love to learn things about computers so this is perfect! Thanks
This is the best course yet
This show really helps you appreciate the machine you are sitting in front of. Wow o.O
That moment when Carry was legitimately excited when "we" constructed our ALU was way too cute
It was a tough episode, had to rewatch twice at slower pace to fully understand, but was awesome nonetheless.
Thank you for your hard work
finally! new episode :)
CSRocks Hey, just letting you know. I checked out your vids and subscribed!
Thank you! I checked your channel as well, very nice content :) +1 sub ;)
the best quality series of computer science
THAT’S WHY YOU CAN’T GO OVER SPEED 255 IN MINECRAFT
I was doing independent research for the sake of knowledge and this really helped. Blew my mind honestly.
Can't wait till you get to how algorithms work!
+Tommy Rosendahl She's going to explain it better later but...
Algorithms are just thought processes, they are not executed directly by computers. They are defined through programs written in a programming language.
Program code are translated to assembly code. Assembly code are translated to machine code. Machine code is just a bunch of sequence of 0's and 1's that defines your very one program and everything boils down to very simple commands of sum, subtraction, addition, comparison, etc... All executed by the ALU she just explained.
There are way more things than that obviously (like memories she's going to explain later) but you can see the bigger picture...
"They are defined through programs written in a programming language" Not necessarily. With an FPGA (Field Programmable Gate Array) one can build custom hardware architectures for the implementation of an specific algorithm using logic gates and some good old sequential logic. Now, if one is using the FPGA as a hardware accelerator, it is possible to use a program in a programming language to feed and receive data into the FPGA, however the whole point of it is that this program will only give it the ones and zeros of the data and some configurations, and the actual algorithm will be executed by the custom architecture.
Yeah, but FPGAs are generally written with a sort of programming language, such as VHDL.
verdatum Yes, but VHDL and Verilog are not programming languages, they are hardware description languages. Even if they have some behavioural synthesis options, after you compile your code you end up with a bitsteram that is sent via jtag and configures the registries in the LUTs to implement your architecture.
The hype is real! Algorithms!!
This and the astronomy series are my favorite please continue this!!!
this is so amazingly clear and well explained exclamation point
That little outburst of excitement at about 8:30 killed me... Carrie Anne is precious and must be protected
Excellent explanation my dear. This course should be a prerequisite before studying computers formally
I'm greatly enjoying the series so far. It's apealing even to us old computer folk.
I'm still impressed that all of this is just made up of very tiny transistors
I still can't imagine how do they produce something so complicated and so small at the same time
This is one of my favourite videos ever, it made me start to actually understand how computers work. I’m going to school for computer engineering starting next year and am very excited
Seriously? You're talking about [insert topic here] and you didn't even mention [insert overlooked stuff here]?
sounds like you don't need to watch this video
I just started computer programming, and I have to say, It is more helpful to learn about the computer itself. Thank you so much for the videos.
Really enjoyed the show. I was, in theory, already familiar with the topic, but that was several years ago, so I really enjoyed the refresher course! I dont suppose you would have the apatite to go from transistors to op-amps in a future episode with this same awesome format?
Why do you think its called Crash Course? :) (smug smile)
I've downloaded this series a long time ago but have only watched it now, and it's great to be reminded of how fun Computer Science was.
Is it me or has Carrie Anns sitting-posture improved greatly since her first episode?
Btw, is there going to be an episode about silicone computing contra human brain computing? At this point there seems to be a lot of similarity, but human brains seems to be able to multiply without convoluted logic gates.
The brain is complicated. I mean, we don't use logic gates, but we use retrieve the multiplication answer we memorized through a request from neurons that get it from our memory area. For big numbers, we use an algorithm to do it (the multiplication one we teach elementary school students). Computer engineers have experimented with simulating this with lookup table hybrids with a faster algorithm. I think the faster algorithm is used on its own nowadays IIRC.
You might as well be comparing Conways game of life to a turing machine. Are the conceptual models of both able to solve any computationally solvable problem? Sure. Do they really have similare operational features, or is the conversion from one to the other obvious? No. Not at all. Its like trying to compare a rifle to a chainsaw because they can both break things. Yeah, there is something there, but not really.
Of course, go up a few level of abstractions and they look similar again.
I would say that although the organisation of the brain is different than computers their are low level components that appear similar: neurons are like transistors, and neural nets are like logic gates.
As for how humans do multiplication, I agree with MegaZeroX7, humans generally use lookup tables (memorization) to determine the answer to simple multiplication. I use a lookup table as reference and then offset from there (I know 5 x 7 = -7 + (6 x 7), and I have 6 x 7 = 42 memorized, so 42 - 7 = 35).
Unfortunately, we've still got a ton to learn about how the brain works. At the moment, it's still kind of difficult to compare them. However, this show might address Neural Networks, which is slightly related to how the mind works. And btw, it's "silicon" not "silicone". Silicone is for stuff like heat-resistant rubber cookware and breast augmentations. Silicon is a metallic element that can be modified into a semiconducting material :-)
"Is it me or has Carrie Anns sitting-posture improved greatly since her first episode?"
How do people notice these things?
I am currently in Advanced computer architecture having trouble and your videos are super helpful and super clear! keep up the amazing work.
I'm actually surprised at how low-level these videos get. Not complaining, just a bit of a shock, as many CS courses just gloss over architecture and go to algorithms.
Courses generally don't, but if you major in CS, you'll pretty much always end up taking a Computer Hardware course that covers all this, but in still-greater detail.
I mean it is an overview, not like a CS degree on YT.
I'm a software engineer and I think this is a great course series, very helpful
I think I'm in love with Computer Science, I'm only 15!
I recommend looking at codeacademy or khan academy to learn more!
No one cares
Rachel your enthusiasm for life reminds me of my dear daughter, sweet and loving and with a bright future ahead of herself. She killed herself a few months ago at the age of 17. No one suspected that she had depression and was constantly fighting with her inner daemons.
Lol, this thread
@@mazapanputrefacto3299 sorry for your loss but why do you feel the need to comment that? Perhaps you need to seek therapy instead of commenting something like this to a young girl
The abstraction bit had me in tears.
Love the videos! Always helps me out with mine :)
Few Minute Programming if you don't mind. What software do you use to make your videos?
This computer series is just brilliant as the content taught is to the point......
Logic + Electricity = Math apparently.
Well math comes from logic
Carmen Acuna Math IS logic bro
Logic=math
Electricity= a way for a computer to interpret logic, and thus interpret math
I hope you will continue making more videos on computer science. Hopefully, you will eventually get onto programming languages like C, C++, Java, etc. I love to see how you teach them to us. Your tutorials are brilliant. They make our brains swollen.
You're missing 101 10 from your Full Adder truth table
The full adder table is missing a row: (A, B, C) = (1, 0, 1)
That being said, great series and keep up the awesome work!
A very interesting series so far, I'm excited to learn more
thank you all for this course!
Can we get courses for different types of math? (trigonometry, algebra, calculus etc.) I see you have already started with statistics so I think that would be the way to go. Also Crash Course Music and Art would be awesome!
This episode (especially the ALU) was where it all clicked for me. Computers make a lot more sense now.
"We just made a full adder!"
A "Black Adder"?.... I'll get my coat.
Antti Björklund meh. Stay. It wasn't that funny.
These are my favourite PBS videos.
Why does that Intel ALU have the Texas Instruments logo on it?
Presumably they designed it. Here is its data sheet: www.ti.com/lit/ds/symlink/sn54ls181.pdf
When ever they said "Intel 74-181" they really mean Texas Instruments. Intel was still a start-up making RAM chips at the time that beast was incorporated into the mini computers of the time (late-60s, early-70s). See this for more info: www.righto.com/2017/03/inside-vintage-74181-alu-chip-how-it.html
CORRECTION: I'm scouring the internet and there's debate if it was TI or Fairchild that introduced it first... but I can't find any definitive source. It definitely wasn't Intel though. 😲
Andrew Farrell wow, they even put the logic gate diagrams in there! Very cool.
Eddie Santos thanks for the info! Intel certainly has come a long way since then.
her way of explining thingsmakes it look that its so simple and everyone could get it thank u i am really impressed
they say intel but why is there a Texas instruments logo on it?
INCREDIBLE SERIES !!!! I LOVE YOU ALL
Get ready to put your flip-flops on ;)
I love the quality of this video. I just started a CS master's program and this topic has been racking my brain until I watched this. Thanks!
I need a 1/2 speed button :-)
RUclips has one. Click the gear thingy on the lower right of the video.
You can choose x0.25, but it's useless because the audio is cut. Or, if you are up for the challenge, x2!
Carry Ann (like carry bit), you' re my personal hero! I'm teaching Computer Architecture to college students and though I have a huge passion for the subject I always thought that the scientific committees have a very sterile and inhuman approach to what is one of the most fascinating subjects of computer science! So, again, a huge thanks for the "new wave" approach, contribution and isnpiration!
Did she just say "Thanks for watching i'll cpu later".....?
10:56
This Video is GOLD. It should have been viewed billions of times. But then i think, the real number is always less.