Yeah, WiFi is a standard that is made to make wireless simpler to implement and use. That's what you should tell your family. Sure it can lead to becoming an engineer, but it's meant to let everyone use wireless.
If you implement QoS rules and enforce DNS redirecting and thus block a few websites like porn and such and use MAC instead of password for authentication maybe you're in the path of becoming a network technician. If you install another router to watch the packets exchanged between the original router and the client then you're definitely in the path of becoming a network hacker. But an engineer? Nah.
Some times is the opposite, when people ask me "oh what did you study?" and I answer "Computer Engineering", "oh so you can help me install my printer?"
The 3% of energy produced in the world being used by computers was interesting. It would be even more interesting to know what percent is being used for doing cryptocurrency calculations.
5:11 I think there's an error with the Moore's law graph. The scale of the vertical axis is already logarithmic, but the graph still looks like an exponential on a linear scale would. "Doubling every x years" should look like a straight line.
Aw, no mention of CrashCourse Computer Science? They e.g. covered a lot of the electrical-engineering-y hardware basics on the lowest levels, so have a look if your interest was sparked by this episode.
Fun fact. The IBM 1130 was simulated on an IBM 360 to determine how the single card bootstrap loader would work. The 12 rows of the standard punch card were expanded to the 16 bits of the 1130's words and the instructions were designed to make it work. OS, compilers, assemblers, peripherals, everything was simulated first.
Another thing to consider with energy efficiency is _what_ is running on the computer as a well written C++ will be arguably far faster and more efficient than a well written Java or Node program as the C++ program traverses fewer layers of abstraction and is often optimised for specific hardware.
boss : i plan to build a software. my friend says it's a good business. tech director : my friend says building a software could takes too long, unexpected result, overpay risk, unstable dependencies, costly long term support, and yet too much competition. boss : how about talking about building a software ?
6:39 eh, thats very misleading? components are getting more powerful, and general memory requirements are increasing... but they're also getting significantly more efficient... either getting more performance out of the same power targets, reducing overall power usage, or a mixture of both. There's also Instruction Per Clock improvements where more tasks can be completed in a single clock cycle. And just about all new internal components will downclock or even enter ultra low power sleep modes when not in use to save more energy when you don't need the power, this is a major reason why we have laptops getting 10+hours of usage on a charge and mobile devices that can last days. Even the ultra-high-end is more efficient now than it has ever been despite there being multiple times more processing cores.
that's true about them getting more efficient and powerful recently, but what we have to remember that that efficiency and power comes as a result of the innovative engineering that chip designers are using and, in some cases, even inventing. so although they are more powerful and efficient, all of that wouldn't be possible using the methods found in chip architecture even only 5 years ago. reiterating the point of this series, it's to show that with new engineering techniques and new ways of designing systems, we can achieve all of this, but not without the challenges that come before success. ^^
Yes, but with diminishing returns. Leakage in the smaller transistors will counteract some of the power savings due to the smaller size. Also the smaller size results in increases in the power density - more power per unit area. That makes it much harder to cool the circuits. Why have computer chip speeds pretty much settled into the approximately 2-4 GHz range? It's because it is so hard to cool them if run at faster clock rates.
An issue you approach but don't quite mention explicitly is dark silocon. The chips are getting smaller but the concentrated heat prevents the whole chip from running at once. If we did run power through the whole chip it would break so we need to leave some of the chip "dark". This leaves less of a benefit from jamming more transistors on the chip. One way around this is as was said, reduce the heat. Another idea has been to design hardware "accelerators" that can do one job very well, and just specialize different parts of the chip to do different roles. This way we maintain some benifits from shrinking the chip while not requiring the whole chip to be running.
Although CS and CE require the same amount of mathematics on paper, the engineering classes use these mathematics a lot more frequently. I would say that CE is not the major to fall back onto if your mathematics is becoming difficult, but rather CS is the one to fall back to.
There is a great difference between software and hardware engineers that's not even touched in this. It's telling which divisions you guys have chosen to focus on.
Adnan A it will, it’ll work in other forms e.g. how many nanotubes can we fit on a transistor. But Kurzweil covered this issue in his law of accelerating returns.
Logic gates may become nano scale mechanical switches rather than transistors. See Eric Drexler's book Engines of Creation: The Coming Era of Nanotechnology.
As computer chips become more complex I'm sure their price will sky rocket. Oftentimes when I diagnose a control board as the broken part my customers often choose to buy a new appliance instead of having me replace it. Sometimes the control boards cost almost as much as the appliance itself!
@Hernando Malinche Electrical Engineering is much harder, it requires more analytical math(Calculus 1,2,3, differential equations.......etc). While in Computer science if you got Calculus 1 and discrete mathematics then you are alright.
The quality of Crash Course Engineering series has really been falling with the numerous poor choices in words and misunderstandings of key concepts. At 6:12, it is very misleading to say that nanotechnology is an alternative to standard transistor methods. Instead, nanotechnology is a natural progression of electronic engineering which fits into the Moore's Law narrative where numerous advances in engineering solutions make up what is known as nanotechnology. TLDR: 6:12, Nanotechnology is not an alternative, it is the current standard.
Does anyone else feel that this series is severely lacking in terms of generally useful and relative information? No specific explanation of embedded systems in this episode is simply heartbreaking. In case it helps, I've always found Computer Engineering is best explained from the ground up. By which I mean from a Physics point of view to high level programming abstractions. For future episodes please pick an angle and stick with it throughout instead of jumping around so much because these episodes are not nearly long enough for such a broad scope.
@vvenomm 492 Not certain, however, perhaps there is some way to manufactur negative nanometer chips. I am no computer engineer though so don't take my word for it!
As for the End of Moore's Law: You assume it can only progress via scaling down transistor geometry, but there are other ways to achieve doubling number of transistors on a single chip.
Crashcourse, a suggestion or maybe a request but do you plan to create a video about agricultural engineering and am I correct that it encompasses all four pillars of engineering namely chemical, civil, electrical and mechanical?
The hardware to pull off a general AI absolutely already exists. the next hurdle is in software and storage density and datasets, if they were at the right place we could absolutely build a facility (probably larger than any datacenter on earth, mind) for it to run.
I don't know about that. I'm a software engineer, and while I admit that I don't do a whole lot with machine learning in my work or even as a hobby, I'm of the belief that people overestimate the barriers to general AI. We're not there yet, but we're a lot closer than most people think... and don't even get me started on the people who believe there's something spiritual/magical/special about "minds" that can't be replicated in a computer. Because they're just delusional.
I watched a video partially about reducing computer energy use on a power-sucking dual Xeon workstation from 2005. Am I a bad person? (Joking, although my basement system I watched this on is a dual Xeon system from 2005 and it indeed does suck power... ).
Nitpicking, but there is a difference between memory and storage. I wouldn't store a picture of my cat fitting into a tiny box on volatile memory. I would store it on a non-volatile HDD or SSD though 😉
I want my next CPU to be a quattuortrigintillion core quantum processor powered by interdimensional dark energy. But it'll probably just be a 16 core Intel :'(
@@erik-ic3tpHa ha. I would love to live long enough to see humans travel to other planets with life on. That's my biggest regret of being much nearer the end of my life than the beginnning.
hello i am an upcoming college students.. I am really in a tight spot right now as I am really having a problem on what course to pick Computer Engineering or I.T? But my first choice is to become a computer engineering but my problem is does computer engineering course has many job opportunities ? But one thing for sure there a lot of opportunities of Being an I.T here on my country but i really want to be a computer engineer ? and also can a computer engineering can world of I.T related Field? like even I am an a Computer Engineer can I work on I.T related stuff? Thanks.
Hey did you know: 99% of Indian RUclipsr add 'Tech' in their channel name doesn't know anything about tech. 1% of Indian RUclipsr that doesn't add 'Tech' in their channel name know so much about tech.
1. Make a tool
2. Use the tool to make an even better tool
3. Repeat for thousands of years
4. Colonize the galaxy
I was so hooked on this video! I wanna be an electrical engineer with a concentration in computer engineering and I love it
When I fix the WiFi in my house my family thinks I can become a engineer all of the sudden
Yeah, WiFi is a standard that is made to make wireless simpler to implement and use. That's what you should tell your family. Sure it can lead to becoming an engineer, but it's meant to let everyone use wireless.
If you implement QoS rules and enforce DNS redirecting and thus block a few websites like porn and such and use MAC instead of password for authentication maybe you're in the path of becoming a network technician. If you install another router to watch the packets exchanged between the original router and the client then you're definitely in the path of becoming a network hacker. But an engineer? Nah.
Some times is the opposite, when people ask me "oh what did you study?" and I answer "Computer Engineering", "oh so you can help me install my printer?"
Computer Engineers assemble!
After we fix some bugs.
"Some" being the operative word. We can never fix all the bugs :( They just keep multiplying, like... well, like bugs.
If I asked if the "assemble" part was supposed to be a wordplay would it ruin it
@@DragoniteSpam STORE your question and LOAD it later
@@essigautomat ಠ_ಠ
Software engineer / applications developer / programmer / coder. There are many words for what I do, but please, don't call me an IT guy! :P
Worse: It by Stephen King.
So you can fix my computer?
First sentence: "Whether you watch this video on your laptop, smartphone or smart watch …"
Me: Is nobody using desktop computers any more?
The 3% of energy produced in the world being used by computers was interesting. It would be even more interesting to know what percent is being used for doing cryptocurrency calculations.
I fix my computer by banging my hammer on it expecting it to work.
this is not the russian space station
me to
Kevin Burris I still do it -\(xOx)/
* insert subscribe to me joke here*
Ameutur, I use my sledge hammer.
5:11 I think there's an error with the Moore's law graph. The scale of the vertical axis is already logarithmic, but the graph still looks like an exponential on a linear scale would. "Doubling every x years" should look like a straight line.
Aw, no mention of CrashCourse Computer Science?
They e.g. covered a lot of the electrical-engineering-y hardware basics on the lowest levels, so have a look if your interest was sparked by this episode.
I didn’t cache that last part
Cuh
😂😂
😂😂😂
Cache me ousside
🙂🙄
Fun fact. The IBM 1130 was simulated on an IBM 360 to determine how the single card bootstrap loader would work. The 12 rows of the standard punch card were expanded to the 16 bits of the 1130's words and the instructions were designed to make it work. OS, compilers, assemblers, peripherals, everything was simulated first.
Another thing to consider with energy efficiency is _what_ is running on the computer as a well written C++ will be arguably far faster and more efficient than a well written Java or Node program as the C++ program traverses fewer layers of abstraction and is often optimised for specific hardware.
Engineering never ends because there’s always a new invention applied in a way that helps the population
boss : i plan to build a software. my friend says it's a good business.
tech director : my friend says building a software could takes too long, unexpected result, overpay risk, unstable dependencies, costly long term support, and yet too much competition.
boss : how about talking about building a software ?
6:39 eh, thats very misleading? components are getting more powerful, and general memory requirements are increasing... but they're also getting significantly more efficient... either getting more performance out of the same power targets, reducing overall power usage, or a mixture of both. There's also Instruction Per Clock improvements where more tasks can be completed in a single clock cycle. And just about all new internal components will downclock or even enter ultra low power sleep modes when not in use to save more energy when you don't need the power, this is a major reason why we have laptops getting 10+hours of usage on a charge and mobile devices that can last days. Even the ultra-high-end is more efficient now than it has ever been despite there being multiple times more processing cores.
Ok...
that's true about them getting more efficient and powerful recently, but what we have to remember that that efficiency and power comes as a result of the innovative engineering that chip designers are using and, in some cases, even inventing. so although they are more powerful and efficient, all of that wouldn't be possible using the methods found in chip architecture even only 5 years ago. reiterating the point of this series, it's to show that with new engineering techniques and new ways of designing systems, we can achieve all of this, but not without the challenges that come before success. ^^
/r whoosh @@minter8701
Yes, but with diminishing returns. Leakage in the smaller transistors will counteract some of the power savings due to the smaller size. Also the smaller size results in increases in the power density - more power per unit area. That makes it much harder to cool the circuits. Why have computer chip speeds pretty much settled into the approximately 2-4 GHz range? It's because it is so hard to cool them if run at faster clock rates.
An issue you approach but don't quite mention explicitly is dark silocon. The chips are getting smaller but the concentrated heat prevents the whole chip from running at once. If we did run power through the whole chip it would break so we need to leave some of the chip "dark". This leaves less of a benefit from jamming more transistors on the chip. One way around this is as was said, reduce the heat. Another idea has been to design hardware "accelerators" that can do one job very well, and just specialize different parts of the chip to do different roles. This way we maintain some benifits from shrinking the chip while not requiring the whole chip to be running.
Can you talk about industrial design?
Moore’s Law < The Law Of Accelerating Returns
Goes to lecture, does an entire 3 hours on Logic Memory and Hardware, comes home, puts on Crash Course: Today we're learning about Moore's Law... UGH!
Hardware is already fast (not that it cannot be improved). Software is what makes a computer slower.
This is great I just decided I wanna major in this lmao
That's what I'm majoring in
sirjigsalot same
Although CS and CE require the same amount of mathematics on paper, the engineering classes use these mathematics a lot more frequently. I would say that CE is not the major to fall back onto if your mathematics is becoming difficult, but rather CS is the one to fall back to.
One of the best host really love your vids keep the good work rolling
There is a great difference between software and hardware engineers that's not even touched in this. It's telling which divisions you guys have chosen to focus on.
If only Moore’s law continued for another century.
Adnan A it will, it’ll work in other forms e.g. how many nanotubes can we fit on a transistor. But Kurzweil covered this issue in his law of accelerating returns.
You mean one Moore time?
Logic gates may become nano scale mechanical switches rather than transistors. See Eric Drexler's book Engines of Creation: The Coming Era of Nanotechnology.
jimbert50 Do you think that read is still worth it? Seems like an older book
jimbert50 thanks for the recommendation I’ll give it a look!
Im proud to be a computer engineer. I just wish I was part of computer industry ;(
As computer chips become more complex I'm sure their price will sky rocket. Oftentimes when I diagnose a control board as the broken part my customers often choose to buy a new appliance instead of having me replace it. Sometimes the control boards cost almost as much as the appliance itself!
so... i'm watching this on my computer... and she explains to me what a computer is... fascinating! :D
computer? computer. computer computer
i can only hear computer ..computer ..computer ....
Thank you for being the rare case of getting Moore's law right.
I'm an electrical engineering and computer science dual major student. I think I merit as a computer engineer
It doesnt work like that
marvin ochieng no it does
Electrical engineer most of the time work with the hardware and computer scientist works software and the in between is computer engineer
@Hernando Malinche Electrical Engineering is much harder, it requires more analytical math(Calculus 1,2,3, differential equations.......etc). While in Computer science if you got Calculus 1 and discrete mathematics then you are alright.
@@karimselemani5157
I can confirm. I switched from electrical engineering to computer science, because I was too lazy to study all that math.
This is NOT 'computer engineering'. This is 'Introduction to Computer Pieces'.
I was using a Desktop thank you very much!
7:00 do all engineers have goatees on this mirror earth?
The quality of Crash Course Engineering series has really been falling with the numerous poor choices in words and misunderstandings of key concepts. At 6:12, it is very misleading to say that nanotechnology is an alternative to standard transistor methods. Instead, nanotechnology is a natural progression of electronic engineering which fits into the Moore's Law narrative where numerous advances in engineering solutions make up what is known as nanotechnology.
TLDR: 6:12, Nanotechnology is not an alternative, it is the current standard.
Love this video 🔥🔥🔥🔥🔥🔥🔥
But I want “Moore” of this like “Cache” course: Computer science.
I have no clue what she just said but I could watch her say it all day long!!!
Should've expected this series to trickle back down to computers somehow because I thought this was CC Computer Science again.
Does anyone else feel that this series is severely lacking in terms of generally useful and relative information? No specific explanation of embedded systems in this episode is simply heartbreaking. In case it helps, I've always found Computer Engineering is best explained from the ground up. By which I mean from a Physics point of view to high level programming abstractions. For future episodes please pick an angle and stick with it throughout instead of jumping around so much because these episodes are not nearly long enough for such a broad scope.
Makes sense
I’m gonna become a Computer Engineer
5:09 The start of the section on Moore's Law. You likely already know what's being said before this point.
Who doesn't know Moore's law?
thanks dude
More Moore. More Moore. We want more Moore!
Moore's law never ends. All Processor manufacturers have to do is start making negative nanometer chips. Problem solved!
@vvenomm 492 Not certain, however, perhaps there is some way to manufactur negative nanometer chips. I am no computer engineer though so don't take my word for it!
As for the End of Moore's Law: You assume it can only progress via scaling down transistor geometry, but there are other ways to achieve doubling number of transistors on a single chip.
Please don't say "make the chip bigger"
@@andiCNH No.
My brother is in college for computer engineering ^-^
I need more tech because I’m lazy asf
Crashcourse, a suggestion or maybe a request but do you plan to create a video about agricultural engineering and am I correct that it encompasses all four pillars of engineering namely chemical, civil, electrical and mechanical?
Agricultural Biosystems Engineering to be specific...
The main thing starts at 3:20.
Why would you watch on your smart watch?
because you want to look smart watching on your smart watch.
why wouldn't you
nice
whao they actually did it i asked for this video in the comments in the aerospace enginner video
I'm sure they have plenty of interesting information planned
I love with watching cat videos
Hahaha! Cell phones from the 90's... Try watching kids today figure out how a Rotary Dial Phone works.
I'm a grown man and I still can't figure those things out. Much less find a working one!
How are we gonna be able to prevent the singularity? the most important question I have
Great topic! Without a quantum leap in computer engineering, true AI will not be possible...
The hardware to pull off a general AI absolutely already exists. the next hurdle is in software and storage density and datasets, if they were at the right place we could absolutely build a facility (probably larger than any datacenter on earth, mind) for it to run.
I don't know about that. I'm a software engineer, and while I admit that I don't do a whole lot with machine learning in my work or even as a hobby, I'm of the belief that people overestimate the barriers to general AI. We're not there yet, but we're a lot closer than most people think... and don't even get me started on the people who believe there's something spiritual/magical/special about "minds" that can't be replicated in a computer. Because they're just delusional.
or just more stream processors/cuda cores running at slightly higher clock speeds.
"demands more energy" would be more efficient than "generates more energy demand". I think...
Which one is better, industrial or computer engineering? I'm having a hard time deciding and if which one is in demand
those eyes... *drooling*
Why is there no playlist for this series on the channel?? Edit: oh I thought fhis was computer science not egineeribg
Finally the video about my career
EPIC
I watched a video partially about reducing computer energy use on a power-sucking dual Xeon workstation from 2005. Am I a bad person?
(Joking, although my basement system I watched this on is a dual Xeon system from 2005 and it indeed does suck power... ).
This course has but 5 episodes left. What course r u going to do next Shini?
Actually there are 11 more! There will be 46 total!
@@crashcourse noice 😎
I love you
Talk about Mechatronics
Nitpicking, but there is a difference between memory and storage. I wouldn't store a picture of my cat fitting into a tiny box on volatile memory. I would store it on a non-volatile HDD or SSD though 😉
I am watching this video on my ⌚
❤️
But what about logic gates?
What about them?
How they are connected to the computers? They hadn’t mentioned them, they are part of the transistor right?
Can you make the video on Cyber security
Didn't Carrie-Anne cover that as part of CC computer science
nicer
❤️❤️
Im gonna become a Computer Engineer
I say we just shrink silicon
5:10 "Calculations per sec per $1000"
Say what now? Where are these human brains being sold? And why is the price not changing over the years?
Soon there will be smart air conditioners
There has been for a while now
@@andiCNH i mean super smart touch screen internet air conditioners
@@Sobra_. ...yeah, they already exist.
IceMetalPunk I meant ones that you can watch this video on
@@Sobra_. technically you can. Just connect a Samsung home automation hub to a tablet and wallmount it
I’m new so why did Craig leave?!?!
And the 90 percent of the 3 percent of over energy budget is used by people running an amd system with an r9 295 and amd fx 9590
iron man dies on the avengers end game
watching this on an iphone with a 5nm chip inside of it...
i'm watching on a desktop computer, so, neither laptop, smartphone nor smartwatch...
When I'm drunk I watch this
Also brutal street fights
And mukbangs
WTF
I want my next CPU to be a quattuortrigintillion core quantum processor powered by interdimensional dark energy. But it'll probably just be a 16 core Intel :'(
Maybe in the year 3000. :) Hahahaha.
@@erik-ic3tp lol
Mat Broomfield,
You don’t know when it’s really going to happen. 3000 is just a cool number.😂
@@erik-ic3tpHa ha. I would love to live long enough to see humans travel to other planets with life on. That's my biggest regret of being much nearer the end of my life than the beginnning.
Mat Broomfield,
Same for me too. I’m only 19 years old (for about 2 weeks).🙂
Amdahl's Law > Moore's Law
Change my mind.
You're free to believe whatever you want, sir.
Hooters, cooters, computers. ;)
hello i am an upcoming college students.. I am really in a tight spot right now as I am really having a problem on what course to pick Computer Engineering or I.T? But my first choice is to become a computer engineering but my problem is does computer engineering course has many job opportunities ? But one thing for sure there a lot of opportunities of Being an I.T here on my country but i really want to be a computer engineer ? and also can a computer engineering can world of I.T related Field? like even I am an a Computer Engineer can I work on I.T related stuff? Thanks.
Please show subtitles for kids plz plz
Except moores law isnt done yet. These crash course vids have really fallen off.
Soon you will be able to watch this video on your....
Can they go in robotics
2:49 whats up with that keyboard? all the weird stuff to the side. G U S I O C
Nanomachines, son!
Picomachines, son! Femtomachines, son!
u perty
Buggy like Windows 10
Could you speak a little faster?
Sorry didn’t here the “intel”
Hey did you know:
99% of Indian RUclipsr add 'Tech' in their channel name doesn't know anything about tech.
1% of Indian RUclipsr that doesn't add 'Tech' in their channel name know so much about tech.