Why your laptop charger is so hot
HTML-код
- Опубликовано: 7 сен 2024
- Turns out, Nikola Tesla is partly to blame. Liz Scheltens explains, with a little help from NPR's Planet Money. Subscribe to their awesome podcast here: n.pr/1RZaOeT
Subscribe to our channel! goo.gl/0bsAjO
Vox.com is a news website that helps you cut through the noise and understand what's really driving the events in the headlines. Check out www.vox.com to get up to speed on everything from Kurdistan to the Kim Kardashian app.
Check out our full video catalog: goo.gl/IZONyE
Follow Vox on Twitter: goo.gl/XFrZ5H
Or on Facebook: goo.gl/U2g06o
Video is massively wrong. Converting AC to DC is about 98% efficient. The voltage step down is what generates most of the heat, and all your devices want DIFFERENT voltages so a single unit isn't an option. Even if it was, the low voltage distribution through your house would waste much more power than the existing arrangement.
Not to mention that in this video they made it look as if that a DC power-grid is just as feasible as our current AC power-grid, without mentioning that during the era of Tesla and Edison. Edison had a DC power-grid in New York that required several times more wires that our current AC power-grid to feed a set amount of homes. Choosing AC over DC wasn't an arbitrary choice, and DC is the only one of the two that can be stored in a battery.
rc xb if stepping down the tension of the current was so inefficient, we wouldn't use high tension on long distance wiring.
If converting between AC and DC was so inefficient, we wouldn't use HVDC on long distance wiring, but we do. (There's no such thing as "tension of the current".)
Over long runs, the (relatively low) resistance of the wires becomes quite significant. So much so that the losses of the transformers on both ends is more efficient than the losses of higher current over the long line. Also, HUGE transformers can be much more efficient than a little laptop power brick can practically be, so it's not proof of anything.
98% effecient? so that 2% is heat?
Except for computer power supplies, where a lot of them are 80% or 82%. Desktop.
This video is so incorrect. AC-DC converters don't generate heat. Stepping down voltage generates heat.
Technically you're wrong but I think it's a matter of you not being clear what you're trying to say. AC to DC conversion absolutely positively generates heat, but perhaps in a more negligible amount in comparison to stepping down voltage. Case in point would be home solar inverters (not talking about micro inverters). Pretty much all of them have heat dissipation features in their designs, and even when the system DC voltage is near the 240VAC in your home, they still get hot.
+teraxiel Solar converters are DC-AC converters, not AC-DC. And yes, of course something that's 98% efficient still generates heat in the physical sense, just not in the usual sense of the word. It would make the power supply marginally warmer but not hot. You are always free to choose the definition of a word that fits.
Stepping voltage down, in and of itself, does NOT generate heat. That is, if the conversion is 100% efficient. But in real life, nothing is ever 100% efficient.
So what I failed to mention is that solar arrays will often, depending on the specifics of the installation, generate voltages higher than the 240V typical of north american homes. It is stepping down voltage, even if it is DC to AC and not the other way around. My point was that regardless of the conversion, whether changing voltage up or down or AC to DC or DC to AC, they ALL generate heat, just some more than others.
stop calling them "converters"
if it's AC-DC you're *rectifying* it
if it's DC-AC you're *inverting* it
and no. A step down transformer produces little heat. The reason for the mineral oil they sometimes have is to prevent arcing due to the high voltage. The resistors, the transformer [LITTLE, never said NO heat] and the rectifier all heat up in the AC adapter.
again I can't stress enough, stop calling them CONVERTERS. That's CONVERTING DC-DC voltage. These are outdated, expensive, and inefficient. Today something similar that converts DC voltage is a subcircuit, that uses transistor and resistor systems in small electronics.
Huh. Now I have to question Vox's credibility on music, art and other topics I enjoy on this channel. As an electrical engineer, this was an incredibly dumb and shallow video on AC/DC power supplies.
1. AC is not only preferable because it travels well over long distances but because you can use transformers to easily step up/step down voltages which reduces power losses due to resistance over long wires and you can implement a three phase power system which uses way less copper wiring and significantly reduces infrastructure/maintenance costs.
2. You exaggerate the power loss of AC to DC conversion due to heat dissipation. It is 98% efficient. Most of the power loss happens due to voltage being stepped down.
3. A single AC to DC converter for all devices is a very dumb idea. Yes, electronics use DC, but not all DC is created equal. Each device charges/operates at a specific voltage and current, often different from one another. The charger not only convert AC to DC, but steps down voltage (linear regulator/buck convertor) and may have a choke/inductor to provide steady current flow at a specific amps range. Having a single AC to DC convertor for your entire house just add an unnecessary step in between.
Also, lets be real - the only reason the apple charger heats up, is because it is grossly underspec'ed to reduce size. If they had doubled the power delivery capacity, it wouldn't be operating on its maximum margin most of the time (especially when the MacBook's CPU goes under load).
Just felt the chargers for my two PCs, both are warm, nowhere close to burning. And one of them has been on for weeks.
darexinfinity I have a super compact dell one which also gets too hot to touch. It's simple physics. Smaller components can dissipate less power.
Mine for my gaming laptop has been plugged for months and it guzzles power.Not even hot.
Actually DC travels better for long distances too due to the almost elimination of inductive loss, However the big problem to DC is the electronics required to control it. DC can be stepped up and down almost as efficiently as AC using switch-mode converters, but that requires advanced control circuitry and power electronics not available in Edison's time. Now we have the integrated circuits for the advanced control, and the power semiconductor devices for current handling, together creating the required control components. Also AC switch gear usually don't have to consider arc quenching too much since the current will go to zero on its own at some point, but HVDC current never stops creating a lot of technical burdens in the switchgear. The solution to HVDC switching is, again, semiconductors, when combined with mechanical switches. Have you heard of the gigavolt HVDC in China?
This information is misleading at best =/ You make it sound like we stick to AC just because it's easier, but the truth of the matter is, having a DC power grid would waste so much more energy than how we have it now.
Furthermore, having one DC converter for a house is somewhat impractical. There re devices that run on all sorts on different voltages. Laptops tend to like 19v. Desktops have 12v, 5v, and 3.3v all powering different things inside of them. Not to mention small devices that require all sorts of different voltages. Oh, and running all that DC through the house is going to incur significant voltage droop due to the long wires. There's a reason we run AC to the wall and only convert to DC when absolutely necessary.
+Rick Budzak Listen to this man! he knows what's up!
+Rick Budzak
DC Voltage reduction is much simpler though. A solid-state voltage regulator is pretty much the only thing needed. In the other hand, 120V AC to DC conversion usually requiares a transformer, which is heavy and much inneficient.
It is reasonable to imagine 24V DC avilable in home outlets and each device would only need small voltage reducers for 12V, and 5V.
Some factories do have 24V DC lines running through the whole factory because many measurement instruments are standarized to work at 24V.
Makes sense when everything runs on the same voltage. I know of places that do that with 48v as well. That's not the case for the average home, though.
+Rick Budzak
Well it might no be a perfect voltage match, but each device requiring its own little voltage reducer does make more sense than each of them having their own transformers and rectifiers.
+Rick Budzak It's important to note that while laptops have inputs at 18-19V or the like, they most likely convert it down another time to 5V and/or 3.3V for digital logic.
This Video should have had a disclaimer *For Stupid people* because the explanation is Silly, it doesn't really explain why your laptop charger is so hot, it goes off on some lightly related topic
The Laptop charger gets hot from converting 110v or 240v down to 19v.
that voltage conversion is where the heat is generated.
this video should be titled " what does the laptop charger do?"
It did answer it as the above says. It just wanted to educate the masses who forgot to pay attention in physics
+Mark Pickett Wrong. Conversion from AC to DC generates almost no heat. It's done with a bridge rectifier if you want to use both sides of the AC wave. You can use a a single diode if you wanted to use just half of the wave. The OP is current. The heat is generated in the step down process.
I bridge rectify 110vAC regularly at home. The component is a 4 pin IC that's about 1/4 of a square centimeter and has no heat sink or fins.
+Mark Pickett
I'm pretty certain the heat is generated from the transformer that steps mains power down to 12V or whatever voltage you're using.
Bridge rectifiers (AC-DC) shouldn't generate that much heat....... so this video is essentially bullshit.
+Bobby Newmark i think your getting confused between Linear Regulator and Switching
Mode Power Supplies. Linear Regulator supplies will put off a lot of heat because its burning off the excess energy, while a switching power supply can do the same thing in a smaller space, but in the end costs more because of the components involved. that's how usb wall chargers can be so small now and still put out little to no heat.
then again i could be wrong or just talking out my ass, but might be something to think about.
codenamegamma In the case of USB wall power supplies the amperage and voltage of the out is pretty low. The linear and switching supplies are very similar. They are only different in the steps before the voltage step down. Switching supplies rectify the voltage 1st. Then it ramps it up and converts it back to a high frequency AC then steps it down to a low voltage. The high frequency of the AC allows the transformer to be much smaller. Then it rectifies it again. Not a lot of heat is generated by the switch mode supply because the power passing threw it is not very high and it's efficiency is very high. In switch mode about 10-20% of the input power will be lost to heat where 30-40% will be lost in the linear.
Cost wise it's not much more expensive to build a switch mode power supply. I typically don't build them unless the power use is higher the 10 watts. most of my circuits are only 2 to 5 watts.
"Alternating current works just fine for things like light bulbs"....shows an illustration of an LED light bulb....smh.
Now lets plug this transformer into this 300V battery bank!
They didn't really explain why AC won out over DC. Centralized power generation and the efficiency of transporting AC over distance.
*****
25 cars in series!
leerman22
dang
+Aaron Hall
Actually over long distances DC is better than AC (as you don't create EM waves), but it is/was much easier to transform AC from high voltage to low voltage and the other way around. With modern electronics it is easily possible to do the same with DC.
Ain't nobody talk against Nikolas Tesla! He was the man from future! He did so much things for mankind but got little recognition while alive.
+Core Man They didn't talk badly about Tesla in the slightest... in fact, they praised him. :/
The White man trumps all
+Core Man yeah its tragic no one appreciates him :(
He even died broke in a motel or something like that
+Neil Kline That was Edison with Topsy the elephant to prove that AC was deadly.Your whole post is about Edison not Tesla. You might want to read up on history first
What a misleading title, no science into the conversion. No explanation for the heat other than lost energy.
Why...? The video didnt deliver on its promises. Pretty shitty move to be honest. Also, trying to knock Tesla.. -.-
The video wasn't even correct either, bridge rectifiers don't generate that much heat. The real heat comes from the transformer having to step down comparably high voltages to like 19V
You expected science in a Vox video? I feel sorry for you sweet innocent summer child
There's a transformator in that box, 2 coils designed and wired in such a way that the voltage drops from 230v to whatever your power supply runs on (For example 5v for most phones) . Using a transformator also allows Galvanic isolation which for safety reasons is really nice to have (+ it eliminates interference). Then there's a converter, which changes the current from AC to DC using Graetzes bridge and rectifier diodes. After that there's filtering system which blocks current of too high or too low frequency. The optional last one is a stabilizer which provides lower pulse in DC current if you need it (Mostly used when a high amount of power is needed). Also the stabilizer is what causes the charger to go warm or even hot.
Thumbs down for the Mickey Mouse explanation. This question is better left to the SciShow or Asap Science.
+MrBlaq Yeah, this was a bit too trivial and didn't have enough depth of explanation of the advantages of both AC and DC. I usually like VOX vids, but there are many other channels and vids explaining this which are much much better. Stick to doing what you do best VOX, don't just release vids on topics because you need to release more vids. Quality, not quantity
+cdrwilson The only thing missing in the explanation is about the why of using DC. DC is needed for charging batteries. Also, AC is a noisy source of energy and elelctronics needs a quiet source of energy like DC as the buzzy AC current interferes with using voltages to evaluate computations. AC to DC converters don't just convert the power but also isolates the AC buzz from the DC current.
I mean, the question at hand was why a laptop charger gets really hot, which they did answer really quickly and straightforwardly, so the rest can be thought of as extra. But yeah, it didn't go too in depth about ac vs. dc.
+MrBlaq its also wrong... so yeah
Yeah I dunno why they got somebody who apparently talks about business and/or economics for a living to explain something about electrical engineering.
I've come to expect more of you Vox. This didn't even answer its own questions.
"Why does the power converter get hot?" "Well when the power converter converts power the power converter gets hot."
"Why do our devices need dc?" "Well they do, so you need a power converter."
+Kabitu1 They answered the questions with no explanation because it is already so fucking simple. The power converter or rectifier gets hot because to convert ac power to dc power you need to use capacitors, resistors and mosfets. Resistors get hot... they create resistance... energy is converted from electrical energy to thermal. Why do devices need dc? because dc is consistent, smooth, clean power. Your hairdryer is literally turning on and off so fast you can't even see it happening. You wouldn't even be able to turn your laptop on using ac, it would just turn off. Not to mention that ac like they said pulses forwards and backwards. So whenever the ac pulses backwards through your laptop it be moving through the laptops electronics backwards. Think about that, for current to be usable for a processor it needs to go through a power delivery system. If it were to move backwards it would just go straight into the processor with no system to control it. Hence the blue smoke.
this is what happens when you ask NPR for facts
+Bret H good quality informed with a good insightful audience....
+TekSupport ? what the H?
+TekSupport actually the majority of the power loss comes from the transformation of high voltage to low voltage. So the video is gives inaccurate information.
This is wrong on so many levels. Not the quality I expect from vox. Please do more research next time.
ekolimits this is okay enough for non engineer minded person, (read:woman)
^ luh tanga amp
explain y its wrong
He's not being sexist, he's just saying that the woman in the video was a journalist that had nothing to do with engineering.
I completely agree. No research whatsoever. It would be fine if all devices were the same voltage. Also didn't mention anything about safety either, such as the demonstration with the horse, that Tesla performed.
"I've invited Audrey Quinn from NPR's Planet Money team to help explain what's going on with my super hot laptop charger."
I would have thought an electrical engineer would have been a better bet.
How does having a larger converter for your home save energy? I feel like you jumped to that conclusion without explaining the logic there. Wouldn't a larger AC/DC converter work just like your laptop charge? Heating up and losing power?
+Croo ookie I think the idea is that one large converter can be more efficient (there for loose less energy to heat) than a lot of small converters. However, I don't know if this is actually the case.
***** Yea, could be. Or it could be the case that manufacturers of electronics know how to make their specific products in the most efficient way possible... But yea who knows, they didn't really explain that part.
+Blake Jones It's very likely the case for places like apartment complexes, but I'm not sure about homes. But considering how many devices my home has, I'd say it's probably right.
+Croo ookie It's not quite practical, because each device may need DC power, but at different voltages. My cell phone is running at 4.3VDC, my electric shaver runs at 5VDC, my cordless drill runs at 20VDC, my cordless shop-vac runs at 12VDC, my laptop runs on 11.1VDC, etc. AC devices are fairly standardized in houses, your incoming power is usually 240VAC, which is actually two "legs" of 120VAC. Everything in your house runs on 120VAC, except your oven and dryer, which needs the full 240VAC.
redbeard2001 So you're saying having a larger dedicated AC to DC converter wouldn't be practical because each device would need another DC conversion anyway?
0:46 - 0:50 is literally the most explanation given
...and their explanation is not even the primary cause of heat.
wrong. it is getting so hot because the electronics inside is dirt cheap and wastes more energy (as heat) than more expensive and efficient electronics.
Thank GOD..i thought i was the only one who didn't understand this. Im so confused . CRAZY. thank you.
@@frankfahrenheit9537 Actually you're wrong too. The reason for heat loss isn't the quality of electronics, but simply because changing the voltage of the wall outlet to only whats needed in the laptop generates heat. This is because energy cannot be created or destroyed only transferred or transform. Nothing to do with "cheap" electronics.
you are no electronics engineer, are you?
I am. Cheap means bad means hot.
I think Vox is starting to deteriorate into Buzzfeed with all these low quality videos. I appreciate the explanation, but it goes on quite a few different tangents and doesn't explain things as detailed as before. People want the answer to the question, not a plug towards environmental protection.
+Wayne Ho it heates due to the induction based transformer coil that steps down the power.. its the main heat producer
I understood that entirely before watching the video, and the gist of my comment was directed to the direction of the video. Thank you for your attempt of educating me though.
Wayne Ho you seem nice... dont be angry at vox.. the crazy internet ppl will throw a rant at you... youtube comments is a strange place
+Wayne Ho
If Vox made a video about butter, they'd find someway to interject their particular flavor of political ideology into it ............ along with some glaring technical inaccuracies like it's the goddamn transformer that generates the heat not the bridge rectifier.
+Bobby Newmark Ah ha! You see can see the glaringly clear ideological insert in irrelevant topics in these recent videos, I totally see your point.
2:07 "the US has one of the best most reliable power grids in the world"
Me in 2021: *Laughs in texas blackout caused by unreliable power grid*
Well some things require different voltage and amperage so having one conversion box doesn't work unless you either limit high power devices to low power or you bolster all low power devices to handle high power.
+SpacePak by Mike Ridolfi You are just wrong, what is with all of the want to be electrical engineers in these comments. So lets start off with the ac power moving into your house. Generally your circuit breaker will be feed 240v at 15-20 amps. This is enough to power alot of shit. So now lets convert that to dc power, and disperse it accordingly around the house (no different then what they do with ac so far). You know your laptop charger? throw it out, it no longer works. Your laptop charger is a power delivery and a converter but we no longer need a converter. do you think your phone is getting 100% of the current from the wall? no fucking way, it gets converted in a rectifier and then moves through a power delivery so that the dc current is actually usable and safe for the device to use. See these power deliveries exist in every circuit board on earth to some fundamental degree. They contain mosfets, resistors and capacitors. Instead of having converters/rectifiers in every charging brick we would have just have one big rectifier in the house and power each device would control the amount of dc current it would need. Which we are fundamentally doing already. So your argument is completely invalid.
TekSupport I don't buy your answer. How would the wall DC power know how many volts to supply? And no, I'm not an electrical engineer, I'm a mechanical engineer. I don't know much about EE, but I know enough to see holes in this method.
+TekSupport Spacepak is more correct than you are. Rectification is easy - it's DC to DC conversion that's hard and wastes power. Having a single rectifier in your house converting to some completely arbitrary and ultimately insufficient DC voltage is stupid - every device will still have to do the DC to DC conversion and that's where power is wasted. This video is just ridiculously wrong.
+Ted Middleton Changing to DC would at least solve the issue of impedance. And today DC to DC is not so much of a big deal anymore.
***** Yeah I get that, but still, there are a lot of different voltage required. Each device would need a step up or step down right?
I am 14 years old and watching this video literally made me dumber. I learned about AC and DC power a year ago, that was in grade 8.
yah, i feel stupid too and I'm old. almost 45 didn't understand anything. sad.
Lol
your 18 years old now
You're 19 now. See you when you're 25. They grow up so fast.
you're 19 now and do you understand yet?
I'll fight you
Nikola tesla is better then Edison
No fight here hahah
Leafy Underwood thats just true
Leafy Underwood yeah Tesla was more innovative than Edison, that's kind of fact
FITE ME BICH
Leafy Underwood edison is from shitty general electric right? that scammer?
In addition to all said, the pictured light bulb appears to be an LED light bulb... which would work better with DC in fact :D
I noticed that as well. :D This is not a top quality video..
It's pretty sad, because with this video, which is about something I guess many people wonder, they could've got much more hits than the lousy ~300k, have it had actually some content :D
+
It has to work with DC, it has a rectifier circuit in the bulb
-"Why can't I just power my laptop with AC power?"
-"It works for stuff like hairdryers, but electronics are a little more complicated, so you will probably see a pop of blue smoke."
....................That moment when you want to do a triple facepalm, but can't react adequately to the level of fail because you only have two hands..............
I'm tempted to use "because electronics are complicated" as my go-to explanation for every question from now on. It it's good enough for Vox...
Well, judging by the title, I was expecting some scientific answer. Instead, we get a conversation over AC vs DC. A bit misleading, but still informative. Seems more like an environmentalism ploy than any kind of explanation.
I know something about science behind that. If you want more scientific answer then here it goes:
In most chargers you have a few elements that are helpful when converting AC to DC. First one is one that increases frequency of AC from 50 Hz (or 60 Hz) to higher one (AC to AC conversion, first place you get heat from). That allows everything that is after that to be smaller and therefore have less power loos.
After that there is a transformation that changes voltage of AC from 230V (±315V at peaks) to 5V or 12V or 20V AC. Voltage manipulation is very easy (cheap components) in AC and has high efficiency. However many chargers have low quality so energy loos here can be higher.
Third element is the actual AC to DC converter. It consist of 4 diodes. Those diodes allow current to go in one direction easily, but hard in other. Each diode outputs heat when current tries to go thought it the wrong way (but since there are 4 most current goes somewhere else), and a bit of heat when in right direction.
Fourth element is a stabilizer. DC after those diodes goes in one direction, but not with constant speed. Stabilizer fixes that using a big capacitor (to smooth down "bumps") and another (different) diode to cuts down excessive voltage (eg. you want 5V, capacitor gives 5,2V, diode cuts it to about 5,1V). Capacitor has some energy looses, but diode is converting everything above what you want into heat.
And that is what happens in every charger inside your home. So the question now is - why don't we have one, expensive, but really efficient one? (answer: profit)
Having DC power system in house would still require changing current, but in DC it can be done more efficiently - you need one huge capacitor and one transistor. Say you have 24V DC and want 5V DC. What you do is you charge capacitor until it reaches 5,1V and then unplug it and wait until it goes down to 4,9V and then plug it again and so on. Transistor is doing that plug/unplug thing. Fast, easy, and very energy efficient.
As a electrical engineering student I was so disappointed in what they said, but that's because they didn't go into the math behind it... Lol
Devin Alexander
I hope I didn't screw up anything in my explanation.
These people make it sound like DC rocks. But in reality DC has tons of disadvantages which outweigh all advantages of D.C. ten times over..
en.wikipedia.org/wiki/High-voltage_direct_current
"The US has one of the best, most reliable power grids in the world" 2:10
Texas: Am I a joke to you?
The funny thing is Texas runs on its own separate power grid
There's a reason electricity is transmitted at such high voltages. And that is to reduce power losses in transmission. The wires delivering electricity have some resistance. The power loss is proportional to the resistance and current. So, if the voltage is high, then the current can be less to deliver the same amount of power. So, if you just distribute 5V DC (for example) for the electronic devices, the losses would be enormous
Power loss in a transmission line is actually proportional to the current squared, and inversely proportional to transmission line resistance (or impedance in the case of AC).
P = I²R and P = I²Z
All of your ideas are right, I just felt like nit-picking :)
better explanation than this video
And there is also the fact that in AC it is easier to change voltage then in DC (unless you count capacity converter, but that don't work so well for higher current). So AC is made in power plant, goes up, travels to you, goes down a few steps and you have your 230V (or 110V).
However - would the losses really be so significant if we would distribute say 24V for a whole building or at least a flat?
+Miku MichDem about hundred times?
24V 10A DC with 1mm in diameter coper wire would lose you 8% of voltage per 5 meters (and obviously you need much more than 5 meters to wire an flat).
240v 1A AC with same cable 0.088%
Not to say that 24v is too high, you still need power brick to step down to at least 19v for notebooks and 5v usb-charging things.
niter43
Of course 24V will have more losses then 240V, but to get accurate comparison DC vs AC voltages need to be the same. And having in mind Telegrapher's Equations DV is having advantage over AC.
The advantages will be with one converter. Difference between +80% and 50% efficient power supplier would overcome wires length.
As for 24V imo that's a good voltage. It's easier to step-down then to step-up DC-DC.
And conversion is not a problem. LM2596 for instance can provide up to 3A between 3,2V and 35V or more expensive XL4005E1 up to 5A between 0,8V and 30V.
On the other hand step-up is less powerful. U3V12F12 (similar price range then other two) for instance can provide 1,4A at 12V. Much less then LM2596.
AC or DC?
LETS USE ACDC!
IM ON THE CURRENT TO HELL!
Hold the L proudly
I hate you and your family for that comment
Poatatasium Poatatogen PO2 How about D4C? (Its a jojo and musical reference to a band that creates good music and a anime/manga that has jesus christ and a depressed cripple that uses jesus christ corpse to kill the president of the united states)
Make Jojo Great Again 😐😫🔫
Conversion boxes don't just convert AC to DC, that's not even the part that generates most of the heat, most of the heat is generated by the conversion of Ampere and Voltage to different amounts.
I've heard that the American power grid is old and not reliable at all!
It depends on what part of the country you live in.
+Vladislav Dracula Okay then. How's the electricity grid in Transylvania, Drac?
LARSFSO I haven't installed a grid in Transylvania yet. Although it would be cool to impale people with electrified iron poles.
+Vladislav Dracula Sounds like a fun time! ^_^
It is not really energy efficient. The higher the voltage, the lower the losses are. In Europe we have 240V, in the US it's 110V. Furthermore, electricity is carried by 3 cables to lower the losses in Europe and by only 2 in the US (three phased current is more energy efficient than single phased).
There is also the issue of high voltage transmission, but I don't know what voltage they use in the US. In Europe, we have three different voltages (the current is converted twice before getting to your home).
2:07 you americans sure are funny. always thinking you are the best at everything
+1
+That Guy You're the best at fucking things up, thats for sure ;)
but the rest, not really....
+wrathofkawn
If you want facts, have a read here: www.ibtimes.com/aging-us-power-grid-blacks-out-more-any-other-developed-nation-1631086
BLACKS OUT MORE THAN ANY OTHER DEVELOPED NATION.
There's your facts.
+fickdich google Agreed, the only reason for Americas claim to advancements is because America sucks up the world's best and brightest brains, or tries to.. but they themselves lack behind the world in many things significantly.. except for Ignorance, which they excel in beyond any nation that has ever existed on the face of the Earth.. Combined ;_).. and other Very displeasing things but we haven't got time for that, I don't think the American readers would make it this far..,"-D
***** some countries in europe would put the american power grid to shame. so no
Tesla wanted to make electricity free and wireless. Edison wanted to have more power stations so he could make more money.
+Joseph Ang maybe hypercheap electricity
Oh yeah, hypercheap electricity, where can I get this?
+Joseph Ang Edison was evil. He demonized Tesla's work (to the point where he killed an elephant to try to prove he was right ) and even stole inventions for his own.
No energy is free but Tesla is still a good scientist
Did she just say that the US has one of the best most reliable power grids on earth. Hah that's fucking ridiculous.
+Anders Bisi-Veerkamp why do you say that. so many people say america is a horrible country. its not. its one of the most advanced countries out there. i live there so i would know. so, whats your reason for saying that?
Yeah! The most reliable power grids are in China, Kenya, and India. Third world countries have the best infrastructure, unlike America, the richest country in the world!
Ovenchicken www.ibtimes.com/aging-us-power-grid-blacks-out-more-any-other-developed-nation-1631086
Ovenchicken You shouldn't compare countries like China, Kenya and India the US you should compare similarly wealthy countries like Japan, Germany and the Netherlands.
+Anders Bisi-Veerkamp you should come to india and china and take a look at the power grid..... powering a billion people isnt easy.. i agree they black out for an hour or two everyday in villages and backward towns...but thats the challange of powering a billion people... also the cities get 24/7 365 days power... countries like china and india might not have the most effecient power girds but they are much complex and robust than the US
How is converting all your power more efficient the just converting what you need?
How do people not know about AC/DC already?
i'm 14 and i actually didn't know this
XxAznDarknezZxX
I just got it!
+Cherry Berry you're the typical product of a US education. Pretty sad how little US high school and even college graduates know about the world.
+Juan Alvarez your so stupid
Werlin gonzalez If you're going to insult someone, maybe learn to spell first. Otherwise you end up looking like a dumbass.
A lot of the information in this video is false or at the very least misleading. Honestly, you should take it down or re-do it.
Misleading video almost everything is wrong.
my favorite thing about this is that the lightbulb they show at 1 minute and say handles AC just fine is an LED model. inside it theres a little stepdown AC-DC converter, and is probably the only device that actually presents a decent reason to have a centralised stepdown converter. instead of tossing a converter each time your LED bulb goes out, you just chuck the bit that lights up.
yeaah dude..... pass the joint.
The full bridge rectifier doesn't generate heat, the voltage stepdown does.
A return to the ad hoc voiceovers, poor form.
Why, if I may ask?
Film News Report I'd say half of their videos have voicing that sounds too forced. This is one of those examples, although recently they've been sounding more casual which is better.
Vox : Why Your Laptop Charger Is So Hot
Me : bEkAuJ iT iSn't CoLd
Tesla > Edinson
This video: "The US has one of the best, most reliable power grids in the world"
Texas in February 2021: 👁👄👁
Haha a video for infants on electricity.
get a real scientist next time
why would having one conversion box for your house save energy? you are still converting and losing the same amount for devices that need DC and plus some more for lightbulbs that dont need DC so wouldnt you at best break even if you dont use anything that uses AC or lose more energy when using AC appliances
The converter you'd get would be a lot more efficient and you'd have two networks in your home, one AC and one DC.
Jack Majhand it would be a pain if you want to plug in a dc device but all your dc sockets are full but you have lots of ac sockets
Some modern bulbs use DC too, like LEDs. So it really wouldn't be that big of a hassle
***** but AC bulbs are much more common. I know all the bulbs at my house are ac
john li ye but would you rather have to install special plugs and have a ring just for AC bulbs, or just buy new bulbs that are DC and be done with it?
alternating current/direct current is my favorite band
Edison was DC guy and Tesla was a Marvel guy.
AC: current flows in both directions so if you plugged your laptop into AC it would charge then discarge rapidly with no net power gain.
DC: flows in one direction. Achieved by passing AC through diode which is eccentially a one way street but for electrons - allows current to flow one way but stops it from flowing back. The energy lost by stopping the returning current is converted to heat (Or light if you are using a light emmiting diode (LED))
Since the video didn't explain this I thought I would
1:41 NO!!!! DC is more efficient at travelling long distances. AC is used because high voltages are more efficient at travelling long distances and AC can be converted to different voltages with simpler circuits than DC
Power=Voltage X Current, higher the Voltage, smaller the current, vice versa.
Power loss=Current² X Resistance, the cable used for long distance transmissions have a high internal resistance, so the only way to lower the power loss is to use a small current.
So using high voltage with small current have a low power loss, no matter it is AC or DC.
But the thing is, DC can not be step up or down by using induction but AC can, therefore, people use high voltage AC for transmission, then step it down for us to use.
+sean shapuron If you use AC, you are sending some extra electricity back and forth in the line because there is always some non-zero phase difference. That is called reactive power. (see en.wikipedia.org/wiki/Volt-ampere_reactive)
+Felix Bade I see, does that mean the DC can travel through a longer distance than AC (Assumed that the resistance and length of the transmission cable is the same and they are at the same voltage) ?
If so, I'd like to ask how to step up/down the DC with maximum efficiency, and why Edison failed at the first time? His DC power grid had a severe power drop problem.
+sean shapuron High Voltage DC is used in Long-Haul electric transmission. But in Edisons and Teslas days the electronics needed simply weren't invented.
+Felix Bade even at this day, it is still hard to swith high voltage and high amp DC currents on and/or off. Switching AC on and of is by far easier because voltage and ampere count becomes 0 about fifty times a second (depends on where you live, in the US it's 60Hz as far as i know)
It gets hot because of Angus Young's guitar solos.......... "You've been... THUNDERSTRUCK! "
So then why can't our electronics be designed to charge on AC?
+Yuphrum AC alternates the direction of current circulation around the wire rapidly, meaning that the capacitors in the battery would be receiving positive and negative currents in rapid succession. This results the charge that you just put in being immediately taken out on the next loop round. Batteries that we currently have will only work on AC, and there isn't really a reason to try and fix what ain't broken at the moment.
+Yuphrum 120 or 240 volts is too much for small electronic gadgets. You don't need that much voltage. Try connecting a small LED bulb into a 9-volt battery and you'll see that it will be dead in seconds.
+Yuphrum show me a battery which is working with AC?!?!?
+Martin Bohl I think you misread my comment. I was asking why it "wasnt" possible to charge to charge electronics with AC not that we "do" charge them with AC current
I understand what you have written. so my question is a rhetorical question. You can look for a dc battery but you will not find any.
Additionally i have to say: The DC and AC thing is not like: left lane traffic / right lane traffic.
choose any for all and it will work.
electronic components set the requirement what they need to work proper.
Just for example:
A capacitor in the DC mode is working like a battery, as a puffer.
in the AC mode it is a low-pass filter.
You also cant say: OK WE DO DC now.
yaaaaiiii, the thing is, even if you would, you have often to change the voltage / transforming again.
Open you computer, have a look to the power supply: there a many cables and big labels:
+3,3V ;+5V ; -5V ; +12V ; -12V (all DC)
you see,there is not a perfect answer.
AC - good for transport of electricity in middle range distances (a few hundred km´s around)
the rest is just necessarily adjustment what your device/thingi requires.
it's the flow of charge not the flow of electrons, it's a pretty common misconception but it's worth knowing the difference
God the voice of the woman explaining is so annoying.
Really?.. the heat is the energy not being used or wasted.. it happened awhile ago with light bulbs, instead of the energy being used to produce light, most of it was converted to heat which is not effecient.Therefore the energy that comes out of you're wall is too much for the laptop so the box tones it down to whatever amount it needs and that toned down energy is the heat or wasted energy.
This video could be explained more correctly by an Asian.
The correct answer is AC is better than DC because AC can be transformed, that is the voltage can be made larger or smaller. Higher voltages are easier transmit over distance without too much loss of power. This is due to Ohms law which states;
The current flowing through a resistor at a constant temperature is directly proportional to the voltage across the resistor.
The conductors (cables) used on a high tension power grids conduct electricity, but they do have a resistance.
Voltage is the electrical equivalent of water pressure in a pipe, and current is the time it takes for a quantity of water or electricity to flow.
A conductor is to electricity what a pipe is to water.
Transformer is a coil of copper wires that changes the voltage.
The molecules of the metal used in the wires (aluminium) need to be excited by the electricity to make them move. As they move they both pass on the movement and get hot, just like a person does when they go for a run. What makes them hot and what they are resistant to is current so if we use a low current they will get less hot but still pass the movement.
As current is proportional to voltage, the inverse must be true. So higher voltage means lower current. Lower current means less lost power due to conductor heating.
Power stations produce electricity at 25,000V. Electricity is then stepped up and sent through the National Grid cables at 400,000V, 275,000V and down again to 132,000V in the local grid. This is then reduced again to domestic voltage which is about 240V everywhere except the Americas where it is 110V.
Your little charger is doing exactly the same thing. It is taking 240 or 110V and making it 6,12 or 24 volts. It is also taking away the alternating current using a coil of copper wires and some other electronic components like diodes, capacitors and resistors. This is called rectification. It means making the current go only in one direction. That is really complicated to explain unless you have an engineering background.
This is when news platforms simplify information to the degree that it is completely wrong. Electrical Engineering is a lot more complicated than just converting between AC and DA.
This is the most mind-numbing I have ever watched. My god this is hilarious.
If we changed to DC which makes no sense you would need far thicker wires and along with that you would be wasting more energy, right now the set up we have is just perfect.
when this video made you realize how the band ac/dc got their name
I don't like the conclusion of this video,"If we can reduce the number of conversions from AC to DC power, then we're going to save a lot of energy"
What truly matters is the efficiency of the conversion. A MacBook charger's efficiency is rated at 90%, meaning 10% is turned into heat and other losses...
If I have a full house AC->DC converter (known as a rectifier) but it is only 90% efficient, than I've simply moved the losses and heat somewhere else plus the efficiency is still the same.
However, one could invest in a very efficient rectifier for the house to see a net benefit. We have rectifiers's in data centers that are 97%+ efficient, so they do exist but cost money that nobody is willing to pay for since the electricity is cheaper than the rectifier with electrician to install.
Stick a fork in an outlet. Feel the heat. And then your charger won't seem so hot and you'll learn a valuable lesson on electricity and heat.
I use it as a foot warmer... my room is cold lol
I’m no electrical engineer, but the comments section was far better at explaining this than the video. The heat generation, apparently, comes from the voltage step down from the 110-120v coming from a North American outlet to the 5-10v required for most mobile devices, not from the AC-DC conversion.
Thank you, commenters!
2:08 "The US has one of the best, most reliable power grids in the world"
Except Texas's
IKR
How many of you have electrocuted yourself by accident from an outlet in your home? It might have hurt but it wasent life threatening. That's because it was A.C. going into your body. If it was D.C. like they are proposing here, that shock could have killed you. Sounds great, doesn't it?
You can touch 120v dc and barely feel it.
“That weird box” mate you don’t have to be an electrician to know it’s a power supply..
ikr
I added a heatsink to one side with 2 sidded heatsink tape then sat it on a dual USB fan and the other fan pulls air through the heatsink. 100 percent effective at keeping the charger box just above room temp.
This felt like I was watching a children's show. It's get hot from the transformer knocking down your wall outlet power to the power your computer needs to charge
People should also know that one of the reasons why DC power isn't used in power lines is because DC power can't travel through long distances. AC on the other hand can travel farther. (layman's terms) Imagine having a long wire then you pump in DC power on one end, if you read the amount of power on the other end, you'll get a lower reading. Make the wire long enough then eventually you won't detect any power. But if you use AC, you'll get more power on the other end.
Even before reading the comments I knew something sketchy was going on here my literal reaction was “that barely answered the question “ now reading the comments I realize I just scratched the surface
1:00 "Works just fine for lightbulbs" shows a LED lightbulb that has a built-in power transformer. Bravo.
This video really dosen's answer much in the way of WHY your laptop charger gets hot, so i'll take it upon myself to explain.
The heat that radiates out from the charger is largely a result of electromagnetic losses when converting the mains voltage (either 110VAC or 240VAC) to low voltage (usually around 19v for laptops). Voltage is converted through Faraday's law which states that any change in the magnetic environment of a coil of wire will cause a voltage to be induced in the coil. Because AC power is constantly changing from positive to negative, its magnetic field is also constantly changing polarity. This being the case, when a coil of wire being charged with 240VAC is placed next to another coil of wire, the constantly changing magnetic field of the first coil causes a voltage to be induced into the second coil. The actual value of this secondary voltage is determined by a number of factors which the manufacturers set to produce the desired output voltage.
The heat losses in this process are mainly electromagnetic losses, namely hysteresis losses and eddy current losses.
Hysteresis is the lag time the copper wire takes to change its magnetic polarity and the extra force required to overcome this lag.
Eddy current losses occur when the changing magnetic field also induces a current into the conductive transformer core which, due to Lenz's law, opposes the magnetic force of the main magnetic field of the transformer. If the transformer coils where made of a single solid conductor the eddy current losses would make the transformer extremely inefficient which is why nearly all transformer cores are made of laminated steel stampings to reduce the size of the eddy currents.
Whew, hope that explained it.
"best most reliable power grid in the world"
*looks at Texas*
The heat is generated from the voltage stepping down from 115V/230V to ~15V, not from the rectification ("conversion") from AC to DC.
Having a single large DC rectifier in a house doesn't solve this problem. You still need different DC voltages for different devices, or even multiple different voltages for the same device. Computer power supplies take AC and convert it into 12V, 5V, 3.3V, -12V ect. All that would do is move the problem out of the power brick and into another cabinet, while simultaneously increasing complexity.
R E SE A R C H
Ask your coworkers about it.
One single AC/DC converter would:
1: not save any power at all, instead of that couple of percent of power being dissipated from your laptop adapter, it would be dissipated from that converter.
2: not work at all, different devices use different DC voltages, and you can only easily step up/down AC voltage (with a transformer).
When you accidentally summon every elecrical engineer in the world
That adapter heating would not be a problem with iPhone 12. 🤣
Alternating current doesn't go "back and forth..." It just is on and then off. It is really only on half the time, but it oscillates so quickly that it is unnoticeable to humans. Watch a video in VERY slow motion. You will notice that the lights seem to be flickering. The problem with the electricity from the wall it is that 1. Your computer needs a steady supply of power to keep running, not oscillating, and 2. The voltage from the wall is way too high for your computer and the heat comes from the coils upon coils means literally for the explicit purpose of getting hot by wasting energy to convert ti the low voltage.
This evening i have had three beers although this video has done more damage. Please refund 134 brain cells.
The comments on this video are more informative than the video itself.
I was taught that AC was safer than DC when it comes to wall plugs. As AC is more likely to have muscle spasms, and through you away from the live wires and plugs. Whilst DC has a higher chance to just make your muscles tense up and hold you there. So its more a safety thing rather than convenience.
this happens when you first write the title and make the video later
Putting boxes that convert ac to dc at house level would not reduce consumption.
Say your house consumes 1KW of energy per day. And say 500W of that is converted to dc to run your dc electronics. Putting in this box would just mean the full 1KW would be converted.
"Why your laptop charger is so hot"
...I've never thought about it, It's just my sexy charger ._.
The moment she called the AC adapter "strange box," I knew we'd have a problem.
Electrician here, the best option for the highest efficiency is to increase the AC mains voltage as much as possible without insulation being a major issue (~1,000 VAC) then run a high efficiency (read expensive) DC rectifier that handles all DC equipment. For compatibility with most electronics without forcing electronics to standardize voltage (increasing price and decreasing efficiency) the DC rectifier would need to output 24, 12, 5 and 3.3 VDC but now you have 5 outlets on every power point (including 1k VAC).
It's debatable as to whether you would save that much energy as you need to do the conversion to cover the same amount of equipment in the home. So if you add all those little boxes of heat up, you get one big heat source instead, at that main convesion box.
The other thing is you may end up losing energy (again, heat) in the wires much more quickly that with AC. Remember, electrons flow like a river, but also bump into other obstructions along the way. It becomes worse as more gadgets are plugged in.
With AC the movement back and forth is absolutely miniscule and so less friction (thus loss of energy through heat) occurs.
You are using DC power because when you send power into an electronic it is going to the battery, which emits DC current to the device. AC current is produced by generators and as stated in the video, is better at travelling long distances.
That said, if you live in a cold country that requires heating, that wasted energy is helping warm your house. So it actually isn't going to waste
This video didn't really explain why. Your wall outlet (in the USA at least) is putting out 120V AC, and your laptop needs 19V DC. Inside that box is a transformer that steps down the voltage and stepping up the current in order to power your computer. Due to the laws of energy conservation, some energy is transformed into heat.
The electrons aren't moving, it's the current that is moving. Electrons do move, but they move very very slowly (electron drift) compared to the current, which is derived from a change in charge of the atoms over time. Also there was no elaboration on how energy would be saved for the scenarios of a bunch of little rectifiers compared to a big rectifier; they are right but there is no explanation.
So why does my adapter get too hot? 😆 🤣
I hope the heat from the AC/DC Converter isn't as hot as the Highway to hell..
haha..ha..
when you are 4/5 through the video and recognise, thag this video does not include any useful information...
me, watching this video: huh thats interesting i never knew that before
the comments: this video burned my crops and killed my father
This went from laptop charger to saving the worlds energy
This glosses over so many details, it is hard to point out all the issues.
AC does not just work fine with modern LED light bulbs, they also rectify the current. For starters: The D in LED stands for Diode. Converters are not just about AC/DC, it is about the voltage. Modern Converters actually convert AC (60 Hertz) to DC to AC (at much higher frequency) to DC (at a safe low voltage).
OKAY! TO EXPLAIN IT, Your PC converts AC to DC using some electronic stuff like a full bridge rectifier or something like that and what happens is energy is then lost during conversion. That energy then becomes heat and boom. There you go, you get a hot laptop charger.