Hey folks! It's Falcon. We just put this out and I noticed around 4:20 (nice) I said "PS4," but it's actually PS3 that has separated RAM - PS4 has 8gb GDDR5 that the APU (combo GPU/CPU architecture) can distribute however it needs to. It was a slip of the tongue and we wanted to make sure people understood the correct information. Thanks everyone, and hope you like the video!
As I remember it was 512 mb of total ram, which was split into 256 mb of ram into two separate memory pools, this was a bottleneck. This also made data compression and programming for the 7 spe's (kinda like cores, but not really, they broke up tasks such as audio, and graphics, etc) into a HUGE PAIN!!! This is why the ps4 has no ps3 backwards compatibility. The system was designed on an outdated overly (extremely) complicated proprietary IBM power pc type architecture. In comparison the xbox 360 xenon CPU was the first dual core processor in a console (it had a third daughter die, core which allowed for 6 simultaneous threads) and it was relatively easier to program for since the memory was never split, and the architecture was rather straightforward. This is why multi-platform titles looked better on the xbox 360, rather than the ps3. It wasn't worth the extra effort for some companies to deal with it. Thank god AMD, got Microsoft and Sony to accept x86, as a viable option!!!! It's made cross platform development for console's and even PC a breeze.
@@rehmanarshad1848 never knew how things were with the ps3🤔 at my school i learned development using playstation dev kits from later than 3, and they prepared us by developing remotely to a linux machine. and the preparation was actually not bad due to the playstation seeming very similar to the type of linux machines we used to develop for. (the linux machine also used shared memory and an APU)
@@Glyn-Leine yeah sony makes all of thier system firmware on BSD which is also a unix based system. You can install yellow dog linux on the ps3, before it was later patched out. I always smile when seeing people using cfw ps3 systems and installing pkg files like any other linux distro. It warms my heart, even though it's not legal😂😂😂. Using the system the way it was meant to be used.
@@Glyn-Leine how do ps?? Dev kits look like, are they large and bulky? Given that it may have additional hardware for quick and easy resource monitoring, and debugging or it's just a regular 'unlocked' ps4 with root access, and no restrictions? I'm just curious.
This was more informative and fun to watch than any other video about computers I've watched. Please do more of these explaining what different components of computers do!
The computer in a computer analogy is great. In essence a normal computer is mostly concerned about handling many different things sequentially, whereas a GPU is optimised for parallelisation (doing the same thing many times at once) and therefore has many cores that all do the same thing at the same time. This is why GPUs are actually used in many sciences for non-graphical purposes for parallel computing purposes. The reason this parallelisation works for graphics is because the general process of real-time graphics rendering has been relatively standartised for a long time. They always go through the same couple steps, each of which designed to allow for many independent parallel calculations. The minimum is: 1. Deform every vertex according to the camera, 2. put a raster on top to see what is "behind" each pixel, 3. calculate the lighting for each pixel to get the colour. In the end it becomes possible to apply even shadows to every combination of pixel/light source without knowing the rest of the world, so the lighting can be entirely parallelised with each pixel calculation being independent of the others. For a long time engine developers could barely even program the GPU. The process was simply preset and they could only change around some parameters. These days GPUs are a lot more like CPUs and allow much more customised code.
Your thumbnail makes me think of the pictures of the back of massive server walls, with red cables everywhere. Like The Shining but made of bloody spaghetti.
@@thoup Probably lol. I mean an oc gtx 980ti can beat a stock gtx 1070. In some games even with allot more fps. It's still a good card even today. I myself however have the evga gtx 1070 sc. That one still does amazingly with my Ryzen 5 2600x at 1440p
@@hasanravat Don't be disrespectful to the Mortal Kombat franchise! The correct spelling of "its combat" should be "it's combat". Further more, sentences end with a period.
@ACAB\\ Mela BAKAta No sorry -- then say "power" or "energy". If you can bring up voltage than you can describe it correctly. This is a video called "what's inside your graphics card?", it's supposed to be technical. Voltage and current are 5th grade science concepts that everyone should know. If it's "easier" for you to not understand how electricity works, then you probably shouldn't be watching this video.
@@NillKitty I agree that if you're going to specifically mention something in a video, it could at least be accurate - Especially on the fundamental points. It is somewhat of a noble pet peeve. But I feel that the context supersedes that - the whole video was topical, general, and (as stated in the video) simple, which I believe is the opposite of technical. It was not meant to be a class and at no point was anyone taking notes on the EXACT way any portion of the graphics card works. Pretty sure the cartoons and phony, under-detailed diagrams gave it away for everyone else already. You made a fine correction, just lighten up a bit? It's all you're really lacking here. "Probably shouldn't be watching this video?" You should "probably get off your high horse." I'm not an electrician but I still play computer games. Now that I've watched the video I'm more interested in learning the specifics later on, so I say this cartoon video accomplished what it was meant to. ¯\_(ツ)_/¯ Voltage correction ✓ Interest in GPU ✓ Senseless snobbery ✕. ;D
well this is only the surface explained very briefly, technically every pc comes with a gpu, but some processor have them built in. and when you start looking into the actual architectural differences between cpu's and gpu's and using that to explain how certain workloads can be easier to offload to the gpu or cpu... the rabbit hole is pretty deep😅 before you know it you're comparing gpu's on their vrm layouts and capacitor quality XD and finding ways to bypass OCP on LN2 XD (Edit: don't let this scare you, rather let it encourage you! hardware is awesome! and you don't need to know everything to be able to do anything with them, just learning more is fun! ^-^)
I saw this somewhere ones, imagine two class rooms, one with 1000s kids doing lots of simple maths like addiction, subtraction, etc and other room 4 or 6 Maths teachers with PhD solving complex task like calculus and algebra, etc. First room is a Gpu and second a cpu in a nutshell. I just love the WOLRD OF PCs. ❤️🔥
I do remember some article on the internet tell the reason why GPU tend to have thousand of cores or even more, where CPU just have 4,6 or 8. it gives the analogy of painting something really big, Imagine you have a task to paint a football pitch size picture in a limited amount of time, would you have 4 very professional painter or would you have 500 average painter.
Nah I dont like this analogy. There are many cores in a GPU, that is right, but they are all very specialised. On the other hand a CPU can do universal calculations.
Everyone needs to see this, there is no video actually explaining what any parts REALLY do and what they're meant for (sorta), definitely should do more of these videos! Keep up the good work!
Any chance you could do another informative video like this for a motherboard or a CPU,I think that would really be great. I learned more here than I learned in 4 years of a technical high school. Anyways, loved the video 😀
@@ShroomBois_Inc Excellent point, my mistake, poor choice of words. My school is 95% about hardware 5% software. Computer science was the closest thing I could think of.
Ishak Smajic damn I’ve never heard about a school like that before. So what kind of stuff did you learn at that high school then? I’d assume how to build a computer but I feel like that shouldn’t take four years to learn lol
Hey guys this one is great, I'm working as IT in Norway and this video is perfect to show the less knowledgeable. I'm gonna show this to one of the students we have interning and that's gonna be a great way to learn. Keep up the great work!!
that was a suprisingly good explanation, i find it hard to put it into words when trying to explain processors and memory to people, but this was worded really well
I just looked this up and at 5:40 you said and the graphic shows that the fans pull air through the heatsink but what I found said that most of the time video card fans blow air from inside the case onto the heatsink. Good video though always like to see others and myself learn things I didn’t know.
All this talk about the GPU passing data to the computer, when in reality it's the other way around and you didn't even mention HOW the graphics card produces the video signal?
GPU cores such as CUDA cores and the cores in a CPU shouldnt be compared, thats a bit misleading and incorrect. Also, the cooling on gpus doesnt expel the heat the way you showed, its reveresed, it sucks up cooler air and blows on to the heatsink and across the pcb(if that specific card doesnt have a thermal plate between the heatsick and pcb. and vram isnt quite comparable to system memory, they behave completely differently.
I know it's counter-intuitive, but in some applications the fans can better displace heat by pushing air away from the unit, while pulling cool through the heat sink. I believe that's how all modern CPU and GPU fans are oriented. For my CPU, for example: If looking at my PC from the left side, with (
@@johnuferbach9166 Well, it depends on how you are looking at it. Voltage is an instantaneous measurement. You will also hear it referred to as "voltage difference". Everything has a voltage, even the ground. We more or less just call the ground 0 volts. When we measure voltage, it is always with reference to something else. If nothing else is explicitly stated, then it is usually assumed to be ground. Current can be thought of as the "flow of voltage" in a sense. When 2 objects with different voltage potentials come into contact, then the "voltage will flow" from the object with a higher voltage to the object with lower voltage. This "voltage flow" is current, hence the name (current flows). The greater the voltage difference between the objects, the greater the current flow. I understand that this seems like semantics, and seems picky. And it probably doesn't matter to say "voltage flow" if it helps others get a better understanding of the world... BUT... If the goal is to be accurate, then current flows and voltage difference dictates the amount and direction of the current flow. :>
Pretty neat animations, and the general gist of graphics cards is there- though I wonder if a collab with a computer tech geek would help make things more clear and understandable. On my Tim Talks Tech channel, this is a topic that I am going to get to at a later date (it’s a very new channel, and I’m still learning lots when it comes to media creation- but I’ve got some graphics cards I will use in a real tear down and will go farther into detail on parts and such)
@Zwenk Wiel eventually I plan on building my own desktop. (Or at least getting a prebuilt with upgradeability) but right now a laptop works better for me because 1. It's more convenient when I go back to college. 2. Takes up Less space. 3. Easier to move if I use it for apps like Skype.
I know it's counter-intuitive, but in some applications the fans can better displace heat by pushing air away from the unit, while pulling cool through the heat sink. I believe that's how all modern CPU and GPU fans are oriented. For my CPU, for example: If looking at my PC from the left side, with (
I know it's counter-intuitive, but in some applications the fans can better displace heat by pushing air away from the unit, while pulling cool through the heat sink. I believe that's how all modern CPU and GPU fans are oriented. For my CPU, for example: If looking at my PC from the left side, with (
@@nylesmith6563 Except that's not how most are oriented. GPU fans are pushing air through the heatsink, the main difference you see there is blower style fans which are still pushing air through the heatsink, but are more aimed at pushing the air through the shroud of the GPU and out the back of the case from there. As for your CPU example you just put the fan on a different side from normal, and even on the other side it's doing the same thing, forcing air through the heatsink and in most setups towards the rear exhaust of the case. There's no difference in performance whether it's pushing through the heatsink or pulling through the heatsink since you're dealing with the same stream of air. You just have a different physical reference point of where the air is being pushed/pulled from due to focusing on the fan. The air is forced through the heatsink in both cases. The air flow direction of the fan doesn't matter unless it's actually affecting where it's sourcing the air from. And as for the stock heatsink fans facing the sides of the case rather than front/back, you're only going to have an issue of "circulating hot air in the case endlessly" if you don't have any intake or exhaust fans.
was looking for that. also, the other component (ribs) are not for airflow direction, they are for storing heat and exposing to the airflow in bigger surface
Hey folks! It's Falcon. We just put this out and I noticed around 4:20 (nice) I said "PS4," but it's actually PS3 that has separated RAM - PS4 has 8gb GDDR5 that the APU (combo GPU/CPU architecture) can distribute however it needs to. It was a slip of the tongue and we wanted to make sure people understood the correct information. Thanks everyone, and hope you like the video!
As I remember it was 512 mb of total ram, which was split into 256 mb of ram into two separate memory pools, this was a bottleneck. This also made data compression and programming for the 7 spe's (kinda like cores, but not really, they broke up tasks such as audio, and graphics, etc) into a HUGE PAIN!!! This is why the ps4 has no ps3 backwards compatibility. The system was designed on an outdated overly (extremely) complicated proprietary IBM power pc type architecture. In comparison the xbox 360 xenon CPU was the first dual core processor in a console (it had a third daughter die, core which allowed for 6 simultaneous threads) and it was relatively easier to program for since the memory was never split, and the architecture was rather straightforward. This is why multi-platform titles looked better on the xbox 360, rather than the ps3. It wasn't worth the extra effort for some companies to deal with it. Thank god AMD, got Microsoft and Sony to accept x86, as a viable option!!!! It's made cross platform development for console's and even PC a breeze.
@@rehmanarshad1848 never knew how things were with the ps3🤔
at my school i learned development using playstation dev kits from later than 3, and they prepared us by developing remotely to a linux machine. and the preparation was actually not bad due to the playstation seeming very similar to the type of linux machines we used to develop for. (the linux machine also used shared memory and an APU)
Wtf did you mean by minimum 12gb ram, 1 minimum tends to be 8gb of ram and even still 12gb Ram is a weird amount it tends to be 8 or 16
@@Glyn-Leine yeah sony makes all of thier system firmware on BSD which is also a unix based system. You can install yellow dog linux on the ps3, before it was later patched out. I always smile when seeing people using cfw ps3 systems and installing pkg files like any other linux distro. It warms my heart, even though it's not legal😂😂😂. Using the system the way it was meant to be used.
@@Glyn-Leine how do ps?? Dev kits look like, are they large and bulky? Given that it may have additional hardware for quick and easy resource monitoring, and debugging or it's just a regular 'unlocked' ps4 with root access, and no restrictions? I'm just curious.
My rtx 2080ti is made of my entire bank balance.
Lmao
😂
Took me 4 years of saving money.
I would argue that the balance in your account is just gone, your account is now unbalanced :-P
I convinced my company that I needed it to work on Allen Bradley PLCs so it wouldnt affect my balance ;)
This was more informative and fun to watch than any other video about computers I've watched.
Please do more of these explaining what different components of computers do!
I keep my computer’s side cover off so I can stick my feet in there to keep them warm. No need to wast all that heat.
Creative way to keep your gpu...
*under pressure*
Ah, I see you're a man of culture as well
Your feet didnt get electrocuted?
West*
Its big brain time
Voltage Regulator = Regulates Voltage. Way too much information.
you may benefit from some additional RAM
Andrew Jones yeah, he should probably download more RAM to be able to process that much information.
@@hemlimchhay288 you cant download RAM, its hardware not software...
@@daemoniumvenator4155 r/wooosh
Wow thats useful
Blood, that's what's inside.
IT IS ALIVE!!!!!!!!!
Actually, your computer runs on smoke, when you see it escape, that's when it will stop working.
Due to the cost of new cards you may not be wrong on that one.
Kristopher Kartchner I don’t mean to sound like a smart ass but it’s actually vapor.
White blood
“We electrocuted rocks into doing math”
I've always wondered how they were able to do this
@@whenthethebeansstrikeback6728 computer is torturing hard work
Perfect description of computers!
My 980ti is composed of my blood, sweat, and tears... and soon my 2080ti will be composed of my kidney and perhaps a small portion of my liver. LOL
And if you over clock it... A lot of 🔥🔥💥
wait till the rtx super cards come out ma dude~!
ima keep the likes at 69..nice
if you are still on a budget i recommend you to buy a gtx 1080ti today 2020
@@braixenfirefox1119 I might. Waiting to see the prices on the 30 series. If I sell my eyes I might just be able to afford them. hehehe. Cheers mate
The GeForce FX 5800 is powered by the screams of banshees.
Someone watches teen wolf?? Xd
Ah yes, my favorite car in GTA.
The computer in a computer analogy is great. In essence a normal computer is mostly concerned about handling many different things sequentially, whereas a GPU is optimised for parallelisation (doing the same thing many times at once) and therefore has many cores that all do the same thing at the same time. This is why GPUs are actually used in many sciences for non-graphical purposes for parallel computing purposes.
The reason this parallelisation works for graphics is because the general process of real-time graphics rendering has been relatively standartised for a long time. They always go through the same couple steps, each of which designed to allow for many independent parallel calculations. The minimum is: 1. Deform every vertex according to the camera, 2. put a raster on top to see what is "behind" each pixel, 3. calculate the lighting for each pixel to get the colour. In the end it becomes possible to apply even shadows to every combination of pixel/light source without knowing the rest of the world, so the lighting can be entirely parallelised with each pixel calculation being independent of the others.
For a long time engine developers could barely even program the GPU. The process was simply preset and they could only change around some parameters. These days GPUs are a lot more like CPUs and allow much more customised code.
Back when GPU's were just a pcb and a heatsink
Elias Auret the glory days of PC gaming: when 3DFX was king
Mine is just a PCB with a heatsink...
@ᴋᴇᴢ, wow! Really? What gpu man? I'm honestly curious.
@@brandongonzales9687 the hope hxg-1 potentially future computing power
geforce fx 5500
Your thumbnail makes me think of the pictures of the back of massive server walls, with red cables everywhere. Like The Shining but made of bloody spaghetti.
That's a fun description :P
my 980ti is filled with the tears of orphans
You still stuck with 980 ti lmao
Hmmm...
@@haar637
It's a good card for 1080p still. Any higher though.. eh not really. Could do 1440p if you have a very good CPU.
@@Pand0rasAct0r_ He just thinks a bigger number means better lol
@@thoup
Probably lol. I mean an oc gtx 980ti can beat a stock gtx 1070. In some games even with allot more fps.
It's still a good card even today.
I myself however have the evga gtx 1070 sc. That one still does amazingly with my Ryzen 5 2600x at 1440p
Title: What’s inside your graphics card
Me: wHaT aRE YoU tALKinG AbOuT I doN’t HaVe a GraPhiCs caRD
Me too
@@BEASTgamingYT and ur name is *beast* gaming
m e m o r i e s he is using integrated graphics(Please don’t wooosh me)
Not even a Radeon RX 550? You broke asf boi
@@ghostify8515 I mean igpu
This is the most.. casual educational video I ever saw. Didn't know I needed this. The narrator's voice is great.
That Thumbnail is freaking dope!
This guy: what's inside your gpu?
Me: *cries in Intel's integrated graphics*
Intel hd 4000
@@toaster3715 3000*
Intel potato inside
intel inside potato inside.
Lol same intel hd 630
That thumbnail reminds me of mortal kombat
Fatality
oof
oof
In correct spelling its combat
@@hasanravat Don't be disrespectful to the Mortal Kombat franchise!
The correct spelling of "its combat" should be "it's combat". Further more, sentences end with a period.
*Sees video titled "What's inside your graphics card"*
Me: "I dunno, graphics?"
Voltage doesn't flow. Current flows. That's like saying when you run your faucet, the pressure is flowing. No.
@ACAB\\ Mela BAKAta No sorry -- then say "power" or "energy". If you can bring up voltage than you can describe it correctly. This is a video called "what's inside your graphics card?", it's supposed to be technical. Voltage and current are 5th grade science concepts that everyone should know. If it's "easier" for you to not understand how electricity works, then you probably shouldn't be watching this video.
@@NillKitty I agree that if you're going to specifically mention something in a video, it could at least be accurate - Especially on the fundamental points. It is somewhat of a noble pet peeve. But I feel that the context supersedes that - the whole video was topical, general, and (as stated in the video) simple, which I believe is the opposite of technical. It was not meant to be a class and at no point was anyone taking notes on the EXACT way any portion of the graphics card works. Pretty sure the cartoons and phony, under-detailed diagrams gave it away for everyone else already.
You made a fine correction, just lighten up a bit? It's all you're really lacking here. "Probably shouldn't be watching this video?" You should "probably get off your high horse." I'm not an electrician but I still play computer games. Now that I've watched the video I'm more interested in learning the specifics later on, so I say this cartoon video accomplished what it was meant to. ¯\_(ツ)_/¯ Voltage correction ✓ Interest in GPU ✓ Senseless snobbery ✕. ;D
Nill you sound very annoying
yeah the potential difference, but you get it, don't you
As long as the point comes across it doesn't really matter. As long as the other person understands what you meant, communication was succesful.
I think all Gameranx videos should be animated.
The hours of work to do that :v is massive
Hmm. I want more animated videos, but not sure if all is a good idea.
Having never touched a gaming PC, this is extremely intriguing and informative. Thanks!
well this is only the surface explained very briefly, technically every pc comes with a gpu, but some processor have them built in.
and when you start looking into the actual architectural differences between cpu's and gpu's and using that to explain how certain workloads can be easier to offload to the gpu or cpu... the rabbit hole is pretty deep😅
before you know it you're comparing gpu's on their vrm layouts and capacitor quality XD and finding ways to bypass OCP on LN2 XD
(Edit: don't let this scare you, rather let it encourage you! hardware is awesome! and you don't need to know everything to be able to do anything with them, just learning more is fun! ^-^)
@@Glyn-Leine Okay Steve Jobs sorryyyyyy.......
@@Glyn-Leine Wow, you are smart!
@@ClassicRevive sorry..? why?😅
@@L16htW4rr10r nah the people teaching me are XD
I always assumed magical creatures.
What are you doing here
@@Alexandra8620 commenting apparently
okay now i undertsand
@@Alexandra8620 Definitely commenting.
Be carefull assuming nowadays...
As someone who is trying to learn about computers from very little previous knowledge, please make more of these... it’s very helpful
Gameranx: Whats your gpu?
Me: RX 550 Series
Gameranx: What is inside?
Me: Cough..Cough... we don't talk about that
Mine is a rx 550 too :D
Handy for a 300w psu lmao
👀
@Shreshta Jaiswal rip lol
hope you get a better computer someday
Loving the new animation!
I saw this somewhere ones, imagine two class rooms, one with 1000s kids doing lots of simple maths like addiction, subtraction, etc and other room 4 or 6 Maths teachers with PhD solving complex task like calculus and algebra, etc. First room is a Gpu and second a cpu in a nutshell.
I just love the WOLRD OF PCs. ❤️🔥
I do remember some article on the internet tell the reason why GPU tend to have thousand of cores or even more, where CPU just have 4,6 or 8.
it gives the analogy of painting something really big, Imagine you have a task to paint a football pitch size picture in a limited amount of time, would you have 4 very professional painter or would you have 500 average painter.
Nah I dont like this analogy. There are many cores in a GPU, that is right, but they are all very specialised. On the other hand a CPU can do universal calculations.
Everyone needs to see this, there is no video actually explaining what any parts REALLY do and what they're meant for (sorta), definitely should do more of these videos! Keep up the good work!
i'm pretty versed in PC components and how they function but I really enjoyed the diagrams and bits of art
its filled with powerful organs, and its more stronger than human
And your grammar is weaker than a corpse
@@yandhistillnotout3957 that's harsh asf 🤣
@@yandhistillnotout3957 r/rareinsults
Any chance you could do another informative video like this for a motherboard or a CPU,I think that would really be great. I learned more here than I learned in 4 years of a technical high school. Anyways, loved the video 😀
Ishak Smajic what do you mean by “technical high school”?
@@ShroomBois_Inc there is no correct way to translate it, it is bassically focused on computer science
Ishak Smajic computed science is more about software than hardware
@@ShroomBois_Inc Excellent point, my mistake, poor choice of words. My school is 95% about hardware 5% software. Computer science was the closest thing I could think of.
Ishak Smajic damn I’ve never heard about a school like that before. So what kind of stuff did you learn at that high school then? I’d assume how to build a computer but I feel like that shouldn’t take four years to learn lol
Wow thanks for the great video. I love working with computers thanks 😊
What a perfect design representation, great job even on small components and zooms
Man your way of explanation is great I have never seen any one explaining a concept this clear....
You would have already known this if you would have watched the verge's PC building guide
Make sure to put thermal paste in your CPU socket for optimal temperatures
I opened my graphic card and I found a dwarf, an elf and two Chinese children.
Taste like a candy bar
I feel a great aura glowing from this and I dont like it
😂
Are- are you telling me to taste like a candy bar?
props to the animator, lovee the style
Hey guys this one is great, I'm working as IT in Norway and this video is perfect to show the less knowledgeable. I'm gonna show this to one of the students we have interning and that's gonna be a great way to learn. Keep up the great work!!
The Thumbnail contains "Graphic" content!!!
C O M E D Y
no u!
Stolen Comment
Pretty simply and nicely explained. Great video! :^)
*3:15** - ANTHEM ... ERROR !!!*
lol
so true
Thanks guys I understood it more than understanding on book. Animation was amazing!!!!👍
that was a suprisingly good explanation, i find it hard to put it into words when trying to explain processors and memory to people, but this was worded really well
My graphics card has some dust and some cables
cables?
Wierd right?
I actually learned something from this, thanks
I just looked this up and at 5:40 you said and the graphic shows that the fans pull air through the heatsink but what I found said that most of the time video card fans blow air from inside the case onto the heatsink. Good video though always like to see others and myself learn things I didn’t know.
Its depends on the card blower cards are louder that push air out, but the other kind pulls
@@johnschwalb All graphics cards blow air through the heat sink.
Impressive animations and great explanation! Thumb up!
How did I miss this two years ago when it released...lol
Also, most current graphics cards use _GDDR5_ not -GDDR6-
Mine uses GDDR6 am I cool now
New ones use ddr6
Most yeah GDDR5 and GDDR5X
All this talk about the GPU passing data to the computer, when in reality it's the other way around and you didn't even mention HOW the graphics card produces the video signal?
GPU cores such as CUDA cores and the cores in a CPU shouldnt be compared, thats a bit misleading and incorrect. Also, the cooling on gpus doesnt expel the heat the way you showed, its reveresed, it sucks up cooler air and blows on to the heatsink and across the pcb(if that specific card doesnt have a thermal plate between the heatsick and pcb. and vram isnt quite comparable to system memory, they behave completely differently.
That's for a blower card.
Eh, still gets the point across. And I think the CUDA/CPU comparison was fine.
The cuda/CPU comparison is technically correct, they're cores in the end though gpus do more parallel compute vs CPUs which do more "singular" compute
I think it was good enough of a description, remember this isn't LTT, it's the best explanation for the average person.
@@randomsomeguy156 Acctually no, this have been debated by techjesus long ago, Haz-Matt is right.
Who ever is doing this animation is definitely a KEEPER!
Nice do more of these videos man.
Does that airflow chart bother anyone else but me?
Cynep yes, the animation is poorly, but also the information. It gives the bare minimum and does a bad job
I know it's counter-intuitive, but in some applications the fans can better displace heat by pushing air away from the unit, while pulling cool through the heat sink. I believe that's how all modern CPU and GPU fans are oriented.
For my CPU, for example: If looking at my PC from the left side, with (
Downloading more RGB= PROTEIN
Graphic Card caught fire on VRM. Then Artifact damage on VRAM.
The animation is amazing. And very informative. Thank you
dude i m a embedded system designer and developer and i am at the 60 sec now and it's already so nice. Thank you
He : What's inside your graphic card?
Me: So what exactly is a graphic card?
Just bought a GTX 1660! I can't' wait to play with it RE7.
You guys need to do more of these.
Man i just fkin love the way falcon talks, great stuff.
Best video i have ever seen about tech
Falcon: "This is falcon"
Me: "no you're an owl!"
The PCB is also known as wafer? I dont think so..
Thank you! That made me cringe so hard.
@@AjayBrahmakshatriya watch out for the bread boards then xD
It actually is. In Indian colleges we call a pcb a wafer in our labs. It's just fun to call it that.
It is nicknamed a wafer. They explained it perfectly. The actual name is a substrate and the nickname is a wafer. Its a common term
*Just call it VRAM.* -Falcon 2019
The illustrations are great way to explain! Its appealing
Thank you so much! This is so much easier to understand vs just explaining it. I would love to see more videos like these.
...magical graphics elves?
Yesyes
Did falcon fell in a tub of molten lead
"....voltage flows towards the...." i hate to be 'that guy' but voltage doesnt flow, current does :> great video
voltage has propagation time aswell though, doesn't it?^^
@@johnuferbach9166 Well, it depends on how you are looking at it. Voltage is an instantaneous measurement. You will also hear it referred to as "voltage difference". Everything has a voltage, even the ground. We more or less just call the ground 0 volts. When we measure voltage, it is always with reference to something else. If nothing else is explicitly stated, then it is usually assumed to be ground.
Current can be thought of as the "flow of voltage" in a sense. When 2 objects with different voltage potentials come into contact, then the "voltage will flow" from the object with a higher voltage to the object with lower voltage. This "voltage flow" is current, hence the name (current flows). The greater the voltage difference between the objects, the greater the current flow.
I understand that this seems like semantics, and seems picky. And it probably doesn't matter to say "voltage flow" if it helps others get a better understanding of the world... BUT... If the goal is to be accurate, then current flows and voltage difference dictates the amount and direction of the current flow. :>
Nicely Done!
:-)
I love the whimsical illustrations - esp the Insulating Materials for the PCB - ROFL :-)
I love this type of video. I would love to see a video on a computer as a whole
gameranx: What's inside your graphics card?
gameranx: *Draws an animation instead of showing an actual one in real life.*
Motherboard: Bones
CPU: Brain RIGHT SIDE
Graphics Card: Brain LEFT SIDE
RAM: Brains Activator
Thermaltake: Hearth
Energy: Food
Human: Life
My hearth is literally shaking and crying right now.
Earth*
hotel: trivago
What if a graphics card had a graphics card? 🤔
Hmm... effect
It's called SLI
This is the first Video that i see from you...i must say that is so fucking good animated!
Pretty neat animations, and the general gist of graphics cards is there- though I wonder if a collab with a computer tech geek would help make things more clear and understandable. On my Tim Talks Tech channel, this is a topic that I am going to get to at a later date (it’s a very new channel, and I’m still learning lots when it comes to media creation- but I’ve got some graphics cards I will use in a real tear down and will go farther into detail on parts and such)
Im a simple guy..
Good gpu.. Good games..
If Gameranx responds to this I’ll buy a damn graphics card boi
Better buy two RTX 2080 TI.
@@jamesisaac7684 RTX Isnt worth it
I'm getting a laptop with a GTX 1060 probably.
@@jamesisaac7684 I have two gtx 1080 and I want to just tell you, sli isn't what it use to be. It's not a great boost at most things.
@Zwenk Wiel eventually I plan on building my own desktop. (Or at least getting a prebuilt with upgradeability) but right now a laptop works better for me because
1. It's more convenient when I go back to college.
2. Takes up Less space.
3. Easier to move if I use it for apps like Skype.
inside is a mini oven
We need more videos like this man great job
I could listen to Fakcon talk about anything. His voice is so satisfying lol
5:50 aren't fans blowing air into the heatshink/card?
I know it's counter-intuitive, but in some applications the fans can better displace heat by pushing air away from the unit, while pulling cool through the heat sink. I believe that's how all modern CPU and GPU fans are oriented.
For my CPU, for example: If looking at my PC from the left side, with (
I know it's counter-intuitive, but in some applications the fans can better displace heat by pushing air away from the unit, while pulling cool through the heat sink. I believe that's how all modern CPU and GPU fans are oriented.
For my CPU, for example: If looking at my PC from the left side, with (
@@nylesmith6563 You're talking about case fans and case airflow, that's another thing.
@@nylesmith6563 Except that's not how most are oriented. GPU fans are pushing air through the heatsink, the main difference you see there is blower style fans which are still pushing air through the heatsink, but are more aimed at pushing the air through the shroud of the GPU and out the back of the case from there. As for your CPU example you just put the fan on a different side from normal, and even on the other side it's doing the same thing, forcing air through the heatsink and in most setups towards the rear exhaust of the case. There's no difference in performance whether it's pushing through the heatsink or pulling through the heatsink since you're dealing with the same stream of air. You just have a different physical reference point of where the air is being pushed/pulled from due to focusing on the fan. The air is forced through the heatsink in both cases. The air flow direction of the fan doesn't matter unless it's actually affecting where it's sourcing the air from.
And as for the stock heatsink fans facing the sides of the case rather than front/back, you're only going to have an issue of "circulating hot air in the case endlessly" if you don't have any intake or exhaust fans.
The board looks like a hd 5750 to me
Have one as a htpc Gpu :)
to me it looks like a shitty mock up of the rev 1070's and 1080's
Jokes on you! I dont have a graphics card.... I sold it
I have a laptop. But my actual garden grown potatoes outperform it in minesweeper.
*clears throat as though preparing to sing* oh yeah, yeah
@@factsandstuff2832 well, facts and stuff!?!!
@@lorenzvo5284 well what?
@@factsandstuff2832 I just liked that your name lined up so well with what you were saying.
Late to the party but this is cool! You should more of this stuff! 😀
I thought this would be more detailed .... but hey! ... I loved the simplicity and method of narration!
WTF WAS THAT FALCON....?!?!
No hate just holy shit I wasn't expecting that
so are you surprised by his face? how rude
Gameranx has been bought by Disney, probably.
My gtx 1080 consist of 1 entire 20 hour work week.
You paid $1200 for a 1080?
@@----.__ ngl i fucked up my math there
Drew Nai lol
My mom is inside
its always interesting to see how people describe these things, noob or not whatever you are you are gonna find these kind of videos interesting.
I love the animation. it make it looks easy to understand
the only wrong thing is the airflow of the graphics card :| you won't pull air. Never. Unless you like to incinerate your fans.
was looking for that. also, the other component (ribs) are not for airflow direction, they are for storing heat and exposing to the airflow in bigger surface
Had to scroll way too far down for this
passively cooled gpu team where u at
My goddamn life is inside
Relatable
What an awesome animation! Nicely done! :D
i love the new animations !