What's Inside Your GRAPHICS CARD?

Поделиться
HTML-код
  • Опубликовано: 8 сен 2024
  • Use a computer? Game on a PC? Ever wonder how those graphics get so pretty? Let's go inside your high-end graphics card with this animation.
    Subscribe for more: / gameranxtv

Комментарии • 2 тыс.

  • @gameranxTV
    @gameranxTV  5 лет назад +1957

    Hey folks! It's Falcon. We just put this out and I noticed around 4:20 (nice) I said "PS4," but it's actually PS3 that has separated RAM - PS4 has 8gb GDDR5 that the APU (combo GPU/CPU architecture) can distribute however it needs to. It was a slip of the tongue and we wanted to make sure people understood the correct information. Thanks everyone, and hope you like the video!

    • @rehmanarshad1848
      @rehmanarshad1848 5 лет назад +42

      As I remember it was 512 mb of total ram, which was split into 256 mb of ram into two separate memory pools, this was a bottleneck. This also made data compression and programming for the 7 spe's (kinda like cores, but not really, they broke up tasks such as audio, and graphics, etc) into a HUGE PAIN!!! This is why the ps4 has no ps3 backwards compatibility. The system was designed on an outdated overly (extremely) complicated proprietary IBM power pc type architecture. In comparison the xbox 360 xenon CPU was the first dual core processor in a console (it had a third daughter die, core which allowed for 6 simultaneous threads) and it was relatively easier to program for since the memory was never split, and the architecture was rather straightforward. This is why multi-platform titles looked better on the xbox 360, rather than the ps3. It wasn't worth the extra effort for some companies to deal with it. Thank god AMD, got Microsoft and Sony to accept x86, as a viable option!!!! It's made cross platform development for console's and even PC a breeze.

    • @Glyn-Leine
      @Glyn-Leine 5 лет назад +7

      @@rehmanarshad1848 never knew how things were with the ps3🤔
      at my school i learned development using playstation dev kits from later than 3, and they prepared us by developing remotely to a linux machine. and the preparation was actually not bad due to the playstation seeming very similar to the type of linux machines we used to develop for. (the linux machine also used shared memory and an APU)

    • @gramthelamb
      @gramthelamb 5 лет назад +21

      Wtf did you mean by minimum 12gb ram, 1 minimum tends to be 8gb of ram and even still 12gb Ram is a weird amount it tends to be 8 or 16

    • @rehmanarshad1848
      @rehmanarshad1848 5 лет назад

      @@Glyn-Leine yeah sony makes all of thier system firmware on BSD which is also a unix based system. You can install yellow dog linux on the ps3, before it was later patched out. I always smile when seeing people using cfw ps3 systems and installing pkg files like any other linux distro. It warms my heart, even though it's not legal😂😂😂. Using the system the way it was meant to be used.

    • @rehmanarshad1848
      @rehmanarshad1848 5 лет назад +1

      @@Glyn-Leine how do ps?? Dev kits look like, are they large and bulky? Given that it may have additional hardware for quick and easy resource monitoring, and debugging or it's just a regular 'unlocked' ps4 with root access, and no restrictions? I'm just curious.

  • @ashutoshkumardwivedi253
    @ashutoshkumardwivedi253 5 лет назад +2572

    My rtx 2080ti is made of my entire bank balance.

    • @da14a49
      @da14a49 5 лет назад +23

      Lmao

    • @FalseNihil
      @FalseNihil 5 лет назад +9

      😂

    • @ashutoshkumardwivedi253
      @ashutoshkumardwivedi253 5 лет назад +54

      Took me 4 years of saving money.

    • @duramirez
      @duramirez 5 лет назад +35

      I would argue that the balance in your account is just gone, your account is now unbalanced :-P

    • @wheelerthree
      @wheelerthree 5 лет назад +5

      I convinced my company that I needed it to work on Allen Bradley PLCs so it wouldnt affect my balance ;)

  • @Anthomnia
    @Anthomnia 5 лет назад +124

    This was more informative and fun to watch than any other video about computers I've watched.
    Please do more of these explaining what different components of computers do!

  • @KatimaGaming
    @KatimaGaming 5 лет назад +662

    My 980ti is composed of my blood, sweat, and tears... and soon my 2080ti will be composed of my kidney and perhaps a small portion of my liver. LOL

    • @SytanOfficial
      @SytanOfficial 5 лет назад +10

      And if you over clock it... A lot of 🔥🔥💥

    • @Quaaylude
      @Quaaylude 5 лет назад +7

      wait till the rtx super cards come out ma dude~!

    • @yannys7594
      @yannys7594 5 лет назад +1

      ima keep the likes at 69..nice

    • @braixenfirefox1119
      @braixenfirefox1119 4 года назад +7

      if you are still on a budget i recommend you to buy a gtx 1080ti today 2020

    • @KatimaGaming
      @KatimaGaming 4 года назад +1

      @@braixenfirefox1119 I might. Waiting to see the prices on the 30 series. If I sell my eyes I might just be able to afford them. hehehe. Cheers mate

  • @tumblevveed3586
    @tumblevveed3586 5 лет назад +377

    I keep my computer’s side cover off so I can stick my feet in there to keep them warm. No need to wast all that heat.

    • @djninjitsuchannel7857
      @djninjitsuchannel7857 4 года назад +67

      Creative way to keep your gpu...
      *under pressure*

    • @peerie
      @peerie 4 года назад +15

      Ah, I see you're a man of culture as well

    • @firz76
      @firz76 4 года назад +2

      Your feet didnt get electrocuted?

    • @adityarajkhowalama
      @adityarajkhowalama 4 года назад +2

      West*

    • @redpanda7071
      @redpanda7071 4 года назад +1

      Its big brain time

  • @ItsYaBoyDanny
    @ItsYaBoyDanny 5 лет назад +198

    “We electrocuted rocks into doing math”

  • @kodiakllama3049
    @kodiakllama3049 5 лет назад +471

    Title: What’s inside your graphics card
    Me: wHaT aRE YoU tALKinG AbOuT I doN’t HaVe a GraPhiCs caRD

    • @BEASTgamingYT
      @BEASTgamingYT 4 года назад +6

      Me too

    • @IPlayKindred
      @IPlayKindred 4 года назад +15

      @@BEASTgamingYT and ur name is *beast* gaming

    • @landofrabbits
      @landofrabbits 4 года назад +3

      m e m o r i e s he is using integrated graphics(Please don’t wooosh me)

    • @cryo_life
      @cryo_life 4 года назад

      Not even a Radeon RX 550? You broke asf boi

    • @david_4344
      @david_4344 4 года назад

      @@ghostify8515 I mean igpu

  • @lordofchimichangas2302
    @lordofchimichangas2302 5 лет назад +1969

    Voltage Regulator = Regulates Voltage. Way too much information.

    • @loveless-savage
      @loveless-savage 5 лет назад +33

      you may benefit from some additional RAM

    • @hemlimchhay288
      @hemlimchhay288 5 лет назад +24

      Andrew Jones yeah, he should probably download more RAM to be able to process that much information.

    • @daemoniumvenator4155
      @daemoniumvenator4155 5 лет назад +7

      @@hemlimchhay288 you cant download RAM, its hardware not software...

    • @dreadfulman5191
      @dreadfulman5191 5 лет назад +28

      @@daemoniumvenator4155 r/wooosh

    • @araigumakiruno
      @araigumakiruno 5 лет назад +4

      Wow thats useful

  • @AntiHamster500
    @AntiHamster500 5 лет назад +354

    The GeForce FX 5800 is powered by the screams of banshees.

  • @T33K3SS3LCH3N
    @T33K3SS3LCH3N 5 лет назад +92

    The computer in a computer analogy is great. In essence a normal computer is mostly concerned about handling many different things sequentially, whereas a GPU is optimised for parallelisation (doing the same thing many times at once) and therefore has many cores that all do the same thing at the same time. This is why GPUs are actually used in many sciences for non-graphical purposes for parallel computing purposes.
    The reason this parallelisation works for graphics is because the general process of real-time graphics rendering has been relatively standartised for a long time. They always go through the same couple steps, each of which designed to allow for many independent parallel calculations. The minimum is: 1. Deform every vertex according to the camera, 2. put a raster on top to see what is "behind" each pixel, 3. calculate the lighting for each pixel to get the colour. In the end it becomes possible to apply even shadows to every combination of pixel/light source without knowing the rest of the world, so the lighting can be entirely parallelised with each pixel calculation being independent of the others.
    For a long time engine developers could barely even program the GPU. The process was simply preset and they could only change around some parameters. These days GPUs are a lot more like CPUs and allow much more customised code.

  • @BraxtonKovary
    @BraxtonKovary 5 лет назад +1186

    Blood, that's what's inside.

    • @tardelius5778
      @tardelius5778 5 лет назад +19

      IT IS ALIVE!!!!!!!!!

    • @kristopherkartchner1856
      @kristopherkartchner1856 5 лет назад +53

      Actually, your computer runs on smoke, when you see it escape, that's when it will stop working.

    • @marine606
      @marine606 5 лет назад +2

      Due to the cost of new cards you may not be wrong on that one.

    • @renegadebond6268
      @renegadebond6268 5 лет назад +2

      Kristopher Kartchner I don’t mean to sound like a smart ass but it’s actually vapor.

    • @lovestoned6139
      @lovestoned6139 5 лет назад +1

      White blood

  • @Anophis
    @Anophis 5 лет назад +327

    Your thumbnail makes me think of the pictures of the back of massive server walls, with red cables everywhere. Like The Shining but made of bloody spaghetti.

  • @roowut
    @roowut 5 лет назад +206

    Back when GPU's were just a pcb and a heatsink

    • @disappointednep-nep2430
      @disappointednep-nep2430 5 лет назад +2

      Elias Auret the glory days of PC gaming: when 3DFX was king

    • @brandongonzales9687
      @brandongonzales9687 5 лет назад +4

      Mine is just a PCB with a heatsink...

    • @aseshbasu4816
      @aseshbasu4816 5 лет назад

      @ᴋᴇᴢ, wow! Really? What gpu man? I'm honestly curious.

    • @nichsa8984
      @nichsa8984 4 года назад

      @@brandongonzales9687 the hope hxg-1 potentially future computing power

    • @alexspata
      @alexspata 4 года назад

      geforce fx 5500

  • @greko4849
    @greko4849 4 года назад +134

    This guy: what's inside your gpu?
    Me: *cries in Intel's integrated graphics*

  • @marbleswan6664
    @marbleswan6664 5 лет назад +76

    *Sees video titled "What's inside your graphics card"*
    Me: "I dunno, graphics?"

  • @BigBismark
    @BigBismark 5 лет назад +802

    my 980ti is filled with the tears of orphans

    • @haar637
      @haar637 5 лет назад +25

      You still stuck with 980 ti lmao

    • @Tetoredux
      @Tetoredux 5 лет назад +4

      Hmmm...

    • @Pand0rasAct0r_
      @Pand0rasAct0r_ 5 лет назад +72

      @@haar637
      It's a good card for 1080p still. Any higher though.. eh not really. Could do 1440p if you have a very good CPU.

    • @thoup
      @thoup 5 лет назад +75

      @@Pand0rasAct0r_ He just thinks a bigger number means better lol

    • @Pand0rasAct0r_
      @Pand0rasAct0r_ 5 лет назад +42

      @@thoup
      Probably lol. I mean an oc gtx 980ti can beat a stock gtx 1070. In some games even with allot more fps.
      It's still a good card even today.
      I myself however have the evga gtx 1070 sc. That one still does amazingly with my Ryzen 5 2600x at 1440p

  • @elmntz1435
    @elmntz1435 4 года назад +42

    Gameranx: Whats your gpu?
    Me: RX 550 Series
    Gameranx: What is inside?
    Me: Cough..Cough... we don't talk about that

    • @CosmoBubblegum
      @CosmoBubblegum 4 года назад +2

      Mine is a rx 550 too :D
      Handy for a 300w psu lmao

    • @Ciniqs
      @Ciniqs 4 года назад +2

      👀

    • @CosmoBubblegum
      @CosmoBubblegum 4 года назад +1

      @Shreshta Jaiswal rip lol
      hope you get a better computer someday

  • @TheKillerham5ter
    @TheKillerham5ter 5 лет назад +51

    That Thumbnail is freaking dope!

  • @NillKitty
    @NillKitty 5 лет назад +213

    Voltage doesn't flow. Current flows. That's like saying when you run your faucet, the pressure is flowing. No.

    • @NillKitty
      @NillKitty 5 лет назад +9

      @ACAB\\ Mela BAKAta No sorry -- then say "power" or "energy". If you can bring up voltage than you can describe it correctly. This is a video called "what's inside your graphics card?", it's supposed to be technical. Voltage and current are 5th grade science concepts that everyone should know. If it's "easier" for you to not understand how electricity works, then you probably shouldn't be watching this video.

    • @nylesmith6563
      @nylesmith6563 5 лет назад +9

      ​@@NillKitty I agree that if you're going to specifically mention something in a video, it could at least be accurate - Especially on the fundamental points. It is somewhat of a noble pet peeve. But I feel that the context supersedes that - the whole video was topical, general, and (as stated in the video) simple, which I believe is the opposite of technical. It was not meant to be a class and at no point was anyone taking notes on the EXACT way any portion of the graphics card works. Pretty sure the cartoons and phony, under-detailed diagrams gave it away for everyone else already.
      You made a fine correction, just lighten up a bit? It's all you're really lacking here. "Probably shouldn't be watching this video?" You should "probably get off your high horse." I'm not an electrician but I still play computer games. Now that I've watched the video I'm more interested in learning the specifics later on, so I say this cartoon video accomplished what it was meant to. ¯\_(ツ)_/¯ Voltage correction ✓ Interest in GPU ✓ Senseless snobbery ✕. ;D

    • @m.s.a.s9194
      @m.s.a.s9194 5 лет назад +8

      Nill you sound very annoying

    • @qalbi-s_Ahnfy2095
      @qalbi-s_Ahnfy2095 5 лет назад +2

      yeah the potential difference, but you get it, don't you

    • @rubababdullah2540
      @rubababdullah2540 5 лет назад +1

      As long as the point comes across it doesn't really matter. As long as the other person understands what you meant, communication was succesful.

  • @omarabdelkadereldarir7458
    @omarabdelkadereldarir7458 4 года назад +3

    This is the most.. casual educational video I ever saw. Didn't know I needed this. The narrator's voice is great.

  • @thewanderer5506
    @thewanderer5506 5 лет назад +151

    Having never touched a gaming PC, this is extremely intriguing and informative. Thanks!

    • @Glyn-Leine
      @Glyn-Leine 5 лет назад +19

      well this is only the surface explained very briefly, technically every pc comes with a gpu, but some processor have them built in.
      and when you start looking into the actual architectural differences between cpu's and gpu's and using that to explain how certain workloads can be easier to offload to the gpu or cpu... the rabbit hole is pretty deep😅
      before you know it you're comparing gpu's on their vrm layouts and capacitor quality XD and finding ways to bypass OCP on LN2 XD
      (Edit: don't let this scare you, rather let it encourage you! hardware is awesome! and you don't need to know everything to be able to do anything with them, just learning more is fun! ^-^)

    • @ClassicRevive
      @ClassicRevive 5 лет назад +3

      @@Glyn-Leine Okay Steve Jobs sorryyyyyy.......

    • @L16htW4rr10r
      @L16htW4rr10r 5 лет назад +2

      @@Glyn-Leine Wow, you are smart!

    • @Glyn-Leine
      @Glyn-Leine 5 лет назад +4

      @@ClassicRevive sorry..? why?😅

    • @Glyn-Leine
      @Glyn-Leine 5 лет назад +7

      @@L16htW4rr10r nah the people teaching me are XD

  • @josephkeen7224
    @josephkeen7224 5 лет назад +112

    I think all Gameranx videos should be animated.

    • @yourallygod8261
      @yourallygod8261 5 лет назад +5

      The hours of work to do that :v is massive

    • @factsandstuff2832
      @factsandstuff2832 5 лет назад +3

      Hmm. I want more animated videos, but not sure if all is a good idea.

  • @Ali107
    @Ali107 5 лет назад +390

    That thumbnail reminds me of mortal kombat

    • @Rayan-kf2yf
      @Rayan-kf2yf 5 лет назад +3

      Fatality

    • @stepanuttendorfsky9710
      @stepanuttendorfsky9710 5 лет назад

      oof

    • @kimfattyiii5911
      @kimfattyiii5911 5 лет назад

      oof

    • @hasanravat
      @hasanravat 5 лет назад

      In correct spelling its combat

    • @chromefirefox8896
      @chromefirefox8896 5 лет назад +1

      ​@@hasanravat Don't be disrespectful to the Mortal Kombat franchise!
      The correct spelling of "its combat" should be "it's combat". Further more, sentences end with a period.

  • @Tyretes
    @Tyretes 5 лет назад +28

    its filled with powerful organs, and its more stronger than human

  • @baldwinivofjerusalem47
    @baldwinivofjerusalem47 5 лет назад +8

    I saw this somewhere ones, imagine two class rooms, one with 1000s kids doing lots of simple maths like addiction, subtraction, etc and other room 4 or 6 Maths teachers with PhD solving complex task like calculus and algebra, etc. First room is a Gpu and second a cpu in a nutshell.
    I just love the WOLRD OF PCs. ❤️🔥

  • @TheOrangeSunsetGaming
    @TheOrangeSunsetGaming 5 лет назад +152

    I always assumed magical creatures.

  • @Engiction
    @Engiction 5 лет назад +7

    I do remember some article on the internet tell the reason why GPU tend to have thousand of cores or even more, where CPU just have 4,6 or 8.
    it gives the analogy of painting something really big, Imagine you have a task to paint a football pitch size picture in a limited amount of time, would you have 4 very professional painter or would you have 500 average painter.

    • @stencilman5030
      @stencilman5030 5 лет назад +1

      Nah I dont like this analogy. There are many cores in a GPU, that is right, but they are all very specialised. On the other hand a CPU can do universal calculations.

  • @gyoptic
    @gyoptic 5 лет назад +15

    You would have already known this if you would have watched the verge's PC building guide

    • @thugsonjugs1810
      @thugsonjugs1810 5 лет назад +2

      Make sure to put thermal paste in your CPU socket for optimal temperatures

  • @senalagatta5300
    @senalagatta5300 5 лет назад +6

    i'm pretty versed in PC components and how they function but I really enjoyed the diagrams and bits of art

  • @ArjunGupta-vd7ze
    @ArjunGupta-vd7ze 5 лет назад +32

    The Thumbnail contains "Graphic" content!!!

  • @ishaksmajic418
    @ishaksmajic418 5 лет назад +42

    Any chance you could do another informative video like this for a motherboard or a CPU,I think that would really be great. I learned more here than I learned in 4 years of a technical high school. Anyways, loved the video 😀

    • @ShroomBois_Inc
      @ShroomBois_Inc 5 лет назад +1

      Ishak Smajic what do you mean by “technical high school”?

    • @ishaksmajic418
      @ishaksmajic418 5 лет назад +1

      @@ShroomBois_Inc there is no correct way to translate it, it is bassically focused on computer science

    • @ShroomBois_Inc
      @ShroomBois_Inc 5 лет назад +2

      Ishak Smajic computed science is more about software than hardware

    • @ishaksmajic418
      @ishaksmajic418 5 лет назад +1

      @@ShroomBois_Inc Excellent point, my mistake, poor choice of words. My school is 95% about hardware 5% software. Computer science was the closest thing I could think of.

    • @ShroomBois_Inc
      @ShroomBois_Inc 5 лет назад +2

      Ishak Smajic damn I’ve never heard about a school like that before. So what kind of stuff did you learn at that high school then? I’d assume how to build a computer but I feel like that shouldn’t take four years to learn lol

  • @RatedMforMatt-ure
    @RatedMforMatt-ure 5 лет назад +152

    GPU cores such as CUDA cores and the cores in a CPU shouldnt be compared, thats a bit misleading and incorrect. Also, the cooling on gpus doesnt expel the heat the way you showed, its reveresed, it sucks up cooler air and blows on to the heatsink and across the pcb(if that specific card doesnt have a thermal plate between the heatsick and pcb. and vram isnt quite comparable to system memory, they behave completely differently.

    • @johnschwalb
      @johnschwalb 5 лет назад +8

      That's for a blower card.

    • @TheTombot
      @TheTombot 5 лет назад +10

      Eh, still gets the point across. And I think the CUDA/CPU comparison was fine.

    • @randomsomeguy156
      @randomsomeguy156 5 лет назад +8

      The cuda/CPU comparison is technically correct, they're cores in the end though gpus do more parallel compute vs CPUs which do more "singular" compute

    • @rehmanarshad1848
      @rehmanarshad1848 5 лет назад +10

      I think it was good enough of a description, remember this isn't LTT, it's the best explanation for the average person.

    • @MarceloTezza
      @MarceloTezza 5 лет назад +4

      @@randomsomeguy156 Acctually no, this have been debated by techjesus long ago, Haz-Matt is right.

  • @vlweb3d
    @vlweb3d 5 лет назад +12

    *3:15** - ANTHEM ... ERROR !!!*
    lol
    so true

  • @akarshkarmahapatra4901
    @akarshkarmahapatra4901 4 года назад +4

    He : What's inside your graphic card?
    Me: So what exactly is a graphic card?

  • @nockieboy
    @nockieboy 5 лет назад +9

    All this talk about the GPU passing data to the computer, when in reality it's the other way around and you didn't even mention HOW the graphics card produces the video signal?

  • @KnightSlasher
    @KnightSlasher 5 лет назад +59

    Taste like a candy bar

    • @phantomphool
      @phantomphool 5 лет назад

      I feel a great aura glowing from this and I dont like it

    • @PudWhacker
      @PudWhacker 5 лет назад

      😂

    • @h20wizard57
      @h20wizard57 5 лет назад

      Are- are you telling me to taste like a candy bar?

  • @mbrutout
    @mbrutout 5 лет назад +20

    Loving the new animation!

  • @romanbonifield2404
    @romanbonifield2404 5 лет назад +9

    Wow thanks for the great video. I love working with computers thanks 😊

  • @sumalx
    @sumalx 3 года назад +3

    I opened my graphic card and I found a dwarf, an elf and two Chinese children.

  • @ODZ2174
    @ODZ2174 5 лет назад +6

    Falcon: "This is falcon"
    Me: "no you're an owl!"

  • @snazzysalamander1572
    @snazzysalamander1572 5 лет назад +4

    I just looked this up and at 5:40 you said and the graphic shows that the fans pull air through the heatsink but what I found said that most of the time video card fans blow air from inside the case onto the heatsink. Good video though always like to see others and myself learn things I didn’t know.

    • @johnschwalb
      @johnschwalb 5 лет назад

      Its depends on the card blower cards are louder that push air out, but the other kind pulls

    • @itIsI988
      @itIsI988 5 лет назад +1

      @@johnschwalb All graphics cards blow air through the heat sink.

  • @peter-op1lj
    @peter-op1lj 5 лет назад +16

    My graphics card has some dust and some cables

  • @holo6883
    @holo6883 5 лет назад +11

    Downloading more RGB= PROTEIN

  • @DhMrfuun
    @DhMrfuun 5 лет назад +2

    So this mean you have a nano computer inside mini computer. *COMPUTERSEPTION*

  • @markcollins5901
    @markcollins5901 5 лет назад +1

    A PCB is not called a "wafer." On the other hand, integrated circuits, such as the GPU die itself, are etched on a silicon wafer (substrate), before before being cut (from all the other dies on the same wafer) and packaged. The package is then soldered onto the video card's PCB. The point here is that the silicon wafer is a completely different thing than the PCB.

  • @craftnut
    @craftnut 5 лет назад +11

    Also, most current graphics cards use _GDDR5_ not -GDDR6-

    • @Gortosan
      @Gortosan 5 лет назад +4

      Mine uses GDDR6 am I cool now

    • @matt5898
      @matt5898 5 лет назад +3

      New ones use ddr6

    • @MMA_BEASTHUB
      @MMA_BEASTHUB 5 лет назад

      Most yeah GDDR5 and GDDR5X

  • @adekxii
    @adekxii 5 лет назад +21

    The PCB is also known as wafer? I dont think so..

    • @AjayBrahmakshatriya
      @AjayBrahmakshatriya 5 лет назад +6

      Thank you! That made me cringe so hard.

    • @em0_tion
      @em0_tion 5 лет назад

      @@AjayBrahmakshatriya watch out for the bread boards then xD

    • @aseshbasu4816
      @aseshbasu4816 5 лет назад +1

      It actually is. In Indian colleges we call a pcb a wafer in our labs. It's just fun to call it that.

    • @mememaster8368
      @mememaster8368 3 года назад

      It is nicknamed a wafer. They explained it perfectly. The actual name is a substrate and the nickname is a wafer. Its a common term

  • @Im_The_Slep
    @Im_The_Slep 5 лет назад +10

    Does that airflow chart bother anyone else but me?

    • @tonnentonie2767
      @tonnentonie2767 5 лет назад +2

      Cynep yes, the animation is poorly, but also the information. It gives the bare minimum and does a bad job

    • @nylesmith6563
      @nylesmith6563 5 лет назад

      I know it's counter-intuitive, but in some applications the fans can better displace heat by pushing air away from the unit, while pulling cool through the heat sink. I believe that's how all modern CPU and GPU fans are oriented.
      For my CPU, for example: If looking at my PC from the left side, with (

  • @solaire7046
    @solaire7046 4 года назад +1

    props to the animator, lovee the style

  • @c1nnami
    @c1nnami 5 лет назад

    Everyone needs to see this, there is no video actually explaining what any parts REALLY do and what they're meant for (sorta), definitely should do more of these videos! Keep up the good work!

  • @clericofchaos1
    @clericofchaos1 5 лет назад +15

    ...magical graphics elves?

  • @NOELQUEZON
    @NOELQUEZON 5 лет назад +8

    Graphic Card caught fire on VRM. Then Artifact damage on VRAM.

  • @HollowexOkay
    @HollowexOkay 5 лет назад +5

    Pretty simply and nicely explained. Great video! :^)

  • @sahilchourasiya4555
    @sahilchourasiya4555 4 года назад

    Man your way of explanation is great I have never seen any one explaining a concept this clear....

  • @mattcarlson6047
    @mattcarlson6047 5 лет назад

    As someone who is trying to learn about computers from very little previous knowledge, please make more of these... it’s very helpful

  • @MetalLunar
    @MetalLunar 5 лет назад +3

    Just bought a GTX 1660! I can't' wait to play with it RE7.

  • @noyomi_hikari
    @noyomi_hikari 5 лет назад +5

    *Just call it VRAM.* -Falcon 2019

  • @JakesYourUncle
    @JakesYourUncle 5 лет назад +3

    I actually learned something from this, thanks

  • @gdat5838
    @gdat5838 4 года назад +2

    "Whats inside your graphics card?"
    Oh, a PCI-Interpreter which funnels into the task on-load memory which feeds the GPU which smashes that data with render data which then throws it into a compiler which then encodes it for your display at the proper v&h sync rates + resolution....
    Oh, at a couple hundred million times a second too..

  • @t.k9203
    @t.k9203 5 лет назад

    Hey guys this one is great, I'm working as IT in Norway and this video is perfect to show the less knowledgeable. I'm gonna show this to one of the students we have interning and that's gonna be a great way to learn. Keep up the great work!!

  • @HyperionZero
    @HyperionZero 5 лет назад +5

    gameranx: What's inside your graphics card?
    gameranx: *Draws an animation instead of showing an actual one in real life.*

  • @PaszerDye
    @PaszerDye 5 лет назад +5

    You guys need to do more of these.

  • @nothanks9989
    @nothanks9989 5 лет назад +26

    Motherboard: Bones
    CPU: Brain RIGHT SIDE
    Graphics Card: Brain LEFT SIDE
    RAM: Brains Activator
    Thermaltake: Hearth
    Energy: Food
    Human: Life

  • @indylockheart3082
    @indylockheart3082 3 года назад +1

    How did I miss this two years ago when it released...lol

  • @tealckree1240
    @tealckree1240 5 лет назад +2

    Nice do more of these videos man.

  • @logancapes
    @logancapes 5 лет назад +3

    "....voltage flows towards the...." i hate to be 'that guy' but voltage doesnt flow, current does :> great video

    • @johnuferbach9166
      @johnuferbach9166 5 лет назад

      voltage has propagation time aswell though, doesn't it?^^

    • @logancapes
      @logancapes 5 лет назад

      @@johnuferbach9166 Well, it depends on how you are looking at it. Voltage is an instantaneous measurement. You will also hear it referred to as "voltage difference". Everything has a voltage, even the ground. We more or less just call the ground 0 volts. When we measure voltage, it is always with reference to something else. If nothing else is explicitly stated, then it is usually assumed to be ground.
      Current can be thought of as the "flow of voltage" in a sense. When 2 objects with different voltage potentials come into contact, then the "voltage will flow" from the object with a higher voltage to the object with lower voltage. This "voltage flow" is current, hence the name (current flows). The greater the voltage difference between the objects, the greater the current flow.
      I understand that this seems like semantics, and seems picky. And it probably doesn't matter to say "voltage flow" if it helps others get a better understanding of the world... BUT... If the goal is to be accurate, then current flows and voltage difference dictates the amount and direction of the current flow. :>

  • @ThePredator21
    @ThePredator21 5 лет назад +5

    5:50 aren't fans blowing air into the heatshink/card?

    • @nylesmith6563
      @nylesmith6563 5 лет назад

      I know it's counter-intuitive, but in some applications the fans can better displace heat by pushing air away from the unit, while pulling cool through the heat sink. I believe that's how all modern CPU and GPU fans are oriented.
      For my CPU, for example: If looking at my PC from the left side, with (

    • @nylesmith6563
      @nylesmith6563 5 лет назад

      I know it's counter-intuitive, but in some applications the fans can better displace heat by pushing air away from the unit, while pulling cool through the heat sink. I believe that's how all modern CPU and GPU fans are oriented.
      For my CPU, for example: If looking at my PC from the left side, with (

    • @croft-tom1631
      @croft-tom1631 4 года назад

      @@nylesmith6563 You're talking about case fans and case airflow, that's another thing.

    • @OfficialMageHD
      @OfficialMageHD 4 года назад

      @@nylesmith6563 Except that's not how most are oriented. GPU fans are pushing air through the heatsink, the main difference you see there is blower style fans which are still pushing air through the heatsink, but are more aimed at pushing the air through the shroud of the GPU and out the back of the case from there. As for your CPU example you just put the fan on a different side from normal, and even on the other side it's doing the same thing, forcing air through the heatsink and in most setups towards the rear exhaust of the case. There's no difference in performance whether it's pushing through the heatsink or pulling through the heatsink since you're dealing with the same stream of air. You just have a different physical reference point of where the air is being pushed/pulled from due to focusing on the fan. The air is forced through the heatsink in both cases. The air flow direction of the fan doesn't matter unless it's actually affecting where it's sourcing the air from.
      And as for the stock heatsink fans facing the sides of the case rather than front/back, you're only going to have an issue of "circulating hot air in the case endlessly" if you don't have any intake or exhaust fans.

  • @ABRBD
    @ABRBD 5 лет назад +9

    What if a graphics card had a graphics card? 🤔

  • @Bloody_DE
    @Bloody_DE 5 лет назад +1

    This is the first Video that i see from you...i must say that is so fucking good animated!

  • @furkanayas3339
    @furkanayas3339 4 года назад

    dude i m a embedded system designer and developer and i am at the 60 sec now and it's already so nice. Thank you

  • @buildawall5803
    @buildawall5803 5 лет назад +3

    Did falcon fell in a tub of molten lead

  • @greenspittgames7374
    @greenspittgames7374 5 лет назад +19

    If Gameranx responds to this I’ll buy a damn graphics card boi

    • @jamesisaac7684
      @jamesisaac7684 5 лет назад +2

      Better buy two RTX 2080 TI.

    • @Vinylkk
      @Vinylkk 5 лет назад +2

      @@jamesisaac7684 RTX Isnt worth it

    • @factsandstuff2832
      @factsandstuff2832 5 лет назад

      I'm getting a laptop with a GTX 1060 probably.

    • @johnschwalb
      @johnschwalb 5 лет назад +1

      @@jamesisaac7684 I have two gtx 1080 and I want to just tell you, sli isn't what it use to be. It's not a great boost at most things.

    • @factsandstuff2832
      @factsandstuff2832 5 лет назад

      @Zwenk Wiel eventually I plan on building my own desktop. (Or at least getting a prebuilt with upgradeability) but right now a laptop works better for me because
      1. It's more convenient when I go back to college.
      2. Takes up Less space.
      3. Easier to move if I use it for apps like Skype.

  • @histrigaming7344
    @histrigaming7344 5 лет назад +4

    The board looks like a hd 5750 to me
    Have one as a htpc Gpu :)

    • @ethanwiebe6483
      @ethanwiebe6483 5 лет назад

      to me it looks like a shitty mock up of the rev 1070's and 1080's

  • @dzuul
    @dzuul 4 года назад +1

    Random Information: You can't use 2 different GPU at the same time
    Me: GPU IS GPU

  • @grokwhy
    @grokwhy 5 лет назад

    A circuit board is not made from a wafer. The GPU and other ICs are made from a wafer which is cut from the silicon. The PCB is made of stacked layers of metal and insulator. The metal is etched away leaving traces which form the electrical interconnect between the components mounted on the circuit board.

  • @cheesesteak5689
    @cheesesteak5689 5 лет назад +16

    Jokes on you! I dont have a graphics card.... I sold it

    • @factsandstuff2832
      @factsandstuff2832 5 лет назад +1

      I have a laptop. But my actual garden grown potatoes outperform it in minesweeper.

    • @phantomphool
      @phantomphool 5 лет назад

      *clears throat as though preparing to sing* oh yeah, yeah

    • @lorenzvo5284
      @lorenzvo5284 5 лет назад

      @@factsandstuff2832 well, facts and stuff!?!!

    • @factsandstuff2832
      @factsandstuff2832 5 лет назад

      @@lorenzvo5284 well what?

    • @lorenzvo5284
      @lorenzvo5284 5 лет назад +1

      @@factsandstuff2832 I just liked that your name lined up so well with what you were saying.

  • @R9A9V2
    @R9A9V2 5 лет назад +4

    Im a simple guy..
    Good gpu.. Good games..

  • @redmoon383
    @redmoon383 5 лет назад +22

    WTF WAS THAT FALCON....?!?!

    • @redmoon383
      @redmoon383 5 лет назад +5

      No hate just holy shit I wasn't expecting that

    • @user-qp3cx6rt8w
      @user-qp3cx6rt8w 5 лет назад +3

      so are you surprised by his face? how rude

    • @Housdart
      @Housdart 5 лет назад

      Gameranx has been bought by Disney, probably.

  • @decothegeco
    @decothegeco 5 лет назад

    DRAM and SRAM, SRAM using a design called a Flip-Flop to store the data (one bit at a time), until another bit of data is shifted into it. DRAM uses capacitive cells that hold charges which represent the bits of data, the capacitive cells normally don't hold their charge for long, but they allow more data to be stored in one DIE than using a combination of Flip-Flops connected together.

  • @patmelsen
    @patmelsen 5 лет назад

    A PCB isn’t printed on a wafer, it’s printed and laminated on a non-conductive substrate. ICs are “printed” (In a way..) on silicone wafer, which is a semiconductor.

  • @drewnai3873
    @drewnai3873 5 лет назад +4

    My gtx 1080 consist of 1 entire 20 hour work week.

    • @----.__
      @----.__ 5 лет назад

      You paid $1200 for a 1080?

    • @drewnai3873
      @drewnai3873 5 лет назад

      @@----.__ ngl i fucked up my math there

    • @JamesBond77
      @JamesBond77 4 года назад

      Drew Nai lol

  • @Coco-by7sz
    @Coco-by7sz 5 лет назад +22

    My goddamn life is inside

  • @karlchristianbognot1842
    @karlchristianbognot1842 5 лет назад +3

    inside is a mini oven

  • @fayzan1055
    @fayzan1055 5 лет назад

    Gameranx makes a better teacher for IT than an actual teacher for IT

  • @RogerBarraud
    @RogerBarraud 4 года назад

    Nicely Done!
    :-)
    I love the whimsical illustrations - esp the Insulating Materials for the PCB - ROFL :-)

  • @TheIceThorn
    @TheIceThorn 5 лет назад +4

    the only wrong thing is the airflow of the graphics card :| you won't pull air. Never. Unless you like to incinerate your fans.

    • @musecraft2704
      @musecraft2704 5 лет назад +2

      was looking for that. also, the other component (ribs) are not for airflow direction, they are for storing heat and exposing to the airflow in bigger surface

    • @khrosis5587
      @khrosis5587 5 лет назад +1

      Had to scroll way too far down for this

  • @erinonfire70
    @erinonfire70 5 лет назад +4

    My mom is inside

  • @asneecrabbier3900
    @asneecrabbier3900 5 лет назад +8

    passively cooled gpu team where u at

  • @bou222
    @bou222 4 года назад

    You probably already heard this a bunch... but its actually General Processing Unit for GPU I thought it was Graphics processing Unit too, because I found out GPU's are also used for rendering and other things like crunching number for mining.

  • @eddievanhorn5497
    @eddievanhorn5497 4 года назад

    Hey, I really don’t mean to be rude, just a constructive criticism. The “VRMs” you highlighted were actually just the capacitors, which commonly come in round aluminum cans with a bicolor design to denote polarity. Capacitors help keep voltage stable, but they rely on the vrms, which are actually a separate IC, to regulate the voltage to a set number, which the capacitors then charge up to so they can deliver that power if there is a sudden spike. The VRMs set the voltage and regulate it along with the current, and the capacitors just smooth spikes.

  • @Leo-zv7qq
    @Leo-zv7qq 5 лет назад +8

    ehhhh the info on here isn't totally correct. some parts aren't the best but its most people dont' care about the true specific details about most electronics

  • @satorugojo6921
    @satorugojo6921 5 лет назад +6

    Ram, A CPU, and a graphics processor. Am I right? I didn't watch the video yet.
    Edit: I meant about this in the GPU, not referencing anything else.

  • @BlackBlade
    @BlackBlade 5 лет назад +4

    I honestly don't like the "cheap" animation here
    I prefer the photos or more animated animation
    At least for me this new form with the animation repeating with no change is honestly boerding and a pain to see, its lovely drawing but there simple and on the screen for too long without changing (like the waffer part has the guy chiung the same part for like a minute, its just not engaging

    • @bogdanmitrovic1180
      @bogdanmitrovic1180 5 лет назад

      That is why he is speaking

    • @BlackBlade
      @BlackBlade 5 лет назад

      @@bogdanmitrovic1180 ya, and that is relevant to what I said how?
      Point is want to go animated? Great go all the way
      But this feel like something between the "old" format and a "new" one
      Its not still yet its not fully animated that to me at least the end result just looks "cheap" I. Honestly prefer images, over this I prefer stock photos over this
      There is less to look at, there is just more "random" movement on screen
      They talked before, and they talk the same now, just have overall to me at least a lesser experience overall

  • @velhodoidowill1549
    @velhodoidowill1549 5 лет назад +1

    What a perfect design representation, great job even on small components and zooms

  • @stoicfloor
    @stoicfloor 5 лет назад +1

    The animation is amazing. And very informative. Thank you

  • @MixerTrshur
    @MixerTrshur 5 лет назад +3

    I dont have nothing in my graphics card ...
    Because i dont have one ...

  • @dcy9846
    @dcy9846 5 лет назад +16

    Last

    • @yaaayo2171
      @yaaayo2171 5 лет назад

      Oblivious Tomato no me

    • @wagyusus
      @wagyusus 5 лет назад

      Wow luck

    • @lemonhaze715
      @lemonhaze715 5 лет назад

      Johnny_Haha no u

    • @Albylion
      @Albylion 5 лет назад

      Saw five views and was about to comment "third!" But figured someone would beat me to some smartass post. Kudos!

    • @dcy9846
      @dcy9846 5 лет назад

      We’re all nerds here, why else would we have clicked on this video lol

  • @lilbill1144
    @lilbill1144 5 лет назад +7

    Ligma

  • @bruhbruh8292
    @bruhbruh8292 3 года назад +1

    Finally! My question have been answered, on "how did gpu fits on laptop"

  • @barneybarney3982
    @barneybarney3982 5 лет назад

    5:52 its not how it works, cooler dont suck up heat from card, in such a small solution its simply more eficient to blow frash air on heatsink.