SLI & Crossfire in Same PC!

Поделиться
HTML-код
  • Опубликовано: 30 сен 2024
  • To enjoy all the best vendor exclusive technologies, we built a PC that runs SLI with two GTX 780 Ti's and Crossfire with two R9 290X's at the same time. If that sounds ridiculous, that's because it is... But not because it didn't work...
    Sponsor link: audible.com/linus
    Pricing & discussion: linustechtips.c...
    Join our community forum: bit.ly/ZkLvE7
    / linustech
    / linustech
    Intro Screen Music Credit: Adhesive Wombat - Check out his channel here: / adhesivewombat
    Outtro Screen Music Credit: Approaching Nirvana - Sugar High / approachingnirvana

Комментарии • 3,7 тыс.

  • @sailingmanuel
    @sailingmanuel 10 лет назад +637

    Still cheaper than the Mac Pro.

    • @Flutters_Shygal
      @Flutters_Shygal 10 лет назад +101

      Only half the price, at least ten times the performance.

    • @MrFuzzemz
      @MrFuzzemz 10 лет назад +107

      66% of Apple's pricing is because of the brand name.

    • @panzershreks3811
      @panzershreks3811 10 лет назад +26

      Gaith Droby dont forget gamers dont have that much cash to waste for something so low performance

    • @edwardjohnston6286
      @edwardjohnston6286 10 лет назад +5

      ***** This computer is not 10 times as fast as a mac pro you idiot. and who cares, the argument is stupid because mac pro is way way smaller.

    • @edwardjohnston6286
      @edwardjohnston6286 10 лет назад +2

      Mr. Fuzzums do you have proof to show that 66 percent pricing is for brand? what about nvidia gtx titan? what about microsoft windows. Windows operating system has the highest profit margins. but all you people buy it just fine. its stupid. so don't come here and bitch.

  • @Suiax
    @Suiax 8 лет назад +629

    You guys should make a SLI+Crossfire rig with Polaris, and Pascal cards, and call it "The PeePee Rig"

    • @spencerwitte5616
      @spencerwitte5616 8 лет назад +11

      upvote

    • @muttbunch3059
      @muttbunch3059 8 лет назад +3

      This isn't reddit

    • @spencerwitte5616
      @spencerwitte5616 8 лет назад +37

      Mutt Bunch
      Your right
      On reddit I'm looking for other people's opinions

    • @KianGurney
      @KianGurney 8 лет назад +19

      The PeePeeAyePee rig. I have a GTX 1080, I have an RX480, Uh! CrosSLIre.

    • @594-d9b
      @594-d9b 8 лет назад +1

      +James Buck *cringe*

  • @Prydestalker
    @Prydestalker 10 лет назад +630

    How about Nvidia and AMD make a GPU that has SLI-fire and has 16GB memory and It'll be AMDVIDIA RADEON GTX390X has Mantle and PhysX and It'll run every game at 100fps 4k. A DREAM

    • @TurbonicSpyder
      @TurbonicSpyder 10 лет назад +80

      Yeah,the Nvidia-AMD fight could end there.

    • @examiner2946
      @examiner2946 10 лет назад +28

      That sounds awesome ;_;

    • @metalgod6661989
      @metalgod6661989 10 лет назад +71

      AR 1 But then what would we fight about lol

    • @TurbonicSpyder
      @TurbonicSpyder 10 лет назад +70

      *****
      Well,there still exist the fight of the peasants.We could sit back and watch it.

    • @Prydestalker
      @Prydestalker 10 лет назад +18

      and grab some popcorn and enjoy the movie B|

  • @patrickkilduff5272
    @patrickkilduff5272 8 лет назад +275

    We need an underground graphics company to integrate AMD and Nvidias tech on one chip and sell it on the black market...

    • @weasle2904
      @weasle2904 8 лет назад +9

      +Patrick Kilduff Nvidia cards are overall better. Less power consumption. More performance for the same amount of power consumption compared to ANY AMD card. Better drivers, software, surround gaming. They're just amazing. But Nvidia's that one douche company that knows it's going to own the market, so they wrack up the prices. That's the only reason why AMD still is going on the market with it's GPUs lol.

    • @weasle2904
      @weasle2904 8 лет назад +6

      +Patrick Kilduff Don't get me wrong, I love AMD's prices and I use their cards sometimes in budget PCs. But we all know which company truly has better tech xD

    • @weasle2904
      @weasle2904 8 лет назад

      +-T-X-M- I wouldn't get the GTX 950. I would get the 370. If you're on a budget I'd stick with AMD graphic cards. if you're able to make the stretch the 380 is nice. Beware though, AMD graphic cards are much less efficient, so they'll heat up more. So make sure you don't put a half ass airflow job on your case.

    • @weasle2904
      @weasle2904 8 лет назад

      +-T-X-M- I would recommend buying a new case if that's what you're dealing with something like that lol. The NZXT 210 can be bought online for a good price. ALSO. Try designing your PC on pcpartpicker.com it makes it a lot easier, and gives you an online website retailer with the lowest price for the part. I don't know the wattage numbers off of the top of my head but it sounds like it'd be enough. if you're getting a good power supply that is efficient, like an EVGA or Corsair, then get some extra wattage so you can use it in a new system if you ever upgrade that one or use parts from that in a new build. EVGA are usually cheaper than Corsair for the same if not better quality. I love EVGA lol. Their graphic cards are the best you can get, and their PSUs are really good also. Just a reminder though. AMD cards offer more performance for the price. like the 380 outperforms the 960 in raw performance, but is not as shiny and doesn't have as good computing speeds and efficiency. Basically, Nvidia cards are for who can throw in a bit extra money. Oh and another reminder. I heard that Nvidia is coming out with that new Cuda Core tech Pascal soon in a new series. Might be worth a wait than buying cards now.

    • @weasle2904
      @weasle2904 8 лет назад

      +-T-X-M- oh sorry! lol. I would go for the 370 myself, but the 950's efficiency might be nice. They perform very closely actually. So just get whichever is cheaper. But I'd you can afford a GTX 960 or r9 380, get the 380.. But it might overheat in that case :/ its probably worth getting a cheaper lower end GPU for power consumption reasons

  • @markman090
    @markman090 8 лет назад +186

    Had to check the date to make sure it wasn't April 1st

    • @ethanwitmer7307
      @ethanwitmer7307 8 лет назад +1

      Haha just did that XD

    • @Jedcoe
      @Jedcoe 8 лет назад

      same

    • @bmxriderforlife1234
      @bmxriderforlife1234 8 лет назад

      +markman090 imagine they redo this video when the new cpus and gpus launch and just build like a gaming pc that can eat any game

    • @HudsonGTV
      @HudsonGTV 8 лет назад +1

      I actually checked the upload date before reading this comment. lol

    • @BlakeIgnited
      @BlakeIgnited 7 лет назад +1

      I read this on April 1st 2017

  • @KittenoftheBroccoli
    @KittenoftheBroccoli 9 лет назад +355

    Linus, what have you done?

    • @TuAnh-xm3ix
      @TuAnh-xm3ix 9 лет назад +4

      fireclaw316 i have no idea lol

    • @MitgliedT5
      @MitgliedT5 9 лет назад +43

      fireclaw316 kill it with fire !!!! before it lays eggs

    • @barkbent4324
      @barkbent4324 9 лет назад +19

      fireclaw316 Ahh fuck, I can't believe you've done this.

    • @Lauren_C
      @Lauren_C 8 лет назад

      +fireclaw316 Only thing missing is the pentagram and Japanese spirit wards.

    • @MitgliedT5
      @MitgliedT5 8 лет назад +1

      ***** And burn your house to the ground .. so this happened to austin ... naaaa :D

  • @Killershark217
    @Killershark217 10 лет назад +204

    That PSU must be sent by god

    • @KTMinnesotaUS
      @KTMinnesotaUS 10 лет назад +27

      That or he has a personal wizard to produce power and just shove it in the graphics cards...

    • @parky8441
      @parky8441 10 лет назад +2

      KT Minnesota lol

    • @18aidanme
      @18aidanme 10 лет назад +4

      KT Minnesota "Shove it in the graphics card" A MAGICAL POWER PENIS!

    • @legolas0029
      @legolas0029 10 лет назад +3

      Zeus.

    • @luperte7714
      @luperte7714 7 лет назад +1

      Axecution probobly thor

  • @StingingnettIe
    @StingingnettIe 10 лет назад +371

    Now we just need the technology to get an AMD CPU and an Intel CPU in one motherboard....

    • @techiefromcanada
      @techiefromcanada 9 лет назад +40

      No longer a debate since Intel performs so much better than AMD, regardless of the price. Also, Intel already has dual-CPU technologies.

    • @StingingnettIe
      @StingingnettIe 9 лет назад +24

      ***** I love my 8350 though

    • @sportsnut198
      @sportsnut198 9 лет назад +24

      Stinging Nettle An i7 will kick the crap out of an AMD chip, simple as that. In terms of APUs tho, Intel gets their shit tossed.

    • @HillBillyAsian
      @HillBillyAsian 9 лет назад +42

      sportsnut198 of course a cpu that costs 150 dollars more than the lets say an 8350 would beat it #logic.

    • @StingingnettIe
      @StingingnettIe 9 лет назад +16

      Along with what HillBillyAsian Said, It's usually WAY too little of a performance gain to fathom spending $150 more

  • @leighton138
    @leighton138 9 лет назад +307

    2 titan x and 2 fury x :) go for it haha

    • @pierc7490
      @pierc7490 9 лет назад +25

      How about 2 Titan z and 2 r7 240 lol

    • @leighton138
      @leighton138 9 лет назад +1

      +Piervittorio Ciccariello no 4 titan z lol that be sick lol if it worked properly with drivers

    • @Mr19061970
      @Mr19061970 9 лет назад

      +leighton lanzetta gtx titan z in 8 way sli watercooling and a single gt 610
      Piervittorio Ciccariello get rekt i stole u ideaXDDDD

    • @leighton138
      @leighton138 9 лет назад

      lol. im kinda in a pickle . i got a r9 asus cu2 390x it dont get 60 hrz at 3800X2100 and i dont know if i should get another or just save for a titan x or 980ti and then daul one of them . not sure if two 90x will help or not getting 60 frames p s. on my tv it dose have hdmi 2.0 and the amd catalyst dose say the tv supports it the thing is its hard for me to get money i take care of my handicapped mother so either way it will take me a while i just wanna play all game sin 4k at 60 fps and ill be happy .

    • @leighton138
      @leighton138 9 лет назад

      +Pedro Perdigão. but the card at 3840 x2100 dont support more then 30 hz so wont the screen tearing happen if frame rate is more then 30fps or will 2 card allow 60 hz

  • @18T220
    @18T220 10 лет назад +48

    Can I breed a GTX 780ti with a R9 290X?
    I've already got a female R9 290X and Iv'e rented a breeding pen from a farmer.

  • @fred40587
    @fred40587 8 лет назад +96

    amd"s slogan should be AyyyyMD

    • @maX0229567
      @maX0229567 8 лет назад

      ayyy that's my boy

    • @fred40587
      @fred40587 8 лет назад

      maX0229567 ayyyyymen

    • @Ivanzrer
      @Ivanzrer 8 лет назад

      Thats a reddit page for team red.

    • @fred40587
      @fred40587 8 лет назад

      Ivanzrer skrosh people on that sub reddit is to jingoistic

    • @Duke_Chungus
      @Duke_Chungus 8 лет назад +1

      the fuck

  • @cooptheyeti5103
    @cooptheyeti5103 7 лет назад +48

    First NON Racist PC, Nice Job

  • @JustSomeGuyLV
    @JustSomeGuyLV 10 лет назад +75

    You know it's overkill when casing gets as huge as fridge.

    • @MegaAtHome
      @MegaAtHome 10 лет назад +43

      worst case scenario

    • @MrShoomez
      @MrShoomez 10 лет назад +15

      ***** "CASE" HAHAHAHAHAHAHAHHAH
      Get it? Worst CASE scenario?

  •  8 лет назад +58

    Has science gone too far?

    • @nassimback
      @nassimback 8 лет назад

      +olaolapepsiman pascal gpus may answer that question ;) stay tuned

    • @trgoku1951
      @trgoku1951 5 лет назад

      It's always like this

  • @An_Actual_Ditto
    @An_Actual_Ditto 8 лет назад +39

    All of those DVI ports..

  • @disko6296
    @disko6296 10 лет назад +29

    The Nandvidia Titaedeon 3000 ULTRA XT Platinum Edition?
    Very original, Linus, Very original.

  • @nimoon
    @nimoon 8 лет назад +66

    Next up: AMD Processors and Intel Processors in Same PC!
    Enjoy exclusively cheap and shitty performance from AMD and unecessarily expensive processor from Intel!
    For a value price of only $10000.

    • @ilidenstrmrege987
      @ilidenstrmrege987 8 лет назад +28

      FX 9590 and Pentium 4 in the same case = very expensive way to make fire.

    • @KenrickBrown75
      @KenrickBrown75 8 лет назад +10

      Wait for Zen :)

    • @kingarthur83rd
      @kingarthur83rd 7 лет назад

      1080 ACX cooled card.. in for a penny.....

    • @qeedzhr
      @qeedzhr 4 года назад

      @Cuzeg Spiked PSU be like.
      “Please Stop!!!”

  • @Rakoah
    @Rakoah 8 лет назад +65

    Linus you have taught me so much about computers. Thank you sir

    • @snowzZzZz
      @snowzZzZz 7 лет назад +7

      H3 tot m3 w3ll 2

  • @BaconAniimal
    @BaconAniimal 10 лет назад +43

    Oh, I thought it was gonna allow you to use your 780TIs and your R9 290Xs at the same time.

  • @TheLucidDreamer12
    @TheLucidDreamer12 9 лет назад +139

    A pro is someone a noob at being a noob

    • @farn0153
      @farn0153 9 лет назад

      :O

    • @kenyonb4449
      @kenyonb4449 8 лет назад

      +TheLucidDreamer Well not really pros can be new at their professions and just be a newb. "Pro" doesn't really mean anything about someones skills or experience.

    • @kenyonb4449
      @kenyonb4449 8 лет назад +1

      ***** No... It means you get paid to do something. That's it. Nothing about what you study or anything. You can get a job at mcdonalds with no experience or degree, and your a brand new, un educated, un experienced professional. If I play a video game and someone gives me money to do it Im a pro. Same thing. It doesn't matter how good I am at the game, that has nothing to do with it. Sorry bro.

    • @theplasmapro8343
      @theplasmapro8343 7 лет назад

      Joe Bob
      Professional, Adjective (1): relating to or connected with a profession.
      Professional, Adjective (2): (of a person) engaged in a specified activity as one's main paid occupation rather than as a pastime.
      Professional, Noun: a person engaged or qualified in a profession.

    • @sambaker9217
      @sambaker9217 7 лет назад

      apple agrees
      mac-book pro*
      *not intend for professionals

  • @Nirotix
    @Nirotix 10 лет назад +25

    Kidnap the Developers & CEO's of both Nvidia and ATI (AMD). .. shove them all into a padded 20' x 20' room for a week with only water to drink and no pot to piss in.
    They don't get out until they have 'standardized things' related to graphics.
    My solution may not be legal, but it sure as hell would work.. trust me. :P

    • @Kascaded
      @Kascaded 10 лет назад +6

      But there will no longer be competition, which leads to lazy production and not really much advancement in technology (in terms of graphic card).

  • @MODEPIIC
    @MODEPIIC 10 лет назад +27

    Next up: intel and AMD cpu in same system

  • @Flapfuzzledaddle
    @Flapfuzzledaddle 10 лет назад +31

    But... will it blend?

    • @minartson
      @minartson 10 лет назад +6

      This joke never gets old.

    • @StaticSleet
      @StaticSleet 10 лет назад +2

      For some reason its old to me.

  • @InukCF
    @InukCF 10 лет назад +101

    What about a titan z sli and radeon 295x2 crossfire?

    • @Xarius
      @Xarius 10 лет назад +20

      money...

    • @97miloblack
      @97miloblack 10 лет назад +1

      read description didn t work

    • @VictorDude98
      @VictorDude98 10 лет назад

      milo_black97 I doesn't stand that, you are just making stuff up. liar.

    • @97miloblack
      @97miloblack 10 лет назад

      S4b4t3r To enjoy all the best vendor exclusive technologies, we built a PC that runs SLI with two GTX 780 Ti's and Crossfire with two R9 290X's at the same time. If that sounds ridiculous, that's because it is... But not because it didn't work... Description

    • @Xibyth
      @Xibyth 10 лет назад

      ***** Quad slot MOBO, nough said.

  • @LosAngeles234
    @LosAngeles234 10 лет назад +20

    *only 5000$* pfff kk :D

  • @theroflraptor
    @theroflraptor 10 лет назад +14

    Holy fucking shit. That is all.

  • @ahr332ramvepsen2
    @ahr332ramvepsen2 8 лет назад +31

    only 5 000 dollars. not more? :P

    • @pinecone5129
      @pinecone5129 8 лет назад +1

      RamVepsenVEVO what a deal!1440p? that's unreal

  • @Fennoman12
    @Fennoman12 10 лет назад +22

    GG Linus, GG.

  • @club4ghz
    @club4ghz 10 лет назад +26

    So now you can mine and play games at the same time lmao

    • @dotamig8601
      @dotamig8601 10 лет назад +3

      mine with amd, and gaming with nvidia maybe?, cause nvidia is worthless for mining(not a hater just saying)

    • @club4ghz
      @club4ghz 10 лет назад

      DotA Mig thanks god i don't mine and amd cards cost as much as nvidia in us so they are worthless for me lol

    • @StealthSecrecy
      @StealthSecrecy 10 лет назад

      DotA Mig definetly, if you have to choose between nvidia or AMD for mining, AMD is always better (right now), and usually 780 ti's beat put 290x's in gaming (without mantle)

    • @EscapeThePie
      @EscapeThePie 10 лет назад

      ***** no they don't lol.... you obviously haven't seen watercooled OC benchmarks.

    • @overclockedamd123
      @overclockedamd123 10 лет назад

      DotA Mig i am making 40 bucks a day mining on my 770 4gb sli.

  • @XavierXonora
    @XavierXonora 9 лет назад +52

    Here's my 2 cents on Mantle: Mantle was a tactic used by AMD to force Microsoft to release DX12. The reason they wanted Microsoft to released DX12 is as we have seen mantle brings massive gains to low end APU machines, something AMD really want to push in the future, whereas Intel CPU's don't really experience this processor bottleneck. The Multi-Core leverage utilised by mantle means that AMD is no longer falling behind in the CPU race (At least in terms of gaming) and their systems will run faster on the new standard as opposed to Nvidia and Intel who dont experience the same issues with low end hardware. Having built a few systems myself, an AMD APU will run BF4 high details in mantle at over 2 times the FPS it gets with Direct X 11. The same can be said for DX12 over DX11.
    Tactics people.

    • @OGPatriot03
      @OGPatriot03 9 лет назад +7

      I have an i7-3820 3.60GHz and an R9 290x8gb and Mantle still gives me absurd performance increases, Plus Major FPS drops are a thing of the past.
      yes mantle was designed to get Microsoft's attention, but AMD is not the only company that stands to gain from this.

    • @XavierXonora
      @XavierXonora 9 лет назад +1

      Patriot 03 Yeah, no doubt, but they definitely do gain a lot more than Intel do, especially in the low end market.

    • @theoneyoudontsee3645
      @theoneyoudontsee3645 9 лет назад

      Anthony Paull I thought mantle was an api made with only the hardware support for thare new gpus and simply doing that saves on how complex the api had to be making it so much better than dx111. why would thay fix the one thing that makes mantle run so great2.with dx12 on its way why would thay even care to make mantle work for nvidea or intel gpus why give up a vender exclusive that is what intel did with hyperthreading slapped a patent on it so thay are the only ones that can use smt basic designs in thare cpus for a long time

    • @XavierXonora
      @XavierXonora 9 лет назад +1

      Is not about that. AMD don't want to maintain mantle. It was put out there to say "time for a new api Microsoft"

    • @theoneyoudontsee3645
      @theoneyoudontsee3645 9 лет назад

      Anthony Paull you could be on to something amd said here is what we can make for us why does mantle have 4k playability when dx11 does not on a $600-700 pc

  • @thesquidleader
    @thesquidleader 10 лет назад +17

    0:38 The guys face while leaving the shot is priceless XD. You sir get a like!

    • @Pardock97
      @Pardock97 10 лет назад +2

      Yes yes yes, that's what I thought when I saw it xD Perfect expression for the moment XD

  • @Jacob45678Liam
    @Jacob45678Liam 9 лет назад +3

    I feel kinda sorry for AMD, they used to be better than Intel for CPU's around 10 years ago and used to be better than Nvidia around 8 years ago. Then it twisted when they put the prices on their hardware WAY down. I am running an AMD FX-8320 and it runs all of my games just fine, still gets hot, but others that are Intel fanboys just completely bash on AMD and whoever uses their product. What people don't realize is that if Intel and Nvidia are the only two vendors of GPU's and CPU's then they could raise their prices tremendously, and we will have to buy it if we want a gaming PC. So cut AMD some slack, because they aren't just cutting their prices down, but everyone elses' too.

  • @VikingDudee
    @VikingDudee 9 лет назад +28

    I wish both AMD and Nvidia would just kinda get a long somewhat. I'd gladly run a 7950 with a GTX760 just to get some things that 1 card has that other don't, or even run an AMD card with and Nvidia card for physx (without hacked drivers). Nvidia would still sell much more cards if that were possible out of the box.

    • @thelol1759
      @thelol1759 9 лет назад +3

      This. I wish I could smoothly run Nvidia tech and AMD tech at the same time.

    • @OGPatriot03
      @OGPatriot03 9 лет назад +12

      The problem isn't AMD, they're open source. The issue is with Nvidia.

    • @VikingDudee
      @VikingDudee 9 лет назад

      Patriot 03 Yeah, AMD could care less, hell the Mantle thing, Nvidia should have jump all up on that. instead they say DX 11 is only slightly slower, but they fail to understand Mantle is to help lower end CPU's or CPU bottlenecking, Just hope DX 12 is what its hyped up to be.

    • @OGPatriot03
      @OGPatriot03 9 лет назад

      Viking Dude Mantle helps with more than just low end CPUs, my I7 and my 290x gains 20 FPS on ultra, and I no longer get FPS drops (To me the power of mantle/DX12 is that FPS drops are no longer a thing)

    • @Rockymann27
      @Rockymann27 9 лет назад

      Patriot 03 How do i get mantle?

  • @DJparsons89
    @DJparsons89 10 лет назад +9

    Linus. Get someone to control your teleprompter or whatever you have. Looking for your controller behind the PC is just too distracting.

  • @RealRaynedance
    @RealRaynedance 10 лет назад +22

    Not that it'll happen, but you know what I think would be interesting? If AMD and Nvidia did a joint project for a card. That way it's all in one thing and you have both minds trying to make something.

    • @RealRaynedance
      @RealRaynedance 10 лет назад +8

      ***** I said it would be interesting. I never said it was a good idea.

    • @xXMasterJ360Xx
      @xXMasterJ360Xx 10 лет назад

      That makes more sense than trying to build a $5000 PC that most people cant afford to even use lol. The only thing interesting about it is the fact that we dont have to manually take out our GPU's and uninstall drivers if we switch them. Just b/c we use either AMD or Nividia doesn't mean we can only use of them. Hell I got 2 crossfired 7970's both OC, but I still plan to buy 2 780's SLI in the future

    • @JeSsSe66
      @JeSsSe66 10 лет назад +1

      ***** Thats an absolute terrible idea. The general rule is; to keep technology going forward, you almost always need competition so that each company can strive to release better performing products in a shorter amount of time.
      For example; ever since intel officially become the "best" in the cpu market and amd is no longer a threat to intels business, intel isnt striving to make cpus perform better, now they instead release cpus that arnt as good as they can actually be and they charge you out of the ass for minimal performance gain.
      Anyway, amd and nvidia partnering up is the equivalent of dividing by zero. Nvidia as a company are very selfish and they will try to reap the benefits that both companys do as a group, instead of splitting costs 50/50.
      If those two ever partnered up in the future, i can see nvidia somehow trying to pull some kind of dirty trick on amd.

  • @kobrapromotions
    @kobrapromotions 8 лет назад +83

    All I heard was "star citizen"

    • @genestarwind928
      @genestarwind928 8 лет назад +2

      +Kobra Incorporated scam citizen

    • @genestarwind928
      @genestarwind928 8 лет назад +2

      Kobra Incorporated He just bought a new house.
      Employees also called out the fraud

    • @genestarwind928
      @genestarwind928 8 лет назад +3

      Kobra Incorporated I am going to laugh so hard when this game blows up like the turd it is.

    • @omermor5040
      @omermor5040 8 лет назад

      ll

    • @omermor5040
      @omermor5040 8 лет назад

      llk

  • @Se7enAte
    @Se7enAte 10 лет назад +11

    I think something like this would be best for somebody that wants to edit videos and play games at the same time. Having 2 780ti's for CUDA in editing and 2 290x's for gaming along with having 2 CPU's to govern what programs get which CPU's and losing literally no performance when playing BF4 on ultra while rendering a video would be pretty freaking awesome. Unrealistic, but awesome

    • @plzgodplz
      @plzgodplz 10 лет назад +1

      The energy bill would be insane and your cooling would never measure up to the heat output

    • @Se7enAte
      @Se7enAte 10 лет назад +14

      If you're spending this much on a system, you're going to be a more well off person and won't worry about the energy bill. The cooling could be an entire water loop and fix any heat issues.

    • @ImaTigerwithabeanbag
      @ImaTigerwithabeanbag 10 лет назад +1

      But the two 780Ti's are better for gaming...

    • @Se7enAte
      @Se7enAte 10 лет назад +5

      Canadian Raccoon Conundrum Consider this: on a dual CPU system you're going to use Xeon's. They kinda suck for gaming, and AMD has Mantle. Mantle will offload some CPU bottlenecking and will actually perform better with 290x's than 2 780ti's. With just a 4770k, yes the 780ti's would be better.

    • @ImaTigerwithabeanbag
      @ImaTigerwithabeanbag 10 лет назад

      Se7enAte We don't know if the 290X's would do better yet...

  • @kalvintaur3526
    @kalvintaur3526 10 лет назад +13

    One thing that is a fact is that PC gaming (doesn't matter if it's AMD or Intel/ ATI or Nvidia) is going to always be better than console gaming!!

    • @athosvamberk
      @athosvamberk 10 лет назад

      how many times pc gamers have to say that?it is so obvious it's better than console!!does it makes you feel better saying something that everyone nows?
      i'm sorry if i offended you,but it's just ridiculous pc gamers saying the same thing over and over again,everyone got it!now stop repeating yourself

    • @1toncheese
      @1toncheese 10 лет назад

      Ansgar The Demon it does get annoying but remember for the longest time consoles were better until 2004ish. also revise your comment a little,the grammar errors make it harder to take your side.

    • @kalvintaur3526
      @kalvintaur3526 10 лет назад

      I know that it's annoying, but this comment is trying to further prove a point to console fanboys. With more likes we can make them open their god damn brains so they can stfu about stuff like "Xbox is better" or "PS3/4 is better" and make them join the our side. We can say "Hey you there with the console, stfu, nobody cares about your bullshit 4 generation old technology and come join the awesomeness of PC gaming with true HD gaming with all the perks of AA, FXAA, ect." You know what I mean? Yea I don't really either, it was pretty hard typing this comment. But again PC GAMING DOMINATES CONSOLE GAMING!!!

    • @1toncheese
      @1toncheese 10 лет назад +1

      Kalvin Rispected while your point is valid your method acts as a deterrent to the very people your trying to convince, often coming off as conceited and douchy. pc gamer who will outright tell and explain why their preferred method is better are so common place on the web people have taken it as a middle finger esque gesture. its better people stop doing this and saying instead well its quite a good thing, you should give it a try. keep it simple and sweet since its all been heard or read before.

  • @Nostalgia_Realm
    @Nostalgia_Realm 10 месяцев назад +3

    We have Fressync now, but the G-sync being proprietary point still stands :(

  • @LegendHidde
    @LegendHidde 10 лет назад +17

    Why doesnt Google start with trying to make GPUs using the awesome stuff AMD and Nvidia use, or they could just buy AMD and Nvidia, and merge them. I dont do economics, so I dont know.

    • @KjeftenLP
      @KjeftenLP 10 лет назад +7

      Yes, I was thinking exactly this, I don't care about monopoly issues! :P

    • @dimitriid
      @dimitriid 10 лет назад +10

      If Android is any indication we would have even more fragmentation with crap like "Yeah this driver will only work with our new generation of GPUs, those old ones? Yeah you can toss em"
      So think about what you're wishing for.

    • @RaphYkun
      @RaphYkun 10 лет назад

      They'd essentially just be making AMD stuff since NVidia locks down all their awesome stuff (PhysX actually hunts out AMD cards and disables itself if it detects one). And yeah, the code for Mantle is open, but it is still designed to run on AMD's architecture which pretty much means NV would have to start making AMD cards if they wanted to use it. ALSO, Google has 0 experience in the commercial hardware domain. They can maybe push others to do it, but like the Nexus line of mobile devices, they usually outsource actual hardware... so who would they have manufacture their cards?
      Also, as can be shown by the fact you even suggested Google, they have a pretty cult following which would just lead to even more fanboyism and friction within the gamer community.

    • @irfaniarief
      @irfaniarief 10 лет назад

      dimitriid yep, that's likely to happen

    • @MoronicAcid1
      @MoronicAcid1 10 лет назад

      RaphYkun Open as in open source?

  • @PetarStamenkovic
    @PetarStamenkovic 9 лет назад +61

    freesync and dx12. problem solved :)

    • @skatertwig26
      @skatertwig26 9 лет назад

      +Petar Stamenkovic Hopefully. If the games are optimized.

    • @matt1267
      @matt1267 9 лет назад

      +Petar Stamenkovic wait I can run my games on dx12 with my nividia....oh wait....

    • @PetarStamenkovic
      @PetarStamenkovic 9 лет назад

      +dinoreah 2 What's wrong with nvidia and dx12?

    • @skatertwig26
      @skatertwig26 9 лет назад +2

      Petar Stamenkovic Right now the performance is a little worse then it was with dx11 with Nvidia cards. AMD cards it got a significant boost. I am sure that will be fixed though.

    • @PetarStamenkovic
      @PetarStamenkovic 9 лет назад

      +skatertwig26 What test/game are you referring to? From what I've read AMD gets bigger boost in dx12 since their dx11 driver is crap, but nvidia is still on top - meaning 980ti is better than fury x overall.

  • @wewillrockyou1986
    @wewillrockyou1986 10 лет назад +14

    Run dual 295x2 and dual Titan Z...

    • @oblivionglitchestutorials9899
      @oblivionglitchestutorials9899 10 лет назад +16

      That would be like running 8 graphic cards, as each card is a dual GPU. Your system would explode. And you'd need around 2250 watts of power.

    • @wewillrockyou1986
      @wewillrockyou1986 10 лет назад +33

      OblivionGlitchesTutorials Dual PSU

    • @oblivionglitchestutorials9899
      @oblivionglitchestutorials9899 10 лет назад +1

      True.

    • @DaChazIn
      @DaChazIn 10 лет назад +22

      FUCK IT DUAL EVERYTHNIG

    • @EurekaOW
      @EurekaOW 10 лет назад +3

      Trungle Dor There are dual CPU motherboards ^.^ Be the most powerful single computer in the world.

  • @SE09uk
    @SE09uk 10 лет назад +5

    im starting to think it would be good for pc games and gamers if the hardware manufactures could set 3 standards low, mid, high and licence it out
    I know this is going down the amiga or mac road, but would give devs only 3 sets of hardware to hit
    system builders would just have to comply with one of the 3 standards
    and if you made a high end pc, you know all games would run on max settings for 5 years and use all nv and amd gfx goodies
    so you would have a A compliant system and you buy a A compliant game
    that guarantees in will run at 1080p 60 fps on max settings
    of course there would be nothing to stop you building a A+ pc for other things
    just a thought, but this would allow true optimisation like we used to see on consoles before the xb1 came out

  • @ForTheViolence
    @ForTheViolence 10 лет назад +6

    gsync works with every game. so I choose nvidia.

  • @ImTheAssassin
    @ImTheAssassin 10 лет назад +10

    of only 5000 dollars...
    *looks at wallet*

    • @yumri4
      @yumri4 10 лет назад

      saddly $5,000 is kinda cheap for all which is stuffed into that rig .... but realisticly as you will not be using all of it at the same time ever you could probably get one of compareivly the same power in games for around 1/5 to 2/5 that cost with a better CPU, more and/or faster RAM and a SSD or 2.

  • @cad5359
    @cad5359 10 лет назад +13

    If AMD and Nvidia got in bed together and formed a new brand, maybe called Voodoo.

    • @SBG57
      @SBG57 10 лет назад

      lol VooDoo 4

    • @cad5359
      @cad5359 10 лет назад

      yeah thats where i was going

    • @ShadowlessKillClan
      @ShadowlessKillClan 10 лет назад

      Or did Voodoo split into both AMD and Nvidia?

    • @cad5359
      @cad5359 10 лет назад

      I think it should become the opposite, and there will be better shopping.

    • @ThyTrueMaverick
      @ThyTrueMaverick 10 лет назад

      I'm not sure if you are kidding... but, you actually know, that the company was called 3dfx, not "Voodoo"!?

  • @JerryNeutron
    @JerryNeutron 10 лет назад +4

    I JUST finished the main story in Far Cry 3 literally 15min. ago. Beautiful game

  • @AnonymityIx
    @AnonymityIx 10 лет назад +12

    i can only imagine the power this needs

    • @davidm3630
      @davidm3630 9 лет назад

      More than we'd like to know I'd imagine

  • @MrElitegamer6
    @MrElitegamer6 10 лет назад +5

    Meh. I am not really into getting the best performance money can buy but rather bang for buck mid range cards.

  • @kunven
    @kunven 10 лет назад +5

    $5000 i bought my gtx 660 for $220 i feel poor

  • @KimJongUnTheOneAndOnly
    @KimJongUnTheOneAndOnly 10 лет назад +3

    Amd and Nvidia suck. I'll stick to my hand drawings thank you very much.

  • @leerman22
    @leerman22 10 лет назад +6

    Why can't they just license their technology to the other? Then they can 1up each other in the GPU race.
    I just switched from AMD to NVIDIA and their cards suck at mining cryptocurrency!

    • @DracolegacyOfficial
      @DracolegacyOfficial 10 лет назад +10

      its been known for months that radeon cards were better performers for mining. thats your own damn fault.

    • @stoneyyay4557
      @stoneyyay4557 10 лет назад

      mantle is an open API nvidia is free to implement it themselves. they just need to produce a driver for it, as well as enable support for it on their hardware (firmware update should work for that)

  • @evannolan9031
    @evannolan9031 10 лет назад +4

    I honestly don't think people will buy Gsync monitors if they already own a monitor. It's a technology that only newcomers or enthusiasts will be able to take advantage of unless you have money to blow.

  • @Bang4BuckPCGamer
    @Bang4BuckPCGamer 9 лет назад +3

    SLi and Crossfire in the same PC....All sorts of commandments are being broken here (GabeN looks down and gives disapproving eye) Could you imagine if Red and Green merged? AMD bringing down the quality of Nvidia with their poor drivers, it would be a monstrosity

    • @imjoeking_
      @imjoeking_ 9 лет назад +1

      Looks like the driver thing happened. The current nvidia drivers are horrible.

  • @oriolopocholo
    @oriolopocholo 10 лет назад +4

    Thought I was getting a product review... got an infomercial instead.

  • @Lukeriddoch
    @Lukeriddoch 10 лет назад +4

    4 words. first world fucking problems

  • @--Dawid--
    @--Dawid-- 10 месяцев назад +6

    WAN Show? :)

    • @Youhaveaname
      @Youhaveaname 10 месяцев назад +1

      Wan show. I laughed at the Star Citizen comment.

  • @SuperStareGry
    @SuperStareGry 10 лет назад +4

    Titanedeon! LOL i want one of these!

  • @DGneoseeker1
    @DGneoseeker1 10 лет назад +3

    Did you have water blocks on those graphics cards? They're insanely close together. I have enough heat problems with two HD 6950s with that spacing. I had to downclock one of them.

  • @flubdawub1564
    @flubdawub1564 9 лет назад +3

    Mixing team red and green? This is like mixing coke and pepsi, HE'S GOING TO DESTROY US ALL!

  • @VilleF1N
    @VilleF1N 10 лет назад +6

    Why can't AMD and nVidia marry each other and make sweet CPU and GPU babies for us?

    • @RWoody1995
      @RWoody1995 10 лет назад

      no competition = no innovation.
      thats why for example intel processors havent gotten any faster since 2nd generation and even 5th gen wont have a performance improvement apart from the 10% they make each time as a minimum... 30% improvement in 3 generations is pretty pathetic... but they get away with it because AMD hasnt improved performance since the PhenomII X6 which still equals a FX8350 pretty much in anything that can use only 6 cores max. the only thing to make an FX8350 better than a PhenomII is mantle since that allows all 8 to be used lol

    • @VilleF1N
      @VilleF1N 10 лет назад

      megaspeed2v2 I know, but consider the kind of card they could make together.

    • @RWoody1995
      @RWoody1995 10 лет назад

      VilleF1N
      problem with if amd and nvidia tried to work together on one joint graphics card they both use completely different methods, amd gpus work by having thousands of very small cores which have raw compute power and nvidia makes their gpu cores specialised specifically for gaming, hence why amd cards are epic for litecoin mining and nvidia cards are pathetic for it, its also why nvidia cards are more power efficient, since they dont use raw compute power and just make cores which are specifically good at 3d model rendering they can have the same game performance for less power.
      because of this difference in their design philosophy they wouldn't be able to work as a team very effectively unless they both decided to work from scratch with a more united design philosophy but that would take massive time and effort.

    • @VilleF1N
      @VilleF1N 10 лет назад

      megaspeed2v2 And that would be exactly my point. If they would design a GPU from ground up which had that raw compute performance and 3d model rendering capabilities on 1 chip together. That would be a sight worth seeing. But that all ofc is just a wet dream that will most likely never happen. :/

    • @RWoody1995
      @RWoody1995 10 лет назад

      VilleF1N no my point is its not as simple as that, its not like amd can teach nvidia to improve compute performance and nvidia teach amd to improve efficiency, each design type is there because of limitations of cooling and power delivery, its simply not physically possible right now to add both more compute performance and specialisation at the same time, thats why the performance of a GTX780/Titan and R9 290x are so close together, they both are pretty much at the physical limits of 28nm silicon and its not amd or nvidia who will create the methods for creating 22nm and smaller silicon its the silicon fabricators they use.
      its best off to keep the two companies competing, even better still lets hope maybe intel joins the graphics market too, if we get three companies competing with eachother more ferociously then graphics power will increase much faster than if either company doubled in size and got twice as many smarter engineers

  • @larskrau6966
    @larskrau6966 10 лет назад +6

    so deciding between AMD and Nvidia in a few years will be like deciding between xbox and playstation right now ...

    • @peterei3
      @peterei3 10 лет назад

      Does that mean that everyone will buy Nvidia cards!?

    • @larskrau6966
      @larskrau6966 10 лет назад +2

      peterei3 No, it means you choose the card manufacturer depending on the games you wanna play. The order i wrote that in is totally random

    • @unitedstatessc
      @unitedstatessc 10 лет назад +1

      peterei3 Both those consoles are AMD/ATI and support mantel. It's a good bet that if you want the best experience with ports, you want to go with that card. None the less, cause of bitcoin the demand is really high, forcing the price up.

    • @AnthonyBrusca
      @AnthonyBrusca 10 лет назад

      Ragh D Consoles DO NOT have mantle, but they do use a low level API that takes advantage of GCN architecture, but encoding is not the same.

    • @stoneyyay4557
      @stoneyyay4557 10 лет назад

      Ragh D "What Mantle creates for the PC is a development environment that’s *similar* to the consoles, which already offer low-level APIs, close-to-metal programming, easier development and more (vs. the complicated PC environment). By creating a more console-like developer environment, Mantle: improves time to market; reduces development costs; and allows for considerably more efficient rendering, improving performance for gamers. The console connection is made because next-gen uses Radeon, so much of the programming they’re doing for the consoles are already well-suited to a modern Radeon architecture on the desktop; that continuum is what allows Mantle to exist" Quoted from AMD
      in plain english, nextgen consoles already use low level APIs due to their operating systems. hence why u can get such awesome graphics out of relativley little processing power

  • @KingCorny93
    @KingCorny93 8 лет назад +2

    What's the purpose of having 2 different brands in one PC if you can only run one at a time? If I pay 5000$ for a PC I want to be able to use both brands at once. Like having AMD shadows and Nvidias MSAA.

  • @tacticaladvance
    @tacticaladvance 10 лет назад +11

    Noooooooo

    • @darkkiller1588
      @darkkiller1588 10 лет назад

      Another competitor for the organization xD

    • @tacticaladvance
      @tacticaladvance 10 лет назад +2

      ThePvPCreator I am not competing just better (-:

    • @GFORSE
      @GFORSE 10 лет назад +1

      #TheTAArmy
      The thing is, is that eventually there will be so many people in the Linus Org that it will just be a complete clusterfuck, but while in TA...there may be a limit, so at the end of the day...
      Organisation>Clusterfuck

    • @tacticaladvance
      @tacticaladvance 10 лет назад +3

      George Forse Well said George Forse

    • @GFORSE
      @GFORSE 10 лет назад

      Thanks

  • @Gwalsby
    @Gwalsby 10 лет назад +3

    Well I already have an amd machine I just need to build an nvidia one then Ill be set

  • @RealGengarTV
    @RealGengarTV 10 лет назад +11

    "It's all about the games not the hardware you choose" .. Witch i agree with and is one of the reasons why i bought XBOne ^^,

    • @omicronpvp5656
      @omicronpvp5656 10 лет назад +58

      Peasant.

    • @StewieGriffin
      @StewieGriffin 10 лет назад +3

      when i play games i like mouse and keyboard because it skill you need to play mmorpg
      more memorizing
      fast reactions

    • @armysuper3025
      @armysuper3025 10 лет назад +25

      Xbox one is shit

    • @janzigii
      @janzigii 10 лет назад

      Army SUPER hmmm why?

    • @armysuper3025
      @armysuper3025 10 лет назад +25

      janzigii Overpriced, weak hardware, many problems, Useless features.

  • @vonSaufenberg
    @vonSaufenberg 10 лет назад +4

    I think you should make a video that sates what the fuck each and every technology dose or, if you already made a video about it post the link. I also believe that if the two companies would work together to make a super awesome GPU that graphics could be taken to a new level.

    • @xFlRSTx
      @xFlRSTx 10 лет назад +2

      no graphics would not be taken to a new level, don't you know anything about the purpose of competition in a market.

    • @vonSaufenberg
      @vonSaufenberg 10 лет назад

      no i don't

  • @Cthulhu91
    @Cthulhu91 10 лет назад +142

    Dat bastard PC, next time Linus please, don t pollute a Gaming pc with those shitty AMD

    • @gingerflakes101
      @gingerflakes101 10 лет назад +25

      Yeah fuck AMD

    • @DAVEDAMRON
      @DAVEDAMRON 10 лет назад +101

      gingerflakes101 bull.... AMD is fantastic, nothing against Nvidia but they suck up to Intel way to much and Intel is the PC version of Apple over priced (to pay for marketing)

    • @Cthulhu91
      @Cthulhu91 10 лет назад +23

      Quality has a price, shit is free... Intel + nVidia = like a god, Amd + Amd = uhmmm 2•Amd

    • @DAVEDAMRON
      @DAVEDAMRON 10 лет назад +42

      Quality does have a price but over hyped quality is just that, when 30% of you markup is due to marketing not material there is a problem but the average person buys it which is why it is done. TV and internet ads work...trust me I make a living at it. Again I am not a hate or a fan boy of either I just know what I need and at what price it should cost.

    • @Cthulhu91
      @Cthulhu91 10 лет назад +7

      DAVE DAMRON Intel producees very high quality and enthusiast CPU, You can choice top or cheaper cpu about materials and performance, the first example are the pipelines, less time for 1x check x86 or 2x x64, Intel producees the first choice of chipset, the best bios teck. the same it is about nVidia (CUDA, PhysX, G-Sync, TXAA etc etc) btw we can get more fps than AMD
      This is not bullshit man. Check YT Benchmark

  • @cracklingice
    @cracklingice 8 лет назад +2

    Wonder how this would play with multi-gpu dx12 and it's multi vendor support.
    Maybe even see dual die cards from each team playing together?
    Oh no, red and green make brown don't they? Team Brown doesn't sound all that great.
    Oh wait maybe we'll go with additive color mixing. Team Yellow sounds better. Go bananas!

  • @alan_harake
    @alan_harake 9 лет назад +16

    I couldn't give a crap about gsync to be honest, but mantle sounds great because it will actually boost the performance of the games I'm playing.

    • @YourPalHDee
      @YourPalHDee 9 лет назад

      Alan Harake I'm somewhat confused about the implications of G-Sync to be honest, and as much as I've enjoyed Mantle, DX12 is set to make it basically irrelevant.

    • @joseluislopes3956
      @joseluislopes3956 9 лет назад +4

      Alan Harake AMD will disable (or already did) all support and development of mantle because of DX12, game developers won't support both API since only one will sork on both brands

    • @b_j_t
      @b_j_t 9 лет назад +4

      José Luís Lopes YourPalHDee Have you guys not heard of Vulkan? That's what Mantle is now. It's meant to be the successor to OpenGL, and it's in the same class as D3D 12 (Low level API). The world doesn't revolve around Microsoft's platforms and its games. "Discontinuation" of Mantle doesn't mean anything. Nor does D3D 12 make anything irrelevant. Source engine is OpenGL. Metro's 4A engine supports OpenGL. Mobile devices has to support OpenGL. So yes, there will be game developers that support both APIs, but that depends on whether they want to jump the cross-platform bandwagon or not

    • @joseluislopes3956
      @joseluislopes3956 9 лет назад

      alexyan981 as long as we are talking about power hungry graphic intensive games, d3d will be thw way to go because its the one best supported by both gpu companies, half dozen games in hundreds doesnt prove adoption by opengl, mantle will disapear, vulkan will stay for specific uses but if you think most game studios will have the work of making the games in both d3d and opengl you are mistaken

    • @b_j_t
      @b_j_t 9 лет назад

      Yes D3D is more popular for games nowadays, but OpenGL is just as capable of graphics intensive games as D3D is. They're just APIs. They talk to the same hardware that does the same things,
      I said: "there will be game developers that support both APIs, but that depends on whether they want to jump the cross-platform bandwagon or not". It doesn't mean "OOOH SHIT MAN VULKAN WILL KILL MS DX!!!"... All I'm saying is D3D 12 doesn't make Mantle/Vulkan irrelevant in any way. So no I'm not mistaken because I didn't even say the things you thought I said...
      The only reason D3D is more popular than OpenGL is because of momentum. OpenGL vs D3D was more of a debate in the earlier days of this century. D3D was able to gain the upper hand from 2006-2010 because OpenGL's advancement was pretty much stalled during that time. Not only did they not add many features considered modern at the time, they deprecated a lot too. Whereas MS deprecated a lot, but added more to replace them. Although I did say that they both talk to the same hardware that does the same things, but at this point in time, some features just weren't possible with OpenGL. So for at least 4 years, Choosing D3D as the rendering API was a no brainer because it's hands down more modern.
      But by now, both sides have pretty much closed the gaps. OpenGL is pretty much the equivalent of D3D. There are nothing you can do with D3D but can't do with OpenGL. But like I said before. Momentum. DX engines accumulated too many tools over those years to be ditched for OpenGL/Vulkan unless you were jumping the cross-platform bandwagon.

  • @rochr4
    @rochr4 8 лет назад +3

    G-sync is dead you muppet ;)

  • @SaakeBotha
    @SaakeBotha 10 лет назад +2

    Buy nvidia if you want awesome driver and game support. Buy AMD if you want to feel exclusive. At many LAN's people kinda freak out when they hear I have an AMD card...

  • @williamschaffer2858
    @williamschaffer2858 9 лет назад +25

    Is it just me or from 2:04 to 2:20 was the Nvidia one brighter and had more colors?

    • @christi198281
      @christi198281 9 лет назад +8

      The Nvidia also had very little tearing and smother motion.

    • @AK123456
      @AK123456 9 лет назад +4

      Maybe because of the difference in zoom ... maybe

    • @dirk41nowitzki41fan
      @dirk41nowitzki41fan 9 лет назад

      I noticed that too. AMD always seems to have a washed-out look, even on consoles. I wonder if there is a color editing fix.

    • @hooplan77
      @hooplan77 9 лет назад +1

      I think it was just the area they were in, in the game.

    • @joeymt9215
      @joeymt9215 9 лет назад +11

      ***** is that really civil?

  • @alikhoobiary6595
    @alikhoobiary6595 8 лет назад +3

    Linus... for the love of all that's holy.... put down... THAT MUTANT! **picks up pitch fork**

  • @hoffyc.h393
    @hoffyc.h393 10 лет назад +2

    I have an R9 290 card but i have no idea how i can get Mantle to it. So please can someone tell me how i get Mantle?

  • @Koolkid736
    @Koolkid736 8 лет назад +4

    crossli or SLIfire

  • @kleinkevindd
    @kleinkevindd 9 лет назад +4

    only 5000 dollars lol

  • @criznittle968
    @criznittle968 2 года назад +2

    Came here from WAN show and expected a much more elaborate video with benchmarks etc, we didn’t get a whole lot from this.

  • @PyroFire-Firework_is_a_passion
    @PyroFire-Firework_is_a_passion 7 лет назад +3

    1:30 it whas at that moment that i checked the date of the video wasn't 1 april

  • @spartan23456789
    @spartan23456789 9 лет назад +4

    I play a lot of different games [about 250 games on steam] and im honestly thinking of doing this with an R9 295x2 and a Titan X...

    • @mobertlawl
      @mobertlawl 9 лет назад +1

      Your going to need a 1000w psu for a 295x2 alone xD

  • @Jakewake52
    @Jakewake52 2 года назад +2

    Came here from the WAN show, good lord is that a densly packed pc- the fact this works so well is so dumb in the best way and while the issues this was made to counter may have died down a bit so many points ring true now

  • @CubesTheGamer
    @CubesTheGamer 10 лет назад +4

    Well, too bad we can't be running the two brands at the same time so we could have like Gsync and mantle AT THE SAME TIME IN THE SAME GAME.

    • @XxShindouxX
      @XxShindouxX 10 лет назад

      Stop...you don't know what you're really asking for? Do you want to get rid of all that is known as "real life"?

    • @no-turdburglars_inc
      @no-turdburglars_inc 10 лет назад

      Just quit. so you aren't really talking about anything in english? Speak english.

    • @CubesTheGamer
      @CubesTheGamer 10 лет назад

      I was speaking English. We need to be able to have, as Hannah said, the best of both worlds.

    • @stoneyyay4557
      @stoneyyay4557 10 лет назад

      CubesTheGamer if nvidia implements mantle we will have best of both worlds in the nvidia option. BUT we may have to wait and see what all this hubbub is about with freesync, and hoow well it performs

    • @CubesTheGamer
      @CubesTheGamer 10 лет назад

      Well if they get Mantle...nah it's not worth Gync. I'd rather have mantle than Gsync

  • @Cad_yellow_on_the_1_inch_brush
    @Cad_yellow_on_the_1_inch_brush 10 лет назад +7

    I have one EXGA GTX770 4GB, an i7, and 16gb of ram and never go below 60fps on Far Cry 3 Ultra settings. But AMD cards in CrossFire were only getting 40 to 50 fps?! Am I missing something here or does AMD just suck?

    • @frendiom
      @frendiom 10 лет назад +8

      Are you running far cry 3 in 1440p? (also nvidia got better drivers)

    • @Kaasgeelheid
      @Kaasgeelheid 10 лет назад +6

      I think you are missing 1440p

    • @Cad_yellow_on_the_1_inch_brush
      @Cad_yellow_on_the_1_inch_brush 10 лет назад +2

      KSGH Yeah I run it at 1080p, so I was missing something xD

    • @MrHENRYTHEGAME
      @MrHENRYTHEGAME 10 лет назад +2

      frendiom that nvidia got better drivers statement is dead, can't believe people still rely on that.

    • @MaxxxMaxMaxx
      @MaxxxMaxMaxx 10 лет назад

      X i got fooled by the video just like you ^^ and i got a 650 ti boost and a i5 4670k, 55-60 fps mostly in far cry 3 ultra settings :3

  • @Aeturnalis
    @Aeturnalis 2 года назад +2

    Only a few years after this, the consumer choice between hardware vendors no longer has anything to do with what technologies they prefer, but which card is available for less than the cost of a used car.

  • @lammatt
    @lammatt 10 лет назад +10

    why is mantle even in the discussion?
    directx is still the way to go for PC game, isnt it?

    • @billiampotter
      @billiampotter 10 лет назад +25

      No, DirectX isn't going to go much further, what Mantle does is take load off of the cpu and puts it onto the graphics card which helps a lot on cpu intensive games

    • @makerstories4008
      @makerstories4008 10 лет назад +3

      Im not sure honestly, DirectX is old, Mantle is Exclusive, OpenGL is being updated soon but it has akward extension support.

    • @makerstories4008
      @makerstories4008 10 лет назад +5

      Will Lovrak actually its the other way around. the cpu is processing certain graphics calculations to relive stress on the gpu...

    • @lammatt
      @lammatt 10 лет назад

      ***** true
      this is why mantle is only beneficial to lower end graphic cards (aka garbage) or APU builtin graphic

    • @bigboi9611
      @bigboi9611 10 лет назад +3

      matt lam it is beneficial to low end cpus so they can run with better gpus

  • @jedidethfreak
    @jedidethfreak 10 лет назад +6

    While I respect your opinion, Linus, and agree to a great extent, here's the question -
    what are we supposed to DO about it?
    As crazy as it is to expect people to get a pair of 780ti's AND a pair of 290x's in one computer, it's just as crazy - if not moreso - to expect AMD and nVidia to trade their proprietary tech.

  • @Surms41
    @Surms41 9 лет назад +2

    The fanboys in these comments are real.

  • @jefflum4040
    @jefflum4040 8 лет назад +3

    it took two of the 80s series graphics cards to play games at 1440p in 2013-14 that sucks

    • @pinecone5129
      @pinecone5129 8 лет назад +1

      HD PRO lol I can do 4k with a 1070

    • @ariellaantinero2777
      @ariellaantinero2777 7 лет назад

      I fоund this аweеsоme аll in ооone сheаt fоооr Crоssfirе :) twitter.com/8951b998a93885e75/status/754954572104212480 SLI Crоssfirе in Sаmеe РC

  • @touchainzentertainment
    @touchainzentertainment 10 лет назад +5

    Use AMD for the gaming, then Nvidia for the powerful PhysX

    • @kunnorinno
      @kunnorinno 10 лет назад +6

      lmaoooo

    • @touchainzentertainment
      @touchainzentertainment 10 лет назад +1

      its possible, ive done it before. you just need a hack to enable physX

    • @patrickh92able
      @patrickh92able 10 лет назад

      TouChing Thao pathetic

    • @touchainzentertainment
      @touchainzentertainment 10 лет назад

      PC_GameWorld its not pathetic. its the only reason you should use it in this state

    • @kunnorinno
      @kunnorinno 10 лет назад +2

      "powerful PhysX" dude, do you even know what it is ?

  • @saltyreviews341
    @saltyreviews341 3 года назад +2

    Sli and crossfire will be obsolete, Boooy.

  • @haitianxu
    @haitianxu 10 лет назад +4

    The solution is simple:
    CONSOLES!
    Console peasants assemble!

    • @haitianxu
      @haitianxu 10 лет назад +1

      crazyp2jimmy We are legion. We cannot be stopped. Accept

    • @Ashquacks
      @Ashquacks 10 лет назад +1

      ***** declined.

    • @haitianxu
      @haitianxu 10 лет назад

      Brian Nguyen Horan Your words are as empty as your future. We will be your demise. You cannot escape fate.

    • @Ashquacks
      @Ashquacks 10 лет назад +3

      ***** The element of escape shall be with me for a lifetime. In fact, I drive the hybrid version.

    • @IrishPlaysPC
      @IrishPlaysPC 10 лет назад +3

      Please tell me you're being satirical. If not, stop trying to cause arguments and get off Linus' peaceful channel.

  • @Hobbes4ever
    @Hobbes4ever 8 лет назад +3

    more than two years later a single gtx 1070 would probably outperform this POS at much much lower price

    • @Fleetfoot
      @Fleetfoot 8 лет назад +1

      I have always preferred nVidia, my first GPU ever was a GT8600, then a GTX550Ti and I have been saving money for a good while now and a friend of mine offered me his GTX1080 (he's upgrading to the Titan Z) for a cheaper price which I'm going for without doubt.
      I just hope my i5 3570k won't bottleneck it, cause I really have no money for new cpu or even motherboard for that matter xD

    • @Hobbes4ever
      @Hobbes4ever 8 лет назад

      lucky you! I wish I have rich friends who would ditch a 1080 for the new Titan. My current gtx 970 is my first Nvidia card, used to use ATI only but when I was looking to replace my old cheapo ati HD 6750 1gb over a yr ago I learned how crappy the older gen AMD card was especially when it comes to efficiency. I am just glad that AMD finally woke up and released the RX 400 series. too bad that they are currently not offering any cards that can match the performance of gtx 1070 and 1080.
      I think it really depends on what games you want to play. Most fps should be fine with your i5 3570k and maybe even the new Tomb Raider and fallout 4 if you overclock it. My i5 4430 is probably worse than yours but still manage to get an average of 45 fps while running Rise of TR with everything set to high and very high at 1080p.

    • @Fleetfoot
      @Fleetfoot 8 лет назад

      Yu Tub
      Oh yes it's a friend I made at college, he has rich family, drives a Mitsubishi Evolution and all, he sold me the gtx1080 ZOTAC founders edition for half the price (yaay life is good sometimes)
      Me on the other hand lack funds and I am still stuck with the i5 3570k, 8gb ram 1666mhz and no SSD yet only a seagate e.c 3.5 (3TB) also my monitor can only do 1080p so no ultra or anything for me yet.
      But that's the only purpose I'm at college, so in the future I can have the best high end gear xD

    • @Hobbes4ever
      @Hobbes4ever 8 лет назад

      ssd doesnt give you more fps but would definitely make your games load much faster AND its totally silent which i really appreciate. Your current rig is good enough for 1080p gaming especially the more demanding games like Rise of the Tomb Raider. I wonder how many more fps you would get running that game vs my i5 + gtx 970.
      yeah a college degree is nice and all. especially if you want to move to another country like im trying to

    • @Fleetfoot
      @Fleetfoot 8 лет назад

      Yu Tub
      Oh trust me I am not happy with my country either, I wish I was from Canada like Linus here, but oh well... Also I installed it today and it's working like a charm, I can finally delete LoL and play some funnier games xD

  • @tmhedgehog7813
    @tmhedgehog7813 9 лет назад +1

    I'd love to see this tested with a DX12 demo/game that supports vendor mixing!

  • @robertgonzalez6046
    @robertgonzalez6046 9 лет назад +3

    can i switch from amd to nvidia without problems?

    • @robertgonzalez6046
      @robertgonzalez6046 9 лет назад +2

      ludovico are you sure? ive heard that youll get driver errors?!

    • @zigmanist
      @zigmanist 9 лет назад

      Download nvidia drivers

    • @danshady09
      @danshady09 9 лет назад

      What of his motherboard doesn't support nvidia?

    • @zigmanist
      @zigmanist 9 лет назад +1

      Every motherboard supports nvidia (unless it has the wrong pcie version)

    • @danshady09
      @danshady09 9 лет назад

      Sorry I was thinking that his motherboard might not support a certain cpu, but yeah you can switch between amd and nvidia

  • @marekvrbka
    @marekvrbka 10 лет назад +1

    Nvidia and AMD, I know you are tough competitors, but please, make at very least BASIC SUPPORT for this multi-manufacturer GPU configuration

  • @zhenwang6609
    @zhenwang6609 10 лет назад +1

    How is $5000 a valuable price? Sure, the system is good, but average people, even hardcore gamers, don't do that.

  • @anzekranjc3312
    @anzekranjc3312 9 месяцев назад +1

    My guy made top of the line quad GPU system for the cost of a single GPU system today

  • @HashtagPULSE
    @HashtagPULSE 9 лет назад +1

    Am I okay putting both my 760OC and 7770 into my pc? :P

  • @deagt2695
    @deagt2695 7 лет назад +2

    Waste of money...

  • @gabrielayumi92
    @gabrielayumi92 7 лет назад +1

    I did i with dual 1080s and Fury X
    I fried the powercables of my house