Guys, can you maintain the gpu positions fixed in the charts, no matter which gpu wins? It's easier to read fast when you change a variable for a constant...
Jarrod from Jarrod's Tech did an excellent graph. Easy to read, highlighted nicely, and the colors make sense. ruclips.net/video/CKmExYwcHhY/видео.html
I was about to say, I have no idea what the graphs mean with the different colors. (granted I'm not focusing on the video that much) I felt dumb when you pointed out the different systems, I saw that part and it should of clicked, but it didn't lol.
Fast forward 2 months later, Shadow of the tomb raider used 11gb vram on 2080 ti when RTX is enabled at 1080p. Not sure if it used more since the vram already maxed on that card.
@@PutinIsKing The radeon 7 us actually a server card. Thats why it has that much high bandwith vram. Moreover the card is very fast in compute tasks but not so fast in games.
Even when paused I can't figure out which test is which in the Blender slide. Even the colors in the key at the top are confusing, it makes it look like there is 2 results for each test and they are put on separate graphs? I think? Anyone else confused?
I do not take sides with either competitor but I fell in love with the looks of it. The design is not only so awesome to look at but also more practical from a manufacturing perspective. What someone should attempt is a minimalistic case on this theme that compliments its looks. Linus is right on the cost, they should have released it for a bit less or make a 8GB variant. In any case, we do have a choice. We can opt to be ripped by either Nvidia or AMD.
Yeah the card is sexy, kinda like the b2tf DeLorean of GPUs. Unlike most other cards with the shitty plastic designs that stereotypically resembles someone who wears XL monster drink hoodies and doesn't remember the color of their desk. Titan RTX is classy as well
Here from the future! My Radeon VII has aged beautifully compared to the 2080. 1: I still have enough VRAM to run games at max textures at 1440p (something the 2080 can no longer do with new titles) 2: Ray Tracing on the 20 series ended up being a joke. Most users still.used rasterized lighting do to the hit on performance. 3: My card supports FSR 2 (technically, so can the 2080) which runs much better than DLSS1 and honestly on par with DLSS2. Much happier with my purchase these days seeing that the card has aged like a fine wine in comparison to the 20 series equivalent
I'm planning to upgrade to this card (currently running a 1660 Super), would you recommend it as a viable upgrade? (The card price is around $250 USD). Thanks in advance for your time and attention.
@metalchungus6241 Yes, it is a viable upgrade, and the price you've found it at is very low. Though, honestly, it's sort of more of a collectors card and is starting to show its age in newer games for me. If you were to upgrade, I would highly recommend something newer just so you get more time out of it. Don't get me wrong, I can still run everything at 1440p 60fps max textures, but there is no Ray Tracing and the card is already 4 years old.
How in the world has the Radeon VII aged better than the 2080??? That's not true. I bought a 5700XT when it came out and it beat the Radeon VII for a fraction of the price. It still beats the Radeon VII and costs around $200 used. The 5700XT is around 2070 Super performance, the Radeon VII has around 2060 Super performance. Unless of course what you're doing benefits from its RAM. You really should upgrade, I went for a 3090 for $750. Next year prices will drop even more. Radeon VII is too hot for the little performance it gives compared to cards still a fraction of its price used. I suppose there's no harm in still using it, but I would never recommend anyone buy one today.
@@metalchungus6241 The 5700XT costs around $200 used and vastly outperforms the Radeon VII in gaming. Literally zero reason to buy this card, especially used. Zero reason, you will be left with a card with no warranty, that runs hot, and is a terrible value especially in 2023. I hope you didn't buy one....
Linus, you missed the killer must-have feature. RTX2080 - 8GB - Supports over 450 chrome tabs before memory runs out. Radeon VII - 16GB - Supports over 1000 chrome tabs before memory runs out!! Win!
@@expertojordigg Both are required, technically. Back when I had a lot of stock charts going on (Flash or HTML5 graphing stuff), Chrome (and sometimes my computer) would crash when I hit a certain amount of tabs. It was beneficial to leave them up so that I could quickly flip between them and compare. After jumping from a 2GB card to a 6GB card, I could roughly double the amount. (Probably closer to a triple, but who knows - I was eagle eyeing it.) After the upgrade I could immediately add hundreds more to my saved session without chrome and my computer crashing. System memory was 32GB in both cases. Your GPU is now used to accelerate all sorts of web stuff, including just compositing the page, drawing fonts/characters, etc.; even drawing a white square in the background is likely now handled by the GPU rather than the CPU, at least for many web browsers on some OS's. I'm wrong about the amount of tabs, though. In my own testing, 6GB cards support about 800 before windows gets crazy unstable. In theory 16GB cards could approach the 2000 tab level. At that point Chrome will be gobbling closer to 64GB of system RAM. Just 600-800 tabs already consumes well over 20GB in my experience.
HBM2 and 4 stacks of 4gb gives the user twice the bandwidth no matter how many GB he is using. It's not about how much memory it's about how to double the speed it reads and writes. Like people who want a cpu that clocks 5Gig or ram that clocks 3466GHz. Like people who pay a premium for quad memory. I don't run 4x8GB sticks of ram because I need more than 16GB total memory. It's because quad is faster than dual channel memory. Radeon VII memory: Utube reviewers cover up the advantage of 16GB of HBM2 by talking only about how much ram do up need. Your letting them think for you. A year after Ryzen 6 core cpus launch and Intel has 6/8 core cpus, Utube reviewers mention a 6 core Ryzen stomps an Intel 4 core at gamming if you stream at the same time. Hide the feature by hammering on something else.
The white hawk AMD hasn't had anything real competitive in the GPU market ever since Nvidia released their super cards. The only benefits most of the new AMD cards have is PCIe Gen 4 and more memory (which can be useful, but these gpus still have less raw horsepower than Nvidia's supers. Plus, the Radeon VII was put in end of life only six months after it released.
It has two 8 pin connectors because it barely draws 30 watts from the pci slot. It gets basically all it's power from the connectors. And also gives room for extra power needs from overclocking
It was $699 then went out of stock then taken down then back up still out of stock with $599. Pretty sure it's a bug, although 3rd parties are usually cheaper than the initial fe cards. So we'll see @@Pwnstared
@@AstroKitty16 I infact use high quality thermal paste for my PC builds. Otherwise would fry my components. It was just a reference to the Verge Video with Etienne. He pretty much did everything wrong in his PC build Video and jizzed the whole CPU with thermal paste and put two "fast" DDR4 RAM in single channel, stating that it wouldn't depend in which slot or like he says "Bracket" you put them in. Just search for reaction Videos of different PC builders on RUclips and prepare for the cringe of your life. Because it's really bad.... Hence the "cooling paste" reference. Enjoy! XD
I wonder why? XD Seriously though what an ugly looking card. I know performance is a key value but come on, even Intel sneak peak at X^e Graphics cards look nicer than that 90's looking brick. RTX 2080 Trio in my opinion has to be the best looking card though
@@sway4808 compare it side by side to the rtx 2080 trio. The way the radeon 7 looks is plain, and blocky it's fine if you don't want to show off your computer, but comparing them side by side it just looks hideous compered to the competition . Functionality wise? it seems good enough.
I tweeted them and asked if they would have the 15watt Ryzen with Vega chip and they had a one word response saying "will". So, I'm thinking they will. lol
Nobody really gives a shit about no Ray tracing. Most people prefer getting a stable 60fps or more ..... You make good points on features though. 12GB(or less) of memory on this card would have made more sense to keep costs competitive
@@fancyslimoshady Tanking fps for a huge price increase is not worth the visual improvement RELATIVELY speaking. Puddles will look pretty. Neat. I'm looking at said puddles for less than 1% of the game. I will enjoy an extra 50% fps 100% of the time.
Sure but RTX and DLSS features are still features. They are available options, which are not available in AMD cards. There's no missing features if you get an Nvidia card instead of AMD. Some performance hits, sure but not incompatible. For $700, this card is nothing. 7nm, 0.7nm, 0.00007nm, still doesn't matter for end users. If anything, it's weird how much power it consumes for just 7nm.
Did he say it didn't? I have had both and and Nvidia cards from their recent generations, and re-live videos just look worse at the same bitrate. #Fact
As a video editor, I prefer re-live because it records in a constant frame rate as opposed to shadowplay with variable frame rate. People who use Premiere to do editing know it's a pain with VFR. I hate having to convert the videos to CFR with handbrake because they take so long. While the is a difference in visual quality from the two services, the difference is small.
G Dante You should not express your opinion when you know nothing. You are clearly some fat dude in his grandmas basement who has amd stickers all over the room. Fanboys are the worst. PEACE.
@@GreatCakesable I think it's a decent deal to get RTX 2080 performance for $700. You get futureproofed 16GB of HBM2 and also save yourself $100. It's not that bad of a card...
@@hambopro4221 And when would you start utilizing the 16GB VRAM though? Around 10 - 12 years? I dunno, but something tells me Radeon VII would not be able to keep up with games by that time. They should have limited it to like, 10GB or something. That way, they can sell the card for much cheaper and be a great deal. But now, I'll just stick with 2080.
their CPU is still shit. CPU at same prices as Intel are worse in gaming and even perform similar in workbenches. So then, why get it? Cuz hurr durr some reviewers said Intel killer hurr durr. Well all benchmarks tend to show Intel still winning. Especially at lower resolutions. So unless you game at 4k, you shouldn't buy Ryzen shit. And even then, why settle for less? Same performance at 4k.
I just bought the compute version with 32 gb of HBM2 on ebay for $250. We'll see if I can get it working for machine learning. And maybe I can run Steam on Linux and use it for games. It's interesting that the game version was $700 new, I think the compute version was more like $5000, where the only difference was that double precision arithmetic wasn't gimped, it only has one video out, a double memory version is available, and the drivers don't even support windows (or even most PCs).
It's really quite amazing that their continued insistence on selling people on ram and buffer that you won't utilize in gaming, actually makes it impossible for them to compete on price now. That's a ferociously stupid business decision that required general ignorance from their customer base, and that has finally caught up to them. I don't dislike AMD on the whole but I have always disliked their ability to project themselves as the people's champion while selling snake oil.
@Lolled9991 They're clearly not using their superior architecture to beat Nvidia. Not in gaming, at least. And even in serious workstation loads most are going with Nvidia. There may be some all around edge cases of people who do a lot of both workloads, but there aren't enough people out there with those specific cases to make AMD a threat currently. Hopefully someday they get it together and can compete again in graphics and drive down price, but at this point its hard to justify investing in an AMD card, since If im doing those kinds of workloads I can either A, afford two specialized systems, or B, have a workstation at the office to deal with those other workloads than gaming and if I need to do something on mine at home I can afford the extra time given that its not the primary use of my machine anyways. Idk, its a confusing play by AMD, maybe just trying to get within punching distance, but theyre not there yet, and I'm willing to bet Nvidia, even if you took away their head start of releasing months ago will far and away outsell with just 2080s vs Radeon VII from today forward, and that obviously doesnt include all the other cards in their lineups.
@@LinusTechTips Are you high? AMD's software and drivers are much more reliable than Nvidia's. GeForce experience is trash and only provides stupid "optimisation" that nobody uses + sharing to Facebook/Twitter.... Also two more words. Linux support.
@@LinusTechTips Ok first of. You guys actually read this here. And I just really wanted to write this comment since it sounded like AMD has no alternatives for anything Nvidia does. Like AMD has Advanced Media Framework Encoder as their version of NVENC and actually have something that NVIDIA still doesn't has. Being able to tune/change your color, saturation, contrasdt and so on in the game and in any way you want. I just find it frustrationg to just compare things to NVIDIA "Bulletpoints" and think it is not really representing the actual product if the AMD side of features (even if not superior) isn't even mentioned. Also personally speaking. I had only problems with using H.264 on my 1050 Ti OC + self OC. So it might be cool for the high end but not so much for the lower consumer grade products even while pushed to their outer limits. But still.. Thanks for taking your time replying to my comment. I appreciate that.
In Canada, the RVII cost 929$ across the board for any models while the lowest 2080 RTX is 969$. Basically, the RVII is at 700$ US while the cheapest 2080 is at 730$ US.
The 16 GB of memory may not mean much to gamers, but it means a lot to deep learning researchers, and it is probably the only sub-$2k card with this feature. Meanwhile, the 1080 Ti went from costing $700 to now twice that.
yeah, I had to buy a 970 over a 480 a couple years ago because amd raptr was complete trash but for my next upgrade I went with a vega 56 and was REALLY impressed by AMD drivers
a lot of people do but you get these nvidia paid reviewers like linus who don't even acknowledge the existence of relive or adrenaline and claim ge is the best thing since sliced bread
HaydosaXD you can get the MSI reference one for 1169 AUD and the XFX for 1099, which may only be 100 cheaper than a 2080, but I still see that decent value
I bought one when it came out not knowing anything about it but that it was the the best amd card, well i still have it, its never done me wrong in any game i've played and i dont ever mess with clocks.
I wish they coulda did 12gb of HBM2, but that would limit the memory speeds to 768gb/s instead of 1tb/s. Woulda made it cheaper. Also, xfx is already selling their radeon vii on newegg for $600
the thing is that the vega architecture only performs well with the bandwidth. If they did one with 12gb of HBM2 then it would perform considerably worse, making it probably just something like 10% faster than a Vega64. Not worth it
I have full day job 40hrs/week and I can only buy this card if I will be saving third of my salary for entire year. The prices are just fucking frustrating. Earlier I could buy entire gaming pc within year of savings.
Saboth literally would have cost more for them to do that than not to. If you didn't know, the HBM stacks are integrated on the GPU die itself, and the GPU actually needs all 4 stacks populated with hbm in order to even function, because of the memory bus design. A gpu with 2 hbm stacks would have required a hefty redesign which would have cost a lot for RND, so the cost savings would have been minimal to nonexistent and you'd be releasing a lower capability card for the same price. The reason amd couldn't have designed it for 8gb hbm2 is because it's just a low binning instinct gpu in the first place. Radeon vii is a low cost radeon instinct that didn't bin well, and because instinct is a creator card it needs the 16gb hbm2. Radeon vii is AMD saying fuckit and finding at least some way to recoup the lost profit from the few instinct gpus that didn't bin high enough to make it into pro cards. This was never meant to be a real gaming card or a competitor to the RTX, it's just a stopgap for Navi.
Can you just give us 1 or 2 mins of Review about just for Video Editors who are interested to use this card or RTX 2080 for 4k or 8k video editing? which one is better?
7nm should not really mean anything to a user who is just trying to game. The 7nm GPU just helped the card crank the performance without significant power use increase. Ray Tracing might be something used in the future, but it comes at a premium. This is a very hard choice and will depend on many factors.
@@seventyseven8076 Well AMD fans have no choice but to buy Radeon VII or wait another 8 months for Navi to answer the already 24 month old performance of the GTX 1080 Ti.
I felt for ya at the end of this video! AMD is so close yet so far away.... Anyways, for those of you who want to save a few bucks, the Vega-64 is good enough for MOST AAA games if you buy TWO of them and cross link them! It's been in the Radeon software since Version 17.9.2 where you can PAIR UP two Vega cards! Go on eBay and take the chance. I'm seeing them as low as $260 US ($320 CAN) per card so for $520 US ($640 CAN) you can get all the GPU cores (4096 cores per card) and a total 16 gigs of VRAM for your gaming AND you work-related vlog editing tasks! AND I would definitely think about getting that ultra-widescreen 2.5k display though! it REALLY makes your system enjoyable! Amazon is now selling that LG 34WK500-P 21: 9 Ultrawide Full HD IPS LED 34" Screen LED-LIT 14700510 at 2560 x 1080 pixels with AMD Freesync which would make your gaming and video playback/editing REALLY SWEET! And it's only $270 US ($355 CAN) .
I hate the framing of "Will amd be good enough to lower Nvidia's Prices?" that everyone seems to be doing lately. Why cant it be "Will Amd Finally be competitive in the high end again." The prior implies that their only purpose is to force Nvidia to make GPU's at a reasonable price so that everyone can buy their rgb laced nvidia crack to appease their senseless brand loyalty to their favorite dealer, while the later says hey look I hope this gpu beats whats on the market now so that video card performance continues to tick up instead of stagnating like the 2060-2080 did.
That has been AMD's game for a long time, that is their role. AMD is a fraction of the size of Nvidia, with a fraction of the resources. The fact that they compete at the level they do, at all, is already pretty impressive.
AMD will never win the war against nvidia. Nvidia is too far ahead software wise. You are a gamer ? Shadowplay is absolute must. You are a programmer ? Tensor cores are APSOOOLUUTE MUST . Better optimized drivers and better community support as its 20x times biggeer.
I don't think it's brand loyalty as much as them simply having a better product, and most people only care about that. AMD just dissapoints over and over again. Just like with Vega64, you'd need to be an absolute AMD shill and fanboy to buy it over GTX1080 considering everything. I'd wish AMD stepped up their game just like they did with their Ryzen CPU's, but...I'm ordering RTX2080Ti soon.
@@skuyzy198 I have a Vega 64. My Asus ROG Strix 1080 went into my sons computer. The difference between the 2 is negligible at best, at least in the games we play, were talking under 5 fps, if any difference at all. Some games in DX12 or Vulkan, the Vega actually beats our 1080. Are you thinking of the 1080 ti????
@@skuyzy198 I totally agree that currently and if I'm honest ever since the 900 series Nvidia has had better gpus than AMD in most segments that isn't what I am arguing about. My statement is that I wish people desired AMD to do well as a benefit to the community in general rather than just a means to reduce the price of Nvidia products.
Now's a great time to pick up a GTX 1060 my dude. Huge upgrade from the 7950 and it just took a massive price drop. It's priced almost as low as the 1050 was 6 months ago.
It's kinda funny that people with 0 clue are talking about AMD drivers. 5700 and 5700 XT hadn't issues for months. It was just small period when they were released, and people are still complaining about it, but they never really had those cards in their PCs.
Radeon VII works fine for me, esp with the better drivers now (no more Crashes) - However there's no RayTracing. I've never had the GPU max out on me, and it plays fine with Star Citizen 3.5 / 3.6 - The fans are quiet (I have a an open frame Thermaltake) for most games, and you can adjust the curves (as normal). I think for the Price / Performance is good on this card
Just a suggestion on displaying graphs... I think it would be best to keep the product in question (Radeon VII in this case) displayed at the top of the graph and immediately below it are the other comparison products. This way, the viewer won't need to find where the product being reviewed is for every slide. Even if in this case, the Radeon VII is highlighted, I needed to wander about and pause the video to see the comparison. Having the product always be at the top is a quick way to deliver performance comparison info.
SC2 and LoL are both still DX9 too IIRC. I think the base version of DOTA is too, but the new version uses vulkan when available (and therefore probably something newer).
0:20 Seriously though, I listen with headphones all the time. You guys, and everyone else, on videos, live streams, WAN shows- whatever the hell those are, are TOTALLY good! It doesn't hurt in the slightest. In fact, I use headphones so that random loudness doesn't bother my roommates via speakers. How is this not a meme?
6:33 Based on GamersNexus' (or rather Buildzoids) video the power delivery on the Vega is overkill. Power delivery is NOT a limiting factor on the Vega VII
AMD isn’t a charity. They don’t care about you. Making the choice to buy them anyways is fanboyish and allows them to get away with NOT being competitive, if they know fanboys will buy it anyways.
AMD fanboys kept buying stuff and AMD is going up. So your assumption is wrong. If the stuff will be terrible, even fanboys will not buy it. In this case, the GPUs are very similar for the same money so its my money gets either Nvidia or AMD which will support competition, not the monopoly.
@@bitkarek No, AMD release competitive things on their CPU side - that's why they're doing so well. Their GPU side has been disappointing. "Wait for polaris" (only really saved through mining) "Wait for vega" "Wait for VII" "Wait for Navi." No, it's not very similar. It uses a lot more power and they essentially released a card that would have been competitive two years ago.
if nobody went with AMD in past years, they would bunkrupt, not come up with Ryzen.If you have 2 companies only and you buy only from the better one, the second one would be gone fast. They would not have money for the development and bye bye.
@@bitkarek What are you even trying to say. AMD happened to survive because they released semi-competitive products. Anemic, but semi-competitive. You don't buy from one company if they've wasted years and money just to keep them afloat enough so that competition exists. You let them die and let anti-trust legislation settle it. You don't reward them for lack of innovation and bad products. AMD has shot themselves in the foot for years.
At least they listened to one thing and uncapped more FP64 performance, now at 1/4th rate rather than 1/8th. Though that also shows that's entirely a software limit and it could have been uncapped to full rate like M150, but ah well. I'll be curious for testing if the iMac Pro is refreshed with a chip like this soon, what its FP64 rate will be, as Apple co-writes the driver.
These are just really low binned M150 machine learning accelerators that AMD launched in what, October of 2018? 7nm dies are expensive. It makes sense to bin this way.
How is the radeon 7 a failure? It offers decent gaming but drags the floor with the 2080ti in professional applications. People just didn't use it for what it was good at. The radeon 7 and vega gpus have always been as close to a workstation gpu as you could get on a budget. I think i already commented here
"You were supposed to Destroy Nvidias Price tag not join them!!!!"
Bring balance to the market!
Not leave it overpriced!
@@fallshimjager1 XD
Yup, i think we all was hoping for R7 to be the chosen one.
sell or sell not there is no try
Actually their earnings margin with that card is almost non-existing, do you know how much would Ngreedia 16 GB HBM2 card cost???
Guys, can you maintain the gpu positions fixed in the charts, no matter which gpu wins? It's easier to read fast when you change a variable for a constant...
Hardware Unboxed changes them each slide based on ranking and it doesn't bother me. LTT just shows it too fast.
Go to Digital Foundry for serious review
@@Malus1531 LTT's slides are way way way too fast.
@flycast @Malus You are supposed to pause video while looking at them ;)
@@foxbondpl That doesn't work on mobile, the play button obscures half the screen.
At 0:36 you can clearly hear linus drop it and casually keep talking like he didn't just do that.
He dropped the box
Yeah,You can also clearly see that he risk of a drop on minute 3:40 :P
don't forget right here 5:16
NightVision Official g
@@MrDerpy-ns6sy yeah, Linus is trying her best to don't let's things fall off xD
Damn who made the chart? Is so hard to make nvidia green and AMD red...
That would be even more confusing for color blind people.
Blue is intel cpu, red is AMD cpu...
seaners24 no.......
yeah like 3:39 and next one.
Someone put in effort to make charts/listings harder to comprehend intuitively and fast.
@@Seanidor Wait - there were different colors?
So, those graphs.. are terribly labeled,.. and orange is ryzen, blue intel pc benchs
Jarrod from Jarrod's Tech did an excellent graph. Easy to read, highlighted nicely, and the colors make sense.
ruclips.net/video/CKmExYwcHhY/видео.html
I agree, for a channel thats so high production the graphs a practically illegible, sort it LTT.
Totally agree garbage graphs design.
I was about to say, I have no idea what the graphs mean with the different colors. (granted I'm not focusing on the video that much)
I felt dumb when you pointed out the different systems, I saw that part and it should of clicked, but it didn't lol.
@@mattsmechanicalssi5833 that guy is just on the up and up I think.
0:36 Linus dropped the Radeon.
That's exactly what I said to myself 😂
nobody:
AMD: 16gb VRAM
Amd gpus have a shit ton of vram
It needs the vram
Fast forward 2 months later, Shadow of the tomb raider used 11gb vram on 2080 ti when RTX is enabled at 1080p. Not sure if it used more since the vram already maxed on that card.
Justin friendo, I think you're only look at gamers. Because FAR from nobody is excited about the 16 GB frame buffer on this bad boy.
Classic Linus dropped the VII in the background at 0:37
muhammad fahmi lol
makes overclocking easier
Lmao
loled hard
I believe that was the bottom of the box.
0:37 that's the gpu falling from Linus' hands
Lmaoooo this boi wild
Linus needs to use "the stick" that wide receivers use to keep from dropping everything. I'm dead :'D
that's probably the box getting tossed out of view
That leaves a dent on the brushed aluminum cooling cover
It sounded more like the clear gpu stand being dropped onto the table after being removed from the box
0:54 I see he didn’t forget to add “And it’s got RGB”
_Professionals have standards_
They also kill french spies.
I see you everywhere I go
I reply here so someone can see this and tell me why the hell DX12 doesn't outperform DX11 years after launch, and Vulkan?
be polite
Be efficient
It’s a good productivity card because of how much ram it has and the ram bandwidth
What a lame comment, its build for games and has an 700$ pricetag, Amd is just shit
@@PutinIsKing it is a repurposed productivity card... How is this comment lame?
StgY Cobalt Me chillin with 2080ti be like: 👁👄👁
@@mitzey4545 2080ti is a waste lol
@@PutinIsKing The radeon 7 us actually a server card. Thats why it has that much high bandwith vram. Moreover the card is very fast in compute tasks but not so fast in games.
Better graphs next time, please.
These are so confusing!
Terrible design.
Even when paused I can't figure out which test is which in the Blender slide. Even the colors in the key at the top are confusing, it makes it look like there is 2 results for each test and they are put on separate graphs? I think? Anyone else confused?
Look at gamers nexus's review if you want to know about accurate numbers rather than being spoonfed opinion pieces
What? Can't you see the red on red highlighting?
I found them rather easy to understand. Maybe this is a you problem, and not a oversight on theirs?
I wish they had arrows pointing left or right to indicate which direction is better.
The only thing I learned from this video: *Jensen Huang is Lisa Su's uncle*
That isn't true
her grandfather is his uncle. so not exactly uncle and niece but close.
They're more like 1st cousins once removed.
Wow I know this since when I was inside of mom.
@@Interestingworld4567 hol'up
0:20 Thank you thank you thank you LMG for having Riley on the team. Seriously makes my day every time.
Drivers issues
Best part of the video.....
Riley is the best.
0:25 Even AMD's CPU Division are impressed! =)
He's going apeshit
WHY ARE THERE 8 GRAPHS IN ONE?! Maybe hire a statistician?
Yeah having that many just makes it confusing and I keep having to pause to find the ones I’m looking for
I'm an equities trader and it looked fine to me. Git gud?
@@manictiger this just proves his point. not everyone is an equities trader.
@@rubenlungu5763
r/whoosh
@@manictiger ok, I'll take that. Well played.
I do not take sides with either competitor but I fell in love with the looks of it. The design is not only so awesome to look at but also more practical from a manufacturing perspective. What someone should attempt is a minimalistic case on this theme that compliments its looks. Linus is right on the cost, they should have released it for a bit less or make a 8GB variant. In any case, we do have a choice. We can opt to be ripped by either Nvidia or AMD.
Eh, I dislike the look of this card the most.
@@crylune Yeah I personally like the turing cards
Yeah the card is sexy, kinda like the b2tf DeLorean of GPUs. Unlike most other cards with the shitty plastic designs that stereotypically resembles someone who wears XL monster drink hoodies and doesn't remember the color of their desk. Titan RTX is classy as well
What awful graphs.
yup they are gross, really gross.
0:34, that classic sound of something being dropped by Linus...
Here from the future!
My Radeon VII has aged beautifully compared to the 2080.
1: I still have enough VRAM to run games at max textures at 1440p (something the 2080 can no longer do with new titles)
2: Ray Tracing on the 20 series ended up being a joke. Most users still.used rasterized lighting do to the hit on performance.
3: My card supports FSR 2 (technically, so can the 2080) which runs much better than DLSS1 and honestly on par with DLSS2.
Much happier with my purchase these days seeing that the card has aged like a fine wine in comparison to the 20 series equivalent
I'm planning to upgrade to this card (currently running a 1660 Super), would you recommend it as a viable upgrade? (The card price is around $250 USD). Thanks in advance for your time and attention.
@metalchungus6241 Yes, it is a viable upgrade, and the price you've found it at is very low. Though, honestly, it's sort of more of a collectors card and is starting to show its age in newer games for me.
If you were to upgrade, I would highly recommend something newer just so you get more time out of it. Don't get me wrong, I can still run everything at 1440p 60fps max textures, but there is no Ray Tracing and the card is already 4 years old.
@@metalchungus6241careful they tend to randomly die get a 1080 ti over this it beats it
How in the world has the Radeon VII aged better than the 2080??? That's not true. I bought a 5700XT when it came out and it beat the Radeon VII for a fraction of the price. It still beats the Radeon VII and costs around $200 used. The 5700XT is around 2070 Super performance, the Radeon VII has around 2060 Super performance. Unless of course what you're doing benefits from its RAM. You really should upgrade, I went for a 3090 for $750. Next year prices will drop even more. Radeon VII is too hot for the little performance it gives compared to cards still a fraction of its price used. I suppose there's no harm in still using it, but I would never recommend anyone buy one today.
@@metalchungus6241 The 5700XT costs around $200 used and vastly outperforms the Radeon VII in gaming. Literally zero reason to buy this card, especially used. Zero reason, you will be left with a card with no warranty, that runs hot, and is a terrible value especially in 2023. I hope you didn't buy one....
It's a conspiracy as the CEO of amd is the niece of the CEO of Nvidia
The legend has started
Justin Y. are you running
This bot is too op for captchas even
No they’re cousins :P
@@kiyoponnn some people
This vid didn't age well lol, already discontinued, less than 10 months after launch
so they just couldn't sell off their MI50 so they decided to make a quick buck
AMD with vram: I neeed iiiitt
It's still a beast for video editing.
It was never intended to be a large volume product, it was basically MI60 cores that did't make the grade so were utilised for a gaming gpu.
@@gabrielwhite3890 Why not? It's money sitting there waiting to be snapped up.
No. thank you, I'm good with Intel HD Graphics 620
@@kiyoponnn you already said that
Wow u r a legend
@@PaneledPear fuck off dumbass
Amateur... My laptop is running HD4400
Papa Linus will not want to see us fight.
Can you use lighter colors for your graphs so the numbers in black are more visible and use a solid color background? These graphs are bad.
but dark mode
people these days be using blackout curtains and are in a lightless basement.
Linus, you missed the killer must-have feature.
RTX2080 - 8GB - Supports over 450 chrome tabs before memory runs out.
Radeon VII - 16GB - Supports over 1000 chrome tabs before memory runs out!!
Win!
Isn't that managed by system memory(ram) and not video memory?
could be a joke, but some noobs could really believe you
@@expertojordigg Both are required, technically. Back when I had a lot of stock charts going on (Flash or HTML5 graphing stuff), Chrome (and sometimes my computer) would crash when I hit a certain amount of tabs. It was beneficial to leave them up so that I could quickly flip between them and compare. After jumping from a 2GB card to a 6GB card, I could roughly double the amount. (Probably closer to a triple, but who knows - I was eagle eyeing it.) After the upgrade I could immediately add hundreds more to my saved session without chrome and my computer crashing. System memory was 32GB in both cases.
Your GPU is now used to accelerate all sorts of web stuff, including just compositing the page, drawing fonts/characters, etc.; even drawing a white square in the background is likely now handled by the GPU rather than the CPU, at least for many web browsers on some OS's.
I'm wrong about the amount of tabs, though. In my own testing, 6GB cards support about 800 before windows gets crazy unstable. In theory 16GB cards could approach the 2000 tab level. At that point Chrome will be gobbling closer to 64GB of system RAM. Just 600-800 tabs already consumes well over 20GB in my experience.
HBM2 and 4 stacks of 4gb gives the user twice the bandwidth no matter how many GB he is using. It's not about how much memory it's about how to double the speed it reads and writes. Like people who want a cpu that clocks 5Gig or ram that clocks 3466GHz. Like people who pay a premium for quad memory. I don't run 4x8GB sticks of ram because I need more than 16GB total memory. It's because quad is faster than dual channel memory. Radeon VII memory: Utube reviewers cover up the advantage of 16GB of HBM2 by talking only about how much ram do up need. Your letting them think for you. A year after Ryzen 6 core cpus launch and Intel has 6/8 core cpus, Utube reviewers mention a 6 core Ryzen stomps an Intel 4 core at gamming if you stream at the same time. Hide the feature by hammering on something else.
Holy crap, imagine the quantity of porn you can see at the same time!
@BikeHelmet say that to titan rtx 24gb vram
Family reunion be like:
Jensen: Radeon VII reviews are out, it's really underwhelming.
Lisa: I hear your RTX cards aren't selling that well.
RTX cards are actually selling a lot
@@christophegroulx8187 And failing a lot. Ram running too hot and failing. What a way to fuck up a great card!
Christophe Groulx no they’re not 😂😭
Christophe Groulx yeah that’s just not true lol.
Common Dirtbagz it happened for a while with Micron batches. stop denying. I like rtx and all but tech yes city got a vid of GB rtx doing that.
Your graphs are unreadable.
How so?
Totally readable for me. Maybe you should stop watching in 240p
the numbers Linus What DO THEY MEAN!!
Designed for PCs, not phones.
check an eye doctor.
Give Lisa time! The woman took company from near bankruptcy to scaring Intel, now scaring Nvidia
Philip Quaglino this aged well
Aged like fine wine
@@GamingWithSpeed how did it aged well im not really inform with amd stuff
The white hawk AMD hasn't had anything real competitive in the GPU market ever since Nvidia released their super cards. The only benefits most of the new AMD cards have is PCIe Gen 4 and more memory (which can be useful, but these gpus still have less raw horsepower than Nvidia's supers. Plus, the Radeon VII was put in end of life only six months after it released.
@@GamingWithSpeed thank you
It has two 8 pin connectors because it barely draws 30 watts from the pci slot. It gets basically all it's power from the connectors. And also gives room for extra power needs from overclocking
at $500 and 12gb mem they would have taken the market mid and enthusiast
Newegg already selling the card for $600.
It was $699 then went out of stock then taken down then back up still out of stock with $599. Pretty sure it's a bug, although 3rd parties are usually cheaper than the initial fe cards. So we'll see @@Pwnstared
@@Dante_S550_Turbo Has to be $500 or less. Without ray tracing and with 1080s coming down in price that's the sweet spot.
@@Snipermac99 this shits on a 1080. Especially in content creation.
I'm sorry but 99.9% of the people buying beefy GPU don't give a damn about content creation.
I am glad you didnt literally DROP it this time
Rishiraj Rajkhowa I bet he still did. They just did not leave it in the video
He did drop his wallet though. ;)
@@Hsiss god i feel so sorry for the card
Major spoiler here :(
the Radeon RX 480 reveal was just freaking hilarious LOL HAHA
Doesn't change the fact that the AMD CEO is the niece of Nvidia's CEO.
Nani?
i love meth.
Nani?
Oh shet its actually true. Some game of thrones type shiz
Nani the fuck?
Hol up warthunder sponsored this. Noice
Nice to see you here hehehe
Cheeky bustards
Nooice!
)))
They have too much money
Linus, you're drunk. The human eye can't even SEE 7nm. 🤦🤦🤦🤦🤦🤦😑
Would you recommend using a pound of cooling paste or only a half a pound of cooling paste for a Ryzen 3 processor Etienne?
The human ears can only hear 64gb of ram
@@AstroKitty16 I infact use high quality thermal paste for my PC builds. Otherwise would fry my components. It was just a reference to the Verge Video with Etienne. He pretty much did everything wrong in his PC build Video and jizzed the whole CPU with thermal paste and put two "fast" DDR4 RAM in single channel, stating that it wouldn't depend in which slot or like he says "Bracket" you put them in. Just search for reaction Videos of different PC builders on RUclips and prepare for the cringe of your life. Because it's really bad.... Hence the "cooling paste" reference. Enjoy! XD
@@Eltonbang69 OMG... XD. That'll be a "handful" of work for two days.
Imagine the smell emitting from the PC.
The human nipple can only smell 8GB of Cuda Cores
0:25 Me when I run out if chicken nuggies
4:15 the 2080 does better than the 2080ti?
lol
mexicanmanjohn smh
that can’t be right
Smh
smh
Meanwhile in March... What Radeon VII?
Can't find one anywhere.
Anon Mason yeah man it sucks
I wonder why? XD Seriously though what an ugly looking card. I know performance is a key value but come on, even Intel sneak peak at X^e Graphics cards look nicer than that 90's looking brick.
RTX 2080 Trio in my opinion has to be the best looking card though
@@galaxyeyesphotondragon8191 in what way does it look bad? looks fine to me
@@sway4808 compare it side by side to the rtx 2080 trio. The way the radeon 7 looks is plain, and blocky it's fine if you don't want to show off your computer, but comparing them side by side it just looks hideous compered to the competition . Functionality wise? it seems good enough.
@@galaxyeyesphotondragon8191 Are u a kid? Only kids prefer style over substance
AMD still makes boss integrated graphics, we need a Vega GPD Win 3
I tweeted them and asked if they would have the 15watt Ryzen with Vega chip and they had a one word response saying "will". So, I'm thinking they will. lol
@@edgeofsins proof?
@@dedgect
tinyurl.com/y8674ngy
I already work at the Krusty Krab
You can't even buy a potato pc lol
Thx for telling me to get a job. I needed that.
Whoever makes your Benchmark charts needs a new job, Those charts could be used as a reference for everything not to do when making bar graphs.
Nobody really gives a shit about no Ray tracing. Most people prefer getting a stable 60fps or more .....
You make good points on features though. 12GB(or less) of memory on this card would have made more sense to keep costs competitive
The initial release of RTXs had unstable performance when performing ray tracing. They since then released a driver and now performs pretty well .
Absolutely. Ray Tracing is so not ready for prime time yet.
M. K yeah plenty of people actually care about it.
@@fancyslimoshady Tanking fps for a huge price increase is not worth the visual improvement RELATIVELY speaking. Puddles will look pretty. Neat. I'm looking at said puddles for less than 1% of the game. I will enjoy an extra 50% fps 100% of the time.
Sure but RTX and DLSS features are still features. They are available options, which are not available in AMD cards.
There's no missing features if you get an Nvidia card instead of AMD. Some performance hits, sure but not incompatible.
For $700, this card is nothing. 7nm, 0.7nm, 0.00007nm, still doesn't matter for end users. If anything, it's weird how much power it consumes for just 7nm.
7:11 Amd has ReLive? It's just as good as ShadowPlay, and frankly I prefer the ReLive interface.
Did he say it didn't?
I have had both and and Nvidia cards from their recent generations, and re-live videos just look worse at the same bitrate. #Fact
@@SrtRacerBoy lol, thinking if he do a #Fact means it's true....
@@wabachi Lol implying he is erong when it isnt. Relive is most certinally not bettet.
As a video editor, I prefer re-live because it records in a constant frame rate as opposed to shadowplay with variable frame rate. People who use Premiere to do editing know it's a pain with VFR. I hate having to convert the videos to CFR with handbrake because they take so long. While the is a difference in visual quality from the two services, the difference is small.
@@kristanbirbalsingh8161 what are you talking about. Shadowplay records in a constant frame rate.... Seriously, ur a clown
0:23 me on the toilet after a night of taco bell
Lmao
ROFL I'm on the toilet from spicy food watching this rn.
hahaah good one
must be on fire
Should be seeing a doctor if eating a fast food taco is doing that to you.
rx 5000 series: allow us to introduce ourselves
Wait for 2030
@@colemin2 what??
RAD 7 - Who are you?
RX 5700 XT - I'm you but better and cheaper
@RectalDiscourse me with 1 gb
@RectalDiscourse Ummmm never seen a game use 8gb of vram but ok
I expected more from that GPU... What a disapointment...
Btw, the graphs are kinda of a mess, too fast, and too messy.
G Dante You should not express your opinion when you know nothing. You are clearly some fat dude in his grandmas basement who has amd stickers all over the room. Fanboys are the worst. PEACE.
@@GreatCakesable I think it's a decent deal to get RTX 2080 performance for $700. You get futureproofed 16GB of HBM2 and also save yourself $100. It's not that bad of a card...
You shouldn't have expected anything the second you heard Vega, 699$, and 300W
@@hambopro4221 And when would you start utilizing the 16GB VRAM though? Around 10 - 12 years? I dunno, but something tells me Radeon VII would not be able to keep up with games by that time. They should have limited it to like, 10GB or something. That way, they can sell the card for much cheaper and be a great deal. But now, I'll just stick with 2080.
@@GreatCakesable why so fucking toxic jesus christ
Wait no ray tracing?! How am I going to cripple my performance for *slightly* better reflections then?! SMH AMD, SMH
And in ONE GAME as well!
Truly baffable, what a disgrace
@@Icybubba You guys don't get it, the frames drop to 24 to make it more cinematic!
OMGSMHWTFBBQ
@@CaveyMoth LOLROFLRIP
Amd: our cpu are good though..
Linus:get a job....
Yes
Big Oof
well, I don't get it, someone pls explain?
@@jcgongavoe337 yikes
their CPU is still shit. CPU at same prices as Intel are worse in gaming and even perform similar in workbenches. So then, why get it? Cuz hurr durr some reviewers said Intel killer hurr durr.
Well all benchmarks tend to show Intel still winning. Especially at lower resolutions. So unless you game at 4k, you shouldn't buy Ryzen shit. And even then, why settle for less? Same performance at 4k.
I just bought the compute version with 32 gb of HBM2 on ebay for $250. We'll see if I can get it working for machine learning. And maybe I can run Steam on Linux and use it for games. It's interesting that the game version was $700 new, I think the compute version was more like $5000, where the only difference was that double precision arithmetic wasn't gimped, it only has one video out, a double memory version is available, and the drivers don't even support windows (or even most PCs).
CPU division: 💰💰💰💰💵💵💵
GPU division:
It's really quite amazing that their continued insistence on selling people on ram and buffer that you won't utilize in gaming, actually makes it impossible for them to compete on price now. That's a ferociously stupid business decision that required general ignorance from their customer base, and that has finally caught up to them.
I don't dislike AMD on the whole but I have always disliked their ability to project themselves as the people's champion while selling snake oil.
@@RJT80 They couldn't use 3 stacks of HBM and with 2 it would be bandwidth starved. So there was really no choice but 16GB.
Lolled9991 doesn't really matter how good amd is if people keep buying nvidia products.
wait for navi
@Lolled9991 They're clearly not using their superior architecture to beat Nvidia. Not in gaming, at least. And even in serious workstation loads most are going with Nvidia. There may be some all around edge cases of people who do a lot of both workloads, but there aren't enough people out there with those specific cases to make AMD a threat currently. Hopefully someday they get it together and can compete again in graphics and drive down price, but at this point its hard to justify investing in an AMD card, since If im doing those kinds of workloads I can either A, afford two specialized systems, or B, have a workstation at the office to deal with those other workloads than gaming and if I need to do something on mine at home I can afford the extra time given that its not the primary use of my machine anyways. Idk, its a confusing play by AMD, maybe just trying to get within punching distance, but theyre not there yet, and I'm willing to bet Nvidia, even if you took away their head start of releasing months ago will far and away outsell with just 2080s vs Radeon VII from today forward, and that obviously doesnt include all the other cards in their lineups.
Nothing beats a new GPU review! We only get these videos a few times a year!
facts I get really excited for new gpus
AMD has a Shadowplay competitor software with
Radeon™ ReLive Linus...
Didn't say they don't. GeForce experience is more than shadowplay. And NVIDIA's hardware H.264 encoder being superior has nothing to do with relive...
@@LinusTechTips Are you high? AMD's software and drivers are much more reliable than Nvidia's. GeForce experience is trash and only provides stupid "optimisation" that nobody uses + sharing to Facebook/Twitter.... Also two more words.
Linux support.
@@LinusTechTips Ok first of. You guys actually read this here. And I just really wanted to write this comment since it sounded like AMD has no alternatives for anything Nvidia does. Like AMD has Advanced Media Framework Encoder as their version of NVENC and actually have something that NVIDIA still doesn't has. Being able to tune/change your color, saturation, contrasdt and so on in the game and in any way you want. I just find it frustrationg to just compare things to NVIDIA "Bulletpoints" and think it is not really representing the actual product if the AMD side of features (even if not superior) isn't even mentioned. Also personally speaking. I had only problems with using H.264 on my 1050 Ti OC + self OC. So it might be cool for the high end but not so much for the lower consumer grade products even while pushed to their outer limits. But still.. Thanks for taking your time replying to my comment. I appreciate that.
@@hambopro4221 No need to be rude man/woman or other.
@@Nexipal I'm on your side, apache btw
6:49 - Lmao, Jensen Huang and Lisa Su aren't related, Lisa Su denied it. It's just a stupid rumour that got floated around the internet.
then why do they wear the same jackets?😏
@@saptarshibhattacharya2505 because they had a quickie in the limo and switched clothes all the time
In Canada, the RVII cost 929$ across the board for any models while the lowest 2080 RTX is 969$.
Basically, the RVII is at 700$ US while the cheapest 2080 is at 730$ US.
i'd still rather spend the extra $30 and get two free games as well as more features.
That $30 will eventually be eaten up in power consumption. The AMD card is more expensive in the long run.
@@CalculatedRiskAK Oh snap... you spit hot fyia!
The R7 is currently being offered at £650 UK, for 16gb ram it's a great card for content creators like myself that will use the ram.
in EU you can find a 2080 at about 670 euros already... AMD has lost the international game before it begun...
The 16 GB of memory may not mean much to gamers, but it means a lot to deep learning researchers, and it is probably the only sub-$2k card with this feature. Meanwhile, the 1080 Ti went from costing $700 to now twice that.
Sure, but as a DL guy myself, you have the *sliiiiiiiiight* problem that every goddamn thing in DL is CUDA-only.
Eldritch Not so much these days: github.com/ROCmSoftwarePlatform/tensorflow-upstream/issues/173 and rocm.github.io/pytorch.html
What's funny is now we want this kind of vram in gaming gpus
3:40 Failed Linus drop...
(The card starts shaking)
also 5:18
7:55 the issue with GCN is the scaling of shaders, adding 10 more compute units will not help the card at all.
If i remember correctly GCN is also limited to 64 CUs max du to architectural design.
@benjib26 you are correct. The architecture cannot support any additional shaders.
7:12 I actually prefer the adrenaline drivers over GeForce Experience.
same, amd relive is great
yeah, I had to buy a 970 over a 480 a couple years ago because amd raptr was complete trash but for my next upgrade I went with a vega 56 and was REALLY impressed by AMD drivers
Geforce Experience is god awful..
a lot of people do but you get these nvidia paid reviewers like linus who don't even acknowledge the existence of relive or adrenaline and claim ge is the best thing since sliced bread
AMD really streamlined the experience with adrenaline+relive. Objectively superior user experience at least with vega 64.
Dudes... Radeon VII is said to cost about 3000PLN in my country while 2080 costs 4000PLN... I say that's a great deal lol
Yep idk why in Poland those cards don't have same price xd
In your situation hell yes it is
Lucky.
Same in Australia too
HaydosaXD you can get the MSI reference one for 1169 AUD and the XFX for 1099, which may only be 100 cheaper than a 2080, but I still see that decent value
But Can it run *Minesweeper On Full Screen*
*On Full Screen*
*4K Minesweeper*
Not with ray tracing.
I bought one when it came out not knowing anything about it but that it was the the best amd card, well i still have it, its never done me wrong in any game i've played and i dont ever mess with clocks.
0:23 The warning is well appreciated
I wish they coulda did 12gb of HBM2, but that would limit the memory speeds to 768gb/s instead of 1tb/s. Woulda made it cheaper. Also, xfx is already selling their radeon vii on newegg for $600
768GB/s is still blazing fast considering the 2080 doesn't come anywhere close to that...
the thing is that the vega architecture only performs well with the bandwidth. If they did one with 12gb of HBM2 then it would perform considerably worse, making it probably just something like 10% faster than a Vega64. Not worth it
vega vii is barealy matching the 2080 at 1tb what do you think will happen if it goes down to 768? @@Chrinik
@@nicane-9966 Nothing because you clearly don't know how bandwidth works...
@@Chrinik Well said. Lots of ignorance in this comment thread.
Some people can get jobs
But not us
Not us
Tf you talking about?
@@BrandonBeanland watch "Avengers - End Game TV Spot" and you'll know,, 😂
@@BrandonBeanland some people get jobs. but we only game
they took our jobs
I have full day job 40hrs/week and I can only buy this card if I will be saving third of my salary for entire year. The prices are just fucking frustrating. Earlier I could buy entire gaming pc within year of savings.
Wow, they should have just put in half the memory and dropped the cost $150 and they'd probably have had a hit.
the problem is that vega is extremly memory bandwidth hungry. it wouldnt perform alot better than the vega64 with 8gb of hbm=512gb/s
Saboth literally would have cost more for them to do that than not to. If you didn't know, the HBM stacks are integrated on the GPU die itself, and the GPU actually needs all 4 stacks populated with hbm in order to even function, because of the memory bus design. A gpu with 2 hbm stacks would have required a hefty redesign which would have cost a lot for RND, so the cost savings would have been minimal to nonexistent and you'd be releasing a lower capability card for the same price.
The reason amd couldn't have designed it for 8gb hbm2 is because it's just a low binning instinct gpu in the first place. Radeon vii is a low cost radeon instinct that didn't bin well, and because instinct is a creator card it needs the 16gb hbm2. Radeon vii is AMD saying fuckit and finding at least some way to recoup the lost profit from the few instinct gpus that didn't bin high enough to make it into pro cards. This was never meant to be a real gaming card or a competitor to the RTX, it's just a stopgap for Navi.
Well 16GB's is the only selling factor for professional use. If it was just 8GB then who would have bought it ?
You should be amazed that by doubling the memory bandwith ,the TDP is less than that of of an 8g hbm2 on Vega 64 (because of the 7nm of course )
that’s not possible on VEGA
Can you just give us 1 or 2 mins of Review about just for Video Editors who are interested to use this card or RTX 2080 for 4k or 8k video editing? which one is better?
I would assume the Radeon VII is because of it's extreme memory.
Get a Titan RTX
It's better
For a mear $1400 you can beat a $700 Radeon VII. WOW, who would have guessed that?
@@breadbuttrjam1604 its Time Now to go with RTX 3090 or 3080
@@nexttvc yeah
Does anyone actually care about Ray Tracing?
Not much,it's giving up performance for graphics,i know Nvidia relies on graphics but even they don't understand PC users.
We will 10 years from now.
@@cyphaborg6598 pretty much every graphics option involves "giving up performance for graphics".
vfx artists. Doesn't really makes sense for games.
Not sure, maybe its too early, I really couldn´t care less right now. Maybe in the years to come...
I might get down to business to choose a "Ray Tracing" or a "7nm Gaming" GPU.
Yeah same here...
Or do what AMD fans do best and wait for nVidias 7nm GPU line later this year and get both 7nm and RTX.
7nm should not really mean anything to a user who is just trying to game. The 7nm GPU just helped the card crank the performance without significant power use increase. Ray Tracing might be something used in the future, but it comes at a premium. This is a very hard choice and will depend on many factors.
@@seventyseven8076 Well AMD fans have no choice but to buy Radeon VII or wait another 8 months for Navi to answer the already 24 month old performance of the GTX 1080 Ti.
I felt for ya at the end of this video! AMD is so close yet so far away....
Anyways, for those of you who want to save a few bucks, the Vega-64 is good enough for MOST AAA games if you buy TWO of them and cross link them! It's been in the Radeon software since Version 17.9.2 where you can PAIR UP two Vega cards!
Go on eBay and take the chance. I'm seeing them as low as $260 US ($320 CAN) per card so for $520 US ($640 CAN) you can get all the GPU cores (4096 cores per card) and a total 16 gigs of VRAM for your gaming AND you work-related vlog editing tasks! AND I would definitely think about getting that ultra-widescreen 2.5k display though! it REALLY makes your system enjoyable!
Amazon is now selling that LG 34WK500-P 21: 9 Ultrawide Full HD IPS LED 34" Screen LED-LIT 14700510 at 2560 x 1080 pixels with AMD Freesync which would make your gaming and video playback/editing REALLY SWEET! And it's only $270 US ($355 CAN)
.
Cross firing a 300w tdp card is a bad idea.
Also the radeon vii is around 540-550 on Amazon rn so
0:22 makes my day
Oh Absoloutely.
👏👏 Radeon Review
*Meme Review*
*claps 2 times* - Radeon 7 dead
@@sujandhar6613 more like THANOS snaps radeon dead
AMD did an oopsie.
It was worth to re-watch this video just for GPU/Riley's rage fit :D
I hate the framing of "Will amd be good enough to lower Nvidia's Prices?" that everyone seems to be doing lately. Why cant it be "Will Amd Finally be competitive in the high end again." The prior implies that their only purpose is to force Nvidia to make GPU's at a reasonable price so that everyone can buy their rgb laced nvidia crack to appease their senseless brand loyalty to their favorite dealer, while the later says hey look I hope this gpu beats whats on the market now so that video card performance continues to tick up instead of stagnating like the 2060-2080 did.
That has been AMD's game for a long time, that is their role. AMD is a fraction of the size of Nvidia, with a fraction of the resources. The fact that they compete at the level they do, at all, is already pretty impressive.
AMD will never win the war against nvidia. Nvidia is too far ahead software wise. You are a gamer ? Shadowplay is absolute must. You are a programmer ? Tensor cores are APSOOOLUUTE MUST . Better optimized drivers and better community support as its 20x times biggeer.
I don't think it's brand loyalty as much as them simply having a better product, and most people only care about that. AMD just dissapoints over and over again. Just like with Vega64, you'd need to be an absolute AMD shill and fanboy to buy it over GTX1080 considering everything.
I'd wish AMD stepped up their game just like they did with their Ryzen CPU's, but...I'm ordering RTX2080Ti soon.
@@skuyzy198 I have a Vega 64. My Asus ROG Strix 1080 went into my sons computer. The difference between the 2 is negligible at best, at least in the games we play, were talking under 5 fps, if any difference at all. Some games in DX12 or Vulkan, the Vega actually beats our 1080. Are you thinking of the 1080 ti????
@@skuyzy198 I totally agree that currently and if I'm honest ever since the 900 series Nvidia has had better gpus than AMD in most segments that isn't what I am arguing about. My statement is that I wish people desired AMD to do well as a benefit to the community in general rather than just a means to reduce the price of Nvidia products.
NO RILEYS WERE HARMED IN THE MAKING OF THIS VIDEO
This card is sweetspot between productivity and gaming.
Maybe we should wait for driver updates.
still have my Radeon HD 7950. 😂
still have my Radeon HD 6370M
280x here
its still a beast tbh
i have two still kicking in crossfire in my second pc
Now's a great time to pick up a GTX 1060 my dude. Huge upgrade from the 7950 and it just took a massive price drop. It's priced almost as low as the 1050 was 6 months ago.
Except the 2080 is still $100 more than the Radeon VII...
depends on which version, the sick looking rgb tricked out ones are, but the standard aren't
The rtx founder cards are much cheaper compared to other companies like gigabyte
Joshua Johnson Not it’s not
Joshua Johnson raedon vii can be found for 550$
Water sheep’s Child So can the 2080
Finally WarThunder marketing team got to work
There's a hole in your left cooling fan!
doesnt make the game better
@@loc1123 it does: more players means less waiting time!
@@loc1123 it's already a really nice game :#, atleast the planes are, idk about the tanks, but the plane's are extremely realistic.
@@mataskart9894 tanks are great, super realistic, trust me ))
0:22 this entire scene is priceless
Indeed it is.
Ah! Screw it. I'm getting an Rx 5700XT.
Wise choice bro
@@radityabagus426 nt a wise choice as it is plague with driver issues. Imma get the rtx2070 super instead with rtx and also dlss enabled
@@potatoking9945 up to you, juat sayin my thought
It's kinda funny that people with 0 clue are talking about AMD drivers. 5700 and 5700 XT hadn't issues for months. It was just small period when they were released, and people are still complaining about it, but they never really had those cards in their PCs.
Larine really dude 1070 to a 5700xt I know 5700xt that is a good card but 1070 is still a capable card in 2020
Radeon VII works fine for me, esp with the better drivers now (no more Crashes) - However there's no RayTracing. I've never had the GPU max out on me, and it plays fine with Star Citizen 3.5 / 3.6 - The fans are quiet (I have a an open frame Thermaltake) for most games, and you can adjust the curves (as normal). I think for the Price / Performance is good on this card
I’m Still playing on the “NEW” Intel HD 4000
Get 1050 ti
Adam's Tricks i’m kidding, i have an Rtx 2070 with an AMD Ryzen 2700x
I'll sell my GTX 1050 Ti SC for $90+ shipping
PriGO FCD weird flex?
@@liquidrime i feel that now you are better then me
Quite the power house in OpenCL otherwise just another 2080 as expected.
Probably more staying power but thats it.
Yeah but it costs 400 dollars less
@@cyberbitzz8178 It does not? The 2080ti costs over 1k. The normal 2080 is at the same Range.
Omg a warthunder sponsor!!!! Linus ily! That is incredible to see a company like that reach out to you.
Just a suggestion on displaying graphs... I think it would be best to keep the product in question (Radeon VII in this case) displayed at the top of the graph and immediately below it are the other comparison products. This way, the viewer won't need to find where the product being reviewed is for every slide. Even if in this case, the Radeon VII is highlighted, I needed to wander about and pause the video to see the comparison. Having the product always be at the top is a quick way to deliver performance comparison info.
You probably want to say
"Sleepers by day, content makers by night" xd
Really didn't expect a war thunder ad, that's pretty cool.
Funny how he's comparing performance in all these modern DirectX 11 and 12 games and CSGO is sitting there still on DirectX 9.0c lol.
just valve things
SC2 and LoL are both still DX9 too IIRC. I think the base version of DOTA is too, but the new version uses vulkan when available (and therefore probably something newer).
0:20 Seriously though, I listen with headphones all the time. You guys, and everyone else, on videos, live streams, WAN shows- whatever the hell those are, are TOTALLY good! It doesn't hurt in the slightest. In fact, I use headphones so that random loudness doesn't bother my roommates via speakers.
How is this not a meme?
legit watched Riley flipping out like 5 times xD
YTMND
i'd love to get a job but i guess i'm just unhireable...
*where is my parents credit card*
In deinem Anus
Dipqi Ghozali Under the cigarettes
I bet my old drug addict friend Mark will be stealing his mom's credit card, POS dude.
@@DimitriosChannel ya need to tell him that
They can get HASH rates over 100MH/s! A beast 2 years ago and a performance workhorse today that beats the 1080Ti at mining.
6:33 Based on GamersNexus' (or rather Buildzoids) video the power delivery on the Vega is overkill. Power delivery is NOT a limiting factor on the Vega VII
Actually it could be but only in ln2 scenarios
5:18 Linus loves hitting/dropping stuff.
I would buy AMD anyway just because i want to support them to get more competetive.
AMD isn’t a charity. They don’t care about you. Making the choice to buy them anyways is fanboyish and allows them to get away with NOT being competitive, if they know fanboys will buy it anyways.
AMD fanboys kept buying stuff and AMD is going up. So your assumption is wrong. If the stuff will be terrible, even fanboys will not buy it. In this case, the GPUs are very similar for the same money so its my money gets either Nvidia or AMD which will support competition, not the monopoly.
@@bitkarek No, AMD release competitive things on their CPU side - that's why they're doing so well. Their GPU side has been disappointing. "Wait for polaris" (only really saved through mining) "Wait for vega" "Wait for VII" "Wait for Navi." No, it's not very similar. It uses a lot more power and they essentially released a card that would have been competitive two years ago.
if nobody went with AMD in past years, they would bunkrupt, not come up with Ryzen.If you have 2 companies only and you buy only from the better one, the second one would be gone fast. They would not have money for the development and bye bye.
@@bitkarek What are you even trying to say. AMD happened to survive because they released semi-competitive products. Anemic, but semi-competitive. You don't buy from one company if they've wasted years and money just to keep them afloat enough so that competition exists. You let them die and let anti-trust legislation settle it. You don't reward them for lack of innovation and bad products. AMD has shot themselves in the foot for years.
0:20
Man, this Gold will relevant for many years to come
At least they listened to one thing and uncapped more FP64 performance, now at 1/4th rate rather than 1/8th. Though that also shows that's entirely a software limit and it could have been uncapped to full rate like M150, but ah well. I'll be curious for testing if the iMac Pro is refreshed with a chip like this soon, what its FP64 rate will be, as Apple co-writes the driver.
This thing should be priced 100$ less THAT would make instantly competitive
Meh, it's just the reference card, wait for the partner boards
They aren't making any $$$ off it eves as it is, what the hell???
They probably will not be able to make enough profit from that, especially selling at high end, which is a small market space
These are just really low binned M150 machine learning accelerators that AMD launched in what, October of 2018?
7nm dies are expensive. It makes sense to bin this way.
theyre not making very many of these... literally Europe is getting less than a thousand as of today .... its limited for reasons
I started watching this video and the subtitles were in Hindi.
...I don't ever remember turning that on.
How is the radeon 7 a failure? It offers decent gaming but drags the floor with the 2080ti in professional applications. People just didn't use it for what it was good at. The radeon 7 and vega gpus have always been as close to a workstation gpu as you could get on a budget.
I think i already commented here