Not going to lie - I encounter that GPU Render error (and now it's turned to outright driver crashing) multiple times every day, (and this is on 11GB 1080ti) so you saying you can avoid it on Radeon VII is enough for me to start turning on email notifications for when they come into stock. Thanks for covering this.
And u have got it ? I got mine week ago and is stunning, after bios and drivers update works like a charm, and will be only better :) 24 % increased performance comparing to previous drivers plus lower power consumption :)
@@Dar1usz RX6000... are here.... this is why i stick with console and laptop... it not long about money but i can not had endless cpu and gpu... for all this endless bulshit.
@@calidude1114 I was just replying to this comment when I noticed that you were the person I just responded to on another post lol. What's your PC's build? 128 GB of RAM & a Titan RTX?! What do you do for a living? lol Also, what's the point of all that RAM? I'm guessing you use it for something other than gaming?
Watched pretty much every youtube review from GN to LTT to HC, etc etc. DF is the only one with an even distribution of gaming and production performance and explanation of how this card aligns and competes in the gpu space.
It's refreshing to see, as many 'gaming tech' channels focus souly on gaming performance and poo-poo 16GB HBM2 as pointless, when clearly it isn't. As game texture resolutions increase, more VRam and high bandwidth make a large difference. Many seem to think its gaming performance is poor also, but considering is can nearly match 2080 performance (and actually sold ((all so briefly)) at MSRP), it's pretty good value for money! As long as AMD remains dedicated to driver support, the VII should gain some more performance and should be a good buy, even if it is just a stop-gap until Navi. I'd like to see a board-partner (Sapphire particularly) do a custom cooling solution though, as noise seems like the biggest down-side.
@@Ben-Rogue I doubt drivers are going to make a big difference here. Whatever optimizations AMD needed to make with Vega they likely already have. In this respect Radeon 7 is a known commodity. Secondly, the VRAM doesn't make much sense with regards to gaming. The card's rasterization performance will be tapped out long before its VRAM. By the time the 16GB allows you to retain a higher texture setting, you'll be compromising in so many areas that you'll likely have upgraded. That's why Richard focused on productivity workloads in terms of VRAM. If anything, the Turing cards are likely to age better than this Radeon 7, given how they seem to scale with newer game engines and/or features (as seen in games like Wolfenstein). People tend to focus on RTX and DLSS with regards to Turing's future, but don't pay enough attention to the architectural improvements that will apply across all games on newer engines.
@@Ben-Rogue radeon 7 has very very little architectural differences when compared to vega, in fact as more games using low level apis come, we will likely see turing pull even further ahead as its better in those areas.
@ntodek get a job to fill your free time that you would use to actually game. I make 100k+ a year, I still can't talk myself into getting any of these cards when a $200 card can allow me to play the games I want. I'm kind of sad to see that the jump in power wasn't as big as the time between releases. I was expecting more from mid range cards, but it looks like I can hold off from up grading my Rx 580 8gb or my daughter's 1060 6gb for quite a while. Still not seeing the consistency at 4k that I want. Maybe the rtx 3060 will finally entice me, or even 3070. But 350 for the 2060, I'm not seeing the price to performance. Even the 1660ti is not what I was hoping for.
I've already read and even seen a few reviews and i'm not even that interested (1080) - but I'll ALWAYS come around and enjoy your reviews and info pieces. It's not just curiosity, or entertainment - but pure enjoyment. I really, really like your content ;D Thank you for this & auf wiedersehen ^-^'
yeah the best part about 4k is that you need less than half the power to keep playing in 1080p. can't wait till people are struggling to run 8k so i can buy a garbage laptop with onboard gpu and still play games fast in 720p
@@jjnet123 my laptop has a 1080p screen but i don't mind playing at non-native resolutions in it. i just go for 720p if the situation calls for it. i mean, it's mostly a work laptop now, so the screen can't be that shitty.
@@GraveUypo I was only mentioning it because my laptop was 1366x768 and was a great laptop for its time still does play somethings well but it probably would melt on battlefield v or any of the tomb raider games post 2013 😂
@@jjnet123 mine's a not-so-good 3rd gen dual core i5 that boosts to 3.4ghz with a 2gb (ddr3) 650m and 8gb of ram. if i overclock the hell out of that gpu and lower the resoltion way down, it still runs modern stuff. of course i have to add an external suction fan for it not to overheat, but i have one that works wonders, so it actually runs cooler than stock without it. forza horizon 4 runs like 30~50fps in it on the lowest settings in 1024x768 so lower than 720p. it's the lowest res the game allows, otherwise i'd go down to 800x480 which is widescreen and probably get 60fps. heck i'd gladly go to 640x400 if i had to. i started my pc carreer in the 90's so i remember when that was the upper bound for gaming resolutions. duke3d looked so pretty in SVGA :P i haven't tried many other modern games in it, but i suppose the dual core cpu may be also a bottleneck nowadays. welp, i only do light gaming in it anyway, i have an xbox one x and a gaming desktop for actual gaming.
Yeah, the press driver is absolutely abysmal from looking at Gamer's Nexus's review of the Radeon VII. Hell, you can't really even overclock the thing atm because the drivers are so unstable.
A mild undervolt on the Radeon 7 gives a pretty good performance boost, at least that's what I read on reddit, would love it if digital foundry looks into this matter.....
@@SconVideos Vega runs into a MAX POWER limit VERY quickly. Even at +50%. If you let it run at whatever voltage it wants; it gets to a certain clock frequency (not very high) and hits the power limit (315W in my case). If you under volt it; you can get much higher clock speed before it hits the same amount of power. It seems strange considering you usually need raise voltage to support higher clocks, but vega seems to be really over volted out of the box.
stock voltage is there for a reason, not all chips will allow undervolting and be 100% stable. who the fuck buys a gpu to undervolt anyway. would you also underclock it?
Nvidia Titan X 12GB Frequency 1.25GHz (10Gbps effective) 1.25GHz (10Gbps effective) Interface 384-bit 256-bit Bandwidth 420GB/sec... he meant the pascal titan x
I swear, this the most informative review I saw on this card. Is it really that hard to do a production scenario where this card shines? There are people whit certain workloads that would benefit from this card and this might be the best thing for them. Great stuff DF, as always.
This card has much potential, hope they get some of the drivers and optimization fixed. Hope to see more testing and content with this card. Great video guys, thanks for the awesome content.
Because they do not know anything about video editing or video formats. You can use proxy and edit 16k video footage on a 256mb video card, it would just take 2 months to export it.
It has nothing to do with their speech, it's just that sometimes you have to watch videos without audio. But I saw that the automatic captions got added ^^ And yes, 99% of the time Automatic captions are accurate
Very very helpful review again from your channel. My conclusion is, that: let's see where AMD's Radeon 7 is going in future in regards of design. At least AMD is going to be within upcoming consoles of PS5 and XB2, so I am very curious how that design is going to look like (suppose it will be based upon this Radeon 7 design) but on the other side they may have to work out a lot of issues - for me, the 375 W average power required (!) is really crazy (Nvidia 2080 is around 250 Watt).
This is why I enjoy the DF reviews. They cover all basis and set about pointing out things others wouldn't notice, while doing so in a relaxed setting. Very nice!
That is not happening. The radeon 7 is vega which was released 2 years ago. Optimisation for vega are already done, there may be some further refinements but dint expect a noticeable difference
@calistorich no it does not. It is vega. If it did have some changes they would not recall it vega and they would also state the changes they have done to the architecture.
It is Vega but it's not exactly the same, they said in the pre release info that there were a few differences, You need to remember that the 7nm Vega's were designed for the pro market, not for the gaming market, what they've done is repurpose MI50 chips & it's behaving differently to Vega 64 which in part is because they didn't get the software ready in time for release, Things will improve, maybe not by much but they will improve. We just need updated tools.
@@davideverett2 the obly difference between the m150 is that it has better fp64 compute which even that was cut down for the radeon vii. It is literally the exact same gpu just built in 7nm. It is not going to improve. Even if it is slightly different it is inexcusable that after 2 years of vega release they haven't optimised a very very similar architecture in 2 yrs time.
I find it curious that years ago, when the 1080 ti debuted, the Witcher 3 benchmark as shown here didn’t drop below 60 fps in 4k with an i7 6700k in DF’s own review of the card. Now suddenly it struggles to maintain 60 fps with a faster processor 🤔😒
I will give you a compelling reason -> modded texture packs! Especially in VR these things chew your VRAM like nothing else, and as PC gaming goes we are always slapping visual mods on the games, more VRAM means more stable frame rates in mods and less stutter. Simple as that
Liked the video....one thing though... the factory OC on the 2080 was mentioned a lot. You'll struggle to find a 2080 that doesn't have this OC or higher.... And with GPU boost they all boost up to around the same level without any manual tweaking anyway. As well as when you're comparing architectures from different companies I really don't think it's relevant.
Jack _ If you upscale 1080p to 4K, then RUclips would preserve more bitrate, and videos look better. Perhaps everybody should upscale to 14K before uploading?
A lot of new 8k TVs came out at CES 2019 so its not far away. TV manufacturers want you to upgrade and buy a new TV, now that most enthusiasts have a 4k TV they can only sell them a new TV if it is better, hence 8K TVs.
For gaming a second hand 1080Ti for about $500 is a winner for price/performance. For productivity the Radeon 7 is indeed a nice card, specially if you use DaVinci Resolve which benefits way more then Adobe Premiere.
lol competition where? It barely beats out the 1080ti that's at this point approaching 2 years of age, and at the same pricepoint of the 2080 without extra features it's just a waste of money. Care to remind me when AMD last tried to compete at the VERY TOP of GPU power? Maybe the 7970 back in the day...
@@HLPC_2 ya I agree. I was hoping for some actual competition. if they made radeon 7 like 20-30% cheaper than the 2080, then it would be something to look at. but as it stands... meh.
@@HLPC_2 What percentage of people buy 2080Ti's or 1080Ti's, yes Nvidia has the fastest cards but it doesn't bring in the $$$. What brings in Nvidias money is their mid tier cards, which AMD competes with.
FourEyedGeek Doesn't really matter if you control all Gpu market. You can charge whatever you want and people will buy anyway, because of the strong brand.
Have you tried exporting the audio only and the video as png sequence, then joining them up again for a video render? This time the video render will be from the png sequence and won’t have to deal with the fancy transitions. Definitely takes more time and adds another layer of work, but safer and error proof.
@@grimfist79 Yeah and Nvidias promotion says that RTX is not worthless garbage. The only question left is who exactly is so naive that they take anything said in adverts seriously.
Great video as always. You were the channel that convinced me to get an RX470 back in the day. Have 1070 now. Game at 4k 60fps. Not sure whether to go 1080ti or 2080 or radeon vii or keep waiitng.... Probably latter!
9:58 - "300$ more on ur new gpu". I think you've missed the real deals, 2080 Ti comes for minimum 1300$, more like 1500-1600$ average. Or do u mean VII will go for 1200$?
Richard = Number 1 Reviewer! Interesting idea for you Richard - how about you conduct a review (using your current review style) but imagine you're still the editor of the official Sega Saturn magazine!
@@zybch it is expensive due to the massive hbm2. I envisage the navi won't use hmb2, thd architecture will be more powerful and energy efficient, and easier to cool. Because it won't be an overclocked vega like this. Therefore should be cheaper.
It's like being stuck in the Groundhog Day movie. 2017 over and over and over again. This will probably be the longest cycle I have ever had a video card GTX 1080 TI still no compelling reason to upgrade either performance-wise or money-wise
Same here....bought the 1080 TI in April 2017 and it's still a great card in comparison to what's on the market, even the price/performance didn't change one bit. I guess we'll have to stay tuned for the next generations of graphics cards, something along the 3080 TI to get a meaningful leap in performance. The 2080 TI doesn't really seem like a suitable choice to me given how much it costs for what appears to be about 25% performance leap for a whooping 1300€ over here.
Indeed. I got my 1080Ti exactly last year this month for $849 while the mining craze was in effect. I am still happy with it. Since then, I upgraded to that ridiculous 49inch Super Ultrawide monitor 144hz. It's between 1440p and 1080p. Which I think it's the sweet spot for 1080Ti with 144hz monitor.
I'm currently using a 6G GTX1060 and I've been waiting quite a while for something from AMD. I went ahead and ordered the card since I already have the QHD 144hz monitor on the way and it comes with 3 games I was going to buy anyway. Heres hoping the undervolting works and the driver support is good. AMD has had a good track record recently.
@@madfinntech just shows that they are out of their comfort zone it seems to be the case on every pc video having a lot of stuff they saying being wrong or not the full story and talking about stuff they do not fully understand which comes out as just wrong or again not the full story if u want stuff on pc this is not the channel for it there are a ton of channels that know what they are talking about best to stick with them on pc stuff
Finally and HONEST zero shill review. No capping fan speeds at 20% like Gamer's Nexus and Paul's Hardware did to reduce performance. On that note, no one playing 1080p should be buying a radeon 7, or 1080ti, or 2080.... its a waste of money.
I think that had more to do with their Drivers being incredibly buggy with their Motherboards. They were having lots of issues with the Card. I wouldn't recommend a card either if the press release drivers I was given resulted in me unable to complete most of my testing. With that aside, I hear a lot of that has subsided and I can't wait to try out my R7 when it arrives wednesday.
AMD Finewine ftw. At least in a few years, i'll start overpassing the 2080 and 1080 Ti more and more as time goes. just the natural order of AMD vs GPU. Invest for 1-2 years, or invest for 4-6 years or more. like the HD 7970. Still highly viable to this day, and came out in 2011, and is comparable to a 1050 Ti
@@StarFox85 Its been that way since at least the 7000 series of AMD GPU's, they work on driver improvements over time. You can either think that they improve their drivers over time and Nvidia doesn't or that Nvidia gets it right from the beginning. Either way, AMD drivers improve performance over time compared to Nvidia.
Vega 56 is a bit overkill for 1080p 60fps but it'll work well enough. If you have RAM available I recommend you turning on HBCC. The card seems to behave better with it enabled.
The thing I really hate about the tensor cores and DLSS in particular is that if this silicon were used to just make a bigger traditional die we wouldn't be talking about spending £1200 for a 1440p card that can fake 4K as long as you don't look too closely at the shimmery edges in a few games that Nvidia have done the heavy lifting server-side. We'd instead have a card that could maintain 4K 60fps with SSAA enabled in every game.
the REALISTIC console specs are : lower end ryzen cpu, 24gb combined memory (16gb system and 8gb vram), the gpu will be a gtx 980 equivalent if you are lucky, and a 1-2tb HDD............for the 400-500$ price point that's only realistic.........
The reviews you do are way more balanced and well rounded than most huge thumbs up watched mainly out of curiosity these cards way out of my price range but well done great video 👍
So the AMD Radeon 7 and a RTX 2080 are around the same price range and performance. From what i can see Radeon 7 offers double the VRAM 16GB well RTX 2080 offers Ray Tracing. Good to have some competition in the GPU market again, but I am not impressed by neither AMD or NVIDIA at this point...
Or you could've had the 1080Ti and enjoy those numbers for the last few years. That's the most disappointing thing about the Radeon 7, 7nm with blazing fast VRAM but slightly beating a 16nm slower VRAM card
@@imo098765 I am not impressed until I see it actually working in actual game with decent enough frame rate. And beleive me, I really want it to be great, I have an RTX card. But facts are facts.
Wow its another world.Still rocking a GTX970 here for 1080P gaming, was thinking of upgrading my graphics card but there seems so many choices these days. Great to see reviews of cards i can only dream about :)
Great review. I run a 144HZ 1440P Gsync monitor and the more GPU power the better. If I see frame drop based stutter, I can use the extra GPU resources to get a fixed frame rate higher than a lower GPU.
Yeah and i love mine and it's shiny gold casing. Got it a month ago and it has run flawlessly destroying benchmarks and putting a smile on my face every time i look through it on my pimped out tempered glass PC case with RGB and water cooling going on. I know I can run everything at max settings!
@Galva Tron - Your wrong, i just sold my nearly 3 year old GTX 1080 Ti for $550 which is about 80% of the $699 price I paid for it. I will do the same with my Titan RTX when the upgrade comes out and continue to have the fastest card possible and selling my old card to schmucks like you on ebay who can only afford yesterday's technology.
For this conundrum, I have one solution: skip this card generation entirely. For these cards' features are not currently supported by the software. It will be paying for something you can't make use of.
@@sev2300 true but RTX may become more prevalent in the future(not only on ray tracing but also on mesh shaders and architecture features) and then you will be stuck with kinda fast old technology.
19:47 radeon 7 reads the junction temperature and sets the fan according to that,but that temperature is 20-25 degrees higher than gpu temperature so fans ramp up quite unnecessarily making huge noisy situation compared to other gpus.
AMD: The Forever Bridesmaid. Always second place (at least for GPUs). Had this card been at least $100 cheaper, it might have made some sense. As is, a GTX 1080ti is still the best buy for "second top tier" performance.
@@patrikmiskovic791 How it slow it performs the same as a Radeon VII or even better at 1440p which is the only resolution I care for, while costing less if you go in to the second hand market! The 1080 ti is still king even after 2 years! I was hoping AMD had something that would best it clearly (Something like 10 to 20% faster than a 2080) and was at the $700 mark but this is not it. Stop being a fanboy!
@@SegaCanuck Sure! But if we are talking price/performance you can do better buying used 1080 ti's right now. However the 2080 is the same price as the Radeon VII and the 2080 still beats it in the majority of games! If the Radeon VII was $600 then it would have been much better buy!
Used 1080 Ti? Yup. New in box 1080 Ti? Good luck. And before someone mentions the benefits of used cards (especially 1080 Ti), used Vegas are also dirt cheap in a lot of countries (including mine - huh, mine is the key word, yet these cards are still at >90% health (fans may fail earlier) if the miner using them wasn't a stupido overvolter).
Still rocking with my RX570 4GB, got it a year ago. Suits my needs, 1080p is all I need and the card delivers stable 60fps in most newer titles with ultra/high settings.
I have an Xbox One X and PS4 pro. People who think that the new consoles are going to do 4K 60 ultra settings are crazy when these cards can't even do that
@@destroyerdestroy87 Most games is medium setting with some detail set to high. But then you get impressive games like Red dead redemption 2 native 4k and looks better than even the mighty witcher 3 maxed out on PC. There is a few exceptions in this case however such as far better foliage and draw distances.
@@jaigray5422 4-5x more powerful is nonsense, no GPU exist today that 4-5x more powerful than xbox one x GPU, even the mighty RTX 2080 ti only 2,5 x more powerful than xbox one x. There's no way next gen consoles GPU will be more powerful than RTX 2080 ti.
@@jaigray5422 it won't be that much more powerful. i it's a win if it's twice as powerful on the gpu side of things, to be honest. cpu should see a significant upgrade though. maybe 3x in the right applications.
I will be using the Radeon VII for GPU rendering in Modo and working in Substance Painter... the 16GB of VRAM will be helpful, but yes, the noise will BUG ME!!! The current rig I have is virtually silent...
Mine undervolted to 0.998mV, and after I configured custom fan curve it went from unusable to almost inaudible over my PSU and case fans. Clocks got better and more consistent too, lol. I do plan to watercool and overclock it though.
been playing games at 5k ultrawide 5120x2160 with radeon 7 i get well over 60 fps on all latest games including bfv at ultra settings and the card is quiet and undervolted to 940mv
So you mean 4K Ultra Wide. I don't get the naming vernacular 1080 is 1080. We don't call it 2K. 1440P is at 2560, but when I game on my 1080P Ultrawide (2560x1080), I don't call it 2.5K UW. 1440p or QHD would be called 2.5K or 3K or whatever. But NOOOO 3560x2160P or whatever is 444kkkkk and if its ultrawide 5555kkkk. Dumb naming standards. I'm not attacking you, I'm sure its called a 5K monitor, I'm just amazed at the lack of uniformity in the naming convention of monitors.
Really nice to see that the frame times are again top of the game! Really a cool and robust entry in to the world of 7nm AMD products. Can't wait to see Ryzen 7nm and Navi Thank you DF p.s: When it comes to Benchmarks i really like to walk from place A to B for about 30sec. But it is nice to have in-game Benchmarks in some titles.
Scorpio dev kit is 24GB of ram with the retail unit coming with 12GB of ram so I cant't see anything less then 24Gb of unified ram from either Sony or Microsoft for next generation.
Considering how high memory prices still are i wouldn't hold your breath, unless they decide to put the cash into memory and use garage processors again in next gen, that would be very disappointing.
I feel like 16Gb's would be more plausible, but I also remember hearing a rumor of them removing the unified RAM too. I guess it would be cheaper to have 8GB of System RAM (DDR4) and then 8GB or whatever the # is of graphics memory
The GPU is so fast at rendering the low resolutions that the CPU can't keep up with how much it's receiving from the GPU, making you CPU Bound. Higher resolutions put more strain on the GPU forcing it to slow down some which lets the CPU keep up.
AMD are moving in the right direction, and although it'll probably be a while before I jump Nvidia's ship, I hope it forces Nvidia to reduce it's crazy pricing.
I hope it doesn't force nvidia to reduce there price, AMD deserve to charge more for also been able to compete when they can, they should like that they have lower there price because there competition can't lower there if anythign they should charge equally and along the performance line. Nvidia's pricing is a result of us all buy there GPU and not taking care of the competition when we could have. FYI i own a 1080TI and i purchased 2x Radeon 7 so am not been biased.
@@DeadNoob451 am afraid not am perfectly thinking it through nice and straight if you can afford it dont buy it save up or get a job. It is business at the end of the day they can charge what ever they want. are you doing to tell me Lamborghini has to lower its price because Ford is sells a Mustang for cheaper or that Bugatti should lower the price of veryon because you can afford it. OKAY then i rest my case so i guess your the delusional one here mate than you verymuch. come back when your able to talk sense.
Amd is back , love amd from ati 9800pro time . Love these nvidia fanboys . Im waitting for navi becouse i got no money ,still using rx580. Great rewiev guys . Radeon 7 gonna be faster even more than 2080 in next month.
Please consider putting more variety to the background music selection. Silence is better than 20 min of the same loop, not to mention it being on multiple (if not all) videos..
and how much is the RTX2080. the 1080ti, 2080, 2080ti even the titan RTX & Volta losses in opencl work load so we could say Nvidia has had more than 2 years to beat AMD in opencl workload but yet they failed and cost way moreee.
@@hexotech5202 yea, were being raped by both ends. My point is, this sucks. I'm starting to feel like the titan x(p) I bought 2.5 years ago for 1200 was a good deal. As far as open CL. I don't really care about that. Gamers don't care about that. If RTX ever can be applied to blender, that I would care about. But gamers are not getting any improvement in value and barely a upgrade in performance
@@SlyNine ofc for gaming this is not the best your can get for open cl this is one of the best for the value, for deep learning it comparable so in general i think it just about fights depending on things.
You guys are like the only ones not shilling for nvidia right now (they are talking up nvidia features that are basically unusable right now). Great review and thanks for being so transparent
Idk how true that is and I think the argument to be made is that, Despite RTX not being useful now, Nvidia is still ahead in performance on average, and RTX even if its a gimmick, is a feature you can try in a few games, and it could be a feature implemented in time, where Radeon 7 is not quite matching the RTX2080 and doesn't have any features period. You could technically tout the 16GBs of HBM2 as a feature, but I'm not so sure. I ordered a Radeon 7 before I'm labeled a Nvidia Fanboi. I made the purchase because HDMI Freesync is the feature that I need and its gonna kick ass @1440P and make use of the 144hz display.
@@thosewhoslaytogetherstayto3412 I have an rtx 2080 and i feel like pretty much any feature they touted back when they announced these cards is basically non existent. DLSS is probably never coming.....just look at multi res shading for pascal 1 real game supported. They make money on selling hardware, and i dont think theyll give anyone any "free" performance
PS5 and xbox two will have some slow ass gpus if is this the best what can AMD put out. 4K30 with checkerboard rendering will be still a thing in the next generation.
@EJSC yeah and the highest Navi will offer is gtx1080 performance for around $500. Look at AMDs roadmap, it's what's after Navi is what people need to look forward to. The fact Nvidia is 2 generations ahead of AMD right now doesnt look good for the next consoles.
So a card that’s 20-30% slower than the 2080ti at $700 compared to the 2080ti at $1200+ is awful? It’s the 2nd fastest consumer gpu out with buggy press drivers. It’s not the best but it’s a really good card.
LinusMiningTips , "PS5 and xbox two will have some slow ass gpus if is this the best what can AMD put out." Only a fool would expect a $400 to $500 console to run like a $1500 to $2500 PC.
Not going to lie - I encounter that GPU Render error (and now it's turned to outright driver crashing) multiple times every day, (and this is on 11GB 1080ti) so you saying you can avoid it on Radeon VII is enough for me to start turning on email notifications for when they come into stock. Thanks for covering this.
And u have got it ? I got mine week ago and is stunning, after bios and drivers update works like a charm, and will be only better :) 24 % increased performance comparing to previous drivers plus lower power consumption :)
@@Dar1usz RX6000... are here.... this is why i stick with console and laptop... it not long about money but i can not had endless cpu and gpu... for all this endless bulshit.
When your GPU has more Ram than your System...damn
@Subi_fan what GPU are you running and CPU? my kind sir or madame
when your priorities are in the wrong place. damn.
Damn I never tought the day we would see a 700 dollar gpu with the same amount of vram as the current amount of ram i have would come so soon
That makes no sense, system should always have more RAM. I run a Titan RTX with 24GB of VRAM and my PC has 128GB of RAM.
@@calidude1114 I was just replying to this comment when I noticed that you were the person I just responded to on another post lol. What's your PC's build? 128 GB of RAM & a Titan RTX?! What do you do for a living? lol
Also, what's the point of all that RAM? I'm guessing you use it for something other than gaming?
From all the reviews I've seen on the VII this has to be the most unbiased of them all.
Check hardwere chunks..this guy recommend amd for stability.. Hahahha..hahha.sorry can't control it.
Watched pretty much every youtube review from GN to LTT to HC, etc etc. DF is the only one with an even distribution of gaming and production performance and explanation of how this card aligns and competes in the gpu space.
It's refreshing to see, as many 'gaming tech' channels focus souly on gaming performance and poo-poo 16GB HBM2 as pointless, when clearly it isn't. As game texture resolutions increase, more VRam and high bandwidth make a large difference. Many seem to think its gaming performance is poor also, but considering is can nearly match 2080 performance (and actually sold ((all so briefly)) at MSRP), it's pretty good value for money! As long as AMD remains dedicated to driver support, the VII should gain some more performance and should be a good buy, even if it is just a stop-gap until Navi. I'd like to see a board-partner (Sapphire particularly) do a custom cooling solution though, as noise seems like the biggest down-side.
@@Ben-Rogue I doubt drivers are going to make a big difference here. Whatever optimizations AMD needed to make with Vega they likely already have. In this respect Radeon 7 is a known commodity. Secondly, the VRAM doesn't make much sense with regards to gaming. The card's rasterization performance will be tapped out long before its VRAM. By the time the 16GB allows you to retain a higher texture setting, you'll be compromising in so many areas that you'll likely have upgraded. That's why Richard focused on productivity workloads in terms of VRAM. If anything, the Turing cards are likely to age better than this Radeon 7, given how they seem to scale with newer game engines and/or features (as seen in games like Wolfenstein). People tend to focus on RTX and DLSS with regards to Turing's future, but don't pay enough attention to the architectural improvements that will apply across all games on newer engines.
@@Ben-Rogue nice try justifying to buy radeon 7 lol. As the guy above me said, the gpu power would be maxed out long before the vram's would
@@Ben-Rogue radeon 7 has very very little architectural differences when compared to vega, in fact as more games using low level apis come, we will likely see turing pull even further ahead as its better in those areas.
@Digital Foundry.
Man I love your channel. Always very honest and not biased at all. Also Rich should be on more often :) Thx for all the work guys.
This is the best and most unbiased Radeon VII review I've seen all day. Thank you so much for your professionalism.
I don’t know why I watch stuff like this that I can’t even afford
Oh well GO AMD!
People STILL keep buying them even though they now cost as much as an entire mid-tier PC.
Waste of money IMO
@ntodek get a job to fill your free time that you would use to actually game. I make 100k+ a year, I still can't talk myself into getting any of these cards when a $200 card can allow me to play the games I want. I'm kind of sad to see that the jump in power wasn't as big as the time between releases. I was expecting more from mid range cards, but it looks like I can hold off from up grading my Rx 580 8gb or my daughter's 1060 6gb for quite a while. Still not seeing the consistency at 4k that I want. Maybe the rtx 3060 will finally entice me, or even 3070. But 350 for the 2060, I'm not seeing the price to performance. Even the 1660ti is not what I was hoping for.
@ntodek you dont even know how old he/ she is
"Get a job? "
Get some sense dude
@ntodek facepalm
@@cosmicgamingunlimited I'm on 30k and I bought a £500 card.
Get it spent. You can't take it with you lol.
I've already read and even seen a few reviews and i'm not even that interested (1080) - but I'll ALWAYS come around and enjoy your reviews and info pieces. It's not just curiosity, or entertainment - but pure enjoyment. I really, really like your content ;D Thank you for this & auf wiedersehen ^-^'
Modded Skyrim se will love the 16gb vram of the Radeon 7
Jesus, current GPU prices are crazy good thing I'm happy gaming at 60 FPS at 1080p with a gtx 1060.
yeah the best part about 4k is that you need less than half the power to keep playing in 1080p. can't wait till people are struggling to run 8k so i can buy a garbage laptop with onboard gpu and still play games fast in 720p
@@GraveUypo nah youd be playing in 1366x768 :P
@@jjnet123 my laptop has a 1080p screen but i don't mind playing at non-native resolutions in it. i just go for 720p if the situation calls for it. i mean, it's mostly a work laptop now, so the screen can't be that shitty.
@@GraveUypo I was only mentioning it because my laptop was 1366x768 and was a great laptop for its time still does play somethings well but it probably would melt on battlefield v or any of the tomb raider games post 2013 😂
@@jjnet123 mine's a not-so-good 3rd gen dual core i5 that boosts to 3.4ghz with a 2gb (ddr3) 650m and 8gb of ram.
if i overclock the hell out of that gpu and lower the resoltion way down, it still runs modern stuff. of course i have to add an external suction fan for it not to overheat, but i have one that works wonders, so it actually runs cooler than stock without it.
forza horizon 4 runs like 30~50fps in it on the lowest settings in 1024x768 so lower than 720p. it's the lowest res the game allows, otherwise i'd go down to 800x480 which is widescreen and probably get 60fps. heck i'd gladly go to 640x400 if i had to. i started my pc carreer in the 90's so i remember when that was the upper bound for gaming resolutions. duke3d looked so pretty in SVGA :P
i haven't tried many other modern games in it, but i suppose the dual core cpu may be also a bottleneck nowadays. welp, i only do light gaming in it anyway, i have an xbox one x and a gaming desktop for actual gaming.
More or less what I expected to see, better hope AMD reduces the driver bottleneck.
The only driver available at the moment is a pre release driver i'ed be expecting one soon.
@@TheCooperman666 a new driver literally came out this morning....
Yeah, the press driver is absolutely abysmal from looking at Gamer's Nexus's review of the Radeon VII. Hell, you can't really even overclock the thing atm because the drivers are so unstable.
on dx11 ???? never... they are trying to reduce it for years. Ok, maybe they will reduce for another 3%
Considering how fantastic the Vega 64 drivers have been the past few months, I'm sure they'll iron out the driver issues.
A mild undervolt on the Radeon 7 gives a pretty good performance boost, at least that's what I read on reddit, would love it if digital foundry looks into this matter.....
made a big difference with VEGA
I though it gave a large performance/ watt, not necessarily performance.
@@SconVideos Vega runs into a MAX POWER limit VERY quickly. Even at +50%. If you let it run at whatever voltage it wants; it gets to a certain clock frequency (not very high) and hits the power limit (315W in my case). If you under volt it; you can get much higher clock speed before it hits the same amount of power. It seems strange considering you usually need raise voltage to support higher clocks, but vega seems to be really over volted out of the box.
stock voltage is there for a reason, not all chips will allow undervolting and be 100% stable. who the fuck buys a gpu to undervolt anyway. would you also underclock it?
dr. whet farts: No one buys a GPU to undervolt it, but you have to undervolt Vega in order to overclock it.
Yes, content creators are people to ! Finally some one give us more Vram for fair enough price. Pro cards ridiculously priced
A titan RTX is some radically expensive that you can build an entire rig and still have enough for 2 or 3 more rigs.
Digital Foundry makes the best GPU reviews no boring PowerPoint graphs to look at lol.
7:35 , you said titan rtx has 12 GB ram allocation. It actually have 24 GB RAM
my thoughts exactly
Maybe he was thinking ok Titan V
Are there not different versions of that card with different VRAM capacity installed?
Nvidia Titan X 12GB
Frequency 1.25GHz (10Gbps effective) 1.25GHz (10Gbps effective)
Interface 384-bit 256-bit
Bandwidth 420GB/sec... he meant the pascal titan x
He was probably up all night making this video
I swear, this the most informative review I saw on this card. Is it really that hard to do a production scenario where this card shines? There are people whit certain workloads that would benefit from this card and this might be the best thing for them. Great stuff DF, as always.
This card has much potential, hope they get some of the drivers and optimization fixed. Hope to see more testing and content with this card. Great video guys, thanks for the awesome content.
Thank you for making the data so transparent and clear! My God. So easy to digest your infographics. Thumbs upped!
Why is your captured footage in interlaced?
Because they do not know anything about video editing or video formats.
You can use proxy and edit 16k video footage on a 256mb video card, it would just take 2 months to export it.
Somebody ticked the wrong box.
@@SMGJohn If they don't know anything about video editing, then how have they been making great videos for years now?
@@SolarMechanic And I'm ticked about it. It looks like a scanline overlay!
huh? They must have either reuploaded, or it's something else going on, the video is fine.
Would really like if you guys enable captions on your videos
auto captions are pretty good in english tbh
I’m Russian. So I can say, that Richard has the purest English I’ve ever heard, even if regular Russian “intermediate” can understand his speech
It has nothing to do with their speech, it's just that sometimes you have to watch videos without audio.
But I saw that the automatic captions got added ^^
And yes, 99% of the time Automatic captions are accurate
Never had to watch videos without audio lol....still there are "old school" reviews for that .
Does anyone else see his face glitching out around 7:06?
"Welcome to my world"
*starts glitching out of existence*
@@clessili "Richard from digital foundry here" droning voice "Prepare to die"
Lmao
Oh, good, it wasn't just me. I was like "funny how that happens as he is talking about exporting stuff"
How did I miss that?
Very very helpful review again from your channel. My conclusion is, that: let's see where AMD's Radeon 7 is going in future in regards of design. At least AMD is going to be within upcoming consoles of PS5 and XB2, so I am very curious how that design is going to look like (suppose it will be based upon this Radeon 7 design) but on the other side they may have to work out a lot of issues - for me, the 375 W average power required (!) is really crazy (Nvidia 2080 is around 250 Watt).
Amd is coming. Cmon Lisa, you're close
More like they came prematurely. Like AMD always does...
alintro amd will always suck with gpu like rx Vega too!
@@Weirdounknown598 the bait is real
It's like the whole AMD community is helping AMD (Lisa the Great) give birth 😄
AMD is coming over 10 years
This is why I enjoy the DF reviews. They cover all basis and set about pointing out things others wouldn't notice, while doing so in a relaxed setting. Very nice!
"Who's editing 8k video?"
*Linus Sebastian has entered the chat*
@digital foundry Thanks for taking the time to make a proper review, couldn't have said it better myself. cheers mate!
Hopefully a few driver updates will sort things out, That's what usually happens with Radeon GPU's. A great video with some good info, Thankyou.
That is not happening. The radeon 7 is vega which was released 2 years ago. Optimisation for vega are already done, there may be some further refinements but dint expect a noticeable difference
@calistorich no it does not. It is vega. If it did have some changes they would not recall it vega and they would also state the changes they have done to the architecture.
Nope, it's the exactly same architecture, so, no performance boost by drivers..
It is Vega but it's not exactly the same, they said in the pre release info that there were a few differences, You need to remember that the 7nm Vega's were designed for the pro market, not for the gaming market, what they've done is repurpose MI50 chips & it's behaving differently to Vega 64 which in part is because they didn't get the software ready in time for release, Things will improve, maybe not by much but they will improve. We just need updated tools.
@@davideverett2 the obly difference between the m150 is that it has better fp64 compute which even that was cut down for the radeon vii. It is literally the exact same gpu just built in 7nm. It is not going to improve. Even if it is slightly different it is inexcusable that after 2 years of vega release they haven't optimised a very very similar architecture in 2 yrs time.
I find it curious that years ago, when the 1080 ti debuted, the Witcher 3 benchmark as shown here didn’t drop below 60 fps in 4k with an i7 6700k in DF’s own review of the card. Now suddenly it struggles to maintain 60 fps with a faster processor 🤔😒
I will give you a compelling reason -> modded texture packs! Especially in VR these things chew your VRAM like nothing else, and as PC gaming goes we are always slapping visual mods on the games, more VRAM means more stable frame rates in mods and less stutter. Simple as that
Liked the video....one thing though... the factory OC on the 2080 was mentioned a lot. You'll struggle to find a 2080 that doesn't have this OC or higher.... And with GPU boost they all boost up to around the same level without any manual tweaking anyway. As well as when you're comparing architectures from different companies I really don't think it's relevant.
"Who edits 8k video?" Linus. Because of course he does.
And even more people in the future.
RUclips actually supports up to 14K.
I am a bit disappointed that Linus doesn’t upload 8K.
@@allansh828 he has a 8k camera amd all
Jack _
If you upscale 1080p to 4K, then RUclips would preserve more bitrate, and videos look better.
Perhaps everybody should upscale to 14K before uploading?
A lot of new 8k TVs came out at CES 2019 so its not far away. TV manufacturers want you to upgrade and buy a new TV, now that most enthusiasts have a 4k TV they can only sell them a new TV if it is better, hence 8K TVs.
For gaming a second hand 1080Ti for about $500 is a winner for price/performance.
For productivity the Radeon 7 is indeed a nice card, specially if you use DaVinci Resolve which benefits way more then Adobe Premiere.
Cool that there's more competition once again from AMD.
lol competition where? It barely beats out the 1080ti that's at this point approaching 2 years of age, and at the same pricepoint of the 2080 without extra features it's just a waste of money.
Care to remind me when AMD last tried to compete at the VERY TOP of GPU power? Maybe the 7970 back in the day...
@@HLPC_2 ya I agree. I was hoping for some actual competition. if they made radeon 7 like 20-30% cheaper than the 2080, then it would be something to look at. but as it stands... meh.
@UHD Gaming PC Then expect 1k dollar mid-range 7nm Nvidia GPU. :D
@@HLPC_2 What percentage of people buy 2080Ti's or 1080Ti's, yes Nvidia has the fastest cards but it doesn't bring in the $$$. What brings in Nvidias money is their mid tier cards, which AMD competes with.
FourEyedGeek Doesn't really matter if you control all Gpu market. You can charge whatever you want and people will buy anyway, because of the strong brand.
This is by far the most detailed GPU review I've watched in my life. My brain was seeing double towards the end. Great video!
Have you considered doing Premiere/AE/Cinema 4D benchmarking?
I was thinking the same thing
Have you tried exporting the audio only and the video as png sequence, then joining them up again for a video render? This time the video render will be from the png sequence and won’t have to deal with the fancy transitions. Definitely takes more time and adds another layer of work, but safer and error proof.
Why isn't there at least 2-3 VR games? I really hoped to see some comparsion there if the Redeon 7 handles vr games better than a 2080/2080ti.
Have always been able to count on Digital Foundry for a solid and thorough review of Graphics Card. Keep up the good work!
the GPU video editors have been waiting for... AMDs core target group
This card is targeted to profesional encoders and data centers...
crookim sure it is
grimfist79 i think you might be onto something here...
I guess they made Radeon 7 a Swiss army knife of GPUs
@@grimfist79 Yeah and Nvidias promotion says that RTX is not worthless garbage.
The only question left is who exactly is so naive that they take anything said in adverts seriously.
Great video as always. You were the channel that convinced me to get an RX470 back in the day. Have 1070 now. Game at 4k 60fps. Not sure whether to go 1080ti or 2080 or radeon vii or keep waiitng.... Probably latter!
Was the same 970 v 290x & 1060 v rx480.... Look at them cards now
My 290x is still chugging along
Amazing in-depth review, you guys are leagues ahead of the rest of the competition when it comes to the quality of your videos.
9:58 - "300$ more on ur new gpu". I think you've missed the real deals, 2080 Ti comes for minimum 1300$, more like 1500-1600$ average.
Or do u mean VII will go for 1200$?
Yup, should be "About $600 more on a new GPU".
Richard = Number 1 Reviewer! Interesting idea for you Richard - how about you conduct a review (using your current review style) but imagine you're still the editor of the official Sega Saturn magazine!
This is clearly a radeon machine learning cpu excess being binned and rebadged as gaming cards. September will see the proper navi gpus
Nothing wrong with that. Its kept both companies more profitable to be able to use excess and 'faulty' processors in lower end products.
@@zybch it is expensive due to the massive hbm2. I envisage the navi won't use hmb2, thd architecture will be more powerful and energy efficient, and easier to cool. Because it won't be an overclocked vega like this.
Therefore should be cheaper.
the info on exporting ram needs was just freaking awesome!!!! thanks
It's like being stuck in the Groundhog Day movie. 2017 over and over and over again. This will probably be the longest cycle I have ever had a video card GTX 1080 TI still no compelling reason to upgrade either performance-wise or money-wise
Same here....bought the 1080 TI in April 2017 and it's still a great card in comparison to what's on the market, even the price/performance didn't change one bit. I guess we'll have to stay tuned for the next generations of graphics cards, something along the 3080 TI to get a meaningful leap in performance. The 2080 TI doesn't really seem like a suitable choice to me given how much it costs for what appears to be about 25% performance leap for a whooping 1300€ over here.
Indeed. I got my 1080Ti exactly last year this month for $849 while the mining craze was in effect. I am still happy with it. Since then, I upgraded to that ridiculous 49inch Super Ultrawide monitor 144hz. It's between 1440p and 1080p. Which I think it's the sweet spot for 1080Ti with 144hz monitor.
I'm currently using a 6G GTX1060 and I've been waiting quite a while for something from AMD. I went ahead and ordered the card since I already have the QHD 144hz monitor on the way and it comes with 3 games I was going to buy anyway. Heres hoping the undervolting works and the driver support is good. AMD has had a good track record recently.
@@gameurai5701 the 2080 TI is hardly faster than a 1080 TI (both at full OC)
Love the way you made the comparison.
The comparison between 1080p-1440p-4k is quite nice.
Absolutely agree with you on the CPU bottlenecking problem.
7:37 I'm pretty sure RTX Titan has 24 gig of Vram.
Me too. Perplexes me that he actually made an error like that.
@@madfinntech just shows that they are out of their comfort zone it seems to be the case on every pc video having a lot of stuff they saying being wrong or not the full story and talking about stuff they do not fully understand which comes out as just wrong or again not the full story if u want stuff on pc this is not the channel for it there are a ton of channels that know what they are talking about best to stick with them on pc stuff
didn't he say the Titian XP and previous pascel cards.. not the titian Turing card...
@@MadDoggyca correct
He said he uses a titan x pascal everyday.
I always enjoy Richard's presentations! Great stuff as always!
Finally and HONEST zero shill review. No capping fan speeds at 20% like Gamer's Nexus and Paul's Hardware did to reduce performance.
On that note, no one playing 1080p should be buying a radeon 7, or 1080ti, or 2080.... its a waste of money.
GN ran the fans at 100% and even changed the thermal pad to paste for better cooling, how are they shills?
I think that had more to do with their Drivers being incredibly buggy with their Motherboards. They were having lots of issues with the Card. I wouldn't recommend a card either if the press release drivers I was given resulted in me unable to complete most of my testing.
With that aside, I hear a lot of that has subsided and I can't wait to try out my R7 when it arrives wednesday.
You ever heard of 120-240 hz screens? Or unoptimized/very demanding games?
Thank you for a great in depth review!
The Radeon VII will most likely get faster when drivers mature.
AMD Finewine ftw. At least in a few years, i'll start overpassing the 2080 and 1080 Ti more and more as time goes. just the natural order of AMD vs GPU. Invest for 1-2 years, or invest for 4-6 years or more. like the HD 7970. Still highly viable to this day, and came out in 2011, and is comparable to a 1050 Ti
No doubt, I expect at least 10% from drivers in the next 6 months.
@@FinnLovesFP Yeah, that's how it pretty much always happens with AMD's stuff.
beautiful dream
@@StarFox85 Its been that way since at least the 7000 series of AMD GPU's, they work on driver improvements over time. You can either think that they improve their drivers over time and Nvidia doesn't or that Nvidia gets it right from the beginning. Either way, AMD drivers improve performance over time compared to Nvidia.
Finally!!! A review i was looking for concerning GB usage and performance!!!
So if I wanna go with 1080p 60fps gaming my best option for AMD is what, Vega 56?
U do realise all of the above gameplays were recorded on 4k right ?
Vega 64 is priced good right now
You can do that with a rx480 lol
Yes vega 56 (or 1070ti if it is cheaper enough than vega 56)
Vega 56 is a bit overkill for 1080p 60fps but it'll work well enough.
If you have RAM available I recommend you turning on HBCC. The card seems to behave better with it enabled.
The thing I really hate about the tensor cores and DLSS in particular is that if this silicon were used to just make a bigger traditional die we wouldn't be talking about spending £1200 for a 1440p card that can fake 4K as long as you don't look too closely at the shimmery edges in a few games that Nvidia have done the heavy lifting server-side. We'd instead have a card that could maintain 4K 60fps with SSAA enabled in every game.
THE RTX 2080 TI SHOULD HAVE THE RTX TITAN SPECS
THE PRICE GOES UP 50% AND THE VRAM STAYS EVEN........ALL THOSE USELESS FEATURES COST MONEY HEHE
What's an RTX 1080? EDIT: title is fixed :)
Also, looks like there are some rendering problems at 7:05. This video seems rushed AF.
"nothing to see here, move along... move along"
really a good review !!
you just missed a clear sum up slide at the end
I HIGHLY DOUBT that consoles will see 24GB of VRAM. 16GB for sure, but 24 would be expensive as hell.
the REALISTIC console specs are :
lower end ryzen cpu, 24gb combined memory (16gb system and 8gb vram), the gpu will be a gtx 980 equivalent if you are lucky, and a 1-2tb HDD............for the 400-500$ price point that's only realistic.........
my titan Xp has 12gb of vram
no way in sam fuck a not even 1/2 as powerful GPU needs more on a limited platform.........
The reviews you do are way more balanced and well rounded than most huge thumbs up watched mainly out of curiosity these cards way out of my price range but well done great video 👍
RTX 1080? Sign me right up!
Baldurus shotgun
Why you need 1080?
What you use now?
I found this review on the second day, but it is hands down the best. Very thorough analyzes on resolution scaling.
So the AMD Radeon 7 and a RTX 2080 are around the same price range and performance. From what i can see Radeon 7 offers double the VRAM 16GB well RTX 2080 offers Ray Tracing. Good to have some competition in the GPU market again, but I am not impressed by neither AMD or NVIDIA at this point...
Or you could've had the 1080Ti and enjoy those numbers for the last few years. That's the most disappointing thing about the Radeon 7, 7nm with blazing fast VRAM but slightly beating a 16nm slower VRAM card
@@imo098765 Agree
"I am not impressed by neither AMD or NVIDIA at this point..." - complete summary of this gen GPU releases.
@@yorhaunit8s You are not impressed by the huge leain Ray/Path Tracing having the technology done in real time, with a built in filter
@@imo098765 I am not impressed until I see it actually working in actual game with decent enough frame rate. And beleive me, I really want it to be great, I have an RTX card. But facts are facts.
Wow its another world.Still rocking a GTX970 here for 1080P gaming, was thinking of upgrading my graphics card but there seems so many choices these days. Great to see reviews of cards i can only dream about :)
Do AMD vs NVIDIA colors quality comparison!!
Great review. I run a 144HZ 1440P Gsync monitor and the more GPU power the better. If I see frame drop based stutter, I can use the extra GPU resources to get a fixed frame rate higher than a lower GPU.
Correction: Titan RTX has 24GB of VRAM
@HB C $2400
Yeah and i love mine and it's shiny gold casing. Got it a month ago and it has run flawlessly destroying benchmarks and putting a smile on my face every time i look through it on my pimped out tempered glass PC case with RGB and water cooling going on. I know I can run everything at max settings!
Corrections music please
Pretty happy with my 2 RTX 2080s NVLinked and their shiny silver casing + i9-9900K that destroy a single Titan RTX for less money.
@Galva Tron - Your wrong, i just sold my nearly 3 year old GTX 1080 Ti for $550 which is about 80% of the $699 price I paid for it. I will do the same with my Titan RTX when the upgrade comes out and continue to have the fastest card possible and selling my old card to schmucks like you on ebay who can only afford yesterday's technology.
Great review. Loving my Evga RTX 2080XC.
7:05 Intentional graphical glitch to go with what you're talking? Or rendering error?
For this conundrum, I have one solution: skip this card generation entirely. For these cards' features are not currently supported by the software. It will be paying for something you can't make use of.
The best decision.
Although I agree, the fact is that my GTX 970 isn't keeping up anymore, especially in VR. I'm not sure what card I should go for though. Maybe a 2060.
secret333 I’d go for 1080 ti.
@@sev2300 true but RTX may become more prevalent in the future(not only on ray tracing but also on mesh shaders and architecture features) and then you will be stuck with kinda fast old technology.
19:47 radeon 7 reads the junction temperature and sets the fan according to that,but that temperature is 20-25 degrees higher than gpu temperature so fans ramp up quite unnecessarily making huge noisy situation compared to other gpus.
AMD: The Forever Bridesmaid. Always second place (at least for GPUs).
Had this card been at least $100 cheaper, it might have made some sense. As is, a GTX 1080ti is still the best buy for "second top tier" performance.
😂😂😂 nice joke 1080ti slow and old ,cost more . Love this nvidia fanboys
@@patrikmiskovic791 How it slow it performs the same as a Radeon VII or even better at 1440p which is the only resolution I care for, while costing less if you go in to the second hand market! The 1080 ti is still king even after 2 years! I was hoping AMD had something that would best it clearly (Something like 10 to 20% faster than a 2080) and was at the $700 mark but this is not it. Stop being a fanboy!
@@DeAtHvAnGeL you cant compare pricing using used prices for one and not the other.
@@SegaCanuck Sure! But if we are talking price/performance you can do better buying used 1080 ti's right now. However the 2080 is the same price as the Radeon VII and the 2080 still beats it in the majority of games! If the Radeon VII was $600 then it would have been much better buy!
Used 1080 Ti? Yup.
New in box 1080 Ti? Good luck.
And before someone mentions the benefits of used cards (especially 1080 Ti), used Vegas are also dirt cheap in a lot of countries (including mine - huh, mine is the key word, yet these cards are still at >90% health (fans may fail earlier) if the miner using them wasn't a stupido overvolter).
Love the 1080p comparision bits. It may be a low resolution, but I still play on 1080p @ 144hz so this information is really valuable to me
*best Radeon VII coverage by far*
Still rocking with my RX570 4GB, got it a year ago. Suits my needs, 1080p is all I need and the card delivers stable 60fps in most newer titles with ultra/high settings.
The take away...7nm Vega is pretty decent....and 7nm Ryzen 2 is going to be amazing.
7nm Ryzen is Ryzen 3 aka Zen2. Still amazing though
@@DeadNoob451 Ryzen 3rd Gen
We wouldn't want to mistake Ryzen 3 with Ryzen 3 if you catch my drift.
Probably the best review I've heard all day...I subscribed
You don't get it do you. Future proof (next 5 years):
RTX2080: NO
R7: YES
Don't believe it huh? You'll see ;-)
lol yeah keep that same card for 5 years while i have 12k with my 300hz and real time ai learning systems gtfo bruh lol
I dont see how,
Excellent review Richard!
I have an Xbox One X and PS4 pro. People who think that the new consoles are going to do 4K 60 ultra settings are crazy when these cards can't even do that
The xbox One X has hit native 4k quite often not as far fetched as you might think when the next console should be 4-5x more powerful than a X.
@@jaigray5422 but not ultra settings tho.
@@destroyerdestroy87 Most games is medium setting with some detail set to high. But then you get impressive games like Red dead redemption 2 native 4k and looks better than even the mighty witcher 3 maxed out on PC. There is a few exceptions in this case however such as far better foliage and draw distances.
@@jaigray5422 4-5x more powerful is nonsense, no GPU exist today that 4-5x more powerful than xbox one x GPU, even the mighty RTX 2080 ti only 2,5 x more powerful than xbox one x. There's no way next gen consoles GPU will be more powerful than RTX 2080 ti.
@@jaigray5422 it won't be that much more powerful. i it's a win if it's twice as powerful on the gpu side of things, to be honest. cpu should see a significant upgrade though. maybe 3x in the right applications.
I will be using the Radeon VII for GPU rendering in Modo and working in Substance Painter... the 16GB of VRAM will be helpful, but yes, the noise will BUG ME!!! The current rig I have is virtually silent...
Mine undervolted to 0.998mV, and after I configured custom fan curve it went from unusable to almost inaudible over my PSU and case fans. Clocks got better and more consistent too, lol. I do plan to watercool and overclock it though.
been playing games at 5k ultrawide 5120x2160 with radeon 7 i get well over 60 fps on all latest games including bfv at ultra settings and the card is quiet and undervolted to 940mv
So you mean 4K Ultra Wide.
I don't get the naming vernacular
1080 is 1080. We don't call it 2K.
1440P is at 2560, but when I game on my 1080P Ultrawide (2560x1080), I don't call it 2.5K UW. 1440p or QHD would be called 2.5K or 3K or whatever. But NOOOO 3560x2160P or whatever is 444kkkkk and if its ultrawide 5555kkkk.
Dumb naming standards. I'm not attacking you, I'm sure its called a 5K monitor, I'm just amazed at the lack of uniformity in the naming convention of monitors.
Really nice to see that the frame times are again top of the game!
Really a cool and robust entry in to the world of 7nm AMD products. Can't wait to see Ryzen 7nm and Navi
Thank you DF
p.s: When it comes to Benchmarks i really like to walk from place A to B for about 30sec. But it is nice to have in-game Benchmarks in some titles.
Title says rtx 1080
They changed it
Amazing analysis Wichad!
24 gigs for the nxt console? Did l hear right? Cool.
Scorpio dev kit is 24GB of ram with the retail unit coming with 12GB of ram so I cant't see anything less then 24Gb of unified ram from either Sony or Microsoft for next generation.
I'm sure they're playing with prototypes now. We'll see.
I've heard they will ditch unified memory beacuse of memory bandwidth bottleneck. @@ZeitGeist_TV
Considering how high memory prices still are i wouldn't hold your breath, unless they decide to put the cash into memory and use garage processors again in next gen, that would be very disappointing.
I feel like 16Gb's would be more plausible, but I also remember hearing a rumor of them removing the unified RAM too. I guess it would be cheaper to have 8GB of System RAM (DDR4) and then 8GB or whatever the # is of graphics memory
17:00 why would playing at a higher resolution make the framerates less erattic? I don’t understand.
The GPU is so fast at rendering the low resolutions that the CPU can't keep up with how much it's receiving from the GPU, making you CPU Bound. Higher resolutions put more strain on the GPU forcing it to slow down some which lets the CPU keep up.
AMD are moving in the right direction, and although it'll probably be a while before I jump Nvidia's ship, I hope it forces Nvidia to reduce it's crazy pricing.
I hope it doesn't force nvidia to reduce there price, AMD deserve to charge more for also been able to compete when they can, they should like that they have lower there price because there competition can't lower there if anythign they should charge equally and along the performance line. Nvidia's pricing is a result of us all buy there GPU and not taking care of the competition when we could have. FYI i own a 1080TI and i purchased 2x Radeon 7 so am not been biased.
@@hexotech5202 You are quite delusional.
@@DeadNoob451 am afraid not am perfectly thinking it through nice and straight if you can afford it dont buy it save up or get a job.
It is business at the end of the day they can charge what ever they want. are you doing to tell me Lamborghini has to lower its price because Ford is sells a Mustang for cheaper or that Bugatti should lower the price of veryon because you can afford it. OKAY then i rest my case so i guess your the delusional one here mate than you verymuch. come back when your able to talk sense.
You could have done a noise comparasion but great content!
Amd is back , love amd from ati 9800pro time . Love these nvidia fanboys . Im waitting for navi becouse i got no money ,still using rx580. Great rewiev guys . Radeon 7 gonna be faster even more than 2080 in next month.
just like Vega 64 was faster vs gtx 1080 !! or not never
@@Mr11ESSE111
Cool story
Please consider putting more variety to the background music selection. Silence is better than 20 min of the same loop, not to mention it being on multiple (if not all) videos..
$700 and better than the 1080ti. Wow that's amazing. How much was the 1080ti 2 years ago....
and how much is the RTX2080. the 1080ti, 2080, 2080ti even the titan RTX & Volta losses in opencl work load so we could say Nvidia has had more than 2 years to beat AMD in opencl workload but yet they failed and cost way moreee.
@@hexotech5202 yea, were being raped by both ends. My point is, this sucks. I'm starting to feel like the titan x(p) I bought 2.5 years ago for 1200 was a good deal.
As far as open CL. I don't really care about that. Gamers don't care about that. If RTX ever can be applied to blender, that I would care about.
But gamers are not getting any improvement in value and barely a upgrade in performance
@@SlyNine ofc for gaming this is not the best your can get for open cl this is one of the best for the value, for deep learning it comparable so in general i think it just about fights depending on things.
I got my 2080 Ti for $999 from EVGA. Manually overclocked to 1980 MHz. Very impressed.
As the AMD drivers begin to dim, we'll see the real power of this board.
Yeah like always... Not really :D.
You guys are like the only ones not shilling for nvidia right now (they are talking up nvidia features that are basically unusable right now). Great review and thanks for being so transparent
Idk how true that is and I think the argument to be made is that, Despite RTX not being useful now, Nvidia is still ahead in performance on average, and RTX even if its a gimmick, is a feature you can try in a few games, and it could be a feature implemented in time, where Radeon 7 is not quite matching the RTX2080 and doesn't have any features period.
You could technically tout the 16GBs of HBM2 as a feature, but I'm not so sure.
I ordered a Radeon 7 before I'm labeled a Nvidia Fanboi. I made the purchase because HDMI Freesync is the feature that I need and its gonna kick ass @1440P and make use of the 144hz display.
@@thosewhoslaytogetherstayto3412 I have an rtx 2080 and i feel like pretty much any feature they touted back when they announced these cards is basically non existent. DLSS is probably never coming.....just look at multi res shading for pascal 1 real game supported. They make money on selling hardware, and i dont think theyll give anyone any "free" performance
PS5 and xbox two will have some slow ass gpus if is this the best what can AMD put out.
4K30 with checkerboard rendering will be still a thing in the next generation.
@EJSC yeah and the highest Navi will offer is gtx1080 performance for around $500. Look at AMDs roadmap, it's what's after Navi is what people need to look forward to. The fact Nvidia is 2 generations ahead of AMD right now doesnt look good for the next consoles.
@EJSC this, without the dumb Game Motion Plus idioticy
@EJSC any kind of frame interpolation is bad, no matter how much they market it to "gamers". I can't believe you even said that.
So a card that’s 20-30% slower than the 2080ti at $700 compared to the 2080ti at $1200+ is awful? It’s the 2nd fastest consumer gpu out with buggy press drivers. It’s not the best but it’s a really good card.
LinusMiningTips
, "PS5 and xbox two will have some slow ass gpus if is this the best what can AMD put out."
Only a fool would expect a $400 to $500 console to run like a $1500 to $2500 PC.
Great video, thanks especially for covering the creator side of things. I would love to see more :-)
RTX 1080? You might need to edit the title.
Proud to be a 1080ti owner, running at 2050mhz (OC), almost 2 years old now and runs most games 4k 60 or 3440x1440 at 80-120fps. Great investment.
Um. Mistitle