I personally don't think it's worth it. I recently bought the 3090 FE (idles at 20W, lol) and gladly paid the 300$ premium for an unlocked hash rate and 24Gb of VRAM. I think 12Gb of VRAM is ridiculous for a 1200$ card with a capped mining performance on top. Not to mention the much smaller cooler, they should have used the 3-slot one from the 3090. I really feel the 3080 Ti it's an Nvidia money grab like Gamer's Nexus stated. If I wanted to save some money would better buy the 3080, much better value overall. Also sold my 2080Ti and 3080 at a profit, btw, got lucky it was just before the Bitcoin crash. Prices are much lower now.
What energy management settings are you using in Nvidia Control Panel? I have read that when you switch it from "max performance" to "normal" that it will drop to 10-20 watts.
@@LiveForDaKill Funny how people find ok to use all the kind of workarounds with Ampere while the response to any workaround with RX 5000 cards was: "I'm paying $400, it need to just work, it's not my job to fix AMD mess!!!1!!!1!"
That is because Nvidia manipulated the launch reviews by sending people that stupid PCAP thing. It uses a slow as balls ADC so any power measurement will miss any power spikes. Igors Lab used a real Oscilloscope and measured almost 500W spikes on the 3080. But most reviewers won't spend the $400-600 on an entry level Oscope so by controlling the gear reviewers used to test the cards they could claim lower power usage. Believe me, Nvidia would NEVER use a $40 meter to test their GPUs when even a base model Multimeter costs double that. PCAP would be useless for any real measurements, and any maker should have known that immediately.
yeah, but any properly designed circuit with a slow ADC should have a low-pass filter on its input, so unless they ommited that the spikes should not matter much for power consumption
@@TheBackyardChemist but your PSU should still could handle power spikes like that, so it could potentially be a problem if your PSU has a bit too low wattage.
@@TheBackyardChemist That doesn't solve the issue, the spike will cause heat in a way or the other in some place inside your system wich will dissipate into your room (or somewhere else if by chance you have some kind of external radiator :-)
Yep. RTX 30 series are a big failure in terms of power consumption. That's why I have my 3090s undervolted and I suggest everybody else do that too if you know how to do it (it's not that hard really if you are tech-savvy). Now they run at ~280-290W under full load with minimal to no performance loss compared to the stock 350-380W.
@@gsrcrxsi GDDR6 is still plenty fast. It's not like it is 5 years old technology or something. The first GDDR6 cards to come out with GDDR6 were the previous gen RTX 20 series. Besides, you can always OC it the VRAM if you think that you need more raw power. I would personally take 12 GB of GDDR6 over 6-8 GB of GDDR6X if I had to choose.
I definitely enjoy the qualitative analysis of these reviews. This is something more reviews need, was it actually noticeable when being used? Or is it just better based on number fetish?
Went from a 400 w 3080 to a 6900 xt. I much prefer the raw rasterization perf, lower power draw, reduced CPU driver overhead, and 6gb extra vram. RDNA2 definitely wins this generation. Nvidia mindshare is still insane though. I’d like Ampere cards more if they were priced competitively or at least had more vram for longevity.
When I saw RTX 3080 only have 10GB, I laughed and immediately reminds me of Kepler cards. Some games already consume almost 10GB when playing at 4K. I don't think this card will last for 3 years when playing at 4K.
@@MrWarface1 lmao using an 800 watt PC for mining then having a 600 watts air conditioning on inside the room to counter balance it. Your part of the problem with the current climate crisis we are in. Disgusting
@@wile123456 first off.. I have central air. Second, my power cost went up a little more than a dollar a day for the 15 a day they are making mining. Imagine being such an idiot, you try to make a point about stuff you know nothing about cause you are such a fanboi. Lmao
Don't believe everything you hear. Were living in a world of propaganda of huge proportion. Nvidia is evil, sure that's a fact, so is amd. And the FBI and the Obama's and bidens and Clinton's etc.
@@wile123456 show me evidence. So far its just been lies from a criminal organization. Again... Show me truth and facts that aren't made up by the communist. You can't do it. Cause you've been so brainwashed you believe things people tell you without proof or manufactured proof.
Don't forget about Nvidia's software scheduler that creates a pretty noticeable additional load on a cpu making it eat more power and produce more heat. So 3080TIE! is 400w itself + some amount of parasitic power load. Based on HUB overhead investigation - around 10-30 watts.
Exactly ! Not only nvidia gddr6X cards consume an insane amount of power for the performance, but they also offload a part of theyr business to the cpu !
@@powerdude_dk In the pre-DX12/Vulkan days the graphics pipeline with DirectX and OpenGL was much more CPU limited and also very difficult to parallelize to multiple thread. Nvidia restructured their driver to offload some workload to an extra thread to increase performance if you had a multicore CPU, and at the time that made sense as many games didn't use the extra core(s) much. This did give Nvidia a performance advantage at the time. But come DX12/Vulkan and AMD has a better starting position because all the optimizations Nvidia did in the past don't do anything and their software + hardware architecture causes increased CPU load in their driver.
If you think about it, the 3090 is sort of a dual rank memory GA102 while 3080ti is single rank. If they interleave between the chips, yeah you have more chips but each chip will be running fewer cycles. Techpowerup found the 3090 used less power in multimonitor vs 3080ti.
The 3080ti only idles at 100W if you have the power management setting in NVCP set to max. This keeps the card at the boost clock, so of course it uses more power. Change it to default/balance and power drops to P8 state and like 10W. That’s a bingo.
Yeah I did wonder this, considering my 3080ti FE doesn't even spin the fans up when doing most basic non-gaming tasks. It's almost like he cherry picked a series of situations which would make the 6800XT look better.
And in true narcissistic tech-tuber fashion, he won’t reply when he’s shown to be wrong or coming to the wrong conclusion based on wrong/incomplete/biased information without verifying anything. Haters gonna hate. Shills gonna shill.
I have a 6900xt and paid retail for it. I can mine ETH at full hashrate and didn’t get butt hurt from Nvidia’s inflated prices. I upgraded from a 1080ti so I am not a fanboy. You guys enjoy your overpriced product limited to what your allowed to do with it. I’m not going back to company that only cares how much money they can make and limits you in its uses.
@@pilotstiles the 6900XT is already hash limited from AMD, they just don't advertise it like nvidia does. you think an unrestrained 6900XT only does 65MH/s?, just barely more than a 5700XT lol. it should be a LOT faster.
@@MrWarface1 Yea, they beat them at how power hungry, hot and massive in size (due to needing giant coolers) a product can be. 😂 I would take the product that's a few % slower, but much more power efficient, cooler and compact any day. BTW, what do you mean by "many generations Intel was destroying tsmc 7nm with 14nm"? As far as I know, there are just 2 generations of AMD CPUs on 7nm - Ryzen 3000 and Ryzen 5000. Intel didn't "destroy" any of those. Like WUT?? In the case with Ryzen 3000, Intel was 10% faster on average in games and much slower in productivity software. So I wouldn't use the term "destroy" in this case. Not to mention, the HEDT Ryzen 3000 chips (Threadripper) obliterated any Intel HEDT CPU at that time (and still does). And in the case of Ryzen 5000, AMD is a tad faster in games and, again, much faster in productivity workloads. So that makes only 1 generation of 7nm AMD CPUs (Ryzen 3000) in which Intel had any dominance and it was only in gaming. So overexaggerating much?
@@bgtubber that's your argument? They are bigger and hotter.. lmao who cares what you would rather have. You aren't an enthusiast. Also, keep lying to yourself about ryzen. Userbenchmark says it all.
@@MrWarface1 I'm not an enthusiast? Even though I have two RTX 3090s and a 32-core Threadripper 3970x with 64 GB of B-die RAM OC'ed to 3800 Mhz CL14 with tightened secondary and tertiary timings. Yep, I guess I'm not an enthusiast. 😂 And Userbenchmark? Really? Thanks for the laugh. Imagine using Userbenchmark to prove your point. UseLESSbenchmark are the laughingstock of the PC community. All reputable hardware reviewers have accused them of intentionally tailoring their benchmarking software in such a way that it will artificially boost Intel's numbers. Literally ANY other benchmarking software paints vastly different picture in terms of gaming and productivity performance for Intel vs AMD.
@@bgtubber lmao you flexing to the wrong guy scrub. I will assume you lying about your imaginary setup. If not, I'll post another vid of the real goods. ruclips.net/video/YBdnB3BARTI/видео.html imagine claiming a benchmarking site is a shill because the product you fanboi for gets exposed. They are real world benchmarks scrub. Boilerplate benchmarking metrics. Unfortunately "influencers" don't control the environment so people can actually record their systems performance with proper cooling and overclocks.
But amd charges the same for less. amd don't do cuda, don't have tenser cores for other apps that utilize it, they don't have mining capabilities (yes newer nvidia cards its been removed from) etc. AMD is charging way to much for their gaming only product. That's a fact. Either bake in more function or reduce prices. The two brands are completely different products this generation. But AMD decided to offer less and keep the same price and fanboys don't see this because of the faults of the nvidia cards and the evil brand practices.
Not having CUDA (there's quite less apps depending on Open CL, or not getting great performance, prolly as nVidia has a longer history in apps) neither the NVENC encoder (streaming and certain video editing) is a problem for a certain number of graphic apps (work). But I'm all for cards at a good price for graphic work, whatever the brand. As in... if u get me a card not as efficient in A and B apps due to feature lacking... but at a price so good that it's like a lower nVidia tier in cost... the AMD one might win by brute force (so it happened with Radeon VII)... But what I am seeing is both brands are still way too overpriced.
@@3polygons That's because both brands cannot saturate the market, and using this to get more revenue, nothing wrong with that it's just business. Nvidia sets the price and AMD follows (for the most part) Knowing this, We the consumers need to stop buying new GPU's if we want prices to come down. GPU's have received some sort of a luxury status faaaar beyond their value and consumerism has fucked us all in the end. I'm to blame as well I guess since I did buy a Asus G17 laptop with a 5900HX+RTX3070 combo, but that was after my GTX1070 died on me and RTX3070's were going for half the price of my G17. (made quite a bit back by mining on the side though) But yeah this consumerism and hype to literally sleep at retailers has gotten out of hand, don't get me wrong it's a fun experience with friends in a way I guess, but we are sending the likes of Nvidia and AMD the wrong signals here. Just my 2 cents
IMO the big selling point for GDDR6X was that it was supposed to give the 3090 "1TB/s bandwidth". i.e. 936.2GB/s / 19.5gbps ( * 21gbps) = 1008GB/s. Obviously due to the horrible power and heat draw they underclocked it to 19.5gbps so it ended up with 936.2GB/s, at the cost of all the heat and power.
@@nathangamble125 Not that it would make any difference for a halo product like the 3090. You don't buy a Titan class card for price/performance but for performance.
GDDR6X is the swan song of the GDDRx standard. The 32-bit bus is simply too limiting. You have to clock to the moon to match bandwidth that HBMx can easily obtain. HBM is the future for ultra-high bandwidth and memory capacities, however neither are really needed for gaming GPUs which is why HBM is only found in professional-tier SKUs that demand both. Fiji was proof of concept while Vega 56/64 was stuck in the unenviable position of trying to entertain professional market and gaming markets while AMD didn't have enough capital to make separate designs at the time
@@Krogoth512 I bet you engineers could make gddr6x work efficiently. Just saying. Otherwise indeed the way its implemented currently is a failure. For nvidia, its marketing. X is always better and more expensive.
Norwegian ❄️ Winters Dad - Chop some wood, son. For the Hearth. Son - Nah Da, I put a 3080 Tie in there. We're set. Dad - It's good to see my son spend my money for the good the family.
3090 GDDR6 also has lots of cooling problems. Any workload that heavily tasks the RAM will get the RAM to the throttling temp of 110. So, 3090 with all that RAM is pointless when it is constantly throttling whenever you try and use it. Almost every 3090 owner I've seen has had to do hacky cooling and underclocking. Otherwise, the performance is in par with 3080 and the memory chips will probably fail in a year or so running at 110. AMD has none of those problems!
This is causing problems also in some professional workloads, where I often get computation errors and crashes that I believe is due to memory errors (it happens a lot less with a powerful fan pointed at the back of the card). I assumed the AIB model I had was just horribly built so I got the SUPRIM one, but no - same temps, same issue. And the A-series cards is not an option at all, the A6000 starts at $7000 USD here.
Memory temp problems are a disgrace, it's pretty annoying to pay top dollar prices and have to fix such extreme temperatures. My 3090 FE reached 110C memory junction while the core stayed in the 60Cs under demanding games (4K 120Hz), it's a stupid and severe design flaw (happens with almost all versions out there). I was going to watercool it anyway, so I bought a block and problem solved, memory remains on the 70Cs now. But many people will not be able or even aware of this memory temp problem, it's absolutely unacceptable that GPUs are being sold with a time ticking bomb in their memories. They will degrade and there will be problems. There should a collective claim against Nvidia for selling cards like this.
Water cooling is the answer, it allows me to max out the memory overclock (+1500). Memory intensive applications have it running in the mid 90's still but only because the backside is still undercooled. Corsair make a block that cools the rear as well if that's still to hot for you. It's just a shame you need to go to this expense.
"Wrong Polygons" spoken like a true tech enthusiast who is very knowledgable about "architectures" and not someone from the car industry feigning a higher level of tech literacy to grow a following.
Well Hardware Unboxed that you referenced says the 6800XT *does not* perform like a 3080Ti in 4K. On average ~12% worse. That's around 10FPS in a lot of games at that resolution and it's definitely a noticeable difference. You can take any sample of 4/5 games to paint the narrative that you want. It's when you look at larger sample sizes like HU does that the truth comes out. Add to that: - better ray tracing performance (I know RDNA2 is not as bad as it looked at release in that department but it's still worse, sometimes *a lot* worse) - DLSS (even if FSR is great, right now DLSS is simply the better option because of adoption, and while Nvidia cards will get access to both, AMD cards will be FSR only) - better for productivity And suddenly the case for the 3080Ti doesn't look so bad. You said undervolting meant lowering it's performance but that's not even true, you can absolutely undervolt while keeping the same (or sometimes a bit better!) performance than stock. I don't think it's a good buy don't get me wrong, the 3080 is such a better price/perf, but imo you're not being fair. You're ignoring or downplaying what makes the Nvidia cards strong and laser focusing on their weak points (like power consumption).
Theres also an argument to be made that hw unboxed swapped some games that ran better on nvidia hardware for games like godfall and dirt which runs like 30% slower on nvidia like wtf?
I've noticed this channel really hates chewing on nvidia. Even hw is a bit nvidia hating since when you return the benchmarks nvidia cards even ran better at 1440p and 1080p! When looking at multiple sites that are genuinely considered good (so sites like that one which loves intels stick aren't included) and piling them into an average (like a meta-analysis would do) came to the conclusion that ampere runs better even at 1440p and 1080p (6800xt versus 3080 and 6700xt versus 3070). There is ofc the argument of overhead but I've seen plenty of people with say 5600x or 10th gen and above intel cpus which isnt all that uncommon to pair with ampere cards and should be what you'd pair with them. Nvidia have done some terrible things and will bend over games where they can (tbh any company in this position would imo, even the "best" giant companies turn around if they can) but they do have the superior cards if you have a decent cpu and power supply.
There might be an issue with 3080 ti... My 3090 consume ~35W according to HWiNFO64 at idle. My entire system idles at around 90W, so a 3080ti single-handily using 110W is very odd.
I like your review data delivery. I like how you conclude with general user experience between the card and it's contemporaries. Values on a bar chart only go so far after awhile.
As an A6000 owner we use it split the GPU into VM for work also. It actually was easier to get and because we can spit it between VMs that was a no brained because 16gb for 3 users.
@@RamkrishanYTVMware and license from Nvdia. All of which I wouldn't normally do but prices were so high and VRAM on the cards is so low. We use Omniverse for work and 12gb is standard for us.
That's not true. I have two 3090s which are even more power hungry than a 3080 Ti and they both idle at 20-30W. I think that Tom used an OC profile with fixed frequency and voltage. My 3090s idle at ~100W only if I overclock them with fixed frequency which doesn't allow them to downclock and go into lower power states.
@@bgtubber that actually sounds more realistic. Still it is weird that a 6800xt basically ist the same as a 3080ti :/ guess ist rtx 3060 or amd if you want more performance now. But i would be totally fine with a 3060 if i could get one for msrp :D
GJ Tom, this is how a small(-ish) independent review should look like. You point the good and the bad and you show real world scenarios and a lot of useful info on the side too, not just some graphs while also testing vs the competition. A lot of YT channels, big and small should do this. Even though HU and GN are the benchmarks for reviews, you school a lot of others how it should be done (looking at Linus, Jayz2c, Joker, even Redgamingtech and others). I'm also glad you don't shill for any company.
And yet Linus schools even them in other aspects. His insight into the market and company is something a lot of other channels don't have. (I'm far from a Linus fanboy, I watch more technical channels).
I love the rasterization perf AMD has & the lower wattage usage. But on the Nvidia cards that will be using OBS streaming with NVEnc encoding will also be the win for so many people.
That A6000 is a wet dream for someone working with Blender... (I do so, and render by GPU (which is a ton faster than in CPU) with that crazy amount of VRAM....yay). Not that I'd hit the limit of memory with such (would be an interesting experiment... how to fill up 48GB for a render :D :D). And I guess, for how Davinci Resolve Studio (not the free, the 300 bucks one) uses heavily GPU... that'd be AMAZING for even 8k projects. 24GB of the 3090 was already crazy...
Got my hands on the EVGA 3080ti FTW3 Ultra, and yea this thing is smoking HOT, if I let it run at 100% power it easily hits 80+c even on the lightest of tasks. Without overclocking.
@@lupintheiii3055 stupid invention when samsung ie the dudes who make GA102 have 18gbps gddr6. micron must give good head either that or huang picks favorites
The 8-pin CPU is a “standard” connector for GPUs in the enterprise space, which is exactly what the A6000 is aimed at and why it has that connector instead of the PCIe connectors. No adapter needed when used in a server as intended.
1:50 the most premium cooler I've ever felt in my life is the reference Radeon 6800XT. Beats the crap outta the Red Devil 6900XT cooler that I have in terms of quality (though not in performance, obviously). Seriously: it's all metal, heavy AF, hides the PCB completely, looks absolutely gorgeous and DOESN'T SAG. It's almost witchcraft. I actually felt bad about sending that card to the mines after getting the 6900XT, it's just too beautiful.
As the owner of a 6800xt I can say I am extremely satisfied with this card. I have the aorus card running at a 2650Mhz O.C. with max memory speed and timings, and it barely hits 60° at load.
In South Africa, I got a 3080ti for less than what I could get a 3080 for, well I couldn’t get a 3080 so I kinda took what I could get, I went with what ever gpu would perform best in the games I play, namely racing sims wity triple 1440p monitors and in that use case the nvidia cards destroy the amd cards. In mining I have managed to reduce power draw without lowering the hash rate. I’m getting 63mh/s with 55% power limit, so it ends up pulling 215w to achieve that hash rate and the card runs super cool, under 50° C. fully overclocked, the highest it gets to is 68°c the card is a Palit game rock 3080ti oc, very happy with its performance, first time using the brand
And who the a6000 is for? ... me. And no, I'm not a gamer, and yes we also exist. The reasons are probably irrelevant for you, but for me they are everything. Being able to get support from Nvidia engineers directly, which I need often. The stability and peace of mind when working with clients over the shoulder. Being able to sleep well while my a6000 is rendering all over the night without a glitch. And the fact that even if she lives in my super overpacked wokstation she is still friendly with my other high tdp cards (video, 2nd gpu, fiber channel, quad NVMes, etc..
I know, right? I can't believe he just all Quadro owners play games on them. There are probably more Quadro owners who never game than do. And the ones that do...likely have a dedicated gaming PC for that.
RTX3090 SUprim X - idles at less than 27Watts while watching this video. The cooler is very good too. I have undervolted a little and it boosts higher - 2000Mhz at times.
They should come out with a GA102 with 16 gigs of GDDR6, maybe slightly cut down cores from the 3080, and far lower power limit. It might be efficient, question mark?
Let me explain his explanation: One memory module takes 32 bits of bus width. GA02 has a 384 bit bus (if not cut down) and 384/32 = 12. So 12 memory modules. GDDR6 and GDDR6X is available in 1GB, 2GB and 4GB modules. But all modules have to be of the same size. So on a 384 bit bus you can have 12GB, 24GB or 48GB total. On a 320 bit bus you can have 10GB, 20GB or 40GB total. You can do the math for the rest :P
Not really it was bad hollow product which not useful for creators,gamers or miners they just planned the cash grab out since the they didn't want to waste die on cheaper models nvidia making more money on mobile gpu rather than desktop gpus
Ever watched hardware unboxed? Cause they do, without RT games the 6900XT is on top (the unlocked water-cooled behemoth). With RT (& DLSS) Nvidia is at the top of the charts until you get to like the RTX 3080. Of course that was before FSR was out, but only Godfall was in the games they tested so I doubt it would make a difference yet.
Seeing as I had my friends PC crash just yesterday due to overheating during this horrible summer, I find it funny how some people still excuse high power usage components. As far as I'm concerned, power usage is the second most important metric after the performance by a country mile. It affects heat output and noise levels directly and those are quite crucial unless you have your PC hidden in the other room, lol.
@@juliusfucik4011 yes indeed, and if you have the panels open, what's the point of the case? Also, you will heat up your room that way which is fun during winter and hell during summer. I had to remove my panels from Fractal define r5 case since it was suffocating older mid range Vega 56 and i7 4970k, lol. You couldn't get good airflow cases five years back but luckily today you can get them. I would strongly recommend getting something with proper airflow.
@@juliusfucik4011 It doesn't matter if you leave one panel off or water cool the entire build with top of the line radiators and fans. The components will still output the same amount of heat and the same amount of heat will be exhausted into your surroundings. Some people care about the heat/power consumption and some don't that's fair. My game room is rather small so I can feel the temperature change when I play something demanding on my 5900x/6800xt which are both water cooled in an open air case. If I was running them air cooled in an enclosed case I'd still be dumping the same amount of heat in the room though because the components will generate the same level of heat regardless of how efficient my cooling solutions are.
In that case, don't buy any current gen desktop part since power consumption shot up to >200w on both vendors. Maybe Intel will provide you with low power consumption GPUs?
@@MLWJ1993 both Ampere and RDNA 2 are quite efficient architectures and even if NVIDIA is busy raping their customers wallets, their achievements on Samsung's inferior node compared to what AMD is using with TSMC is nothing short of impressive. The current gen just cannot be overclocked a lot, if at all, and if you do, you lose any semblance of the inherent efficiency the architecture had to start with. I would gladly get 3060Ti, 3070 or 6700XT on MSRP, but that's just not going to happen anytime soon and in few months time I don't want to buy old hardware and will be looking forward to next gen. The current market conditions made this gen dead to me.
Respect everything this man says. However; I do have to say, I have a Zotac Trinity 3080ti, at load 330 W idle 21.0 W ... Performance is Superb! (despite zotac being perceived as *trash*)
I'm kinda annoyed about how bizarre the memory configurations are for Nvidia GPUs this generation. I'm looking to move up from 8 GB VRAM for Blender workloads (it's optimized for Optix/CUDA so an AMD GPU loses too much performance) and its like just bad choices all around. A 12/16 GB 3070 GDDR6 would be really ideal over a 2080Ti but instead we gotten a 8 TB 3070Ti. Sigh.
Yeah, I'm in the same boat. No good options for (high VRAM) Blender this generation except the 3090. I decided to get the 3090 (MSI Suprim) -- at 60% power cap it uses 15% less power (accounting for longer total render time + including CPU usage) for 4% longer render time. Haven't played with voltages yet. Idles at 40W which is fairly high IMO.
@@MooresLawIsDead Undervolting a 3090 can save as much as 30% on compute applications. Pity that's windows only and with the chapstick thermal pads many AIOs seem to use, the operating spec is exceeded at 60-70% of TDP. Some shocking engineering shortcuts.
reminds me of a vid kinda. where i made a comment about just the vram size in general cant remember where. but i once said how they got no problem doubling sizes in the datacenter side but struggle to even give us anything worth the price and honestly if they keep this up im retiring from gaming all together. one frosty day i get locked into console world once more.
I managed to get a 3070 back in November when prices were significantly better. I was still on the lookout for a higher end card but as you've pointed out in this video to be quite honest I haven't felt like I've been getting a 'compromised' gaming experience so what's the point? I like the fact it's using less energy than the top cards and with all the upscaling technologies emerging it feels a bit pointless having a screaming top end card. I do wish it had more Vram but I tend to upgrade my hardware semi regularly so it's less of a concern in that regard. Great video as always!
this was such a great review, one of the best reviews of anything I have ever seen, You should do more of these lol! This was very comprehensive, took into account the market conditions, the competition and the real world experience difference for the target audience of the channel. TomScore: 10/10
Yesterday my 80ti FE came. At first I kinda felt bad while watching this video, but then realized that I've been hunting a GPU for approximately 8 months. All 3070 (what I was originaly going to buy) are still more expensive then 80ti FE. So that's my answer to your question "who is this card for". It's for people who perhaps wanted to buy 3070, 3080 or 6800XT, but couldn't get FE or AMD's reference model and just so happened to get 80ti. I mean most of 3070 in my country used to be around 2000USD so a much better card is a no brainer for 1440USD (3080ti FE w/ UE taxes).
Back when I had a GV100, I could play Flight Simulator on a Win10 box over RDP, from a Windows 8.1 client, and the performance was absolutely flawless. Blew me away. And on idling, an idling GV100 didn't really add anything extra to power consumption from the wall socket. It wasn't really any different from any other newish consumer card -
There is a lot of countries in the world where creators are not so wealthy like on the west and they need as much CUDA cores as they can gest for the reasonable money. It is highly unprobeable that I can earn enough more money with A6000 to make it obvious choice comparing to 3080ti. Despite that this power consumption is bothering me because pushing such card to the limits for long period of time was not what I was expected on my current setup with 750W power supply.
This video is on point. I had a 3070, but gaming on my 1440p ultrawide it really struggled with some heavy titles like cyberpunk. So I bought a 3090FE not because it was the card I waned, but it was the only card I could get in December without paying scalper prices. The card gets 60 to 70 FPS in cyberpunk with DLSS quality and mostly top settings. Where I have a problem with the 3090 is the noise. Playing Control with an undervolt, the fans hit 2400 rpm and HWinfo is reporting a 106c peak junction temp while the gpu is at 61c. This is simply not good enough for a £1350 card. I know there is an issue with the thermal pads on the early models so I’m going to have to sort it. But you would think they could have just spent a few dollars extra on better pads in the first place considering the massive margin they must have in this card. Anyway rant over, keep the content coming 🙂.
About NVIDIA and memory; they slow drip capacity. If the 3080ti were 24GBs, they tie their hands for their next round and anything below that would be a downgrade. They give you just a enough RAM (which in a sense is technology they can’t control) and push their proprietary stuff.
FSR is going to start biting Nvidia in the ass as it gets updated over time. Hopefully studios will put it in most new games (hopefully some old games too) as it's so easy to drop into the rendering pipeline and tweak.
Good video, but i think you miss the point of the undervolting argument. I was able to drop total package power of my 3090 down to 320 wats, while overclocking the core another 85 mhz. (And yes the core was actually bosting higher after the undervolt) Nvidia just pushed the total package power way higher than they should have, so the cards would boost stupidly and preform at the highest clocks for reviews. Also I have been mining on my 3090 so long that it paid it self off, something that would take way longer on the 6900xt because of poor hashrate. Although i do agree the 3080ti is a joke while the 3090 exists on the market.
My 3080 paid for my 3080Ti lol. Also, I can confirm that the 3080Ti idles at ~110Watts, although I do have mine OC'd.
You basically got a 3080 sold it on eBay and got a tie right ?
This is the way. Used my 3080 for 7 months, sold it, got a 3080Ti+$200cash :)
I personally don't think it's worth it. I recently bought the 3090 FE (idles at 20W, lol) and gladly paid the 300$ premium for an unlocked hash rate and 24Gb of VRAM. I think 12Gb of VRAM is ridiculous for a 1200$ card with a capped mining performance on top. Not to mention the much smaller cooler, they should have used the 3-slot one from the 3090. I really feel the 3080 Ti it's an Nvidia money grab like Gamer's Nexus stated. If I wanted to save some money would better buy the 3080, much better value overall. Also sold my 2080Ti and 3080 at a profit, btw, got lucky it was just before the Bitcoin crash. Prices are much lower now.
What energy management settings are you using in Nvidia Control Panel?
I have read that when you switch it from "max performance" to "normal" that it will drop to 10-20 watts.
@@LiveForDaKill Funny how people find ok to use all the kind of workarounds with Ampere while the response to any workaround with RX 5000 cards was: "I'm paying $400, it need to just work, it's not my job to fix AMD mess!!!1!!!1!"
Not sure if red eyes in the thumbnail were accidental or intentional, but I’m here for it 😂
Difference between gddr6 and gddr6x is smaller than i expected, power usage compared to g6x is insane!
Marketing. G6X is better than G6 right...or it sounds like it should be...
infinity cache FTW !
@@VoldoronGaming i think people would've preferred 16gb gddr6 vs 3080 10 gddr6x
@@WayStedYou Of course they would, but then how Nvidia could outsell you a 3090?
Insane.. till you compare hashrates, and realize gddr6x makes a huge difference
LOOOLLL I was just listening to your podcast with Colin Moriarty where you said I'm not gonna give this a good review loll..
That is because Nvidia manipulated the launch reviews by sending people that stupid PCAP thing. It uses a slow as balls ADC so any power measurement will miss any power spikes. Igors Lab used a real Oscilloscope and measured almost 500W spikes on the 3080. But most reviewers won't spend the $400-600 on an entry level Oscope so by controlling the gear reviewers used to test the cards they could claim lower power usage. Believe me, Nvidia would NEVER use a $40 meter to test their GPUs when even a base model Multimeter costs double that. PCAP would be useless for any real measurements, and any maker should have known that immediately.
yeah, but any properly designed circuit with a slow ADC should have a low-pass filter on its input, so unless they ommited that the spikes should not matter much for power consumption
@@TheBackyardChemist but your PSU should still could handle power spikes like that, so it could potentially be a problem if your PSU has a bit too low wattage.
@@TheBackyardChemist That doesn't solve the issue, the spike will cause heat in a way or the other in some place inside your system wich will dissipate into your room (or somewhere else if by chance you have some kind of external radiator :-)
Yep. RTX 30 series are a big failure in terms of power consumption. That's why I have my 3090s undervolted and I suggest everybody else do that too if you know how to do it (it's not that hard really if you are tech-savvy). Now they run at ~280-290W under full load with minimal to no performance loss compared to the stock 350-380W.
@@bgtubber The fact is you can do that on RX 6000 cards too
So the race between the RX 6800XT and the RTX 3080Ti is a TIE! 😆
@Gareth Tucker it was a pun. But the performance was about equal between the two
I laughed so hard when you pointed out the 3060 has the same amount of V Ram as a 3080 Ti.
Maybe 3060 should have launched with just 6 gig vram, or the rest of the 3070-3080 should have had 16-20 gig of ram.
@@mtunayucer No, 3060 is fine with 12GB. Please don't give Nvidia any ideas.
@@bgtubber lmao
3060 = 12GB of slower ram, with less bandwidth. They are not the same.
@@gsrcrxsi GDDR6 is still plenty fast. It's not like it is 5 years old technology or something. The first GDDR6 cards to come out with GDDR6 were the previous gen RTX 20 series. Besides, you can always OC it the VRAM if you think that you need more raw power. I would personally take 12 GB of GDDR6 over 6-8 GB of GDDR6X if I had to choose.
I definitely enjoy the qualitative analysis of these reviews. This is something more reviews need, was it actually noticeable when being used? Or is it just better based on number fetish?
great work tom ! looking forward to seeing more content
Went from a 400 w 3080 to a 6900 xt. I much prefer the raw rasterization perf, lower power draw, reduced CPU driver overhead, and 6gb extra vram.
RDNA2 definitely wins this generation. Nvidia mindshare is still insane though.
I’d like Ampere cards more if they were priced competitively or at least had more vram for longevity.
When I saw RTX 3080 only have 10GB, I laughed and immediately reminds me of Kepler cards. Some games already consume almost 10GB when playing at 4K. I don't think this card will last for 3 years when playing at 4K.
Did you notice less heat being generated in the room with the 6900xt?
I’ve been saying this since I picked up my 6900xt at launch and I sold my 3080 a week later. Finally folks are staring to notice.
@@buckaroobonzai9847 yes I have a small room that my pc is in
This proves that amd was actually right by not going g6x and instead going infinity cache
Tom's opinion proves that? How? I have 2 3090s running 100% mining load 24/7 in my bedroom. Do u guys not have air conditioning? Amd peasants
GDDRX6 is currently an exclusive deal with Nvidia right now from was it micron?
@@MrWarface1 lmao using an 800 watt PC for mining then having a 600 watts air conditioning on inside the room to counter balance it. Your part of the problem with the current climate crisis we are in. Disgusting
@@wile123456 Well said.
@@wile123456 first off.. I have central air. Second, my power cost went up a little more than a dollar a day for the 15 a day they are making mining. Imagine being such an idiot, you try to make a point about stuff you know nothing about cause you are such a fanboi. Lmao
Glad to see your community providing you with hardware so you can keep up with the bigger channels.
I feel like the purpose of the 3080ti is so Nvidia can discontinue the normal 3080 and sell the same die for almost twice as much
No doubt. Yields improved as manufacturing matured and they saw all the money they were leaving on the table.
I think you are dead on Wei.
Don't believe everything you hear. Were living in a world of propaganda of huge proportion. Nvidia is evil, sure that's a fact, so is amd. And the FBI and the Obama's and bidens and Clinton's etc.
@@tacticalcenter8658 but comviently not the corrupt trump family does anything wrong? Lmao
@@wile123456 show me evidence. So far its just been lies from a criminal organization. Again... Show me truth and facts that aren't made up by the communist. You can't do it. Cause you've been so brainwashed you believe things people tell you without proof or manufactured proof.
Don't forget about Nvidia's software scheduler that creates a pretty noticeable additional load on a cpu making it eat more power and produce more heat. So 3080TIE! is 400w itself + some amount of parasitic power load. Based on HUB overhead investigation - around 10-30 watts.
Exactly ! Not only nvidia gddr6X cards consume an insane amount of power for the performance, but they also offload a part of theyr business to the cpu !
@@tetrasnak7556 What??? This is insane! How can Nvidia be so careless...
@@powerdude_dk they weren't careless, they had to do it to stay ahead of AMD.
@@powerdude_dk In the pre-DX12/Vulkan days the graphics pipeline with DirectX and OpenGL was much more CPU limited and also very difficult to parallelize to multiple thread. Nvidia restructured their driver to offload some workload to an extra thread to increase performance if you had a multicore CPU, and at the time that made sense as many games didn't use the extra core(s) much. This did give Nvidia a performance advantage at the time.
But come DX12/Vulkan and AMD has a better starting position because all the optimizations Nvidia did in the past don't do anything and their software + hardware architecture causes increased CPU load in their driver.
If you think about it, the 3090 is sort of a dual rank memory GA102 while 3080ti is single rank. If they interleave between the chips, yeah you have more chips but each chip will be running fewer cycles.
Techpowerup found the 3090 used less power in multimonitor vs 3080ti.
The 3080ti only idles at 100W if you have the power management setting in NVCP set to max. This keeps the card at the boost clock, so of course it uses more power. Change it to default/balance and power drops to P8 state and like 10W. That’s a bingo.
Yeah I did wonder this, considering my 3080ti FE doesn't even spin the fans up when doing most basic non-gaming tasks.
It's almost like he cherry picked a series of situations which would make the 6800XT look better.
This like all the guys setting power limit to 300W in BIOS and then complaining that their 11900K is actually drawing 300W :D
And in true narcissistic tech-tuber fashion, he won’t reply when he’s shown to be wrong or coming to the wrong conclusion based on wrong/incomplete/biased information without verifying anything. Haters gonna hate. Shills gonna shill.
I have a 6900xt and paid retail for it. I can mine ETH at full hashrate and didn’t get butt hurt from Nvidia’s inflated prices. I upgraded from a 1080ti so I am not a fanboy. You guys enjoy your overpriced product limited to what your allowed to do with it. I’m not going back to company that only cares how much money they can make and limits you in its uses.
@@pilotstiles the 6900XT is already hash limited from AMD, they just don't advertise it like nvidia does. you think an unrestrained 6900XT only does 65MH/s?, just barely more than a 5700XT lol. it should be a LOT faster.
Remember when we hoped the 3080 Tie was gonna be on TSMC 7nm? Pepperidge Farm remembers.
I remember when Nvidia went with samsung 8nm and beat tsmcs 7nm. I also remember the many generations Intel was destroying tsmc 7nm with 14nm.
@@MrWarface1 Yea, they beat them at how power hungry, hot and massive in size (due to needing giant coolers) a product can be. 😂 I would take the product that's a few % slower, but much more power efficient, cooler and compact any day. BTW, what do you mean by "many generations Intel was destroying tsmc 7nm with 14nm"? As far as I know, there are just 2 generations of AMD CPUs on 7nm - Ryzen 3000 and Ryzen 5000. Intel didn't "destroy" any of those. Like WUT?? In the case with Ryzen 3000, Intel was 10% faster on average in games and much slower in productivity software. So I wouldn't use the term "destroy" in this case. Not to mention, the HEDT Ryzen 3000 chips (Threadripper) obliterated any Intel HEDT CPU at that time (and still does). And in the case of Ryzen 5000, AMD is a tad faster in games and, again, much faster in productivity workloads. So that makes only 1 generation of 7nm AMD CPUs (Ryzen 3000) in which Intel had any dominance and it was only in gaming. So overexaggerating much?
@@bgtubber that's your argument? They are bigger and hotter.. lmao who cares what you would rather have. You aren't an enthusiast. Also, keep lying to yourself about ryzen. Userbenchmark says it all.
@@MrWarface1 I'm not an enthusiast? Even though I have two RTX 3090s and a 32-core Threadripper 3970x with 64 GB of B-die RAM OC'ed to 3800 Mhz CL14 with tightened secondary and tertiary timings. Yep, I guess I'm not an enthusiast. 😂 And Userbenchmark? Really? Thanks for the laugh. Imagine using Userbenchmark to prove your point. UseLESSbenchmark are the laughingstock of the PC community. All reputable hardware reviewers have accused them of intentionally tailoring their benchmarking software in such a way that it will artificially boost Intel's numbers. Literally ANY other benchmarking software paints vastly different picture in terms of gaming and productivity performance for Intel vs AMD.
@@bgtubber lmao you flexing to the wrong guy scrub. I will assume you lying about your imaginary setup. If not, I'll post another vid of the real goods. ruclips.net/video/YBdnB3BARTI/видео.html imagine claiming a benchmarking site is a shill because the product you fanboi for gets exposed. They are real world benchmarks scrub. Boilerplate benchmarking metrics. Unfortunately "influencers" don't control the environment so people can actually record their systems performance with proper cooling and overclocks.
It seems like AMD really nailed it by increasing effective bandwidth by using cache.
Optimizations vs brute force. The smarter way to do things.
But amd charges the same for less. amd don't do cuda, don't have tenser cores for other apps that utilize it, they don't have mining capabilities (yes newer nvidia cards its been removed from) etc. AMD is charging way to much for their gaming only product. That's a fact. Either bake in more function or reduce prices. The two brands are completely different products this generation. But AMD decided to offer less and keep the same price and fanboys don't see this because of the faults of the nvidia cards and the evil brand practices.
Not having CUDA (there's quite less apps depending on Open CL, or not getting great performance, prolly as nVidia has a longer history in apps) neither the NVENC encoder (streaming and certain video editing) is a problem for a certain number of graphic apps (work). But I'm all for cards at a good price for graphic work, whatever the brand. As in... if u get me a card not as efficient in A and B apps due to feature lacking... but at a price so good that it's like a lower nVidia tier in cost... the AMD one might win by brute force (so it happened with Radeon VII)... But what I am seeing is both brands are still way too overpriced.
@@3polygons That's because both brands cannot saturate the market, and using this to get more revenue, nothing wrong with that it's just business.
Nvidia sets the price and AMD follows (for the most part)
Knowing this, We the consumers need to stop buying new GPU's if we want prices to come down.
GPU's have received some sort of a luxury status faaaar beyond their value and consumerism has fucked us all in the end.
I'm to blame as well I guess since I did buy a Asus G17 laptop with a 5900HX+RTX3070 combo, but that was after my GTX1070 died on me and RTX3070's were going for half the price of my G17. (made quite a bit back by mining on the side though)
But yeah this consumerism and hype to literally sleep at retailers has gotten out of hand, don't get me wrong it's a fun experience with friends in a way I guess, but we are sending the likes of Nvidia and AMD the wrong signals here.
Just my 2 cents
@@dawienel1142 agreed. I bought a 1080ti for $300... What it was truly worth in 2020
That F bomb was ripe lmaooooo. This was goood after a long day, Mr. Tom!
IMO the big selling point for GDDR6X was that it was supposed to give the 3090 "1TB/s bandwidth".
i.e. 936.2GB/s / 19.5gbps ( * 21gbps) = 1008GB/s.
Obviously due to the horrible power and heat draw they underclocked it to 19.5gbps so it ended up with 936.2GB/s, at the cost of all the heat and power.
Should be fitted with HBM2E if they want all the bandwidth though.
@@gamtax Yeah, but Nvidia aren't going to use VRAM as expensive as HBM2E
@@nathangamble125 Not that it would make any difference for a halo product like the 3090. You don't buy a Titan class card for price/performance but for performance.
Finally someone in my feed that is telling the trouth about gddr6X !
What do you mean? Pretty much everybody knows that GDDR6X is blazing hot and cr@p at power consumption.
@@bgtubber Pretty much everyone who's really into PC hardware knows it, but that's definitely not the majority of people.
GDDR6X is the swan song of the GDDRx standard. The 32-bit bus is simply too limiting. You have to clock to the moon to match bandwidth that HBMx can easily obtain. HBM is the future for ultra-high bandwidth and memory capacities, however neither are really needed for gaming GPUs which is why HBM is only found in professional-tier SKUs that demand both. Fiji was proof of concept while Vega 56/64 was stuck in the unenviable position of trying to entertain professional market and gaming markets while AMD didn't have enough capital to make separate designs at the time
More like gddrFailure
@@Krogoth512 I bet you engineers could make gddr6x work efficiently. Just saying. Otherwise indeed the way its implemented currently is a failure. For nvidia, its marketing. X is always better and more expensive.
Norwegian ❄️ Winters
Dad - Chop some wood, son. For the Hearth.
Son - Nah Da, I put a 3080 Tie in there. We're set.
Dad - It's good to see my son spend my money for the good the family.
What they use in summer? 1030?
;)
There it is, the actual titan ampere + pro drivers. Ohh and a 2-4x price increase off course.
8-pin CPU port is likely for rack server chassis (as is the blower style cooler).
Much easier to manage one cable instead of 2 in a server.
"This ten thousand, seven hundred and fifty two..." DOLLAR? "...CUDA core..." Oh. Well, wait until it hits ebay.
this is truly the worst timeline
I play on an LG OLED 48 inch. That's where my Sapphire Nitro really shines in 4k.
Sapphire is a damn good company.
Same here brother. Nitro+ 6900XT OC. I'm usually always pegging that 120Mhz at 4K on just about any game.
Your camera is incredible. The video quality of the video is so great
Thank you so much!
Really enjoyed this review, was a unique angle and it's near impossible to disagree with.
my exact thoughts as well!
3090 GDDR6 also has lots of cooling problems. Any workload that heavily tasks the RAM will get the RAM to the throttling temp of 110. So, 3090 with all that RAM is pointless when it is constantly throttling whenever you try and use it. Almost every 3090 owner I've seen has had to do hacky cooling and underclocking. Otherwise, the performance is in par with 3080 and the memory chips will probably fail in a year or so running at 110. AMD has none of those problems!
This is causing problems also in some professional workloads, where I often get computation errors and crashes that I believe is due to memory errors (it happens a lot less with a powerful fan pointed at the back of the card). I assumed the AIB model I had was just horribly built so I got the SUPRIM one, but no - same temps, same issue. And the A-series cards is not an option at all, the A6000 starts at $7000 USD here.
@@TheAmmoniacal MSI cards, including Suprim, all have terrible thermal pads. I replaced mine on a 3080 ventus. VRAM went down by 20°C
Memory temp problems are a disgrace, it's pretty annoying to pay top dollar prices and have to fix such extreme temperatures. My 3090 FE reached 110C memory junction while the core stayed in the 60Cs under demanding games (4K 120Hz), it's a stupid and severe design flaw (happens with almost all versions out there). I was going to watercool it anyway, so I bought a block and problem solved, memory remains on the 70Cs now. But many people will not be able or even aware of this memory temp problem, it's absolutely unacceptable that GPUs are being sold with a time ticking bomb in their memories. They will degrade and there will be problems. There should a collective claim against Nvidia for selling cards like this.
Water cooling is the answer, it allows me to max out the memory overclock (+1500). Memory intensive applications have it running in the mid 90's still but only because the backside is still undercooled. Corsair make a block that cools the rear as well if that's still to hot for you. It's just a shame you need to go to this expense.
Nonsense, my 3090 vram never goes above 80c under load. Even mining, it stays at 102 or below. Its all about how it's cooled
I think Ruthless Looking Corporate Henchman bellowing "THE THIRTY EIGHTY TIE!" will haunt The Nvious One for a long time! :P
"Nvious" nice
"Wrong Polygons" spoken like a true tech enthusiast who is very knowledgable about "architectures" and not someone from the car industry feigning a higher level of tech literacy to grow a following.
hm?
Well Hardware Unboxed that you referenced says the 6800XT *does not* perform like a 3080Ti in 4K. On average ~12% worse. That's around 10FPS in a lot of games at that resolution and it's definitely a noticeable difference. You can take any sample of 4/5 games to paint the narrative that you want. It's when you look at larger sample sizes like HU does that the truth comes out.
Add to that:
- better ray tracing performance (I know RDNA2 is not as bad as it looked at release in that department but it's still worse, sometimes *a lot* worse)
- DLSS (even if FSR is great, right now DLSS is simply the better option because of adoption, and while Nvidia cards will get access to both, AMD cards will be FSR only)
- better for productivity
And suddenly the case for the 3080Ti doesn't look so bad. You said undervolting meant lowering it's performance but that's not even true, you can absolutely undervolt while keeping the same (or sometimes a bit better!) performance than stock.
I don't think it's a good buy don't get me wrong, the 3080 is such a better price/perf, but imo you're not being fair. You're ignoring or downplaying what makes the Nvidia cards strong and laser focusing on their weak points (like power consumption).
Theres also an argument to be made that hw unboxed swapped some games that ran better on nvidia hardware for games like godfall and dirt which runs like 30% slower on nvidia like wtf?
I've noticed this channel really hates chewing on nvidia.
Even hw is a bit nvidia hating since when you return the benchmarks nvidia cards even ran better at 1440p and 1080p!
When looking at multiple sites that are genuinely considered good (so sites like that one which loves intels stick aren't included) and piling them into an average (like a meta-analysis would do) came to the conclusion that ampere runs better even at 1440p and 1080p (6800xt versus 3080 and 6700xt versus 3070). There is ofc the argument of overhead but I've seen plenty of people with say 5600x or 10th gen and above intel cpus which isnt all that uncommon to pair with ampere cards and should be what you'd pair with them.
Nvidia have done some terrible things and will bend over games where they can (tbh any company in this position would imo, even the "best" giant companies turn around if they can) but they do have the superior cards if you have a decent cpu and power supply.
There might be an issue with 3080 ti... My 3090 consume ~35W according to HWiNFO64 at idle. My entire system idles at around 90W, so a 3080ti single-handily using 110W is very odd.
i dont believe it too, he didnt show any proof of it
That reflective shroud is gorgeous.
I like your review data delivery. I like how you conclude with general user experience between the card and it's contemporaries. Values on a bar chart only go so far after awhile.
Agreed. I think real world differences need to be explored more unless you are a mega channel that can afford to run through tons of data.
As an A6000 owner we use it split the GPU into VM for work also. It actually was easier to get and because we can spit it between VMs that was a no brained because 16gb for 3 users.
I liked the A6000 a lot. But I think an A4000 is the perfect one for me...
What softwares are required for it and which hypervisor do you use?
Does this require a license for Nvidia grid or something?
@@RamkrishanYTVMware and license from Nvdia. All of which I wouldn't normally do but prices were so high and VRAM on the cards is so low. We use Omniverse for work and 12gb is standard for us.
@@twinsim do you have any link to a youtube tutorial for it?
@@RamkrishanYT umm no there what 100 A6000 in the world. I would get like five views.
6:00 was the most legendary part of the review xD always great work keep it up!
When a 580 uses less power while mining that an idle 3080ti, and only has a 25% lower hash rate.
That's not true. I have two 3090s which are even more power hungry than a 3080 Ti and they both idle at 20-30W. I think that Tom used an OC profile with fixed frequency and voltage. My 3090s idle at ~100W only if I overclock them with fixed frequency which doesn't allow them to downclock and go into lower power states.
@@bgtubber that actually sounds more realistic. Still it is weird that a 6800xt basically ist the same as a 3080ti :/ guess ist rtx 3060 or amd if you want more performance now. But i would be totally fine with a 3060 if i could get one for msrp :D
Something is wrong with your 3080 ti it should not draw more then 20 watt.
GJ Tom, this is how a small(-ish) independent review should look like. You point the good and the bad and you show real world scenarios and a lot of useful info on the side too, not just some graphs while also testing vs the competition. A lot of YT channels, big and small should do this.
Even though HU and GN are the benchmarks for reviews, you school a lot of others how it should be done (looking at Linus, Jayz2c, Joker, even Redgamingtech and others).
I'm also glad you don't shill for any company.
And yet Linus schools even them in other aspects. His insight into the market and company is something a lot of other channels don't have. (I'm far from a Linus fanboy, I watch more technical channels).
The thumbnail makes me feel like I've just walked down a dark alley and met a black market GPU dealer.
What do you mean? Isn't this how people get their GPUs in the past few months?
Bannerlord benchmarks! Good man! =]
10:19 Sweet Warband mod pogchamp
I love the rasterization perf AMD has & the lower wattage usage. But on the Nvidia cards that will be using OBS streaming with NVEnc encoding will also be the win for so many people.
That A6000 is a wet dream for someone working with Blender... (I do so, and render by GPU (which is a ton faster than in CPU) with that crazy amount of VRAM....yay). Not that I'd hit the limit of memory with such (would be an interesting experiment... how to fill up 48GB for a render :D :D). And I guess, for how Davinci Resolve Studio (not the free, the 300 bucks one) uses heavily GPU... that'd be AMAZING for even 8k projects.
24GB of the 3090 was already crazy...
Got my hands on the EVGA 3080ti FTW3 Ultra, and yea this thing is smoking HOT, if I let it run at 100% power it easily hits 80+c even on the lightest of tasks. Without overclocking.
After this generation, I wonder if the big halo products will use the current flavor of HBM
Pro cards only and only if large customers demand it.
Awesome vid. I love those cuts... The "TY" edit was epic...Shit was cool😎.
underclock the g6x to 16gbps and re run. would be a cool idea
The way memory signalling work already different from standard GDDR6. Anyway nice to see
To demonstrate how GDDR6X is useless even at same transfer speed?
@@lupintheiii3055 yeah i wanted to see a timing difference basically
@@bigcazza5260 Considering how bitchy GDDR6X is it will probably crash or trigger ecc and make the experiment null
@@lupintheiii3055 stupid invention when samsung ie the dudes who make GA102 have 18gbps gddr6. micron must give good head either that or huang picks favorites
Very interesting. Missed the info about idle Temps before. That's insane
Damn son, you did it this time.
I've been waiting all week for this!!!!
I'm am really surprised when I saw the thumbnail looking forward for this video.
Tom, I liked your take on reviews. Great job!
ever since that one broken silicon i keep noticing him saying gdr6(x) instead of gddr6(x)
shorter name, and people still understand it...
The 8-pin CPU is a “standard” connector for GPUs in the enterprise space, which is exactly what the A6000 is aimed at and why it has that connector instead of the PCIe connectors. No adapter needed when used in a server as intended.
Another great review!
You and Dan are going great work! Keep it up
Power draw figures from Techpowerup are interesting. Idle: 3080Ti 16W 6800XT 29W Idle with V-sync 60HZ 3080ti 110W 6800XT 146W.
1:50 the most premium cooler I've ever felt in my life is the reference Radeon 6800XT. Beats the crap outta the Red Devil 6900XT cooler that I have in terms of quality (though not in performance, obviously). Seriously: it's all metal, heavy AF, hides the PCB completely, looks absolutely gorgeous and DOESN'T SAG. It's almost witchcraft. I actually felt bad about sending that card to the mines after getting the 6900XT, it's just too beautiful.
As the owner of a 6800xt I can say I am extremely satisfied with this card. I have the aorus card running at a 2650Mhz O.C. with max memory speed and timings, and it barely hits 60° at load.
Even the baseline 6800 with 16GB is a steal for MSRP and will age nicely.
In South Africa, I got a 3080ti for less than what I could get a 3080 for, well I couldn’t get a 3080 so I kinda took what I could get, I went with what ever gpu would perform best in the games I play, namely racing sims wity triple 1440p monitors and in that use case the nvidia cards destroy the amd cards. In mining I have managed to reduce power draw without lowering the hash rate. I’m getting 63mh/s with 55% power limit, so it ends up pulling 215w to achieve that hash rate and the card runs super cool, under 50° C. fully overclocked, the highest it gets to is 68°c the card is a Palit game rock 3080ti oc, very happy with its performance, first time using the brand
That A6000 is the most beautiful card I have ever seen.
And who the a6000 is for? ... me.
And no, I'm not a gamer, and yes we also exist.
The reasons are probably irrelevant for you, but for me they are everything. Being able to get support from Nvidia engineers directly, which I need often. The stability and peace of mind when working with clients over the shoulder. Being able to sleep well while my a6000 is rendering all over the night without a glitch. And the fact that even if she lives in my super overpacked wokstation she is still friendly with my other high tdp cards (video, 2nd gpu, fiber channel, quad NVMes, etc..
Not for everyone
It's for Big Chungus to brag on Reddit
I know, right? I can't believe he just all Quadro owners play games on them. There are probably more Quadro owners who never game than do. And the ones that do...likely have a dedicated gaming PC for that.
i just loved your review it is so simple easy to understand than others please keep doing it !!!
Thank you! Will do!
The 3080Ti is for Eskimos, Alaskans and Siberians
The 3080ti is meant to be used in Syberia.
Free heating is magic
Syberians now understand what heat means...
Im buying the 6600xt sometime later but i really wish 7700xt performs as good as the current 6800xt/6900xt
Upgrading again then nxt year
Tom is expecting anywhere from a 20-60% improvement over RDNA 2, so I think it's absolutely possible.
RTX3090 SUprim X - idles at less than 27Watts while watching this video. The cooler is very good too. I have undervolted a little and it boosts higher - 2000Mhz at times.
They should come out with a GA102 with 16 gigs of GDDR6, maybe slightly cut down cores from the 3080, and far lower power limit. It might be efficient, question mark?
you need a 256 bit bus or 512 bit one for 16gb something which ga102 doesn't have
Let me explain his explanation: One memory module takes 32 bits of bus width. GA02 has a 384 bit bus (if not cut down) and 384/32 = 12. So 12 memory modules. GDDR6 and GDDR6X is available in 1GB, 2GB and 4GB modules. But all modules have to be of the same size. So on a 384 bit bus you can have 12GB, 24GB or 48GB total. On a 320 bit bus you can have 10GB, 20GB or 40GB total. You can do the math for the rest :P
The 3080 Ti is for nVidia of course. They're making bank selling it.
Not really it was bad hollow product which not useful for creators,gamers or miners they just planned the cash grab out since the they didn't want to waste die on cheaper models nvidia making more money on mobile gpu rather than desktop gpus
Great job! You always benchmark right!
Thank you Brother
for honest review .
I have nvidia GPU so much power draw ..
Now amd have ray tracing I am love to purchase amd and sell nvidia
It is very clear for whom the 3080 Ti is for: For NVIDIA to increase their revenue.
Yeah, but could nVidia control the distribution on regular GDDR6? 😛
Fantastic video in every way! Informative, incisive, and funny.
5:40 this is so good. IDK why reviewers don't do a all in one average of fps for all games tested. It help's a lot with the bigger picture.
Ever watched hardware unboxed? Cause they do, without RT games the 6900XT is on top (the unlocked water-cooled behemoth). With RT (& DLSS) Nvidia is at the top of the charts until you get to like the RTX 3080. Of course that was before FSR was out, but only Godfall was in the games they tested so I doubt it would make a difference yet.
Seeing as I had my friends PC crash just yesterday due to overheating during this horrible summer, I find it funny how some people still excuse high power usage components. As far as I'm concerned, power usage is the second most important metric after the performance by a country mile. It affects heat output and noise levels directly and those are quite crucial unless you have your PC hidden in the other room, lol.
I have a 3090 and 5950x in a case. Running full load 24/7 and it is not loud or hot. The secret is to leave one panel open on your case 👍
@@juliusfucik4011 yes indeed, and if you have the panels open, what's the point of the case? Also, you will heat up your room that way which is fun during winter and hell during summer. I had to remove my panels from Fractal define r5 case since it was suffocating older mid range Vega 56 and i7 4970k, lol. You couldn't get good airflow cases five years back but luckily today you can get them. I would strongly recommend getting something with proper airflow.
@@juliusfucik4011 It doesn't matter if you leave one panel off or water cool the entire build with top of the line radiators and fans. The components will still output the same amount of heat and the same amount of heat will be exhausted into your surroundings. Some people care about the heat/power consumption and some don't that's fair. My game room is rather small so I can feel the temperature change when I play something demanding on my 5900x/6800xt which are both water cooled in an open air case.
If I was running them air cooled in an enclosed case I'd still be dumping the same amount of heat in the room though because the components will generate the same level of heat regardless of how efficient my cooling solutions are.
In that case, don't buy any current gen desktop part since power consumption shot up to >200w on both vendors. Maybe Intel will provide you with low power consumption GPUs?
@@MLWJ1993 both Ampere and RDNA 2 are quite efficient architectures and even if NVIDIA is busy raping their customers wallets, their achievements on Samsung's inferior node compared to what AMD is using with TSMC is nothing short of impressive. The current gen just cannot be overclocked a lot, if at all, and if you do, you lose any semblance of the inherent efficiency the architecture had to start with. I would gladly get 3060Ti, 3070 or 6700XT on MSRP, but that's just not going to happen anytime soon and in few months time I don't want to buy old hardware and will be looking forward to next gen. The current market conditions made this gen dead to me.
Respect everything this man says. However; I do have to say, I have a Zotac Trinity 3080ti, at load 330 W idle 21.0 W ... Performance is Superb! (despite zotac being perceived as *trash*)
Same card. Seems fine to me.
You know a card consumes a lot when you use the number of room you heat up even under AC to compare consumption LMAO
I got the 3080 fe. I'm happy with it. No need for a 3080ti.
I have an evga 3070 ti ftw3 ultra and I'm in love with it. Super quiet and very cold. Dope shit
I'm kinda annoyed about how bizarre the memory configurations are for Nvidia GPUs this generation. I'm looking to move up from 8 GB VRAM for Blender workloads (it's optimized for Optix/CUDA so an AMD GPU loses too much performance) and its like just bad choices all around. A 12/16 GB 3070 GDDR6 would be really ideal over a 2080Ti but instead we gotten a 8 TB 3070Ti. Sigh.
The 3070 should have been at least 10gb out of the gate. Nvidia penny pinching again.
@@Mopantsu That would make it the ideal budget 2080Ti alternative and would be a great deal but Nvidia just had to gank everyone's wallets...
Yeah, I'm in the same boat. No good options for (high VRAM) Blender this generation except the 3090. I decided to get the 3090 (MSI Suprim) -- at 60% power cap it uses 15% less power (accounting for longer total render time + including CPU usage) for 4% longer render time. Haven't played with voltages yet. Idles at 40W which is fairly high IMO.
@@Mopantsu Not really. Just clever marketing in the market segmentation.
part of the higher efficiency of A6000 compared to the TIE might also be binning 🤔
Not this much of a difference - and remember the A6000 is the full die.
@@MooresLawIsDead Undervolting a 3090 can save as much as 30% on compute applications. Pity that's windows only and with the chapstick thermal pads many AIOs seem to use, the operating spec is exceeded at 60-70% of TDP. Some shocking engineering shortcuts.
reminds me of a vid kinda. where i made a comment about just the vram size in general cant remember where. but i once said how they got no problem doubling sizes in the datacenter side but struggle to even give us anything worth the price and honestly if they keep this up im retiring from gaming all together. one frosty day i get locked into console world once more.
I managed to get a 3070 back in November when prices were significantly better. I was still on the lookout for a higher end card but as you've pointed out in this video to be quite honest I haven't felt like I've been getting a 'compromised' gaming experience so what's the point? I like the fact it's using less energy than the top cards and with all the upscaling technologies emerging it feels a bit pointless having a screaming top end card. I do wish it had more Vram but I tend to upgrade my hardware semi regularly so it's less of a concern in that regard. Great video as always!
I like the new offset camera angle 🙏❤️🙏
They want the 10gbs to cap out soon.
The real question is when will sub 200-300 $ cards come out?
It allready did… gt1030!
:)
this was such a great review, one of the best reviews of anything I have ever seen, You should do more of these lol!
This was very comprehensive, took into account the market conditions, the competition and the real world experience difference for the target audience of the channel.
TomScore: 10/10
Yesterday my 80ti FE came. At first I kinda felt bad while watching this video, but then realized that I've been hunting a GPU for approximately 8 months. All 3070 (what I was originaly going to buy) are still more expensive then 80ti FE. So that's my answer to your question "who is this card for". It's for people who perhaps wanted to buy 3070, 3080 or 6800XT, but couldn't get FE or AMD's reference model and just so happened to get 80ti. I mean most of 3070 in my country used to be around 2000USD so a much better card is a no brainer for 1440USD (3080ti FE w/ UE taxes).
Back when I had a GV100, I could play Flight Simulator on a Win10 box over RDP, from a Windows 8.1 client, and the performance was absolutely flawless. Blew me away.
And on idling, an idling GV100 didn't really add anything extra to power consumption from the wall socket. It wasn't really any different from any other newish consumer card -
Awesome video. Just FYI your 3070 was artefacting at 13:17….think you may have pushed the oc too high. EDIT: no it wasnt!
That made me panic. It had me thinking "It's not my 970's time! It's too early to die!"
As someone who has actually played Metro, that is the effect of being in a radioactive area. Unless I'm just missing something.
@@jmporkbob ah then disregard....It really looked like it!
There is a lot of countries in the world where creators are not so wealthy like on the west and they need as much CUDA cores as they can gest for the reasonable money. It is highly unprobeable that I can earn enough more money with A6000 to make it obvious choice comparing to 3080ti. Despite that this power consumption is bothering me because pushing such card to the limits for long period of time was not what I was expected on my current setup with 750W power supply.
Perhaps the 6800XT is the way to go then?
@CaptainMcShotgun For me the price of 3090 is unacceptable
@@xlinnaeus Fine but no cuda cores. I was a fan od Radeon obce but lets face the truth. This cards are for gaming or mining.
This review definitely the best hot take on 3080 Ti !
Got a 3090 hybrid. Nice for keeping some heat out of my case but, 4k gaming is now a winter sport
This video is on point. I had a 3070, but gaming on my 1440p ultrawide it really struggled with some heavy titles like cyberpunk. So I bought a 3090FE not because it was the card I waned, but it was the only card I could get in December without paying scalper prices. The card gets 60 to 70 FPS in cyberpunk with DLSS quality and mostly top settings. Where I have a problem with the 3090 is the noise. Playing Control with an undervolt, the fans hit 2400 rpm and HWinfo is reporting a 106c peak junction temp while the gpu is at 61c. This is simply not good enough for a £1350 card. I know there is an issue with the thermal pads on the early models so I’m going to have to sort it. But you would think they could have just spent a few dollars extra on better pads in the first place considering the massive margin they must have in this card. Anyway rant over, keep the content coming 🙂.
About NVIDIA and memory; they slow drip capacity. If the 3080ti were 24GBs, they tie their hands for their next round and anything below that would be a downgrade. They give you just a enough RAM (which in a sense is technology they can’t control) and push their proprietary stuff.
FSR is going to start biting Nvidia in the ass as it gets updated over time. Hopefully studios will put it in most new games (hopefully some old games too) as it's so easy to drop into the rendering pipeline and tweak.
Somebody already put it in GTA 5.
@@VoldoronGaming share link pls I wanna see 🤤
I guess Nvidia for the time being will enjoy this situation as an Nvidia Rtx customer you get access to both DLSS and FSR..
Good vid, keep up the good work!
During winter I use my GPU as a room heater.
Good video, but i think you miss the point of the undervolting argument.
I was able to drop total package power of my 3090 down to 320 wats, while overclocking the core another 85 mhz. (And yes the core was actually bosting higher after the undervolt)
Nvidia just pushed the total package power way higher than they should have, so the cards would boost stupidly and preform at the highest clocks for reviews.
Also I have been mining on my 3090 so long that it paid it self off, something that would take way longer on the 6900xt because of poor hashrate.
Although i do agree the 3080ti is a joke while the 3090 exists on the market.
Cant believe youre justifying the 3090.
@@morpheus_9 well i snagged one at msrp, and I have already broken even on the card with mining. I dont know how you cant justify a free 3090.
Great vid, bro.