Powercolor and Sapphire are supposedly great companies to work with. I hope AMD comes through with a nice chip after leaving them twiddling their thumbs at CES
Indeed. Just moved my build to a bigger case last month, in anticipation that my next graphics card will be a 2.5 or 3 slot card ... so of course they start going back to 2 slot cards. ;)
@@coreyisapushover I know there will be but a good number of them haven't been is what I mean. And it's really surprising. I was full expecting to see nothing BUT chonkers.
Not enough time? BS. They ran on and on about AI in their presentation, just like Nvidia did in its keynote, and could have easily talked about what people came there to hear about instead. Paul has it right: AMD just wanted to wait for Jensen's keynote to release any specifics. Positioning and naming used to be reliable ways to predict performance and price, but not anymore. People, stop paying attention to naming/numbering schemes. That's what they are -- schemes! And stop basing your buying decision on the color of the box. Go by specs, performance and price. If the card that meets your needs and price range is from AMD and is called the X-Factor 2000 LE, buy it. If it comes from Nvidia and is called the 5070, but you always only needed a xx60, buy it. If your ego makes you buy a xx80, buy it. If you can only see green, buy green. Summary: buying a GPU based on its name or the color of its box is foolhardy. But do what you must do. The 5090 is just $500 + 1500 AI units, right?
I spent some time last year slowly ordering the parts for a new gaming pc, then finishing it off with a Micro Center bundle in August. I decided amongst the things to keep from the old PC was my 6700 XT. I have been waiting ever since then for the next gen of graphics cards to come out, and this system also has a 7800X3D in it, so its basically an all AMD build. I've been wanting to keep it like that and have been pumped for the 9070 series and looking at the leaks and everything for months now. I was super disappointed in the lack of mentioning it at the press conference, but this video from Paul has kept me hyped up a bit. I saw the tweets from PowerColor about the Red Devil with the Hellstone before...and man, I think that's the one I want to get. I just hope it isnt absurdly more expensive than the Reaper. Love the content, Paul! Thanks!
I'm keeping my eye on these too. I used AMD (ATi before) on my home machines from 1995 through 2017 before the 1080 was just too good to ignore. Since then I've been looking for an excuse to go back, but the 6000 series had supply issues, and when Nvidia made the 40 series pricing a joke AMD responded by... making the launch prices of their 7000 series a joke, and clowning themselves by implying they were competing at the 90 series level. A used card won last gen for me, but let's hope AMD comes through this time.
I built a 7800x3d + 6700XT rig last year, and it absolutely screams. (Still on 1920x1080) I honestly don't need anything more right now, - it's going to take a lot to make me upgrade from the 6700XT, it's an incredible card!
@@BrokenKanuckupgraded the CPU from 2700x to 5700x3d with a 6700xt in 1080p. I'm going to try and make it last another three years then make a new build. I have no problem with lower settings.
@@elitebigs2010 u can go 1440p no prob with this setup depending on your game titles, i have the same build, horizon zero dawn at 1440p all maxed settings pure raster, gave me 65+ fps, 40~ fps 1%, all day, everyday, in an matx case
Thanks Paul, and Joe. Please send my thanks to PowerColor for making the 9070 XT Reaper, specifically because (unlike NV's garbage proprietary "non-SFF-ready" spec, the 9070 XT Reaper is a PROPER 2-SLOT, STANDARD HEIGHT video card. Keeping a design within those specifications should be applauded and encouraged.
Why are you so bent out of shape about the slot size and height? This was not a problem with the last gen and they sold boatloads. Amd needs to invest in their GPUs and stop using CU to do ray tracing.
Also did you see nvidia did release their SFF ready list for the new gpus? I’m not sure what you’re all up in arms about when they have an entire page on their website listing which cards are SFF ready and list their exact dimensions / specs 🤣
@@zappulla4092 Many people, me included, have sff and mff cases that dont really work with these large overhangs gpus are doing these days. We want to use cases that rely on GPUs actually following the standard for pcie card dimensions, especially the 111mm height spec, as when you add cables into the mix they barely fit in 150mm wide cases (common for older mff cases). It absolutely was a problem with the last generation, as pretty much only reference cards fit, with maybe a couple of exceptions (some of the sapphire cards come to mind). Hell, even my gtx1080 from msi doesnt even fit in my case in a normal orientation due to such overhangs, which is frankly absurd and not something i ever want to experience again when i upgrade. In terms of Slot height, i think its becoming less of an issue as motherboards transition to using more pcie lanes to m.2 use, allowing for more open slots under the primary pcie slot. However, there are people who still need certain pcie expansions cards in conjunction with their gpu, and thats before taking into account how much strain 3-4 slot gpus put on the slots and mounting hardware. The cards should remain within the spec size if possible, especially since there are recent developments in flowthrough gpu coolers that make it possible for these high wattage cards.
Hahaha although very funny that the youtube community is spreading Steve's CES Ban meme like wildfire, Steve is actually slaving away at his dungeon doing massive tests for the 5090/5080/5070/5070ti that he has on hand and on actual test bench benchmarking round the clock 😅 Steve probably also have both RX 9070 XT and non-XT in house cued for benchmarking too... Steve is the actual troll making Tim touch all videocards at CES when he already have them same shiny new gpu's in Australia...
The RX 9070 being 16GB, will mean its the new RX 580. 16GB of Vram at ~$400 when Nvidia will be offering 8gb with the 5060 at ~$400 (most likely) is really good.
Until you notice that the Nvidia card delivers more performance in raw and AI than the AMD GPU and the fact that neural rendering uses less vram in which case the lower vram on the Nvidia card won't be an issue.
@chrisking6695 Lol you've been fully hypnotised by the nvidia marketing, the lowering of the vram was like 400MB in that video they showed. Yes that was like a 70% reduction, but thats in a benchmark that is build around the rendering reducing the VRAM. In real games that isn't gonna make up the difference of 16GB vs 8GB. Also the 5060 won't be better than the RX 9070, it will on par at its best. MFG also is just bad, 75% of your fps will be "generated", even if the FPS number will look big when you have a fps counter enabled, the latency will still be based on the "real" frames rendered. Thus the game will still feel bad if you enabled it on 30fps and with MFG it shows 150fps.
@@chrisking6695 A good friend of mine has a 4070 Super 12GB, and in Indiana Jones and The Great Circle, frame generation (FG) uses more VRAM than running the game without FG. This results in lower FPS when FG is enabled, as the GPU is constrained by its 12GB VRAM. Jensen mentioned in their demo video that out of 33 million pixels, only about 2 million were actually calculated-the rest were AI-generated garbage. Well, he didn’t use the word "garbage,". I suspect that the 5070 won’t be powerful enough to fully utilize the new technology and expect that the 5090 will require new technologies like DLSS 4 frame generation to play upcoming AAA games at 4K at a reasonable framerate.
Here's the kicker though, and is a rumor. I think AMD caught wind of the price for the 5000 cards from Nvidia before their keynote and realized their new cards aren't as competitively priced. So they opted out to not display it. Meanwhile, they blabbered on and on about AI. So exciting!
@@xyr3s ain't nobody gonna buy it at 500 if they can find the 5070 at 550 which will probably perform the same at raster and be better at everything else besides vram, although even then they got gddr7 and the neural thing, which i got no idea if it makes a difference or not practically but will still convince most people to not care as much about the vram skimping nvidia does. amd has tried this tactic before and it has always failed, i don't see why it's gonna be any different now
@metalface_villain yea but we don't know how good or bad fsr4 is. We don't know if they got other stuff going on. I didn't know about multi frame gen or transformers until ces. I didn't see any of those things in the leaks. And they say the 9070 is on par with the 7900 gre which is on par with the 4070 ti i think? That's decent performance for 500 or at the least 400. I don't think it's going belom 400 tho. 9070xt for 500 would be massive i think. Probably going to be like 600 or 650 maybe.
The AMD presentation was incredibly poor from a plain old professional communications perspective. You'd think a multi-billion dollar company would do much better. They can ask me for help if they want!
I think Jack had some anxiety on that stage, so I'm trying to refrain from judging too harshly. They had some good content, it just wasn't delivered well. They obviously forgot they were support to announce new GPUs when the team put the presentation together though.
@@CareBear-Killer Fair point about anxiety, but I'd have done the whole thing differently and played to the strengths of people, rather than mis-casting them. Even from a pure content perspective it was much less impressive than it could have been.
AMD should be hammering the press with these cards, if they can offer new products at a fair price. Here in Australia the GTX5090 is going to be over $4000 (yes, that’s a 4), so any viable options are desperately needed.
Yep that's why I'm so interested in the 9070s amd cards. They are likely be cheaper than the 5070s so as long as they close in performance especially good native I'll definitely go that over NVIDIA.
Paul! Can you ask them why GPUs have moved to being longer and longer? Why not introduce a slightly thicker/wider card that is less than 300mm and just two large fans?
Mobo and case design would be the answer. The don't want to block another slot on the Mobo, and depending on the case the width of the card may cause issues. Cases have been getting wider lately in response to cards however, this years CES has some regular cases that actually have a bit of dual chamber room in back so the cards are not pressed up against the glass in the fish tank designs.
That Hellhound card looks so damn clean, that would be a nice aesthetic transition from my current 5700XT THICC II - c'mooooon nice test results and an even nicer price tag... And TWO SLOT IS BACK BABY, WOOOOO!!!
Hes not he is just back in Australia re running benchmarks for their data for 5000 and 9000 cards since HWU is only a 3 man team. Linus has hundreds of people and Steve from GN has like 7 people to do stuff
Omg! This man helped me build my first pc when he made videos for some other company or newegg or something back in 2017 or so with an am4 and a 1080! Glad I found him again for my upcoming build!😮😊
Currently running a 5700XT and I will be upgrading this cycle for one reason - two slot cards. I have been extremely tempted to get one of the 6xxx or 7xxx series cards at the sale prices because they pretty well priced but I cant deal with the massive size of these things and by extension how hot they run. Good to see both AMD and Nvidia return to reasonably sized cards.
I've had nothing but good luck with PowerColor over the years. My first was an ATI X800GT way back in the day. Glad to see them still in the market. I bought and sold a Red Devil model this past year and it was a really nice card... I liked the design and look even better than my personal MSI gaming X trio.
If the pricing they had intended to set for the 9070 and 9070XT was competitive with nVidia it seems like there would have been no reason to cancel the announcement. Most people thought the nVidia cards would be a lot more expensive. Meaning AMD had probably intended to price the 9070XT higher than $550. There were some rumors it would cost $600 or $650. This seems to suggest that those leaks/rumors quite likely were true. I guess they are not happy with pricing the 9070XT at $500 or lower, so I guess so much for people expecting the 9070XT to cost like $450 or lower.
They're caught between a rock and a hard place, with Intel's offering/pricing and Nvidia's. About the only thing most people remember from Nvidia's keynote is "5070 = 4090" and seen that AMD changed its naming to make the positioning clear, they now have a pricing problem. Mid range buyers will gladly pay a bit more and go for a 5070 while entry level buyers have the B580 and perhaps(?) B770. So, now AMD will be forced, IMO, to go for that 400$ spot, and they're most probably not happy about it.
@@duikmansIntel cards are sooooo bad though, no entry level gamer should be buying those. You'd be better off buying older used AMD/Nvidia cards as sad as that is.
@@Giliver I don't think you've been watching the reviews and benchmarks of the Battlemage GPUs over the past month. They aren't "bad" by any stretch of the imagination, unless you're an AMD/Nvidia shill.
@@BrianWelch-vc7xyI still wouldn't buy one. Id rather buy used Nvidia or amd. It's like the Qualcomm laptop chips emulating windows. You never want to be an early adopter on that kind of thing
@@BrianWelch-vc7xy They are really bad with older cpu's like 3600x,3800x and to some degree Zen 3.Those are all still very capable cpu's especially in single player games and then whole budget option B580 becomes obsolete.No one will pair that card with 7800x3d or 98003d.
To be fair if AMD were to "launch" first and release pricing and performance data, they'd get destroyed because they always aim way too high on pricing at launch. I have yet to see them truly understand that performance alone doesn't govern price. We live in the real world where demand is a thing and honestly the primary factor (when you're talking about market share). If realistically only 1 in 10 buyers is interested in your product to begin with, you have to price accordingly, and I've never seen them do that at launch. A year or two after when the accountants remind them they need to show some kind of profit from the GPU side of the store? Absolutely. At launch? It's just never happened. They always basically price to be just a bit below Nvidia and hope that's enough, and it never actually is.
So AMD has the 8060 intergrated gpu and does not show it off... I swear we saw Linus playing Black myth on it at 60 fps high on a tablet. Thats more mind blowing than anything anyone has presented for all of ces.
@SkylineLofe yes, but their marketing team is compiled of walking vegetable brains, apparently. Imagine they had just presented numbers that us with a normal functioning brain can interpret and not just mumbo jumbo AI this AI that. We'd like their AI a lot more if we just saw raw numbers. PC enthusiasts actually did pretty well in maths, usually at least.
Red Devil 9070xt with 3x 8Pin although it is not a high end card? Seems a bit overkill to me and is a disadvantage for looks and cabling as I hope it doesn't consume 300W+. At least for me that's definitely a consideration when choosing a specific model.
Read devil is and has allways been owerkill. I am sure that 95% are 3 pin and we even may see some one 8 PIN models. That third one is puhelu for LN owerclocking! But most likely even one 8 PIN would be enough at least in 9070 non x versions.
Thanks for this. Im trying to pick my 1st GPU & build, this helps me out a lot. Just gotta wait for the price announcement & hope I dont get stung too hard
I love that the new cards are mostly 2-slot size (except for the big boi Red Devil flagship card). I hope more companies do that on their versions. Can't wait to see more info on these and someday get the performance info, lol.
All AMD has to do is include something like "real performance, real frames" into the marketing of these GPUs and then deliver on that and they will mop the floor with Nvidia
@@metalface_villain fk if i know, im no engineer. Im just saying, if they can deliver on "more raw rasterization power" then theyll sell more units If they decided to head down the AI route along with Nvidia, then nobody wins and its status quo 🤷♂
@@OldBuford they had slightly more raster and slightly lower prices before and no one bought their gpus. 3 fame frames is a little crazy but ai in general is a good thing for gpus and it's proven to work, tbh it's one of the few places we need ai for
They have to make them look like super cars if they are going to charge us super car luxury prices. They look that way to convince you to overpay for it. It's the opposite of what you are thinking. $2 in lights and colored plastic mark it up 200%. The boxes and packaging costs more than the cosmetics you speak of.
@@sujimayne They could sell their consumer focused, top of the line performance for $300 and still make a good profit. The chip and card manufacturers operate on a strategy of artificial scarcity and douche marketing.
AMD: "we didn't have enough time in the keynote to talk about GPUs". Also AMD: proceeds yapping about AI for about 30 minutes straight with a guy on stage with as much charisma as a piece of wood. That keynote was absolutely horrendous. And WTF is that naming scheme?!?! 385 AI MAX + PRO SKIBIDI TOILET SIGMA OHIO GYATT EDITION Feels like AMD is just run by shareholders at this point.
My latest GPU was a Red Devil 6700XT. I really liked it until I got a small bump up. (Thanks, Paul 👍) Those new flow through design looks terrific 👌. So the Reaper replaced the Fighter?
Caveat that I run a 6800XT but I just want to say that if you don’t think all real time rendering is not wholly done with “smoke and mirrors” then boy do I have news for you. AI just a new type shade of smoke.
As an AMD user, Nvidia has the better product and it's why they are the market leader. You can't call Nvidia smoke and mirrors when AMD utilizes FSR and AFMF 2 to achieve the same thing.
@@Hitthegasmedia If you are talking about 4090 or the new 5090 then sure... But lets be honest - many, many people don't buy those. They got the 3070, 4060 or 4070 instead and then think that they're the best that there is to offer for the money. Outside of maybe 4070 TI Super, the reviewers said don't buy those. People bought them anyway, because they want the 720p upscaled RT experience or they're suddently all streamers. Then they turn off the RT anyway, because there is only few games, that it makes even sense to use with. Instead of getting something like 6700/7700 or 6800XT/7800XT, they now are probably thinking about upgrading again, only to see same memory configuration on the next gen. Will they just up the tier level by going from 4060 to 5070, or 4070 to 5070 TI? Who knows, but most of them probably will have this exact upgrade path, just like Nvidia wanted. I'm not sure that's really having a better product or just dumb customers.
Omg dual slot cards! You're telling me I might actually be able to fit a 9070 XT in my NZXT H210 case? Good thing I waited to upgrade! I hope there will be more dual slot cards!
I like how GPU releases are just Next-Next Gen releases these days. These cards will be nice in a few years. Why? Because that's when they'll be sold for MSRP, so thats when people should start buying them.
lmao you might be right, what do you think would be a good price for a 9070xt that has 4070ti level of performance? i really want a good gpu in rasterizationfor 350-400$
@@gelul12 no you don't. I'm seeing 7000 series cards for sale for the last couple of months for close, at, or sometimes lower than MSRP. It's just a matter of not being dumb enough to pay 4x what something is worth.
Sapphire does offer a two-fan model, but I don't know if it's the 9070 or a lower model. My guess is that the Pulse 9070XT will have three fans and the 9070 will have two.
Really disappointed with the design of the power color cards, their 6950xt was beautiful and their 7900xtx was nice looking but this doesn't have any of that charm.
The 9070XT was probably priced at $499 or even $549, but then they saw NIVIDIA's pricing. THE 9070XT should be $399, but it will likely be $449 at best.
that's reasonable actually, AMD had a slide where they aligned the 9070/XT with the NVidia products. now they can adjust prices to appeal to some not-too-hardcore green fans
Yes Power cooler. The only partner to do long thin for normal office PC cases. I am building a sleeper build and that limits width to the standard PCI-E width. Would love to know how long they are though as if over 350mm some inner case chopping my be required. But you can't chop out side the case.
It's a bit disappointing to not get more details from AMD, but I still appreciate these overviews of partner cards. Hope we do get good value, as you said. Thanks, Paul!
At least we got vram specs It's basically a 7900xt with better RT and a more efficient die. The only thing is that the 7900xt had 20gb vram and this only has 16
Please be TiSuper - 4080 performance for $499. AMD, you need this. The chart could be sandbagging to make nvidia thInk that there isnt as much competition in the midrange so they price high. Of course they were lower than most people expected for the mid range.
@@darkkingastos4369 Oh yes my 16gb 40 series GPU has me begging for something better from the likes of amd. This is for the market.. for gamers.. not for me.
I live in the North, and it frequently get very cold outside. My desk sits right next to a window, with my PC basically being right in front of it. As long as it's not precipitating, is it safe to open my window and let the cold air blow through my PC? Do you think this will help lower temps?
Scalpers have been making pricing not matter since 2018 with the RTX 2080. I'm not even going to pretend I could buy a 5070 for $549, setting my expectations for at least $849, especially when online retailers get into it now.
If the 9070s cost $400-500 tops with better wattage than my 7900XT while getting 4080s ray tracing. Cards will sell. I'm very happy with my 7900XT November 2023 for $650. Upgraded from the goat GTX 1080ti 2017 for $529.99.
I'm keeping an eye on these as I want a solid upgrade with 16GB of VRAM for hopefully no more than $600. Intel and AMD are pretty much my only hope left.
Wccftech are claiming a bench run in Call of Duty Black Ops 6 @ 99 FPS @ 4K Extreme Settings Without FSR and that is apparently with the nerfed driver so it's looking like a bit of a midrange monster. They are likely just waiting to see where they are going to price them...
@Vartazian360 true but 99fps is 7900xtx territory and that's supposedly with the nerfed driver. If that is indeed the performance (remains to be seen) and they have improved RT performance they have a fairly powerful GPU on their hands. It's all up to the pricing though. If it does go up against the 5070 in that regard that would be very interesting!
Gr8 video commentary Paul. I prefer the simpler 9070XT Reaper. Not too fussed about RGB on cards, as my fans have it. 2 x 8-pin is nice to see. I've never owned a PowerColor card, though I prefer 3 year warranty over2 years tbh. Nice cards for sure. Interesting times ahead, roll on the 1440p benchmarks.
They need to be less than $400 for them to be considered honestly… I NEED AMD to actually compete and intel to compete as well so we can bring true competition back…
No need. 5070 is $550… if this is $499 then it is OK. Both gpus are about 4070ti level… so speed is about the same. Nvidia has software adwantage. AMD has more vram. So -$50 is enough. If AMD is really agressive, then $450 is possible. But I don`t believe that AMD will go that low!
AMD doesn't have any real driver issues anymore. That's a belief that's from things that happened over a decade ago and people just refuse to move past it.
Have had AMD card last 8 years… no driver issues! Did have nvidia 1070 6 years before that.. no driver issues there neither. AMD does not like to be installaatio above Nvidia drivers for some reason, so better clean up those very well before going to AMD. But that is the only issue I am familiar with. Also all testers, like Gamer Nexus, and Hardware Unbozed have had issues with AMD drivers… but they do clean Windows install for each test, so no wonder why they don`t have issues.
I'll be delighted if the RX 9070 XT has 16g VRAM and can run off a 650w CPU. I'm currently happy with my 3 year old 6700XT but it's starting to make a hell of a noise when pushed to its limits.
AMD, "Rule #1 about our new GPUs.... We don't talk about our new GPUs!"
AMD never misses an opportunity to miss an opportunity.
Fight Club vibes lmao XD
This time is different, this generation Nvidia will completely destroy AMD share in the GPU market.
@@GM-xk1nw This generation Nvidia is a scam lol.. and i'm a Nvidia user.. but the 50 series are a joke.
@@GM-xk1nw Bullshit !
Powercolor, Sapphire and XFX at CES: "So, uh, hey we uh, we're... here?"
Powercolor and Sapphire are supposedly great companies to work with. I hope AMD comes through with a nice chip after leaving them twiddling their thumbs at CES
Only ASRock was smart enough ! Just because they are the BEST, in making motherboards & GPU's !
where is Diamond Brand and Club3d?
@@gertjanvandermeij4265 ASrock is still the king for budget AMD gpus
@@silentlamb2077so not worth it to get a sapphire
Steve being banned from CES because he kept walking around upsidedown. Cant be breaking physics
You are probably mistaking him for Beve Sturke.
You just can't Climb to the top of Booths and start shouting "WITNESS ME! 16GB SHOULD BE A MINIMUM FOR A MODERN GRAPHICS CARD"
I was literally about to ask why is he banned. Surely Tim (and Balan if he went) also walks upside down too?
XD
xD
Wait, what?
this gen's cards are surprisingly slim compared to last gen. both for team green and team red.
Indeed. Just moved my build to a bigger case last month, in anticipation that my next graphics card will be a 2.5 or 3 slot card ... so of course they start going back to 2 slot cards. ;)
@@deeber3960third party 5090s are absolutely going to be 2.5+ slots. AMD just isn't really trying to compete at the high end this time around.
Certain models. There were definitely a lot of chonker RTX 5080s and 5090s at CES. Like huge chonkers.
@@coreyisapushover I know there will be but a good number of them haven't been is what I mean. And it's really surprising. I was full expecting to see nothing BUT chonkers.
SFF is becoming quite popular. Can't wait to see the temps on that 5090 FE compared to some of the partner models.
I'm starting to believe steve is actually banned from CES lol .
For real cuz WTF 😂
He was touching those GPUs inappropriately
It is boring. There's lots of other people that does CES.
He is lol
He says he hates it. It’s just boring shit like cell phone releases and phone cases.
Not enough time? BS. They ran on and on about AI in their presentation, just like Nvidia did in its keynote, and could have easily talked about what people came there to hear about instead. Paul has it right: AMD just wanted to wait for Jensen's keynote to release any specifics.
Positioning and naming used to be reliable ways to predict performance and price, but not anymore. People, stop paying attention to naming/numbering schemes. That's what they are -- schemes! And stop basing your buying decision on the color of the box. Go by specs, performance and price. If the card that meets your needs and price range is from AMD and is called the X-Factor 2000 LE, buy it. If it comes from Nvidia and is called the 5070, but you always only needed a xx60, buy it. If your ego makes you buy a xx80, buy it. If you can only see green, buy green.
Summary: buying a GPU based on its name or the color of its box is foolhardy. But do what you must do. The 5090 is just $500 + 1500 AI units, right?
You're the man for this
AI is why Nvidia is up 2100% last 5 years. It would be a fire-able offense for AMD not to focus on AI, gaming GPUs are a small part of their revenue.
@@jo_magpie Yes, in some venues. But this was a presentation at the Consumer Electronics Show. CONSUMER.
why are you saying that someone would get a 5080 just because of ego?
@rangersmith4652 Nvidia has way more consumers than just gamers, dude.
Companies throwing their money at nvidia are customers too, you know?
I wonder what 16G means... Maybe it's talking about gravitational force o: 6:00
The GPU chip itself from AMD must weigh 16 grams. Or maybe the video memory chips weigh 16 grams. Something like this
I think it means it’s rated for up to 16G.
16x framegen probaby 🤔
16 times the performance:price ratio of all competition
Might mean it's gonna be 16 grand.
I spent some time last year slowly ordering the parts for a new gaming pc, then finishing it off with a Micro Center bundle in August. I decided amongst the things to keep from the old PC was my 6700 XT. I have been waiting ever since then for the next gen of graphics cards to come out, and this system also has a 7800X3D in it, so its basically an all AMD build. I've been wanting to keep it like that and have been pumped for the 9070 series and looking at the leaks and everything for months now. I was super disappointed in the lack of mentioning it at the press conference, but this video from Paul has kept me hyped up a bit. I saw the tweets from PowerColor about the Red Devil with the Hellstone before...and man, I think that's the one I want to get. I just hope it isnt absurdly more expensive than the Reaper. Love the content, Paul! Thanks!
I'm keeping my eye on these too. I used AMD (ATi before) on my home machines from 1995 through 2017 before the 1080 was just too good to ignore. Since then I've been looking for an excuse to go back, but the 6000 series had supply issues, and when Nvidia made the 40 series pricing a joke AMD responded by... making the launch prices of their 7000 series a joke, and clowning themselves by implying they were competing at the 90 series level. A used card won last gen for me, but let's hope AMD comes through this time.
I built a 7800x3d + 6700XT rig last year, and it absolutely screams. (Still on 1920x1080) I honestly don't need anything more right now, - it's going to take a lot to make me upgrade from the 6700XT, it's an incredible card!
@@BrokenKanuck you're going to run out of vram in a year. I just ditched my 6700xt due to micro-stuttering
@@BrokenKanuckupgraded the CPU from 2700x to 5700x3d with a 6700xt in 1080p. I'm going to try and make it last another three years then make a new build. I have no problem with lower settings.
@@elitebigs2010 u can go 1440p no prob with this setup depending on your game titles, i have the same build, horizon zero dawn at 1440p all maxed settings pure raster, gave me 65+ fps, 40~ fps 1%, all day, everyday, in an matx case
Thanks Paul, and Joe. Please send my thanks to PowerColor for making the 9070 XT Reaper, specifically because (unlike NV's garbage proprietary "non-SFF-ready" spec, the 9070 XT Reaper is a PROPER 2-SLOT, STANDARD HEIGHT video card.
Keeping a design within those specifications should be applauded and encouraged.
Love how clean the shroud/backplate designs look, too.
Why are you so bent out of shape about the slot size and height? This was not a problem with the last gen and they sold boatloads.
Amd needs to invest in their GPUs and stop using CU to do ray tracing.
Also did you see nvidia did release their SFF ready list for the new gpus? I’m not sure what you’re all up in arms about when they have an entire page on their website listing which cards are SFF ready and list their exact dimensions / specs 🤣
@@zappulla4092 Many people, me included, have sff and mff cases that dont really work with these large overhangs gpus are doing these days. We want to use cases that rely on GPUs actually following the standard for pcie card dimensions, especially the 111mm height spec, as when you add cables into the mix they barely fit in 150mm wide cases (common for older mff cases). It absolutely was a problem with the last generation, as pretty much only reference cards fit, with maybe a couple of exceptions (some of the sapphire cards come to mind). Hell, even my gtx1080 from msi doesnt even fit in my case in a normal orientation due to such overhangs, which is frankly absurd and not something i ever want to experience again when i upgrade.
In terms of Slot height, i think its becoming less of an issue as motherboards transition to using more pcie lanes to m.2 use, allowing for more open slots under the primary pcie slot. However, there are people who still need certain pcie expansions cards in conjunction with their gpu, and thats before taking into account how much strain 3-4 slot gpus put on the slots and mounting hardware. The cards should remain within the spec size if possible, especially since there are recent developments in flowthrough gpu coolers that make it possible for these high wattage cards.
@@zappulla4092 Because different people care about different things, surprising as it may be.
That first card was definitely giving me XFX 200 series vibes with its clean black shroud
4:26 It's likely 2.1 as RDNA 3 had DP 2.1 connectors. At this point 1.4 is ditched in favor of 2.1
Hahaha although very funny that the youtube community is spreading Steve's CES Ban meme like wildfire, Steve is actually slaving away at his dungeon doing massive tests for the 5090/5080/5070/5070ti that he has on hand and on actual test bench benchmarking round the clock 😅 Steve probably also have both RX 9070 XT and non-XT in house cued for benchmarking too... Steve is the actual troll making Tim touch all videocards at CES when he already have them same shiny new gpu's in Australia...
Steve is influencer pretending to be reviewer
@@kuyache2 but is the ban actually legit or just a meme? A bit out of the loop here.
The RX 9070 being 16GB, will mean its the new RX 580. 16GB of Vram at ~$400 when Nvidia will be offering 8gb with the 5060 at ~$400 (most likely) is really good.
Until you notice that the Nvidia card delivers more performance in raw and AI than the AMD GPU and the fact that neural rendering uses less vram in which case the lower vram on the Nvidia card won't be an issue.
@chrisking6695 Lol you've been fully hypnotised by the nvidia marketing, the lowering of the vram was like 400MB in that video they showed. Yes that was like a 70% reduction, but thats in a benchmark that is build around the rendering reducing the VRAM. In real games that isn't gonna make up the difference of 16GB vs 8GB. Also the 5060 won't be better than the RX 9070, it will on par at its best. MFG also is just bad, 75% of your fps will be "generated", even if the FPS number will look big when you have a fps counter enabled, the latency will still be based on the "real" frames rendered. Thus the game will still feel bad if you enabled it on 30fps and with MFG it shows 150fps.
@@chrisking6695 8 GB VRAm is certainly isssue.. for example if you wanna play games on that crap
@@sjoerdtheemperor5229the frame Gen is a neat tech, but yeah, they're not actually useful frames. Just lipstick on a pig.
@@chrisking6695 A good friend of mine has a 4070 Super 12GB, and in Indiana Jones and The Great Circle, frame generation (FG) uses more VRAM than running the game without FG. This results in lower FPS when FG is enabled, as the GPU is constrained by its 12GB VRAM.
Jensen mentioned in their demo video that out of 33 million pixels, only about 2 million were actually calculated-the rest were AI-generated garbage. Well, he didn’t use the word "garbage,".
I suspect that the 5070 won’t be powerful enough to fully utilize the new technology and expect that the 5090 will require new technologies like DLSS 4 frame generation to play upcoming AAA games at 4K at a reasonable framerate.
If the 5070 is 550 i can see amd selling the 7090xt for 500. If they want to actually move units and take market share they need it to be 450 or less.
I think at 500 they would move units and gain market share, at 450 or less would be a miracle lol.
Here's the kicker though, and is a rumor. I think AMD caught wind of the price for the 5000 cards from Nvidia before their keynote and realized their new cards aren't as competitively priced. So they opted out to not display it. Meanwhile, they blabbered on and on about AI. So exciting!
when you game at 4k with upscaling 9070xt on par with 3080 and 5070 on par with 4090
@@xyr3s ain't nobody gonna buy it at 500 if they can find the 5070 at 550 which will probably perform the same at raster and be better at everything else besides vram, although even then they got gddr7 and the neural thing, which i got no idea if it makes a difference or not practically but will still convince most people to not care as much about the vram skimping nvidia does. amd has tried this tactic before and it has always failed, i don't see why it's gonna be any different now
@metalface_villain yea but we don't know how good or bad fsr4 is. We don't know if they got other stuff going on. I didn't know about multi frame gen or transformers until ces. I didn't see any of those things in the leaks. And they say the 9070 is on par with the 7900 gre which is on par with the 4070 ti i think? That's decent performance for 500 or at the least 400. I don't think it's going belom 400 tho. 9070xt for 500 would be massive i think. Probably going to be like 600 or 650 maybe.
It is feeling eerily like the Vega and 10 series competitive launch between them.
The AMD presentation was incredibly poor from a plain old professional communications perspective. You'd think a multi-billion dollar company would do much better. They can ask me for help if they want!
The same multi billion dollar company that can't figure out how to name a GPU consistently across generations?
Goes to show you how little AMD cares about the GPU space
I think Jack had some anxiety on that stage, so I'm trying to refrain from judging too harshly. They had some good content, it just wasn't delivered well. They obviously forgot they were support to announce new GPUs when the team put the presentation together though.
@@CareBear-Killer Fair point about anxiety, but I'd have done the whole thing differently and played to the strengths of people, rather than mis-casting them. Even from a pure content perspective it was much less impressive than it could have been.
@@Desturel They wanted to get on the POWER LEVELS OVER 9000 train
I may or may not have upvoted this video, can't disclose until I get my NDA together.
AMD should be hammering the press with these cards, if they can offer new products at a fair price. Here in Australia the GTX5090 is going to be over $4000 (yes, that’s a 4), so any viable options are desperately needed.
Yep that's why I'm so interested in the 9070s amd cards. They are likely be cheaper than the 5070s so as long as they close in performance especially good native I'll definitely go that over NVIDIA.
damn you guys ain't even getting the rtx version :P
Are you suggesting that besides the 5090 there are no other options? Because that's a dumb thing to say
@@iamspencerx no he means just in general NVIDIA cards here are super expensive compared to amd
Why does everything in Australia cost so much
Would love to see sapphire nitro+ series
I really wish Sapphire got more attention.
Yep. The only brand I touch for AMD GPU's.
My 6950xt is a Sapphire model probably the best card I’ve had.
Agreed! Either Sapphire or XFX. I loved my nitro 5700xt. Couldn't find a good price for the 6950xt variant so I went with XFX.
Nitro life!
I like the 2-slot clean design of the 9070XT Reaper, hoping they price it correctly.
id like them as thick as possible. no noise plz
ooh that 9070 xt reaper is looking good! very much appreciate 2 slot designs
Fighter but cleaner?
i feel annoyed that the hellhound has a LED switch but the red devil apparently doesn't seem to have one.
My last Red Devil card was ARGB, so it was software controlled.
Paul! Can you ask them why GPUs have moved to being longer and longer? Why not introduce a slightly thicker/wider card that is less than 300mm and just two large fans?
Mobo and case design would be the answer. The don't want to block another slot on the Mobo, and depending on the case the width of the card may cause issues. Cases have been getting wider lately in response to cards however, this years CES has some regular cases that actually have a bit of dual chamber room in back so the cards are not pressed up against the glass in the fish tank designs.
That Hellhound card looks so damn clean, that would be a nice aesthetic transition from my current 5700XT THICC II - c'mooooon nice test results and an even nicer price tag...
And TWO SLOT IS BACK BABY, WOOOOO!!!
i heard a lot of bad things about the thicc series
I still like the 8pins more than the nvidea connectors!
2:05 i love the “active windows” in the corner
Why is Steve from Hardware Unboxed banned from CES?
Hes not he is just back in Australia re running benchmarks for their data for 5000 and 9000 cards since HWU is only a 3 man team.
Linus has hundreds of people and Steve from GN has like 7 people to do stuff
Omg! This man helped me build my first pc when he made videos for some other company or newegg or something back in 2017 or so with an am4 and a 1080! Glad I found him again for my upcoming build!😮😊
16gb on both versions if you didn't listen to the first part 7:12 shows the XT and he says it a few seconds later that its 16gb
Currently running a 5700XT and I will be upgrading this cycle for one reason - two slot cards. I have been extremely tempted to get one of the 6xxx or 7xxx series cards at the sale prices because they pretty well priced but I cant deal with the massive size of these things and by extension how hot they run. Good to see both AMD and Nvidia return to reasonably sized cards.
What is the length of the Red Devil 9070XT?
Edit: wrong card name, they are too smiliar now
WHAT.
@@soccerboss7924 fixed
I've had nothing but good luck with PowerColor over the years. My first was an ATI X800GT way back in the day. Glad to see them still in the market. I bought and sold a Red Devil model this past year and it was a really nice card... I liked the design and look even better than my personal MSI gaming X trio.
1:10 - WTF???
They simply have nothing to show, or just waiting for what prices NVIDIA will show.
AMD knows the cards are utter jank this time around.
i love the fact that gpus are coming back to 2 slots form factor, my Lian Li x Dan A3 will be happy.
If the pricing they had intended to set for the 9070 and 9070XT was competitive with nVidia it seems like there would have been no reason to cancel the announcement. Most people thought the nVidia cards would be a lot more expensive. Meaning AMD had probably intended to price the 9070XT higher than $550. There were some rumors it would cost $600 or $650. This seems to suggest that those leaks/rumors quite likely were true. I guess they are not happy with pricing the 9070XT at $500 or lower, so I guess so much for people expecting the 9070XT to cost like $450 or lower.
They're caught between a rock and a hard place, with Intel's offering/pricing and Nvidia's. About the only thing most people remember from Nvidia's keynote is "5070 = 4090" and seen that AMD changed its naming to make the positioning clear, they now have a pricing problem. Mid range buyers will gladly pay a bit more and go for a 5070 while entry level buyers have the B580 and perhaps(?) B770.
So, now AMD will be forced, IMO, to go for that 400$ spot, and they're most probably not happy about it.
@@duikmansIntel cards are sooooo bad though, no entry level gamer should be buying those.
You'd be better off buying older used AMD/Nvidia cards as sad as that is.
@@Giliver I don't think you've been watching the reviews and benchmarks of the Battlemage GPUs over the past month. They aren't "bad" by any stretch of the imagination, unless you're an AMD/Nvidia shill.
@@BrianWelch-vc7xyI still wouldn't buy one. Id rather buy used Nvidia or amd. It's like the Qualcomm laptop chips emulating windows. You never want to be an early adopter on that kind of thing
@@BrianWelch-vc7xy They are really bad with older cpu's like 3600x,3800x and to some degree Zen 3.Those are all still very capable cpu's especially in single player games and then whole budget option B580 becomes obsolete.No one will pair that card with 7800x3d or 98003d.
Very Nice Looking GPU's Well Done Paul & Power Color! Prayers Up & Fingers Crossed for 'Affordable Pricing' in Canada???
To be fair if AMD were to "launch" first and release pricing and performance data, they'd get destroyed because they always aim way too high on pricing at launch. I have yet to see them truly understand that performance alone doesn't govern price. We live in the real world where demand is a thing and honestly the primary factor (when you're talking about market share). If realistically only 1 in 10 buyers is interested in your product to begin with, you have to price accordingly, and I've never seen them do that at launch. A year or two after when the accountants remind them they need to show some kind of profit from the GPU side of the store? Absolutely. At launch? It's just never happened. They always basically price to be just a bit below Nvidia and hope that's enough, and it never actually is.
Would temp go down further by removing the led piece at the backplate blocking half of the air pass through ?
So AMD has the 8060 intergrated gpu and does not show it off...
I swear we saw Linus playing Black myth on it at 60 fps high on a tablet.
Thats more mind blowing than anything anyone has presented for all of ces.
on high settings mind you. That's impressive impressive
@@SkylineLofereally?
Oh wow that is pretty good!
@SkylineLofe yes, but their marketing team is compiled of walking vegetable brains, apparently.
Imagine they had just presented numbers that us with a normal functioning brain can interpret and not just mumbo jumbo AI this AI that. We'd like their AI a lot more if we just saw raw numbers. PC enthusiasts actually did pretty well in maths, usually at least.
@@Giliver yes
Red Devil 9070xt with 3x 8Pin although it is not a high end card? Seems a bit overkill to me and is a disadvantage for looks and cabling as I hope it doesn't consume 300W+. At least for me that's definitely a consideration when choosing a specific model.
Read devil is and has allways been owerkill. I am sure that 95% are 3 pin and we even may see some one 8 PIN models.
That third one is puhelu for LN owerclocking!
But most likely even one 8 PIN would be enough at least in 9070 non x versions.
Reaper 9070 XT - 260W, Red Devil 9070 XT - 330W+
Red devil is always their overkill card.
Retailers 100% have these. Just got a box from scan with my new mobo and 9800x3d.. the box has RX9070 16G-L/OC on the sticker so they reusing boxes!
Thank you. I am very interested in these. Subbed for updates.
Didn't have enough time? That's crazy AMD...
Thanks for this. Im trying to pick my 1st GPU & build, this helps me out a lot. Just gotta wait for the price announcement & hope I dont get stung too hard
I keep forgetting this, but why did Steve get banned for CES
fr is paul trolling? google only shows the infamous nvidia ban
He is running nvidia 50 series bencmarks… no time to go CES!
😂
He didn't he is just at home benchmarking he sent Tim and Balan. They don't have huge teams like Linus
I love that the new cards are mostly 2-slot size (except for the big boi Red Devil flagship card). I hope more companies do that on their versions. Can't wait to see more info on these and someday get the performance info, lol.
All AMD has to do is include something like "real performance, real frames" into the marketing of these GPUs and then deliver on that and they will mop the floor with Nvidia
said by someone that has never used dlss
@bigturkey1 wtf are you talking about? Of course I've used DLSS 🙄
how will they do that if they are using the same technologies themselves to reach decent fps though?
@@metalface_villain fk if i know, im no engineer.
Im just saying, if they can deliver on "more raw rasterization power" then theyll sell more units
If they decided to head down the AI route along with Nvidia, then nobody wins and its status quo 🤷♂
@@OldBuford they had slightly more raster and slightly lower prices before and no one bought their gpus. 3 fame frames is a little crazy but ai in general is a good thing for gpus and it's proven to work, tbh it's one of the few places we need ai for
Awesome shots, those cards look super slick, thanks guys!
I hate how much emphasis is placed on how the cards look. The RGB glowie crowd just increases costs for the rest of us.
Yep. Increases the cost... by €5-10.
They have to make them look like super cars if they are going to charge us super car luxury prices. They look that way to convince you to overpay for it. It's the opposite of what you are thinking. $2 in lights and colored plastic mark it up 200%. The boxes and packaging costs more than the cosmetics you speak of.
@@sujimayne They could sell their consumer focused, top of the line performance for $300 and still make a good profit. The chip and card manufacturers operate on a strategy of artificial scarcity and douche marketing.
Prices? Vram? Any word on better drivers for professional work?
AMD: "we didn't have enough time in the keynote to talk about GPUs".
Also AMD: proceeds yapping about AI for about 30 minutes straight with a guy on stage with as much charisma as a piece of wood.
That keynote was absolutely horrendous. And WTF is that naming scheme?!?! 385 AI MAX + PRO SKIBIDI TOILET SIGMA OHIO GYATT EDITION
Feels like AMD is just run by shareholders at this point.
My latest GPU was a Red Devil 6700XT. I really liked it until I got a small bump up. (Thanks, Paul 👍)
Those new flow through design looks terrific 👌.
So the Reaper replaced the Fighter?
1:58 thank you for not playing the minus-one-dollar game with the pricing
One dollar, don't be ridiculous!!!
It's one cent. ._.
Nice video. It's to bad that there's no 24gb card. - Looks like I will be keeping my MSI Radeon RX-7900XTX Gaming Trio Classic for a while longer.
The enthusiasm is so overwhelming. AMD makes cards too. Nvidia has a monopoly on ppls minds and i dont play games with smoke and mirrors.
Caveat that I run a 6800XT but I just want to say that if you don’t think all real time rendering is not wholly done with “smoke and mirrors” then boy do I have news for you. AI just a new type shade of smoke.
As an AMD user, Nvidia has the better product and it's why they are the market leader. You can't call Nvidia smoke and mirrors when AMD utilizes FSR and AFMF 2 to achieve the same thing.
Intel too
@@Hitthegasmedia If you are talking about 4090 or the new 5090 then sure...
But lets be honest - many, many people don't buy those. They got the 3070, 4060 or 4070 instead and then think that they're the best that there is to offer for the money. Outside of maybe 4070 TI Super, the reviewers said don't buy those. People bought them anyway, because they want the 720p upscaled RT experience or they're suddently all streamers. Then they turn off the RT anyway, because there is only few games, that it makes even sense to use with.
Instead of getting something like 6700/7700 or 6800XT/7800XT, they now are probably thinking about upgrading again, only to see same memory configuration on the next gen. Will they just up the tier level by going from 4060 to 5070, or 4070 to 5070 TI? Who knows, but most of them probably will have this exact upgrade path, just like Nvidia wanted.
I'm not sure that's really having a better product or just dumb customers.
@@Hitthegasmedia Never implied that. Rasterization is king. They all use this stuff (smoke and mirrors) but it's meaningless to most ppl.
Omg dual slot cards! You're telling me I might actually be able to fit a 9070 XT in my NZXT H210 case? Good thing I waited to upgrade! I hope there will be more dual slot cards!
If you're the underdog, why the hell would you want to go first? LOL. Take your time AMD, get the marketing right.
Do you think that we can get benchmarks for tgese cards before they hit the market?
i really hope so
I like how GPU releases are just Next-Next Gen releases these days. These cards will be nice in a few years.
Why? Because that's when they'll be sold for MSRP, so thats when people should start buying them.
And by then there wont be any left so you have to buy the new gen at artificially inflated prices
lmao you might be right, what do you think would be a good price for a 9070xt that has 4070ti level of performance? i really want a good gpu in rasterizationfor 350-400$
@@gelul12 no you don't. I'm seeing 7000 series cards for sale for the last couple of months for close, at, or sometimes lower than MSRP. It's just a matter of not being dumb enough to pay 4x what something is worth.
@@nocheckmarkgames for now there is stock. Will be gone soon
@@hyakkimaru1057 the 5070 only has 4070ti perf without dlss 4.0 so 550?
Any chance we'll see a 2-fan RX 9070 (either vanilla or XT) at launch? I can't fit those 3-fan clunkers in my case.
Sapphire does offer a two-fan model, but I don't know if it's the 9070 or a lower model. My guess is that the Pulse 9070XT will have three fans and the 9070 will have two.
Really disappointed with the design of the power color cards, their 6950xt was beautiful and their 7900xtx was nice looking but this doesn't have any of that charm.
It's a cheap solution for a cheap product I'm guessing. However they might be guessing that the card is not worth the time. Who knows...
Because the margin on a 500 dollar card leaves them much less room than a 1000 dollar card
Got my first power color card with my current red dragon 6800xt. Great card, great look. I’ll definitely be looking to them when I upgrade. 😁
I've seen NEITHER Steve on CES...
Th red devil does look really nice. Do we know if the light on the end is just red or can it be changed?
The 9070XT was probably priced at $499 or even $549, but then they saw NIVIDIA's pricing. THE 9070XT should be $399, but it will likely be $449 at best.
With the risk that a lot of buyers will say 'hey, for 100$ more I have a 5070...'
AMDs keynote took place 6+ hours before nvidias though
that's reasonable actually, AMD had a slide where they aligned the 9070/XT with the NVidia products. now they can adjust prices to appeal to some not-too-hardcore green fans
Why would it need to be 399 when the 5070 is just a 4070ti without dlss 4.0?
@@WayStedYou Because nvidia has better technology so amd gotta compete with pricing
I like the clean backplate on the first 9070xt. Very elegant.
I have a feeling these are going to be power hungry :(
Considering the 7900xt is about 300 watts id expect around that range for the 9070 non xt
Was XFX present? Were/are you able to showcase what they are offering?
Just give me a good card for DCS VR!
SHEEEESH!
Yes Power cooler. The only partner to do long thin for normal office PC cases. I am building a sleeper build and that limits width to the standard PCI-E width. Would love to know how long they are though as if over 350mm some inner case chopping my be required. But you can't chop out side the case.
Why would I pre-order a mystery card product? What’s next, signing up for a dating app where all the profiles are just question marks?
Its better than paying real money for fake frames over at nvdia lol
I like the simple clean look of the Reaper and its smaller form factor.
It's a bit disappointing to not get more details from AMD, but I still appreciate these overviews of partner cards. Hope we do get good value, as you said. Thanks, Paul!
The "confirmation" of 16 GB on the 9070 was good. The muted banter from off-camera was also amusing.
CES gave us almost no information but pretty graphs and what cards look like. Pretty useless.
At least we got vram specs
It's basically a 7900xt with better RT and a more efficient die. The only thing is that the 7900xt had 20gb vram and this only has 16
Power color seems to make the best cards for temps/noise and I’m looking forward to seeing them in action.
Please be TiSuper - 4080 performance for $499. AMD, you need this. The chart could be sandbagging to make nvidia thInk that there isnt as much competition in the midrange so they price high. Of course they were lower than most people expected for the mid range.
I would be fine with 7900xt level performance as long as the price is right. $399 would make sense to me honestly.
Copium
AMD: best we can do is 4070 performance for $799
@@jimdob6528yeah, If it’s 4080 ti super level for 400 bucks, WITH better Ray Tracing.
@@darkkingastos4369 Oh yes my 16gb 40 series GPU has me begging for something better from the likes of amd. This is for the market.. for gamers.. not for me.
I live in the North, and it frequently get very cold outside. My desk sits right next to a window, with my PC basically being right in front of it. As long as it's not precipitating, is it safe to open my window and let the cold air blow through my PC? Do you think this will help lower temps?
Yes
Scalpers have been making pricing not matter since 2018 with the RTX 2080. I'm not even going to pretend I could buy a 5070 for $549, setting my expectations for at least $849, especially when online retailers get into it now.
Thank you for all the coverage
If the 9070s cost $400-500 tops with better wattage than my 7900XT while getting 4080s ray tracing. Cards will sell. I'm very happy with my 7900XT November 2023 for $650. Upgraded from the goat GTX 1080ti 2017 for $529.99.
Priceing with the Powercolor 9070 cards is important to. Had a Hellhound and it was great.
I'm keeping an eye on these as I want a solid upgrade with 16GB of VRAM for hopefully no more than $600. Intel and AMD are pretty much my only hope left.
How long do you think they will announce the MSRP? Or release date?
ThX for the info,
9070 w 16Gb Vram,
9070XT w 16GbVram also. Now what will AMD do about pricing?
nuvudia 5070 only 12Gb booooo, hissss
Why aren't they using the single connector?
Wccftech are claiming a bench run in Call of Duty Black Ops 6 @ 99 FPS @ 4K Extreme Settings Without FSR and that is apparently with the nerfed driver so it's looking like a bit of a midrange monster. They are likely just waiting to see where they are going to price them...
Keep in mind amd does really good in cod engine always. For example 7900xtx is close to 4090 perf in previous cod. But overall its like a 4080
@Vartazian360 true but 99fps is 7900xtx territory and that's supposedly with the nerfed driver. If that is indeed the performance (remains to be seen) and they have improved RT performance they have a fairly powerful GPU on their hands. It's all up to the pricing though. If it does go up against the 5070 in that regard that would be very interesting!
Gr8 video commentary Paul. I prefer the simpler 9070XT Reaper. Not too fussed about RGB on cards, as my fans have it. 2 x 8-pin is nice to see. I've never owned a PowerColor card, though I prefer 3 year warranty over2 years tbh. Nice cards for sure. Interesting times ahead, roll on the 1440p benchmarks.
What's the fascination with "hell" themed shit lol
Amd is red. Fire is red. Hell is full of fire. It's edgy I guess?
Yeston probably have a nice Waifu card to cater for different market segments.
That's kinda been PowerColor's theme for as long as I can remember
@@ZachStJohn-vi1kg I always think of Sin City or some other kind of noir stuff. Guess I'm not edgy enough rofl
They enjoy trolling fundamentalist Americans.
3:20 nothing beats a simple design
I am very appreciative that AMD does NOT use the wacky power delivery system Nvidia chose to use
Thanks for sharing this!!! I’m planning on getting the 9070 XT, but I’m wondering how it will look in white colour 🤔
0:38 RDNA 4? you meant FSR 4? 🤣 CES is indeed hard
So with no high end gpu for amd, will Powercolor have a Liquid Devil version this time around?
They need to be less than $400 for them to be considered honestly… I NEED AMD to actually compete and intel to compete as well so we can bring true competition back…
Lol
so that you can buy Nvidia cheaper? LOL
Seriously, if they undercut the market, they can certainly move more volume (if that can handle it) and win overall.
maybe you should ask one for free?
No need. 5070 is $550… if this is $499 then it is OK. Both gpus are about 4070ti level… so speed is about the same. Nvidia has software adwantage. AMD has more vram. So -$50 is enough. If AMD is really agressive, then $450 is possible. But I don`t believe that AMD will go that low!
I'm looking to upgrade from my 6700xt but I only have a 650w PSU. Will the new AMD and Nvidia cards need more than that?
I want to support AMD but they're GPUs aren't cheap enough to make up for the lack of features and driver issues.
AMD doesn't have any real driver issues anymore. That's a belief that's from things that happened over a decade ago and people just refuse to move past it.
Have had AMD card last 8 years… no driver issues! Did have nvidia 1070 6 years before that.. no driver issues there neither.
AMD does not like to be installaatio above Nvidia drivers for some reason, so better clean up those very well before going to AMD. But that is the only issue I am familiar with. Also all testers, like Gamer Nexus, and Hardware Unbozed have had issues with AMD drivers… but they do clean Windows install for each test, so no wonder why they don`t have issues.
I'll be delighted if the RX 9070 XT has 16g VRAM and can run off a 650w CPU. I'm currently happy with my 3 year old 6700XT but it's starting to make a hell of a noise when pushed to its limits.
3 power pins but isn't as strong as an xtx....sad I think they were embarassed to say it's specs and didn't want to get laughed out of CES
Embarached… because their middle range GPU is almost as fast as their last flagship…
😂
Well…
Fingers crossed for a half decent array of GPU's to choose from.. (those 5090 prices though as always.. damn..)