It's a double die card, it would ve double the price of a base one or they would have to sell it at worse margins. And double die GPUs are not well supported in games, can cause frame time variation and other bad stuff, since they work the same way that Crossfire/SLI does.
@@shepardpolska crossfire allowed you to control how the cards worked together from game to game. While it took a bit of work to do it, it outperformed Nvidia in a wider array of games for a cheaper price. Dropping crossfire was silly as it gave them an edge. SLi wasn't designed for gaming. Crossfire was. Late SLi only allowed you to use it in what, 6 games. And crossfire didn't care. Every game, you just had to go into catalyst control center and tell it which mode to run in for what game. If AMD didn't have a recommended for the game you just linked and executable and told it what to do. It didn't matter.
this must be your first time seeing one of his videos, he casually does things like de-lidding a 700$ i9-13900K or disassembling and customizing 4090s to run on risky settings and solders things to them or other rare expensive tech, because he knows what he's doing unlike most tech YT channels
@@acidtoe1278 thanks for educating me about the most simple and basic pricing logic that applies to basically every other industry that has products for both consumer and business applications because otherwise I wouldn't know that because I have been living without the knowledge of the definition and value of money for my whole life
It's similar to all the duo cards from AMD generations ago. Like the fury x or 295x. But they stopped the consumer use of crossfire same with Nvidia and sli. So it is a special thing. They also have a dual 580 card.
Fury X is not a duo card, I would know as I own it. You are thinking of the Radeon Pro Duo, which was 2 of the Fury X dies. Sometimes called Fury X2 unofficially to follow the naming of the R9 295X2
amd already claimed they had a faster card than the 7900xtx to try compete with the 4090 but they said they didnt want because of the price they would have to sell it at.
@@blvckl0tcs750Both companies do mate. They don't because they need to make a profit. Mass producing a card that only select few consumers will buy is not only causing them a loss in profits but a waste of resources and stock. The 4090 is expensive enough. We don't need a $5000 gpu on the market too
@@horyzengm the 4090 isn't expensive. Compared to what it and costed which is what 90 cards replaced, that current price is a steal for the performance. The rest i can understand though.
Those videos cards are used at work place workstations. They are meant for video editing, transcoding, rendering, anything other than gaming. The dual GPU helps to be more productive so you don't have to wait till the GPU finish compiling, you can use the second GPU to move on to the next project
It's a workstation dual die gpu. It's....like it's super impractical. If anything, I'm happy that they experimented with it, but it will only lead to good mccx gpus, as multi die gpus are super wasteful.
der8auer is one of the most famous overclockers in the world, and they speak English. You shouldn't be surprised that he is well-known outside of Germany!
As a gamer on mac, you can infact game on a mac. On the macOS app store there are resident evil village, and if you‘re on an apple silicon chip even stray. You can also get steam for mac, but you can‘t play big name games like cod or CS2. But a lot of games made with unity are available for mac from steam. Currently my favorite games from steam to play on my mac are: -Isonzo, a ww1 FPS with incredible detail. -Ravenfield, a very fun (early access game) with huge modding support even for mac through the steam workshop -Muck, a roguelike survival game which is incredibly fun, yet challenging -War thunder, you all should know war thunder by now Also some games I would suggest that work with mac are: -Deltarune once it releases (and it‘s demo) -Undertale -Crab game
They wouldn't be criticized for having a better card than the RTX4080, that thing is just so expensive, that it can only placed on a computer with all the cheaper better GPUs not being an alternative.
When AMD first unveiled Ryzen and Zen architecture, they also showed their upcoming future line up all the way to Zen 4c which showed they already had the manufacturing process to create those chips, so realistic speaking even though they unvielded that back in 2017, the zen 4 chip actually existed, and who knows, maybe even had the plans already worked out for zen 5 and upwards.
No… it most certainly didn’t, they probably just had some level of architecture planning but I assure you they did not have ryzen 4 processors just lying around when they announced ryzen 1. The reason this card performs so well is because it has 2 dies, it’s not some secret better gpu amd is withholding from gamers.
basically nvidia SLI but soldered onto one pcb without bridge,concept is old and used in gpu's but this doesnt hide the fact that its actually double gpu not a single gpu like nvidia 30 series or 40 series,it may cause problems with gaming as games struggle with sli support or this concept,shown by linus tech tips,on other hand it is mac and no one buys mac for gaming,creative workflow tasks go smoothly.
It’s a dual card, It would be like running cards in SLI (but a little hotter because they share the same heatsink, there is a reason no one makes dual cards anymore)
It's surprisingly not that hot because it was factory super-undervolted. But to achieve that, you'll need the best binning possible. That's... The problem, on top of Pro user "taxes"
The fact you are forced to play war thunder at medium graphics is exactly why mac's can't game. War thunder isn't even an intensive game and you gotta play it at medium settings not MTO mention the small catalog of what you can play are non taxing games.
It'll be nvlink in GeForce now so that NVIDIA can fit more people onto one server. Sli/crossfire died because single GPUs were better while being far cheaper
Hmm... Amd did comment on this and said .. We could make gpu 's that would beat the Rtx 4090 and put a 600 watt TDP on it... But they didn't as they knew that most people don't have enough budget to buy One component that costs 1600 USD and they aim towards the mid ranged market where sales are the biggest
I want you to think real hard on how bullshit that is when their current cards are still getting smashed and aren't path tracing. 40 series isn't mainly about raw power it's about efficiency. And the cost wouldn't make sense it's it's enthusiast level and still cheaper than what a titan would cost which is essentially what the 90 cards derived from. 1500 4090 is a steal compared to almost 3000 USD Titan.
well, it's a dual GPU card, so you can basically half that performance and get a rough understanding of what one of the core's outputs are. it's important to understand that if the application doesn't support CrossfireX then you won't get the full performance of the cards, just half.
Only about as fast as a 7900xtx assuming your workload supports crossfire, isn't bottlenecked by pcie, you can power it, your okay with passthrough, you can cool over 400 watts of heat, and you can deal with general issues.
I don’t think AMD is refusing to sell this card for customers. It’s probably an exclusive deal that was made in partnership with Apple, so that they can increase their market share through Apple.
you know you are definitely getting old when you see 7900 XTX is AMD's top of the line card, but then you also remember that this exact name was also used many many years ago....
but the last thing when ppl tested games it was like 150% lower performance on macOS, but when you installed Windows on mac (not joke) and it gave 150% more fps
i think it's the low turnover rate at lower nm wasting tons of silicon driving up prices. there is also the pcie limits 3.0 is old school 4.0 is somewhat newer and 5.0 is rtx 4090 territory. my 3060 uses 4.0 and thus can allow more cores. if we put an rtx 4090 on a 4.0 or 3.0 pcie it runs like a turtle. although very fast still they have data limits due to the architecture of pcie versions so going 2nm is pointless if the motherboards do not support it yet. we will need another pcie jump to surpass rtx 4090 in a meaningful way unless rtx 4090 is not capping out pcie 5.0 yet there may be a few more upgrades but the 4090 is such a monster as it is and there is known headroom to be had with the rtx 4090 ti coming out so the celling has not been hit for 5.0 that's for sure. maybe 2nm will be about 2030 as we have a long way to go with seeing the potential of 5.0
I swear if AMD had a better numbering system for their GPUs, they would be stronger competitor. The 20/30/40 and then 60/70/80/90 system is just so easy for the average person to understand. The ease makes it more accessible.
ruclips.net/video/nuj4JOe7Tak/видео.html
No
@@memerified bro really said "no"
No
Not a rick roll
It's dual GPU card. It's not 4090 competitor. 4090 is a single chip.
If they could turn a profit from it. They would.
The market size and marketing cost of going against the 4090 is probably the reason why amd doesn't want to compete in that area
It's a double die card, it would ve double the price of a base one or they would have to sell it at worse margins.
And double die GPUs are not well supported in games, can cause frame time variation and other bad stuff, since they work the same way that Crossfire/SLI does.
Companies want to make money, but nobody is omniscient. They THINK selling it wouldn't be a net benefit to them.
They profitted, dont forget they have corporate businesses as their customer, not just personal plebs
@@shepardpolska crossfire allowed you to control how the cards worked together from game to game. While it took a bit of work to do it, it outperformed Nvidia in a wider array of games for a cheaper price. Dropping crossfire was silly as it gave them an edge. SLi wasn't designed for gaming. Crossfire was. Late SLi only allowed you to use it in what, 6 games. And crossfire didn't care. Every game, you just had to go into catalyst control center and tell it which mode to run in for what game. If AMD didn't have a recommended for the game you just linked and executable and told it what to do. It didn't matter.
That guy had the guts to solder a thing which costs thousands of dollars all for a RUclips video
To make twice to even triple that amount on that video
this must be your first time seeing one of his videos, he casually does things like de-lidding a 700$ i9-13900K or disassembling and customizing 4090s to run on risky settings and solders things to them or other rare expensive tech, because he knows what he's doing unlike most tech YT channels
@@CreeplayEU work station cards cost much more than consumer cards, that thing probably costs like twice as much as a 4090.
@@acidtoe1278 thanks for educating me about the most simple and basic pricing logic that applies to basically every other industry that has products for both consumer and business applications because otherwise I wouldn't know that because I have been living without the knowledge of the definition and value of money for my whole life
@@CreeplayEU np buddy, glad I could help
Kinda deceptive to not mention this is a dual chip graphics card.
😢
You can see both chips when he shows it on screen??
If says DUO in the name and the image of the card has 2 silicon.
Pd. They are not for sale today btw
No it isn't lol
How is it deceptive?
Ain’t no way you’re real
I mentioned this once in a discord server and some guy called me an apple shill 💀
LOL
ok apple shill
@@endermaster08 and it begins
Discord and Reddit users are generally against most things.
Hate apple, hate intel, hate Nvidia….
@@VeniVidiAjaxgenerally haters… because somehow the competition are waaaaay better 😂
It's similar to all the duo cards from AMD generations ago. Like the fury x or 295x.
But they stopped the consumer use of crossfire same with Nvidia and sli.
So it is a special thing. They also have a dual 580 card.
Fury X is not a duo card, I would know as I own it. You are thinking of the Radeon Pro Duo, which was 2 of the Fury X dies. Sometimes called Fury X2 unofficially to follow the naming of the R9 295X2
Off topic but the r9 290x is my favorite gpu
amd already claimed they had a faster card than the 7900xtx to try compete with the 4090 but they said they didnt want because of the price they would have to sell it at.
Sure sure chinese amd worries
If you think Nvidia doesn't have stronger cards they can dump onto the market you are highly misconcepted.
@@blvckl0tcs750 and btw i said AMD not nvidia.
@@blvckl0tcs750Both companies do mate. They don't because they need to make a profit. Mass producing a card that only select few consumers will buy is not only causing them a loss in profits but a waste of resources and stock. The 4090 is expensive enough. We don't need a $5000 gpu on the market too
@@horyzengm the 4090 isn't expensive. Compared to what it and costed which is what 90 cards replaced, that current price is a steal for the performance. The rest i can understand though.
Those videos cards are used at work place workstations. They are meant for video editing, transcoding, rendering, anything other than gaming. The dual GPU helps to be more productive so you don't have to wait till the GPU finish compiling, you can use the second GPU to move on to the next project
So basically what a Mac is used for.
It still beats the 4080 in a gaming benchmark tho
@@luke_webster yeah but workstation gpus are not efficient benchmarks are only numbers
It has lower clocks most of the time, just like quadros
It's a workstation dual die gpu. It's....like it's super impractical. If anything, I'm happy that they experimented with it, but it will only lead to good mccx gpus, as multi die gpus are super wasteful.
I am a German and as a viewer of the German (and main) der Bauer/der8auer channel, I was shocked to find him here.
der8auer is one of the most famous overclockers in the world, and they speak English. You shouldn't be surprised that he is well-known outside of Germany!
@@theftking he didn't have his English channel for as long as German and didn't do English content up until a few years.
@@theftking but yeah, I agree
By far one of the best as far as cuts from the end of short,back to the beginning without skipping a beat. Love these shorts!!
Uh, the 7900XTX does beat the 4080 a little bit in about everything excluding Raytracing. But thats cool nonetheless.
this is the reason why competition is good
AMD is just like: nah bro is not worth it.
This loops perfectly
I guess we won in editing, even though I’m an ARM Mac user
I wanted a Mac pro forever for this specific card I've been a big fan of AMD WX cards
I love your sarcasim
Why would you spend more money to buy a worse card this guy knows something we don’t
dunno why you arnt mentioning it but thats a dual gpu card, i suspect plenty of amd cards if you ran 2 of them would be close/above a 4080…
Der Bauer is German for the Farmer (or in this Case Builder)
Jungs, wir haben es geschafft
?
Der Bauer hat es geschafft 💪
Wondering what drivers they're using for this....
I love how wrong u pronounce „der8auer“
🤣
Biste deutsch?
@@Redstoneluchs klar
Nice synthetic benchmarks are great!
Dude, you said it yourself $5000…
For something that in actual games probs loose to a 3080 or maybe even a 3070
@@XioLoBingnot with upgrades
The loop was smooth
Why is his reflection on the table staring at me
My local Walmart doesn't sell 4K computer monitors.
As a gamer on mac, you can infact game on a mac. On the macOS app store there are resident evil village, and if you‘re on an apple silicon chip even stray. You can also get steam for mac, but you can‘t play big name games like cod or CS2. But a lot of games made with unity are available for mac from steam. Currently my favorite games from steam to play on my mac are:
-Isonzo, a ww1 FPS with incredible detail.
-Ravenfield, a very fun (early access game) with huge modding support even for mac through the steam workshop
-Muck, a roguelike survival game which is incredibly fun, yet challenging
-War thunder, you all should know war thunder by now
Also some games I would suggest that work with mac are:
-Deltarune once it releases (and it‘s demo)
-Undertale
-Crab game
I love your sarcasim
They don't want to because being the underdog let's strict criticism slide off like water on a duck.
They wouldn't be criticized for having a better card than the RTX4080, that thing is just so expensive, that it can only placed on a computer with all the cheaper better GPUs not being an alternative.
“They just dont want to” that line has applied to so many things, it’s so sad the way we limit ourselves
Contracts are a thing
I miss dual GPUs, the HD 5970/6990/7990 were really cool cards
On the Mac Pro you could‘ve used boot camp assistant.
I may not have a brain gentleman, But I have an idea
I bet its some apple contract where they arent allowed to build a windows system card better than for the Mac OS.
Man Im tryna game on my MacBook pro from 2015...
Im gonna yoink that shi from my friend
Bro is comparing a multi gpu card to a single gpu card
Make a video of your gaming room tour
Behind the curtain trusts with nvidia and amd confirmed
How times have changed. The 7900xtx makes the 4080s look like a dumb choice now
When AMD first unveiled Ryzen and Zen architecture, they also showed their upcoming future line up all the way to Zen 4c which showed they already had the manufacturing process to create those chips, so realistic speaking even though they unvielded that back in 2017, the zen 4 chip actually existed, and who knows, maybe even had the plans already worked out for zen 5 and upwards.
No… it most certainly didn’t, they probably just had some level of architecture planning but I assure you they did not have ryzen 4 processors just lying around when they announced ryzen 1. The reason this card performs so well is because it has 2 dies, it’s not some secret better gpu amd is withholding from gamers.
That card is the thing that makes Nvidia’s pricing sound so reasonable.
Der Bauer schafft alles
Thought he gon start blasting some people
Had to download the video to watch it
derbauer der deutsche grinder
“De baueer“💀💀
AMD casually outperforming the highest Nvidia GPU on a Fanless Laptop,
I'm Scared what AMD can do with PC
The highest is a 4090 and it can't beat it plus it would cost more than a 4090
me;opens asphalt 9,pubg,cod and overwatch with high graphics
basically nvidia SLI but soldered onto one pcb without bridge,concept is old and used in gpu's but this doesnt hide the fact that its actually double gpu not a single gpu like nvidia 30 series or 40 series,it may cause problems with gaming as games struggle with sli support or this concept,shown by linus tech tips,on other hand it is mac and no one buys mac for gaming,creative workflow tasks go smoothly.
It's a dual gpu so good for parallel stuff
im pretty sure the Polaris pro duo is very similar to it
Yeah but this is a dual gpu setup. If it has one chip, it wouldnt be a fair match lol
i know it's still far from daily driver rrady, but i wonder what asahi linux could do with that card (considering it runns on macs)
meanwhile me on integrated graphics: 😶
How the hell did the guy made drivers for that card
but what about the W6900X for the Mac Pro? Or the W6800X Duo?
you'd also need a custom driver
Mac pro exclusive are words that should never be combined together
They could but depends
i got a mac book pro with windows installed on it
can i game on it?
|
\/
Guessing 3dmark timespy?
What is your keyboard??
It’s a dual card, It would be like running cards in SLI (but a little hotter because they share the same heatsink, there is a reason no one makes dual cards anymore)
It's surprisingly not that hot because it was factory super-undervolted.
But to achieve that, you'll need the best binning possible. That's... The problem, on top of Pro user "taxes"
What keyboard do you use
So if they make this work like sli or crossfire it should work on all games?
"Macs cant game."
Me who plays war thunder on medium graphics in MacBook: **visible confusion**
The fact you are forced to play war thunder at medium graphics is exactly why mac's can't game. War thunder isn't even an intensive game and you gotta play it at medium settings not MTO mention the small catalog of what you can play are non taxing games.
My prediction is that was sli/crossfire technology will be revived althought nvidia already made "nvlinks" technology, which was similiar to sli
It'll be nvlink in GeForce now so that NVIDIA can fit more people onto one server. Sli/crossfire died because single GPUs were better while being far cheaper
Hmm... Amd did comment on this and said .. We could make gpu 's that would beat the Rtx 4090 and put a 600 watt TDP on it... But they didn't as they knew that most people don't have enough budget to buy One component that costs 1600 USD and they aim towards the mid ranged market where sales are the biggest
I want you to think real hard on how bullshit that is when their current cards are still getting smashed and aren't path tracing. 40 series isn't mainly about raw power it's about efficiency. And the cost wouldn't make sense it's it's enthusiast level and still cheaper than what a titan would cost which is essentially what the 90 cards derived from. 1500 4090 is a steal compared to almost 3000 USD Titan.
@@blvckl0tcs750 bro not everyone wants to / can spend 1500 Usd on just a single component but if u want sure you can
@@blvckl0tcs750 only the extremely rich / youtubers and big companies buy a top end gpu
@@blvckl0tcs750 And as far as efficiency is concerned ... Amd gpus usually give more performance per watt
@@blvckl0tcs750 that is exactly why amd is a better choice for the average gamer
bro the 4090 beats the 4080 💀
How does the Rx 7900xtx not match an rtx 4080???
Time Spy Extreme uses ray tracing.
@@WaterZer0 ah like that I was confused af
well, it's a dual GPU card, so you can basically half that performance and get a rough understanding of what one of the core's outputs are.
it's important to understand that if the application doesn't support CrossfireX then you won't get the full performance of the cards, just half.
where is the comment's
Only about as fast as a 7900xtx assuming your workload supports crossfire, isn't bottlenecked by pcie, you can power it, your okay with passthrough, you can cool over 400 watts of heat, and you can deal with general issues.
what about installing windows via bootcamp? what is card performance then?
You could also use linux on it too
You look like Tom Hanks
There's no way you don't have a mac somewhere inside that studio
an average custom keyboard youtuber has the engineering capability to engineer a plane in like a week
“Macs can’t be used for gaming” Me playing 17 fps Minecraft with a 2010s MacBook Air
It’s so unfortunate that on the Mac Pro, macOS bottlenecks it to 3080 speeds
pretty nice loop tho negl
We use those amd in the custom CPU's
Amd said they could make a card to compete with 4090 but sees no reason to have a 1600$ card
Bro you can play so much games
I don’t think AMD is refusing to sell this card for customers. It’s probably an exclusive deal that was made in partnership with Apple, so that they can increase their market share through Apple.
Blink twice if they have a gun against your balls, blink three times if they have a fat cheque against your bank account
Ok buddy...
UFDtech right now: 👀👀
you know you are definitely getting old when you see 7900 XTX is AMD's top of the line card, but then you also remember that this exact name was also used many many years ago....
Nvidia, Amd and Intel are in the same family.
Always helping each oter.
Even their investors are same.
The 7900xtx is already faster than the 4080 lol
Der bauer unser liebling❤❤
/Linus slaps his desk and points at his screen "I WANT THAT!!!"
Would probably drop it the second he gets it.
its not like they dont want to its more like the customers doesnt want to pay 5000 for it
As a dual GPU card it really wouldn't be the best in gaming. We had a lot of those cards before. Similar to SLI you just end up with microstutters.
but the last thing when ppl tested games it was like 150% lower performance on macOS, but when you installed Windows on mac (not joke) and it gave 150% more fps
DerBauer does some funny stuff on his main Channel xD
he definitely have 5000 bucks
I remember how bad i wanted mac only for bcuz of this gpu
They can even go for 2nm now on processors but they don't want too.
i think it's the low turnover rate at lower nm wasting tons of silicon driving up prices. there is also the pcie limits 3.0 is old school 4.0 is somewhat newer and 5.0 is rtx 4090 territory. my 3060 uses 4.0 and thus can allow more cores. if we put an rtx 4090 on a 4.0 or 3.0 pcie it runs like a turtle. although very fast still they have data limits due to the architecture of pcie versions so going 2nm is pointless if the motherboards do not support it yet. we will need another pcie jump to surpass rtx 4090 in a meaningful way unless rtx 4090 is not capping out pcie 5.0 yet there may be a few more upgrades but the 4090 is such a monster as it is and there is known headroom to be had with the rtx 4090 ti coming out so the celling has not been hit for 5.0 that's for sure. maybe 2nm will be about 2030 as we have a long way to go with seeing the potential of 5.0
I swear if AMD had a better numbering system for their GPUs, they would be stronger competitor. The 20/30/40 and then 60/70/80/90 system is just so easy for the average person to understand. The ease makes it more accessible.