The $3000 Titan Z 5 years later...
HTML-код
- Опубликовано: 21 авг 2024
- Just what can an Nvidia Titan Z dual GPU from 2014 do in 2019?... Well it's great at 3dmark... Let's take a dive off Nvidia's last Dual GPU solution! Huge thanks to Jason for donating this BEAST!!!
Consider Supporting Me On Patreon? / timmyjoe
Shopping on Amazon? Use these links and it supports the channel: www.amazon.com...
Products I recommend or use on the channel:
Ryzen 2600: amzn.to/2GRddrA
Gigabyte B450 Elite: amzn.to/2GV9o4V
Radeon Vii: amzn.to/2DDK3dI
Avermedia 4k Live capture card: amzn.to/2DCXgmN
Panasonic G7 Mirrorless Camera: amzn.to/2DDHCrx
Neewer Tripod: amzn.to/2ZKRclO
Neewer Studio Light: amzn.to/2GR0Sn8
I have a website:
www.timmyjoe.com
Check out the MERCH!!! : www.redbubble....
I have an Instagram:
/ watchtimmyjoe
I have a Twitter:
/ watchtimmyjoe
I have a facebook: who cares about facebook...
-
Intro is Instrumental produced by Chuki, the best beats on youtube!
/ chukimusic
(Other music provided by the Epidemic Sound Library...)
Just so you know it's not correct to spread thermal paste. You always dallop it, either in a single blob in the middle, or seperate blobs for larger dies.
This guy...
Bruhhhhh. Im laughing. You can spread it. Its preferential.
@@TimmyJoePCTech I was just gonna say "This fuckin' guy again" lmao
???????
And there he is
Good choice, definitely a cool card to look at. I hope some years down the line that a channel like yours or RandomGaminginHD will be able to look at the Titan V CEO Edition.
Was thinking of seeing if I could find one of those for the lulz of experiencing it. You saved me the time and money :D Great vid as always man!
You are such a breath of fresh air my dude! Definitely my favorite pc/tech channel by a long shot. Lots of charisma and energy so don’t change a thing!
I agree, he definitely carves his own path, for sure the most original tech tuber out there, and one of my favourites as well :)
Pretty crazy how it had 12gigs all those years ago, even the 3080 had less
"9900K" "Yep, we have one of these" xDD
@Noah Standish Yeah exactly. I have a 5900x I agree
When you put the 2070S in front of the Titan Z, my whole concept of reality was shaken. I always assumed the Z to be the biggest 'consumer' GPU I could ever dream and then the MSI Card came in.
I worked with a guy who bought one of these during the GPU shortage in 2020 for $2500. I tried to explain how ripped off he got but he wouldn't hear it. He also used it to play The Division 2 on 4k with a 5200rpm HDD and not an SSD.
I just had a Radeon HD 5970 on my test bench. Honestly, it wasn't as loud as I was expecting. I don't remember if you played around with one but its definitely a step up from the card it replaced, the 4870x2. I think the biggest reason for that was the move from two solid copper finned blocks to a single aluminum finned vapor chamber. It made a pretty significant difference in what I saw. Still, the HD 4870x2 with the custom Arctic cooler for it does perform better but that cooler couldn't have been cheap to buy for how large it is.
I've never seen a Titan Z before, it looks pretty awesome! Nvidia's old reference cooler just looks so nice. Even now, my GTX 780s in SLI just look awesome in a case. It makes me with my HAF XM had a windowed side panel. Its such a premium look. Not that the new reference cooler isn't bad, it just doesn't look quite as premium. At least to my eye.
Dude, 3000 USD for a wallhack for Fortnite, a complete bang 4 the buck.
That pcb tho, filled to the absolute brim.
Just a little tip. I do own on of these since 2016. You cannot use a driver higher then 385.69. That is actually why you are getting glitches and artifacting. They changed something in their drivers wich causes problems on cards with a PLX chip on them (GTX690/Titan Z) Nvidia knows there is a problem, they even have it in every release note of their drivers under unresolved issues. Just imagine buying a 3500$ GPU in 2015 and 2 years later it's not supported anymore and Nvidia doesn't give a shit.
I heard you can get some amazing results from Vega 56 and 64 to some crazy levels of performance. Great video as always and I hope you can get better SLI testing on your 9900K.
A new month has started... So you can make this dual GPU month then.. if it works with your 9900K when you get the mobo...
Or find an older rig for cheap and use that....
Thanks for the video, Joe. Always enjoy your dual GPU videos! :D
I recently ran a test of a GTX 1070. On a 6700 non K system with an EVGA GTX 1070 SC, I was able to get 312.91 FPS in CS:GO with all the graphics settings set to the highest options. I bought the GTX 1070 used for $192 US shipped off of eBay. So, the Titan Z doesn't perform as well as a $192 graphics card.
I bought three GTX 1070's to put in used PC's I was selling as gaming PCs. All of them were $190-$196 shipped.
I tried 3 way SLI and didn't get impressive results even after the Nvidia Inspector modifications. Nvidia updated the drivers so that you don't need Nvidia Inspector for 3 way SLI in some benchmarks. Those worked better than trying 3 way SLI with games.
I really wish more newer games would support DX12 explicit multi GPU. That looks like an actual useful technology. You don't even need the exact same GPU. I tried it with a GTX 1080 and GTX 1080 ti and saw significant improvements in the game that nobody buys to play and everyone uses as a benchmark, AOTS: Escillation.
well people also talked shit to the Titan X and Xp... but these cost like half this card new!!! so the X and Xp were at least somewhat worth it for the Power they gave us back then and even now...
I've been using the z370 Aorus Gaming 7 with 3 different sets of SLI cards and its been flawless. High-end z370-z390 Gigabyte boards seem to be solid with SLI or CFX. This board has had no problems running 2 GTS 450's, 2 Nitro+ RX 480 8GB OC's in CFX and i even dropped 2 Gigabyte Aorus Extreme 2080's running at 2200MHz each on that board and not one hiccup. Other than with the titles that don't support SLI or CFX
I even threw a 9900K in there and it runs just fine"N"diddy. It has 8 60A Intersil ISL99227B's Smart Power Stages for the Vcore and the ISL69138 for the PWM. Plus the Cooper Bussmann 76A inductors (chokes). Even though this board is a 4+2 doubled to an 8+2phase. The doublers are the ISL6617A's which have over current protection with current balancing, among a plethera of other logic built in.
The z370 Aorus Gaming 7 uses the same exact components as the x299 Aorus Gaming 7 Pro. and of course the z390-x299 eVGA DARK boards.
(480A's) is plenty for the 9900K.
Going with a high-end z370 board like the Aorus Gaming 7 is not a problem what so ever.
Either i hit the VRM lottery or the Micro Center I get my components from has binned Intel i7 8700K's and 9900K's. I Got the 8700K for $280 and the 9900K for $400 and bought the Aorus Gaming 7 for $145 that even had bent pins and was returned. Also the z390 Designare to go with the 9900K. Just for fun I wanted to test the 9900K on the z370 Aorus Gaming 7 because most people were having VRM throttling issues.
Not me. I did notice that the VRM heat sink had the thermal pad replaced with a Laird thermal pad. The ones that Gigabyte uses on all there z390 and x570 boards. I guess that was just a bonus. Running the. 8700K at 5.3GHz with 1.380v reaching 93° on AIR. Not just Air but with a 11 year old (new) ZALMAN CNPS 9900MAX.
Its a Gaming rig so no need really to run it at that speed. So I dropped it back to 5GHz with 1.250v 75°c under max stress. 1.250v at 5GHz and 1.380 at 5.3GHz. This chip is better than any Silicon Lottery binned chip on the list. I was definitely surprised and glad to get such good chips.
On top of that, none of these CPU's are dilided.
I then dropped in the 9900K. and whattayaKnow. 5.2GHz not passing 95°c with 1.3780v. with the VRM not passing 85°c. This is running Prime with small fft's. So basically no one's gonna run there Gaming PC like this. So i give it a pass. Testing the z390 Designare had no issues with SLI or CFX either. Ever since the dual GPU vids started, i've been collecting GPU's along with a ton of other stuff like motherboards, CPU's, Cases and monitors. What else am i gonna spend my money on. Especially when im getting prices that can't be beat. I just bought 2 RTX 2080 Aorus Extremes for $620 a peice. Those are $800 cards. Also just grabbed a Power Color Red Devil 5700XT for $370. I'm very lucky to have a GOLD mine right down the street. They might all be open-box parts but, you can still register every component with the manufacturer's base warranty. People forget that. Anyway love the content. Like always. Keep up with the great vids. I would become a patreon but i gotta flip a couple of these PC's first.
WHAT'S GOING ON IT'S TIMMY JOE THE PUTER PARTZ GUY!!!!!!!
I really hope someday we will see some sort of ryzen scaling for GPUs
NVlink Semi does that already on workstation nvidia cards. Just extra expense to put on consumer ones and not worth because barely anyone would use it. I think maybe some of the ultra high end consumer ones have it?
You know, if you keep pushing those Maxwell cards higher, you will get past the point of "diminishing returns", and start to gain performance again. To a point. Maxwell is a monster! It likes more power too! Should be able to hit at least 1300MHz on the cores, being as they were binned/matched by Nvidia for that card.
Or just go all Buildzoid on it, and solder an EVGA E-Power to it!
I wanna know what's going wrong with the profile inspector. Never seen it this bad. But great video timmy 👍
I have GTX Titan X 12 Gb GDDR5 Graphic Card, this card still a beast
i just got two of those for fun, how did yours do in games? ^^
You're gonna dig that 9900k, that's what I use and its a beast
i love the way you look at more of the older hardware than newer
i thought about buying 1 of these for my rig and shit a brick when i saw what people wanted for them and i just bought a vega 64 instead
*_GPU with those designs are always sexy looking_*
I had some pretty good luck with SLI using two GTX 580 3GBs in Vermintide 2, Destiny 2, and Kingdome Come Deliverance.
Did you use nvidia inspector to do it? Or just whatever driver support it has?
I'm thinking the new architecture AMD that is coming out is going to be awesome. I could be wrong but you never know. I bought 2 Radeon Vll cards so I'll be running those for awhile in a custom loop. They're not running on the same motherboard but going to be in a 2 system case.
Keep up on the great videos I like seeing how the older stuff compares to today's hardware
A good deal of VR titles work very well with SLI/CrossFire, allowing 1 GPU per eye rendering.
What's the scaling on that type of setup? And why don't any of the big VR channels seem to have this type of setup?
Love how it has one fan...
right click perf. Compatibilly check Disable fullscreen op.
Dude that benchmark music is fire!
I'm curious as I haven't messed with SLI/Crossfire in a few years. I see a lot of way over the top builds by some other tech tubers, using dual 2080ti's. I take it that all for show ?
I have one laying around almost brand new, sadly no box or anything else left.
It's heavy AF for an old card like this.
2022...i got one for $80 in Toronto....scores !!
we miss you timmy
Lighting is what broke AFR (Alternate Frame Rendering) which was more or less required for Crossfire and SLI to work. New Forward+ techniques and other "tile" based techniques produce workloads that cannot be split between two Graphics processors (on a per frame basis) as information from one frame needs to be relayed to the other GPU and vice versa.The way around this is DX12 Multi-GPU but it is barely supported. DX12 Multi-GPU rendering techniques act sort of like NUMA does for CPUs in that the game see's just one large assortment of GPU Cores and work is scheduled similarly to how Asynchronous Computing schedules work for the GPU cores in that idling resources are given work and thus utilized more effectively keeping the entire GPU array fed and busy. So work is not split into GPU1 = Frame1, GPU2 = Frame2, GPU1 = Frame3 etc but rather both GPUs acting as a single array of cores and working on each frame, or multiple frames if resources permit, more effectively.
The problem with this is that game developers have to code for asynchronous computing, it's not automatic, so the job of optimizing the proper use of idling resources falls onto the developer. This means more man hours to optimize code which results in longer deadline and higher costs. With such a limited market making use of multi-gpu... this makes little sense.
If memory serves me right, Battlefield1 works with Crossfire and SLI when run in DX11 mode. Strange Brigade, on the other hand, works with multi-GPU when run in DX12 or maybe it was Vulkan or maybe even both (I could be wrong but I thought I read that somewhere).
So glad you revisited this card!! Been searching for some results in newer games, but couldn't find anything on RUclips, no wonder Nvidia nuked this cards with no driver support. Interesting could someone sue Nvidia for this, i mean 3000$ for a product which meant to be absolute beast and only couple years later you just can't use it, i mean you can.. but you can't utilize it fully
Well even if SLI doesn't work anymore, this card will always be good for one thing: workstation that and especially workstation that involves DP workloads. DP performance on this card is not cut down, unlike all other titans and Nvidia "gaming" GPUs that came after that.
Have you tried explicit multi GPU with a card like this?
The nice song at 02:03 onward = Elfl
- Transhumanism
I would argue due to the lack of SLI support in most modern games that in many cases it would be outclassed by like a $120 RX 580 if with both working it only hits 980ti to 1070 numbers, maybe I'm, wrong.
If sli was still supported this would be 1080 level
If one core is around 780 ti level which is 1060 6g level which in sli would be around 1080 level
there's sli fingers on it. get a second one and see how it does
that's not a blower, it's a thicker axial fan. it's pushing air downwards into a solid diffuser plate. blowers have vertical blades that pull air in from the centre and push it sideways. that card SHOULD have a blower and doesn't
I kind of always loved the principle of dual GPU cards, even as far back as stuff like the 3dfx voodoo 5500 and even the esoteric as heck Rage Fury Maxx, in principle.
Most recently i snagged up a 690 because i was able to get one for $68 shipped compared to $130ish for a 680. Unfortunately, the intended build project it was to go into (on an Asus P5WD2-E board), it refused to output anything. I was able to confirm it worked in every other machine, but that. And frustratingly, Radeon 7990 cards are still $200+, which makes no damned sense considering how Crossfire has always seemed to work worse than the nvidia equivalent.
Timmy Joe, i have to admit i would envy your collection.
Wow it would be like buying a $600.00 GPU Every 12 Months. To rich for me lol. Great video TJ, nice to see that old tech still kicking. Cheers!
loved the fast forward music btw but always great videos kido....
Try disabling motion blur. That's what the glitching in Battlefield looks like to me, so maybe the motion blur function in some games is causing issues with SLI with the latest NVIDIA drivers so try turning motion blur off to see if it fixes it.
I LOVE the way you handle hardware!!
/ GONE SEXUAL / GONE WRONG
Back when I used to use 2 gtx 460's in sli, AFR2 was by far the most successful way of getting 2 cards to work when they shouldn't, the Nvidia inspector hacks were always hit and miss, mostly miss for me
It would be sweet to see a powercolor devil 13 390x2 reviewed
Well, the delta between this card and a Vega 56 on the market is three years, which I got for 209 EUR last week. So do the math. But I guess the Titan Z wasn't the value option to begin with back then and I had to seriously question the IQ of the people who bought this purely for "future proofing" themselves in gaming if they do even exist.
Still the best card nvidia has made on the consumer/quadro side for double precision workloads. To my knowledge the only better card is Radeon VII from AMD which is now just as end of life as Titan Z
I miss dual GPU cards. I never owned one because lol the price is never worth it, but they were cool.
also holy balls that music is on point TJ
Don't worry brother, keep on doing your thing and you will be reviewing a Titan V.
Very good video dude
Got it going at just over 1 ghz factory overcloked 1080ti laughs in 1974mhz
Manually oced 1080ti aughs in 2020mhz
My msi 1080 ti fe ran at like 2050mhz core
My titan Xp now runs at 2038mhz lmao
You would think that they atleast would have had dual fans on a dual gpu card.
And for 3 grand they should have included a pair of extra thick headphones to cancel out the noise of that single fan @ 3000 rpm.
I bet it crushes the stalker series for multiple reasons
Probably because of GPU boost 10.0 or whatever there are none, noone has done an identicle MHz comparison compared to other gen GPUs. Itd be interesting to see the IPC gain per clock.
I did ran 290x xfx CF 8gb before one of the fan die by accident. I blew the fan motor when I was using the blower to clean the dust off when my PC turn on by my cat pouncing a blue light.
Just curious Tim if you tried any rendering? Like Adobe or Blender? Maybe the card is possibly still really good for applications sorta like that. However I was very surprised after your testing that people paid 3k for this thing and some games arent even playable. Granted its older but it should still perform very well. Specially fortnite! Crazy, but good shit man love watching your channel 👍
I am running Crossfire RX 580's.
90%+ games run great, 150-200% verses 1 card. 4K runs great.
I was benching Shadow if the Tomb Raider As I watched this video!
DX12 has Multi-GPU support (which is Not "Crossfire") and most DX12 games run great.
Was I just lucky?
my friend bought this when it first came out i remember him paying 2500$ or so at micro center .
In '09, 2010 Crossfire and SLI were supposed to be the next big thing. A way anyone would finally be able to get the performance of the top tier graphics cards for a lot less money. You were supposed to be able to buy two mid-range cheaper cards, link them and finally experience the top tier of graphics. Sounded great but in reality it was never really supported as it should have been according to promises and in the end turned out to be just another instance of marketing hype. Sound familiar, Nvidia Ray Tracing anyone? Ray Tracing has been around for years now and until it's being done on the software side of things especially with Nvidias push to make it part of their branding and claim it as their own it's just more marketing hype...
Just works
Could you use Titan Z as a dedicated card for some software or physics or something?
Was a farce of a card, they had to downclock the cores so heavily it wasn't really competitive with the 295x2 at all. Nvidia didn't even sample this to reviewers, they just released it quietly and never mentioned it again.
This and the RTX 3090 are the only triple-slot cards from NVIDIA.
That's got an sli connector on it lolz what would happen if u added more of those cards together? wait only 1 not 2 connectors is it an sli con or sumtin else?
3:16 super soylent
hey man i used to drink soylent, not cool
ew
Drivers Drivers and more drivers.
What I like to do is not load the current drivers for those older cards. Unfortunately you have to research the hell out of that card and find out the driver that worked well.
For instance I have some old AMD R7 370's. I had to find the right driver to get them to work in windows 10 correctly. The 19.9.1.1 Driver works well with those cards. When i change the driver to the 19.9.2.1 they stop working right and crash out or have various video bugs like you are seeing with the Titan card.
Unfortunately again the newer games have little support in the way of crossfire or sli. It's dead.
Timmy, get the Gigabyte Z390 UD motherboard, freaking awesome VRMs!! Or Z370 boards are super cheap right now used. I just got the Fatality Z370 mini-itx for $100. Not to bad for Thunderbolt 3.
I know a kid at my work that had bought 2 of these i told him that was a waste of money.
Do you personally work for the Saudi sultan or something, because holy fuck.
Right now it is not as expensive then it was when it was new but the TITAN Xp has far better performance then the Z for lower the cost the Zs release cost
@@austinwhite3132 i got one for 90$ no joke
@@sectoor7398 you're lucky. Most are still at least a couple hundred bucks on eBay.
At its time the titan z was the fastest card at that point sure can a single card perform faster than a quad core yes but there is gonna be a limit to how far you can push a single monolithic gpu when sooner or later we are gonna need 4 gpus on a single board oh wait nvdia want to give us 8 way gpus on a 3d stacked gpu meaning 8 gpus to work like a single gpu its alot to think about but thats the future of high Bank gpus hell wont be long before we get 8 gpus that work together and work like a single gpu just means more of the work is spread all over the 8 gpus
"9900K, Yeah we got one"...
The Verge
I wanna get every titan card and make a wall of fame
you should really get a electric air duster at this point
Can you please make a video that explains the differences between Nvidia Game Ready and Nvidia Studio Drivers?
probably input lag and extra processes that would be unnecessary for games.
Windows 7 Ultimate is the best operating system for multiple GPU results, i run 2 1080s in my main rig and on windows 7 Ultimate i get great results overall with SLI
SADLY ..--Only 10 games out of 100 support dual GPU SLI on a single graphic card..
That card 7 years later is gold with 40mhs for miners..........
The Titan Z came out being called the ultimate gaming gpu with a sky high price showing that but it must of been the shortest lived King as it was soon blown away by AMD's 295 when you include the price difference. Unless I've missed a release the best dual chip gpu that ever released is the Powercolor dual 390 gpu which had a pair of 8gb 390 chips in it, With Sli & Crossfire both going the way of the Dodo the Devil13's likely to retain the Dual GPU king title forever more.
*Cries In SLI*
geforce gtx titan Z connect with sli cable to geforce gtx titan X is a good configuration on factory megahertz settings for windows 8 enterprise 64 bit
Some people would call this a techno porn with this sexy naked dual gpu.
Hey Timmy Joe, what kind of thermal paste did you use on the Titan? I need to do my card soon. It's going on 5 years lol.
@Timmy Joe PC Tech I have one of those! But sadly it died... so now it just sits on my desk like Tech Syndicates 8800GTX 🤣
how do you get those high speed panning shots? Good camera work (or post work), mayne
A year or 2 ago I remember LTT saying the titan z was poor man's tesla I guess it had a commpute niche
if i bought this GC in 2014 as a developer well i will save 1.5k in 2016 and another 1.5k in 2018 so it's not that bad
What about sli 2x titan z 🤔🤔 4 way sli
Just no!
Y e a h b a b y y
so what was the die in the middle of the 2 gpu chips?
might be a PLX PCI Express signal chip but I've got no idea.
Asome video Timmy joe. 😁
Hey Timmy Joe can you do a video about the cool new features from AMD like RIS, Input Lag and AMD Link ? That would be cool!
I was wondering for a while if it would be possible to make a cooling solution for a graphics card that uses a type of fan that is used in the indoor part of an air conditioner? You know the fan that looks like a tube? It could be mounted on the side of the card that is facing the side panel and would go the enire lenght of the card... could that work? Just wondering.
Great video by the way☺️
Titan V, is not good at gaming since it's a special arch, it just crashes randomly most of the time in current games. I think the Titan RTX like the Titan X will hold up better as they are top bin of retail GPU's
Ive been gifted two of these (no joke) what the frek am I going to do with them now that the 30 series are out?
Have you tried using those cards with age appropriate cpus.
im using a heavily modified oced daylie gigabyte wind-force gtx 660
Me got a plain 680, still running very much.
With 1156 motherboard,and slightly oc'ed Xeon X3470.