I run games off a USB C external SSD and it's not stuttery at all tbh. I have OSD set up the same way so i can see the FT graph as well. No clue why it would be in this scenario.. And Doom Eternal wasn't doing it.. could the VGA adapter or monitor itself affect it? Or maybe the USB cable/port or the External drive needs a scan?
540p? Even my old GTX 960 could do 540p without getting hot! I upgraded from it to a 3060 and may sell the 960. I don't expect to get a fortune for it, I paid maybe $200ish for it back long before the scalping sh!tstorm began.
14:32 A lot of Nvidia's higher end cards have coil while like this. For anyone reading this and unaware of a solution, a mild undervolt may help, at a very mild performance drop. In the case of this 3090 Ti, I imagine the power delivery is quite aggressive and thus the resulting whine.
@@Jmich69 oh shit, 1y ago and still came back for it, pretty chad ngl so the whine becomes worse when the power delivery is hit then, that makes sense i guess. but i thought inducers were the most whine generators, how can transistors whine?
@@falkez1514 Something about the frequency of the transistors hitting an audible harmonization. I honestly don't know enough about the science and detailed reasoning behind it, but I understand that having a higher quality VRM can help prevent a GPU from having coil while. Something about the cleaner and more consistent power delivery. You can most often find bad coil whine in cards with the bare minimum allowed power delivery. You can also find coil while in a top end ROG STRIX card. It's much less likely, but it can happen.
@@Jmich69 It's called coil whine for a reason. GPU power consumption varies during different stages of rendering a frame. This causes fluctuations in the inductor current of the VRM stages and therefore in the magnetic fields inside them. The copper windings of the inductors start to vibrate a little under the influence of this changing magnetic field. The reason high-end cards have this more often is because they draw more current per stage and have more stages that make the noise while also increasing the frequency as they are slightly out of phase by design, pushing it into a range where the human ear is more senstitive. Frame rate also affects the frequency, which is why in some games it's possible to "modulate" the frequency by turning the camera.
Especially considering he's using a 1080p60 monitor which is still a decent resolution and VGA is perfectly capable of displaying it. This video was pretty pointless.
@@DawidDoesTechStuff Doubt it’d even be that much considering prices on flagship GPUs from 10-15 years ago now, although we’ll probably be in another shortage by then 😂
Discovered this channel a few days ago. The main reason I love it is that this madman will go out of his way to do incredibly stupid, incredibly entertaining things lol
The higher power thing was a really good hint about the 4000 series and the phase array those extremely power hungry ( according to rumors ) cards. Keep up the hard work!
Every Tech Tuber: "We got the 3090ti in to test. Look at this massive cooling set-up! Now check out these graphs." Dawid: "Let's make this modern beast of a card, weep like a baby. I'm gonna defeat the purpose of all its power, by using a 1080p 60hz monitor from the before times." 🤣😂🙃😅😭 "Next week, I'll put the whole system into the oven at see 200C. Just to see if you can still game, while living on the planet Venus"
it was 1080p he could have gone for those shit 1366x768 monitors i have one of those, believe me, it would be fun to see such a high end card on a shitty monitor
coil whine is a combination of the PSU and GPU. Really depends on the luck of the mix on whether you get it or not. My 3080 coil whines badly with my EVGA 800W PSU but not with my Seasonic 1300W PSU.
Did not expect this to simply work with a generic adapter... I was under the impression NVIDIA GPU's, since the 10 series didn't have analog output requiring an active converter to output VGA.
My friend actually had a GPU from Gigabyte that he only had for 1 year, and it stopped working. He sent it in to Gigabyte under warranty and they refused to fix it. They're not a very good company
DOOM is so well optimized, it always use 100% of the hardware available, on CP 2077 you can see GPU never hitting any higher than 80%... so basically more than 20FPS potentially there but game not using the power needed to render them. In every resolution comparison I've seen with high end/enthusiast range GPUs it's always the same, low GPU usage so not that much performance gains going from even 8k to 1080p (MFS 2020 10FPS difference from 4k to 1080p for example).
oh yeah, we love being subjected to linguistic abuse. Was it wrong of me to pierce my earballs with a knife to stop the pain of listening to dawid say "TIE" over and over again?
@@hamyncheese The nvidia presenter in their intro video for this card way back when they teased it said "tie" instead of T I This seems to be the _official_ pronunciation.
@@Vengir I think so, that's what makes it funnier. A very high end marketing gaff. The older ones _were_ messaged as T I weren't they? Since Ti is titanium, not a Star Wars throwback memory.
I'm fairly certain that the extra spaces for the power delivery (and the additional space for a second 16 pin power connector on the top right corner) is for the same PCB to be compatible with the RTX 4090 in the future which keeps some costs down
Not sure if somebody else mentioned it, but if I’m not mistaken, the pcb is actually the same design that’s going to be used on the 40series, and this is basically testing the new power delivery connection
exactly what I wanted to see, and I would like to see if those adapters could run at 1600x1200@85Hz, because that is what I still use on one of my gaming computers with an old high end CRT. best GPU with analog output is GTX980Ti, but I'm currently using a GTX970
Dude, STOP using an external drive in your benchmarks! That's where half the stutters were coming from. You should know better than this! Just plug in a SATA SSD with the game files already on it. You can have Steam scan the folder and it'll recognize the game installs. Don't do this again! Some games like Cyberpunk would have run much faster with an SSD installed on the motherboard.
14:30 my msi laptop (7300HQ, 960M) do that too lol, whenever i press a mouse button or do smth on a game and it does a strange noise like the one in the video
I can see why they made the shell for the pcb plastic, I mean I’m pretty sure it’s already heavy with the block on it so the plastic would help with sagging
Best thing they did with the TI was get the memory on one side. The back side memory on my 3090 ran so freaking hot, 110c, I was forced to get a double waterblock, front and back. Works great, but really heavy.
Dawid, I would just like to say I've been subbed for a long time now and your growth has been amazing. You're my top 3 favorite tech tuber of all time. Your one liners are top tier caps and your content is top tier botulism antidote. Love you bro, stay awesome.
Isn't the 'weird flickering' you're referring to screen tearing? That's what it looks like in the video. Games tend to do that at high framerates on a 60hz display, so it'd make sense. G-sync or free-sync won't work over VGA, and turning on V-sync would limit your frame rates, so I'd wager that's the issue.
Hey Dawid, I love the video! When companies send over stupidly expensive products like this, do you have to send them back, ir do you get to keep them?
One thing I noticed in some games i had to set max frames rates in global settings to my monitor refresh rate in the nvidia panel which fixed the stutters as the GPU was pushing more which must of been the cause
since am4's end is near, can you make a their list of lowlow, low, normal, good, medium, medeium well, high and so high you need to sell your body motherboards or any device am4 compatible? i.e. gpu's apu's or just which one you think is worth the money to insides ratio?
14:28 thats the coil wine witch is the most important peace of technology used to increase wattage (turbo speed) for cpu and other parts its integrated on motherboard gpu have it own coil wine integrated on shipset thats why you need to connect power cable separately on gpu rather than connecting it on motherboard is may sound annoying sometime when there is much load on gpu / cpu
had similar experience with usb game drive, in many games the fps are very stuttery especially at the beginning of the game, it won’t go away even with usb 3 gen 2 X 2 speed, i’m using nvme ssd with usb external enclosure. after switching to sata ssd as game drive the whole thing went away
Here's the chain I want to see. HDMI to VGA , VGA to RCA/AV, RCA to RF modulator/75 ohm coax, 75 ohm coax to 300 ohm transformer, 300 ohm connector to black and white tv.
I wish there was a "Dawid Option" for all sorts of entertainment and information. I enjoy all of your videos, and am glad that it looks like your channel is gaining momentum. Thanks for all the chuckles.
Love this concept, Dawid! Might I suggest taking it further and running the least demanding games you can think of on it? Let's see if you can get that framerate counter up in the thousands in gameplay. :)
I noticed the unpopulated 12 pin connector on the board at the top right which seems like it would face out the right side of the card. I would personally like it much more if that was used in place of the one on the top.
I would love to see a 3090 TIE being put onto a 1st gen PCI-E slot or a really low lane m.2 slot with many dongles for a 16 lane adaptor. Just slap on some rgb to this setup and it will make the gaming 42069% better. Trust me bro
I love this channel for the creativity and not just like mainstream reviewing stuff. You go out of your way for entertainment and doinn things just for the heck of it rather than a straight io review. Love your videos man
You know you could run 6 Display Port monitors off of this card as DisplayPort can daisy chain from one monitor to another. Just a thought :). If I recall, the board design for the 3090 Ti can also be used on NVidia's next generation GPU's, that would explain the extra power delivery spaces given the expected power draw of the next generation GPU's from NVidia.
6:22 I believe it is 20 phases and 2 phases for the GDDR6. It has 6 blank spots for up to 26 phases! Good lord. It might be the same PCB design for next gen lovelace? The 4090 supposedly draws 500-600W!
6:10 try to think from an industrial perspective and it makes total sense. How many phsaes do we need? dont know.. should we either make a wild guess, realize we need more and go through another design loop ar should we just add the footprints for more than we think we need just in case we discover we need to add more during the testing? producing something like this is always a balance of costs. a lot of times you will see unpopulated footprints on a device like this.
Could the GPU's Pump be making that sound from having it at the highest point of the run in the AIO?? Jump to 8:50 to see what I am talking about. 14:32 for the sound.
Holy crap call the fire department near you!! Some disgruntled employee at Gigabyte didn't put all the VRMs on your board and wants to make another combustible PC component. Maybe there are supposed to be VRMs in those empty spots, but Gigabyte's CEO read Stephen Edwin King book and wants to get his GPUs in the upcoming "Firestarter Movie" advertizing before they release on May 13. O_o I hope they do another MONSTER GPU that uses all those VRMs and they could call it the XXXtreme 3090Ti++ HAHAHAHA You did an incredible critique of that beast GPU and I wish more youtubers would bring some fun into their reviews like you do.
I think it would be great if you could release some info on the thickness of the thermal pads and even do a thermal pad replacement video. You could measure the difference between high end thermal pads and the stock ones for Vram temperatures. Can even apply some quality thermal paste for the GPU as well while you are at it. As seen in the video it doesnt seem too tricky to remove the shrouds and install new pads-paste.
Try to overclock the display with CRU app, certain screen support that, it can be nice to see the difference between an overclock screen and a real high quality screen
Seeing those frametimes, and GPU usage/performance in BFV and Cyberpunk makes me think something is wrong with the system, not sure what. A 3090 Ti should definitely run better (and be at 100% usage obviously), and my 1050 ti feels smoother than that in BFV. Something is off, maybe the external drive you are using?
11:58 if the 3090 ti can be at near 100% usage while CPU is at around 40 means the GPU is the bottleneck. you heard it here first. the 3090 ti is slow. maybe the 4090 is better
doubt you will see this but what about maxing out the fov on the games tested? (like bfv) I don't remember ever seeing anyone playing at a console fov on pc I think i'd give a more realistic fps for the average player when testing products
Hey Dawid. I actually use my main PC on a CRT monitor on occasion for some games that support 4:3/5:4 resolutions. Games like Cyberpunk 2077 with RT on look PRETTY GREAT at 2048x1536 resolution with that CRT GLOW and SMOOOOOOOOOOTHNES. Definitely recommend! If you can find a CRT that works well in your area, and supports at least 1280x1024 and over 60Hz. (85Hz is a sweet spot for what you can reasonably find in a surprising number of areas) I'd recommend it if you had the chance to give it a go. I'd offer my Samsung SyncMaster 1100DF but I feel the Seattle, WA USA area is A BIT FAR for that.
I would be very upset with that zip tied fan wire if I spent that much on a card, why is that hanging out instead of being run under the nylon sleeve...?
Hello, I have a question, I'm planning to transfer an older motherboard to an RGB PC case, with a new 850 watt power supply. my question is my motherboard has a 4 pin cpu connection does it have a modern power supply or have i cut myself with this. Gigabyte GP-P850GM PC netvoeding 850 W ATX 80 Plus Gold Gigabyte GP-P850GM PC Power Supply 850 W ATX 80 Plus Gold GAME HERO® Geance Gaming PC Case Tempered Glass Side Panel RGB - 2 x USB 3.0 - 4 x 120mm RGB Case Fans GAME HERO® Geance Gaming PC Behuizing Zijpaneel Van Gehard Glas RGB - 2 x USB 3.0 - 4 x 120mm RGB Case Fans
I have seen the haze effect in one of your older videos, but I have a dual monitor setup with mismatched VGA monitors both over adapters, and that effect I have never seen.
I have the same PSU but 850W. I got it for the Asus TUF RTX 3080. I am a bit worried because after I bought it, I read some bad reviews about it on Amazon. Love your video :)
why does a motherboard need a fancy backplate? adds unnecessary height.. imagine if you’ve got a tall cooler and the extra backplate clearance just makes the side panel not be able to close
that sound was air trapped in the pump of the GPU. rad was lower than the pump/block, so air was trapped and we were hearing the pump rev up to account for more activity.
dude, my graphics card does that exact noise when playing games, i believed it was defective or something. I've been playing with vsync or a frame limiter since i got it trying to don't force it to do those weird sounds. Now that i saw this video i guess it is something normal at high utilization.
Some coil whine is pretty normal, especially at really high frame rates. Don’t worry though, unless it’s really loud, it’s most likely nothing to worry about. Some cards just have it worse than others.
“Hello Nvidia, this is Linus. Any idea where my 3090Ti went?” “You sent it to who? And he did WHAT with it? Wow!”
I can hear this in Linus voice
Hopefully Nvidia mentioned to Linus that Dawid did not drop the GPU! 🙂
I can hear his voice too
@@smd9591 Dogs start awooooing in the background
linus dont ask for one, he ask for 10, as always, if not more
Dawid: runs everything off of a USB external drive.
Also Dawid: everything is stuttery
It’s USB C. 😂
@@DawidDoesTechStuff USB-C just means its running higher speed, it means close to nothing when it comes to response time, hence the stuteriness
I run games off a USB C external SSD and it's not stuttery at all tbh. I have OSD set up the same way so i can see the FT graph as well. No clue why it would be in this scenario.. And Doom Eternal wasn't doing it.. could the VGA adapter or monitor itself affect it? Or maybe the USB cable/port or the External drive needs a scan?
I only figured this was my problem with two games earlier this year after one them I stopped playing 2 years ago due to the stuttery lol
@@CarkBurningMoo do you own a 3090 tho?
The 3090ti - undisputed king of 540p gaming.
800x600
@@gpubenchmarks7905 540p is tiny bit more heavy than 800x600. But yeah, below that is just dog shit.
540p? Even my old GTX 960 could do 540p without getting hot! I upgraded from it to a 3060 and may sell the 960. I don't expect to get a fortune for it, I paid maybe $200ish for it back long before the scalping sh!tstorm began.
@@charleshines1553 gimme it for 20 bucks
@@charleshines1553 my 4090 absouloutely destroys 144p
14:32 A lot of Nvidia's higher end cards have coil while like this. For anyone reading this and unaware of a solution, a mild undervolt may help, at a very mild performance drop. In the case of this 3090 Ti, I imagine the power delivery is quite aggressive and thus the resulting whine.
the whine comes from the vrms right?
@@falkez1514 The whine comes from the transistors in the GPU/CPU. It is usually associated with cheap power delivery (which includes the VRM).
@@Jmich69 oh shit, 1y ago and still came back for it, pretty chad ngl
so the whine becomes worse when the power delivery is hit then, that makes sense i guess. but i thought inducers were the most whine generators, how can transistors whine?
@@falkez1514 Something about the frequency of the transistors hitting an audible harmonization. I honestly don't know enough about the science and detailed reasoning behind it, but I understand that having a higher quality VRM can help prevent a GPU from having coil while. Something about the cleaner and more consistent power delivery. You can most often find bad coil whine in cards with the bare minimum allowed power delivery. You can also find coil while in a top end ROG STRIX card. It's much less likely, but it can happen.
@@Jmich69 It's called coil whine for a reason. GPU power consumption varies during different stages of rendering a frame. This causes fluctuations in the inductor current of the VRM stages and therefore in the magnetic fields inside them. The copper windings of the inductors start to vibrate a little under the influence of this changing magnetic field. The reason high-end cards have this more often is because they draw more current per stage and have more stages that make the noise while also increasing the frequency as they are slightly out of phase by design, pushing it into a range where the human ear is more senstitive. Frame rate also affects the frequency, which is why in some games it's possible to "modulate" the frequency by turning the camera.
definitely should have tested it with a CRT monitor
viewing on my crt
@@KokoroKatsura same
I think, the best match were an 80-Column-Monochrom or Amber CRT ;-)
Some of those can actually go up to a 100hz, and colors are pretty vivid on them too, I think he should give it a shot.
Especially considering he's using a 1080p60 monitor which is still a decent resolution and VGA is perfectly capable of displaying it. This video was pretty pointless.
It's crazy how good the Doom game engine is. No frame dropping. Every other game you get so many frame drops!
because he's running it on an external USB drive
watching people play with tech i will never ever be able to afford makes me happy for some reason
Untill 15 years later when they became semi affordable
That would make me extremely upset and jealous.
This setup made me a little jealous, and I have a Ryzen 9 5950X, 64 GB ddr4, RX 6900X.
@@TheRealNeoFrancois That’s true. 10 or 15 years from now this GPU will be like $100.
@@DawidDoesTechStuff Doubt it’d even be that much considering prices on flagship GPUs from 10-15 years ago now, although we’ll probably be in another shortage by then 😂
Discovered this channel a few days ago. The main reason I love it is that this madman will go out of his way to do incredibly stupid, incredibly entertaining things lol
shame he didn't use a crt with the mosso era connector would have been more nostalgic
You took the most power GPU right now and basically turned into the butter roboter from Rick and Morty. I love this channel.
What is my purpose
You turn 3090ti into 2060 super
…….oh myyy Ghhhhhaaaaaaad
Me too. Really!
But I really do not like Rick an Morty and any kind of camparison with this gloryfull yt-channel :D ;) No offense bro!
I love this RUclips channel (and that analogy) haters gonna hate.
@@JohnDoe-wq5eu True...
It emasculated me with its huge die, so I’ll emasculated it with VGA. 😂
The higher power thing was a really good hint about the 4000 series and the phase array those extremely power hungry ( according to rumors ) cards. Keep up the hard work!
Every Tech Tuber: "We got the 3090ti in to test. Look at this massive cooling set-up! Now check out these graphs."
Dawid: "Let's make this modern beast of a card, weep like a baby. I'm gonna defeat the purpose of all its power, by using a 1080p 60hz monitor from the before times."
🤣😂🙃😅😭
"Next week, I'll put the whole system into the oven at see 200C. Just to see if you can still game, while living on the planet Venus"
This cooling solution is crazy AF!
It's now starting snowing here on Venus!
Damn! That Venus idea is actually really good. 😂
it was 1080p
he could have gone for those shit 1366x768 monitors
i have one of those, believe me, it would be fun to see such a high end card on a shitty monitor
im fucking dead xd
@@DawidDoesTechStuff now you need a nasa sponsorship
I love that david makes videos about situations that we can all relate to and totally aren't batshit crazy
Can't imagine spending this much only to get coil whine
If it didn't use as much power as a particle accelerator, no doubt it wouldn't whine as much eh.....
lol its pretty extreme as well, its not supposed to be that variable and noticable
coil whine is a combination of the PSU and GPU. Really depends on the luck of the mix on whether you get it or not. My 3080 coil whines badly with my EVGA 800W PSU but not with my Seasonic 1300W PSU.
I dunno, with some skill you could use it as a full-blown instrument. Sounds like a feature to me xD
@@tonymorris4335 Seasonic, never will ever buy a different brand.
Did not expect this to simply work with a generic adapter... I was under the impression NVIDIA GPU's, since the 10 series didn't have analog output requiring an active converter to output VGA.
I just used a DP to VGA cable for a LOOONG while on my 1080P monitor with my 3070. And it worked well
love how he get sent stuff by gigabyte and then hints at fire hazards with their psu
My friend actually had a GPU from Gigabyte that he only had for 1 year, and it stopped working. He sent it in to Gigabyte under warranty and they refused to fix it. They're not a very good company
(explosions in the distance)
@@marcogenovesi8570 (half life 2 explosion sound)
@@AtomSquirrel not the first time I've heard of that
NZXT is the bigger fire hazard…😁😆
DOOM is so well optimized, it always use 100% of the hardware available, on CP 2077 you can see GPU never hitting any higher than 80%... so basically more than 20FPS potentially there but game not using the power needed to render them.
In every resolution comparison I've seen with high end/enthusiast range GPUs it's always the same, low GPU usage so not that much performance gains going from even 8k to 1080p (MFS 2020 10FPS difference from 4k to 1080p for example).
cp
Love the dedication to saying "3090 Tie" every time, lol. What even is Nvidia's marketing anymore
oh yeah, we love being subjected to linguistic abuse. Was it wrong of me to pierce my earballs with a knife to stop the pain of listening to dawid say "TIE" over and over again?
saying "TIE" every time winds up Linus.😁
@@hamyncheese The nvidia presenter in their intro video for this card way back when they teased it said "tie" instead of T I
This seems to be the _official_ pronunciation.
@@stephen1r2 They seem to use both pronunciation, depending on who from the company is talking at any given moment.
@@Vengir I think so, that's what makes it funnier. A very high end marketing gaff. The older ones _were_ messaged as T I weren't they? Since Ti is titanium, not a Star Wars throwback memory.
I'm fairly certain that the extra spaces for the power delivery (and the additional space for a second 16 pin power connector on the top right corner) is for the same PCB to be compatible with the RTX 4090 in the future which keeps some costs down
Spam comment above my reply ^^^ report don't fall for it
This PCB is the same one that's planned to be used for the 600W 4090. That's the reason for the blank spots.
Wtf. We gotta have a nuclear reactor to run that thing
@@alibozkuer5022 The founders edition will have a Molten salt reactor in the package, should be able to run the 4090 undervolted.
Oh cool! That makes sense.
@@DawidDoesTechStuff YOU'RE SPECIAL ED!!!!!
@@IR4TE 4090 ti AORUS Chernobyl master z360 edition
13:00 Do you know that VGA monitors have an auto-adjust button ?
Not sure if somebody else mentioned it, but if I’m not mistaken, the pcb is actually the same design that’s going to be used on the 40series, and this is basically testing the new power delivery connection
Hasn’t even been that long… come on give the gamers a break from the whole race with scalpers nvidia
heheha
@@Fxmbro "come on give the gamers a break from the whole race with scalpers nvidia" free market and capitalism, wouldn't trade it for anything ;)
1:13 I spy a bent fin on that VRM cooler, tut tut Gigabyte...😂
I think the next logical step is using this setup with a CRT monitor.
You can decrease resolution for higher refresh rate
1024 x 768 baby! Gotta see Crysis as originally intended.
@Monochromatik Early generation CRTs had translucent faces so they actually can have backlight bleed.
exactly what I wanted to see, and I would like to see if those adapters could run at 1600x1200@85Hz, because that is what I still use on one of my gaming computers with an old high end CRT. best GPU with analog output is GTX980Ti, but I'm currently using a GTX970
Dude, STOP using an external drive in your benchmarks! That's where half the stutters were coming from. You should know better than this! Just plug in a SATA SSD with the game files already on it. You can have Steam scan the folder and it'll recognize the game installs. Don't do this again! Some games like Cyberpunk would have run much faster with an SSD installed on the motherboard.
this should be common knowledge.
My external drive is fast enough to game on, his probably is too, I’ve had mine for 3/4 years…
4:29 Dawid giving Gigabyte what they asked for, the true Linus video experience
14:30 my msi laptop (7300HQ, 960M) do that too lol, whenever i press a mouse button or do smth on a game and it does a strange noise like the one in the video
You should hook up 4 of the Mesozoic era monitors to it and see how it does with them. Might need a new monitor arm as well
that would be so epic for multi monitor setup then 🤣😂🙃😅
@@raven4k998 it really would be, I'd love to see him do it. He needs to use his porn music as he builds the setup
that's a good test, how many Mesozoic period monitors can a 3090 "tie" run?
@@Angmar3 ah yes, the 3090 "tie"... Imma go with 4 lol
@@Angmar3 imagine it running 10 at the same time the problem is the amount of adapters and shit you need to do that sort of setup though
I can see why they made the shell for the pcb plastic, I mean I’m pretty sure it’s already heavy with the block on it so the plastic would help with sagging
Total budget of the video: Like 10-20 $ for the Hdmi to Vga converter.
and $100 for power to power the 3090 tie and 12900k that i want both of so bad
@@shuttleman27c Same, my pc do be kinda old
Best thing they did with the TI was get the memory on one side. The back side memory on my 3090 ran so freaking hot, 110c, I was forced to get a double waterblock, front and back. Works great, but really heavy.
Dawid, I would just like to say I've been subbed for a long time now and your growth has been amazing. You're my top 3 favorite tech tuber of all time. Your one liners are top tier caps and your content is top tier botulism antidote. Love you bro, stay awesome.
Isn't the 'weird flickering' you're referring to screen tearing? That's what it looks like in the video. Games tend to do that at high framerates on a 60hz display, so it'd make sense. G-sync or free-sync won't work over VGA, and turning on V-sync would limit your frame rates, so I'd wager that's the issue.
Looks like it
Hey Dawid, I love the video! When companies send over stupidly expensive products like this, do you have to send them back, ir do you get to keep them?
I think he gets to keep them, as the Corsair power supply used in the video was apparently sent to him a while ago.
Stupidly expensive stuff sends back, that's a normal practice
It get send back generally when they can keep it they say it
@@insert_username_here Corsair's probably gonna with a while before sending more things over XD
One thing I noticed in some games i had to set max frames rates in global settings to my monitor refresh rate in the nvidia panel which fixed the stutters as the GPU was pushing more which must of been the cause
since am4's end is near, can you make a their list of lowlow, low, normal, good, medium, medeium well, high and so high you need to sell your body motherboards or any device am4 compatible? i.e. gpu's apu's or just which one you think is worth the money to insides ratio?
Good idea. I feel less guilty hacking the shit out of cheap last-gen gear.
@@SchoolforHackers Why would you feel guilty about it? It is your hardware now, do as you please with it.
14:28 thats the coil wine witch is the most important peace of technology used to increase wattage (turbo speed) for cpu and other parts its integrated on motherboard
gpu have it own coil wine integrated on shipset thats why you need to connect power cable separately on gpu rather than connecting it on motherboard
is may sound annoying sometime when there is much load on gpu / cpu
had similar experience with usb game drive, in many games the fps are very stuttery especially at the beginning of the game, it won’t go away even with usb 3 gen 2 X 2 speed, i’m using nvme ssd with usb external enclosure.
after switching to sata ssd as game drive the whole thing went away
🤔
I have no idea what the video was about but you have an entertaining voice. Could listen to you for hours.
16:07 Who needs a guitar when you have a 3090TI.
Great vid. Thanx Dawid. you are my fav uploader!
when they are sending you their best products you know you've made it.
Here's the chain I want to see.
HDMI to VGA , VGA to RCA/AV, RCA to RF modulator/75 ohm coax, 75 ohm coax to 300 ohm transformer, 300 ohm connector to black and white tv.
I wish there was a "Dawid Option" for all sorts of entertainment and information. I enjoy all of your videos, and am glad that it looks like your channel is gaining momentum. Thanks for all the chuckles.
Muting the music for the protective film peels, this guy gets it.
Love this concept, Dawid! Might I suggest taking it further and running the least demanding games you can think of on it? Let's see if you can get that framerate counter up in the thousands in gameplay. :)
Would the frame rate budge from 999 in Half-Life 2 1080p High Settings?
F.E.A.R. Let Alma loose!
Windows Solitaire 240P 2560fps
i bet minesweeper would get insane fps on that card
Clustertruck at 480p. The coil whine will make glass evaporate.
Man, these golden videos just keep rolling! Keep it up man 👍
Dawid giving us some more useful consumer advice, now that $2000 price tag on the RTX 3090 Ti makes a lot of sense
I noticed the unpopulated 12 pin connector on the board at the top right which seems like it would face out the right side of the card. I would personally like it much more if that was used in place of the one on the top.
"That power button is worth $200 at least!"
My GOD Dawid!!! DONT TELL THEM THAT! 🤣
he doesn't need to
that's already what they're charging for them
@@Shpoovy Exactly what I was gonna type. 😂
@@DawidDoesTechStuff
ruclips.net/video/BNvoM0T8tXs/видео.html
man... and to think my old ddr2 ECS board has both power and reset buttons lol.
holy shit your channel grew fast. big grats
I would love to see a 3090 TIE being put onto a 1st gen PCI-E slot or a really low lane m.2 slot with many dongles for a 16 lane adaptor.
Just slap on some rgb to this setup and it will make the gaming 42069% better. Trust me bro
I love this channel for the creativity and not just like mainstream reviewing stuff.
You go out of your way for entertainment and doinn things just for the heck of it rather than a straight io review.
Love your videos man
Nvdia designing their gpus to double as bagpipes is a feature I never knew I wanted.😁
14:30 for reference🎶
You know you could run 6 Display Port monitors off of this card as DisplayPort can daisy chain from one monitor to another. Just a thought :).
If I recall, the board design for the 3090 Ti can also be used on NVidia's next generation GPU's, that would explain the extra power delivery spaces given the expected power draw of the next generation GPU's from NVidia.
DUDE dat coil whine OMG 🤣
6:22 I believe it is 20 phases and 2 phases for the GDDR6. It has 6 blank spots for up to 26 phases! Good lord. It might be the same PCB design for next gen lovelace? The 4090 supposedly draws 500-600W!
You should have gone with a DP to VGA converter instead, since that gives you the most "native" VGA signal
6:10 try to think from an industrial perspective and it makes total sense. How many phsaes do we need? dont know.. should we either make a wild guess, realize we need more and go through another design loop ar should we just add the footprints for more than we think we need just in case we discover we need to add more during the testing?
producing something like this is always a balance of costs. a lot of times you will see unpopulated footprints on a device like this.
8:29, am I seeing that correctly that he is only using one power connector for the CPU while the CPU beeing the power hungriest CPU ever existed? :O
I mean it is an eight pin cpu connector
@@jordangames2560 but there are two of them and I think plugging them in isnt a bad idea with a 12900k
Could the GPU's Pump be making that sound from having it at the highest point of the run in the AIO?? Jump to 8:50 to see what I am talking about. 14:32 for the sound.
Fact: NVidia actually call it a tie, not TI like we all do
It's T I 💦💦
Only in the recent years. Throughout the older gens pre 30 series, it's been TI.
Is it really a tie with the 6900XT? 🤔
Yes, generating from the word Titanium! Sounds stupid imo
Thai
Most entertaining tech CC out there. Always caught off guard when a joke is dropped in.
I love how you built this ultra expensive system on top of the boxes. Classic Dawid test bench setup. Awesome.
6:17 so was there a higher end version of this card?
Would go ballistic for a GPU like this.
Awesome vids, Dawid, thanks for testing all this stuff and making us laugh.
Holy crap call the fire department near you!! Some disgruntled employee at Gigabyte didn't put all the VRMs on your board and wants to make another combustible PC component. Maybe there are supposed to be VRMs in those empty spots, but Gigabyte's CEO read Stephen Edwin King book and wants to get his GPUs in the upcoming "Firestarter Movie" advertizing before they release on May 13. O_o I hope they do another MONSTER GPU that uses all those VRMs and they could call it the XXXtreme 3090Ti++ HAHAHAHA You did an incredible critique of that beast GPU and I wish more youtubers would bring some fun into their reviews like you do.
I think it would be great if you could release some info on the thickness of the thermal pads and even do a thermal pad replacement video. You could measure the difference between high end thermal pads and the stock ones for Vram temperatures. Can even apply some quality thermal paste for the GPU as well while you are at it. As seen in the video it doesnt seem too tricky to remove the shrouds and install new pads-paste.
Try to overclock the display with CRU app, certain screen support that, it can be nice to see the difference between an overclock screen and a real high quality screen
what is the performance information on the left from? is that like an app running on the side?
Is this card the same as gigabytes other 3090ti? I'm just planning for the future in case I decide to modify this to fit a custom loop
lol is that rock part back ground music from rebel galaxy ?
Seeing those frametimes, and GPU usage/performance in BFV and Cyberpunk makes me think something is wrong with the system, not sure what. A 3090 Ti should definitely run better (and be at 100% usage obviously), and my 1050 ti feels smoother than that in BFV. Something is off, maybe the external drive you are using?
Great job screwing in the VGA connector at 0:26. "It's proper procedure" -Anthony
9:12 Rounded corners.
I think his Dell monitor might be newer than my (Dell 1909W) monitor.
@Dawid you say that heatsink for mofsets is crazy did you forget the p4/775 era with all proper finned heatsinks and heatpipes in all copper with fans
11:58 if the 3090 ti can be at near 100% usage while CPU is at around 40 means the GPU is the bottleneck. you heard it here first. the 3090 ti is slow. maybe the 4090 is better
I love how when he enters the menus it's at damn near 1000fps
doubt you will see this
but what about maxing out the fov on the games tested? (like bfv)
I don't remember ever seeing anyone playing at a console fov on pc
I think i'd give a more realistic fps for the average player when testing products
Hey Dawid. I actually use my main PC on a CRT monitor on occasion for some games that support 4:3/5:4 resolutions. Games like Cyberpunk 2077 with RT on look PRETTY GREAT at 2048x1536 resolution with that CRT GLOW and SMOOOOOOOOOOTHNES. Definitely recommend! If you can find a CRT that works well in your area, and supports at least 1280x1024 and over 60Hz. (85Hz is a sweet spot for what you can reasonably find in a surprising number of areas) I'd recommend it if you had the chance to give it a go. I'd offer my Samsung SyncMaster 1100DF but I feel the Seattle, WA USA area is A BIT FAR for that.
This is probably the most DANK tech channel on RUclips.
13:29 That bit cracked me up! you crazy man lol
Great video, tonnes of Dawid brand humour. That 3090ti was sexy!
I would be very upset with that zip tied fan wire if I spent that much on a card, why is that hanging out instead of being run under the nylon sleeve...?
14:30
Please i need an explanation for this.
14:30 Haha The GPU was actually singing Doomly... 🤣 😂....
Hello, I have a question, I'm planning to transfer an older motherboard to an RGB PC case, with a new 850 watt power supply.
my question is my motherboard has a 4 pin cpu connection does it have a modern power supply or have i cut myself with this.
Gigabyte GP-P850GM PC netvoeding 850 W ATX 80 Plus Gold
Gigabyte GP-P850GM PC Power Supply 850 W ATX 80 Plus Gold
GAME HERO® Geance Gaming PC Case Tempered Glass Side Panel RGB - 2 x USB 3.0 - 4 x 120mm RGB Case Fans
GAME HERO® Geance Gaming PC Behuizing Zijpaneel Van Gehard Glas RGB - 2 x USB 3.0 - 4 x 120mm RGB Case Fans
4:16 As an actual IRL oil baron, I seriously appreciate this comment.
I have seen the haze effect in one of your older videos, but I have a dual monitor setup with mismatched VGA monitors both over adapters, and that effect I have never seen.
When I was a kid one of my brothers had a TI SR56 calculator. It also makes different noises when pressing keys and performing calculations.
I have the same PSU but 850W. I got it for the Asus TUF RTX 3080. I am a bit worried because after I bought it, I read some bad reviews about it on Amazon. Love your video :)
why does a motherboard need a fancy backplate?
adds unnecessary height.. imagine if you’ve got a tall cooler and the extra backplate clearance just makes the side panel not be able to close
Wait. I buyed RTX 3060 to my VGA monitor and I wanna use the adapter. Will it have a big impact on performance?
that sound was air trapped in the pump of the GPU. rad was lower than the pump/block, so air was trapped and we were hearing the pump rev up to account for more activity.
dude, my graphics card does that exact noise when playing games, i believed it was defective or something. I've been playing with vsync or a frame limiter since i got it trying to don't force it to do those weird sounds. Now that i saw this video i guess it is something normal at high utilization.
Some coil whine is pretty normal, especially at really high frame rates. Don’t worry though, unless it’s really loud, it’s most likely nothing to worry about. Some cards just have it worse than others.
Can you do a CLX review? Not many people have done that and I want to see how reliable the company is
What’s the specific brand and model name of the AIO with the screen that went on the cpu?
I’ll give the answered 2 usd
2010: This is how you get 10-15 fps boost
2022: This is how you get a couple hundreds fps boost
@Dawid Does Tech Stuff The 3090ti PCB will be used for RTX 4000 series. That is why you have blank space.
Can you change the fans? And if so where do you connect the new fans?
I have a question:
What CPU AIO is he using?
Edit:
It looks cool and I want one.
Dawid, What software are you using to see the CPU, ram usage