GUIDE: Z840 Fitment of ZOTAC RTX 3090 Ti - Unleashing the Extreme Power in a Workstation

Поделиться
HTML-код
  • Опубликовано: 27 окт 2024

Комментарии • 102

  • @MohammedBadran-dk9qn
    @MohammedBadran-dk9qn 9 месяцев назад +2

    i have z840 workstation 1125 watt (power supply) can i apply aorus rtx 2080 ti xtreme waterforce 11g & EK-CoolStream RAD XT 240 ????

    • @racerrrz
      @racerrrz  9 месяцев назад

      Yes, the Z840 can power the RTX 2080 Ti without issue. Mounting the EK-CoolStream Radiator will likely require a custom mounting solution. Drilling some holes in the case maybe required - but it might be able to fit where the rear exhaust fans are and you could likely mount the rear exhaust fans to the radiator for cooling. The only downside being that the radiator will run warm due to the rear case fans being being the "hot exhaust" for the case. The other option would be mounting the radiator to the HDD bay - then placing fans to the outside of the Z840's front panel. Messy and really inefficient cooling but that would allow you to use the radiator. The best option remains a traditional GPU with a built-in fan module.

  • @wesleyx1
    @wesleyx1 Год назад +1

    I have a HP-Z820 with dual 2690 V2 cpus with the 1125 watt power supply and have a Founders Edition RTX 3090 on the way. I’m glad to see you were able to get the 3090 ti running in your Z840. I was concerned about how it would work. I have a feeling I won’t be able to get the internal pcie cover on with the height of the card plus power connector. Hopefully I’ll be able to get the side panel back on. I guess I’ll know soon.
    I’m getting into AI art generation and local llama. Both are extremely GPU and VRAM intensive.

    • @racerrrz
      @racerrrz  Год назад

      Great combo there. The Z820 with the E5-2690 V2 CPUs makes for a solid platform. The RTX 3090 should work well in the system also, but yes, that side panel will likely be an issue. I understand that there are converters that can switch the 12-pin connector to a 90° connector which may resolve the Z820/840 cover issue. I haven't looked into it since I am also using riser cables to further expand my PCIe connectivity (10GbE NIC plugged in under the GPU). It could be worth checking out these or similar items: store.cablemod.com/12vhpwr-angled-adapter/
      AI is a hot topic and I am sure it will open new possibilities. The Z820 with the RTX 3090 should handle the task well. Just make sure the applications support multi core - if they operate on only one core you will not get the best performance.

    • @DtheDry
      @DtheDry 8 месяцев назад

      How did you find the 3090 in the Z820? I’m wondering if it’s going down this path for my Z840. Mainly looking to do local Llama

    • @wesleyx1
      @wesleyx1 8 месяцев назад

      @@DtheDry I didn't have any problems installing the 3090 in the Z820. You need to get a special high power cable for these machines. I've been using it for local Llama and generating AI generated images with Stable Diffusion. I needed to take off the interior cover because the card is too high for the enclosure. I run it with the back open and I haven't had any overheating or overload problems.

  • @abeelsiddiqui1989
    @abeelsiddiqui1989 4 месяца назад +1

    Hey love your videos, but I have one concern, those 6 pin to 8 pin power adapters are infamous for catching fire, which is why I am a bit hesitant in using them, how's been your experience? did you use any specific adapters with lower gauge wires in your case that dont catch fire? In my particular case i would like to power a 3070ti in a z440 which requires dual 8 pin connectors. Please guide me

    • @racerrrz
      @racerrrz  4 месяца назад

      Thank you, I am glad you have found the videos of help. Be sure to check out my Z440 Case Swap video if you haven't already - lots of ideas for the Z440 in there.
      I have seen the same claims on those cables. One concern I would have is incorrect wiring from the get go (so incorrect pin ordering in the plug). In terms of wire gauge - I would like to think keeping track of the power draw numbers will help to prevent any issues. What can happen quite easily is excessive power draw through a single adapter cable. The ATX standard is 75W for a 6-pin and 150W for a 8 pin, however HP's 6-pin cable is rated to 12.2V @ 18Amps (219.6W). If you use a 6-pin adapter cable and you are pushing 219W of power through it I can see why they catch fire (keep in mind the adapter cables are rated to ATX power limits - not HP's).
      With a little care you can use them correctly and get your system running right. I have listed the adapter cables that I bought in the video description. I converted the 3x 6-pin power cables to 3x 8-pin cables and the Zotac RTX 3090 Ti can draw ~450W to as much as 650W (transient spikes - but one benefit is the GPU is bottled-necked in these older systems so you may never see such high power draws). I have not had any issues in any of my systems and I have used a lot of these adapters. As long as you know the limits of the cable you shouldn't be able to over-load them. In terms of the RTX 3070ti, you should be able to power that without too much issue.

    • @abeelsiddiqui1989
      @abeelsiddiqui1989 4 месяца назад

      @@racerrrz the problem here is that the RTX 3070ti takes dual 8 pin, if I were to use the dual 6 pin to 8 pin adapter, I would be out of PCIE power cables, but I hope its like you said and the card doesn't draw full power in the Z440

  • @gate9595
    @gate9595 Год назад +1

    Edit: After reading a few comments, you have actually extensively answer my questions, so thank
    What about the the G8 Z4??? Still PCI 3.0, but what kind of performance upgrade is to be expected from a Z840 with the same configuration as this. I also assume the 4000 cards would be heavily bottlenecked?????
    This video was extremely helpful, and I'm going through all your other Z840 videos now. I plan on getting one of these, and was wondering what kind of frame rate you get while gaming, and how it compares to modern CPU (intel 10th-13th gen) with the same GPU(Maybe you already this in your other videos. I'll find out)
    Also, have you had the chance to test RTX 4000 cards? Since the Z840 is limited to PCIe Gen 3, and has a much older CPU, how much performance are you giving up when using a 4080 or 4090 card? What potential issues could be encountered with the 4090 since it has 16 pins?

    • @racerrrz
      @racerrrz  Год назад +1

      Hi, I am glad to hear you found the video useful. Yes, it is jam packed with lots of info! The Z840 will see substantial bottleneck with the newer RTX 4090 GPUs, at an estimate, there would be between 25-40% bottleneck (which is a lot!). The RTX 3090 Ti would have ~10-30% bottleneck. These are estimates and the actual number will vary depending on the final configuration of the machine. If you are going to go down the Z840 route you should aim to get the E5-2697 V4 CPUs (best value at the moment). DDR4 server RAM (2400MHz) is becoming more affordable now also. For the cost of the workstation they still are great value. For bottleneck calculations you could try: pc-builds.com/bottleneck-calculator/result/0BD1ci/1/general-tasks/3840x2160/ (link may break over time)
      In terms of the RTX 4090, they are not technically feasible due to requiring 4x 8-Pins to drive the GPU (the Z840 only has 3x 6-pins, 12V @18Amps each, netting theoretical max of 648W (living on the max isn’t a great idea). I am not aware of any power adapters that would allow 4x 8-Pins to be reached without exceeding the power draw on a single 6-pin (216W) on the Z840’s PSU. Thus, a new PSU would be required. On that note, I have found a way to replace the Z840’s PSU with an aftermarket unit (likely a 1600W unit would suffice). So the RTX 4090 is possible, but it requires a bit of a Frankenstein machine. For now I am quite satisfied with the RTX 3090 Ti, it handles everything I throw at it.
      The ideal candidate now is the HP Z8 G4 (Z840’s upgrade), but hardware prices are still steep on them due to being newer. They do however still only have PCIe 3.0. The HP Z8 G5 has PCIe 4.0, but check Linus’s video for the machine costing / performance (ouch and wow). The HP Z4 G4 would be a cheaper path if you can live with only 1 Xeon in your machine. The bottleneck will not go away any time soon though.
      In terms of FPS, I’ll refer you to cyberpunk gameplay secret video (I had a hidden QR code in a video to link to this video - but not many people found it 😂). ruclips.net/video/-4vHZ59Qllc/видео.htmlsi=dRP4Ltmoihf_DyZh
      The Z840 was recording in 4K in the background with OBS so the FPS was lower than you would obtain in game.
      I also compared the RTX 3080 and RTX 3090Ti in Forza here (cosy 4K 60FPS in Forza Horizon, screen cap is 60FPS):
      ruclips.net/video/36g5HI2klA0/видео.htmlsi=zTgj6wotynnU6VHr

    • @gate9595
      @gate9595 Год назад +1

      ​@@racerrrz Thank you so very much for this thorough explanation. I actually just saw Linus' review of the Z8 G5 yesterday; easily a brand new german car right there 😂.
      I initially wanted to go with either the Precision 7920 or Z8 G4. I'm more inclined towards the latter however, but I'm gonna wait a little longer for the price to go down. I was very close to drop $3000 on a very nicely spec'd one this week, but I believe if I'm just a little bit patient, I can find what I really WANT (not what need lol, 1.5TB RAM, 2 Xeon Gold 6230R for running multiple VMs, video editing etc..) by Q1 2024 at a very nice price. Or even maybe, just maybe a lower end Z8 G5 that I can upgrade overtime.
      Beside being able to support more and faster memories, the 2nd gen scalable isn't that much faster than the first gen from what I've seen so far. I haven't been keeping up with the tech world the way I should have in the past few years, but everything points to "I should wait at least until next Q1 to make a long term decision".
      Thanks again mate! I was your 670+ subscriber and I'll definitely let you know what system I get by the time you reach 2M subscribers, hopefully before new year ✌😌

    • @racerrrz
      @racerrrz  Год назад +1

      @@gate9595
      Spot on there, HP Z8 G5 vs house deposit vs new luxury car. But only one of those can do VMs and video editing 😂.
      Actually, the Lenovo ThinkStation P620 Tower Workstation could be a solid alternative also. Their prices are lower than a Z8 G4/5, but for the price point they have high functionality (e.g. Ryzen Threadripper PRO, DDR4 3200MHz ECC, 1000W PSU).
      Are you sure you want 1.5TB of RAM? I am only on 256GB and the Z840 workstation has completely replaced the need for a room heater in winter (aircon is needed in summer…lol). The longer you wait the better the deal will be, but at the cost of not having the system on hand to get work done faster now!
      The Z840 is aging a bit but for the price point they are still really good value. They can’t game at 4K 120FPS, but with up to 44 Cores 88 Threads (E5-2699 V4) and 2TB of DDR4 (LRDIMM; 2400MHz) they can get any task done, albeit slightly slower than the Z8 G4/G5.
      The only other catch is hardware pricing. I have considered updating to a Z8 G4 but the cost to build one that matches my current Z840’s specification is 4x the value of my Z840 in its current state. It might be better to save for that house deposit / new car after all.lol
      Thank you for the sub! I’ll have to fast track some more Z840 upgrade videos but right now the machine is working as it should and I don’t have many bottlenecks I can address. CPU prices have come down so there is that option.
      That sounds like a challenge, but who will win? The race to your Z8 G5 and my 2Mill Subs by Q1 2024, I know which one I would expect to see occur first (I'd be happy with 1k by then lol). Let us know how that buying process works out, but definitely scope out the Lenovo Thinkstations also, they are great value machines also.

    • @gate9595
      @gate9595 Год назад +1

      ​@@racerrrz 1.5TB solely for its ridiculousness, as I plan on making RUclips videos of the system as well. Nothing too technical at first, but they'll still be educational. If I see a RUclips Thumbnail that says "1TB of RAM" , I don't care what the video is about, I'm clicking immediately; especially when it's used as a personal daily driver 😂
      -
      I'm indeed concerned about both the heat and fan noise that could rise from that much hardware (2 Xeons, 1.5TB RAM, a 3090, 4 or 5 HDDs), and I truly hope it wouldn't be too much annoying. We must see!
      I'm not into AMD systems yet. AMD doesn't get workstation systems it deserves from the big three (DELL, LENOVO, HP). Beside their raw CPU power, AMD Workstations are always lacking features when compared to their intel counterparts. Precision 7960 vs 7965 is an example. Correct me if I'm wrong; I don't think the big three ever attempted a dual CPU AMD system. Perhaps that has something to do with their CPU design, but I don't believe the big 3 show the same love to AMD as they do to intel. It makes a lot of sense from a business prospective , as intel has been holding the majority of businesses down for decades, not just corporate America, but across the world. It's like Amazon, eBay, Newegg etc.. Amazon started first, then eBay followed.
      Anyways, I digress lol. AMD has been in the game for decades, but its fame is something fairly recent, and I believe they are gonna keep surprising us.

    • @racerrrz
      @racerrrz  11 месяцев назад

      @@gate9595 I am slow to follow up here sorry! I picked up a Z8 G4 and it has seen some limelight in my most recent HDD expansion video for the Z840. Thus far I am quite impressed with the Z8 G4. Its overall power draw appears to be lower than my Z840 (with near the same hardware fitted). I can't afford that thumbnail (1TB of RAM, ouch lol), but I will eventually install 256GB of 2933MHz ECC Reg. in the Z8 G4 (I tried to install it but the Z8 refused to boot - I presume I need Gen 2 Scalable Xeons for the faster RAM).
      The AMD workstations are quite powerful from what I have seen, but cost is a factor. If you could obtain a Precision 7960 with an Intel Xeon W processor that would be way to go in terms of performance. The AMD offerings are quite powerful in terms of cores/threads for the $$$, and for the more general use they would offer solid performance. Something like the Precision 7875 Tower would give a solid experience, but at a cost. For the time being I'll see what I can get out of the Z8 G4. The Xeon Gen 1 Scalable processors are relatively affordable right now which makes these systems a great upgrade over the older X99 workstations. The Thinkstations are generally priced quite well. I haven't had the privilege of owing a Precision or Thinkstation just yet, but they really look like well built workstations.

  • @craigreich8046
    @craigreich8046 10 месяцев назад +1

    I have a Z840 with 2 2.60GHZ 24 cores right now what processors would be a good upgrade?

    • @racerrrz
      @racerrrz  10 месяцев назад

      There are several options for the Z840, and most of them are quite affordable at the moment.
      Do you prefer more cores or more clock speed?
      The cheapest and top performing CPU would be the E5-2696 V4.
      The best clock speed and core combination is the E5-2697A V4 (I just installed one of these on my Z440 NAS build, great CPU).
      A quick comparison between some of the top models:
      www.cpubenchmark.net/compare/2750vs2783vs2814vs2753vs2779/Intel-Xeon-E5-2696-v4-vs-Intel-Xeon-E5-2697-v4-vs-Intel-Xeon-E5-2697A-v4-vs-Intel-Xeon-E5-2699-v4-vs-Intel-Xeon-E5-2680-v4
      Affiliate links for your reference (I would earn a small commission if you buy through these links - I bought the E5-2697A from Ebay for $63 USD a few months back):
      E5-2680 V4: s.click.aliexpress.com/e/_DmQkIkl
      E5-2697A V4: s.click.aliexpress.com/e/_DllKF8t
      E5-2697 V4: s.click.aliexpress.com/e/_Dmzs0Lj
      E5-2696 V4: s.click.aliexpress.com/e/_Dm3gaGz
      E5-2699 V4: s.click.aliexpress.com/e/_DFakgXx

  • @TheAngeloMichael
    @TheAngeloMichael 6 месяцев назад +1

    I have watched all your videos and learned a lot, I have an z840 for over 3 years with the large psu.256 Gb 24 core. Mainly I use it as a desktop and for staking tokens, never had the need to upgrade the gpu. Recently they have added AI to my edge node and I am trying to use what I have to help performance. I have a few 1060P 6gb from mining software and a couple of P40 Tesla's 24 GB. I want to use a 1060P with a Tesla, no one has really done it before, a lot people have tried the K80 and M40 but all have had problems with the 6 pin to 8 pin connectors. I have the cooling worked out. From what I see and hear from your video, I should be able to use a dual 6 pin to an 8 pin for the P40 (240 watts) and the other 6 pin to 8 pin to operate the 1060P (71 watts or so). Should that work? And is there a particular 6 pin to 8 pin? some people have had issues with this. Any help would be appreciated.

    • @racerrrz
      @racerrrz  6 месяцев назад

      Hi. I am glad you have found my videos of use. I have a decent Z840 video nearly ready to go out but I decided to film a bit more before releasing it - stay tuned for that!
      You should absolutely be able to run a couple of GPUs without issue in the Z840. The key is keeping track of the numbers. Each 6-pin can supply a max of 219W (12.2V @ 18Amps); and normal ATX 8-pins only supply 150W. To cover the Tesla P40 you would need to join two 6-pins into one 8-pin (netting ~439W max through that connector). The issue here is that although the Z840's PSU can handle that I doubt the cheap 6-pin to 8-pin connectors will hold up with much more than ~160-180W before melting down. I suspect people that have had issues drew too much power through a single 6-pin (rated to 75W) / 8-pin (rated to 150W) cable.
      I have had no issues with the adapter cables that I have used (I linked the exact items I bought in the video description and here: amzn.to/3U6Oo0C - affiliate links). I prefer the 6-pin to 8-pin adapter over the dual 6-pin, or single 6-pin to dual 8-pin versions. The Tesla P40's ~250W TDP is aggressive for a single 8-pin connector. I would like to think the 6-pin to 8-pin adapters I am using would survive that. Side note- the GPU power 6-pin to 8-pin adapter cables have survived the RTX 3090 Ti without issue so far - but I have moved over to the HP Z8 G4 - but neither workstation has complained with them fitted.
      For piece of mind you may want to consider getting a infrared heat gun so that you can monitor the wire and plug temperatures for added piece of mind. I have been tempted to get one off Amazon (it's been in my cart for like 6 months haha - affiliate link: amzn.to/3U4okD3 ) to use in videos but I never got that far. I also liked the Thermal Grizzly - WireView (affiliate link: amzn.to/49ow9YO - this is a 8-pin to 8-pin with digital power read-out). Thermal Grizzly offer a range of GPU power adapters that might be worth consideration (I was looking at getting a 3x 8-pin to 1x 12VHPWR with readout to monitor GPU power draw). They should be of high quality and with the digital power readout you'll be able to make sure the numbers aren't ever too crazy. The joys of cash flow not always being in our favour, let's just blame that global economy being in recession.

    • @TheAngeloMichael
      @TheAngeloMichael 6 месяцев назад +1

      @@racerrrz I want to thank you for your detailed thorough explanation and links. I have confidence that I can put it together and it will work with your explanation and experiences.The machine is an absolute beast even without the upgrade, it functions incredibly well as a homelab, for virtualization, staking tasks, Plex server, OBS and the normal household functions, it runs numerous tasks simultaneously. It can even run a number of llama models locally from openchat on its memory and cpu's, although it is slow on all but the most streamlined models, it's quite an accomplishment for an older workstation without powerful GPU's . I am updating storage capacity adding a NAS and working on a decentralized cloud based node. It has plenty of life left in it. I am also working on running inexpensive dual tensor chips from coral ai that can beef up any security system, it should work good, has low power consumption and is an awesome low cost upgrade. They use an m.2 E slot, but there are pcie cards that can stack from 4 to 16 chipsets that add a lot of processing power with low overhead. I picked up a z820 to experiment with ProxMox, after watching afew of your videos and was shocked at how well it performs in that role using a more streamlined operating system. I have been eyeing the newer generation Z 8 and some of the newer chipsets, but the z840 has a useful future, it's a reliable household appliance that can still be upgraded.. Your videos have been quite helpful, I have shared your information and links, with my own opinions and experience in Reddit columns and it has helped several people and small companies in Egypt and India that use them for video production. Looking forward to your next video.

    • @racerrrz
      @racerrrz  6 месяцев назад

      @@TheAngeloMichael Thank you for your feedback and for sharing the content to help others also. Your homelab sounds very capable. It just makes sense to combine all of the modern needs into one reliable system. Getting some tensor chips in there will give you some room to push the workload further. I can imagine a H100 finding it's home in a Z840 lol.
      The HP Z820 is still an absolute trooper for homelab work. Upgrades are cheap, the system is relatively power efficient for the workloads it can handle. I think the biggest bonus is their low noise compared with a dedicated server platform of the same era.
      Right now the Z840 is the best bang for buck - CPU and RAM options are really cheap now. The Z840 bare bones machines have also lost a lot of value on the used market - a year or two ago they were worth double what they are listed for now! I have a detailed Z840 overview video that is planned for release as the next video.
      In terms of the Z8 G4 (I am so slow with the videos! I have a few in the works for it!), for a main system they are more than capable. With the right CPU and RAM setup it would outperform many mid range gaming systems. The only catch on them is cost. The Z8 G4's are still really expensive. The Gen 1 Scalable Xeons and 2666MHz ECC memory prices have come down quite a bit however - and generally the Gen 1 Scalables are prices better than the Broadwells Xeons.
      I am keeping my eyes out for a cheap set of Gen 2 Scalable Xeons for mine. I bought 2933MHz ECC Registered RAM modules in the hope that they would work on my system but I got caught out - the Z8 G4 did not recognize them. Hopefully with the new Gen of CPUs I'll be able to make use of them. But for now my Gold 6142's are handling everything I throw at them.

  • @Grumpy_Bubbles
    @Grumpy_Bubbles Год назад +1

    Love the vids. Oddly enough I've got a VERY similar build. Bit of a all around machine...
    Z840
    Dual e5-2690 V4s, 128gb ram, 4070TI, Asus Hyper M.2 with 4 Crucial p3 4tb, 4 - 16tb Seagate Exos SAS, 3 - 16tb Seagate Exos SATA.🏎

    • @racerrrz
      @racerrrz  Год назад +2

      Hi Ryan, that's not a Z840 to be overlooked! The e5-2690 V4's are a power house. 128GB RAM does everything you can throw at it and your storage collection looks solid. Exos SATA/SAS drives give solid speeds for their capacities. How do you find the Asus Hyper M.2? Any issues running 4x 4TB NVMes? Are you able to boot from your adapter also?
      I had some issues booting from my AORUS Gen 4 adapter - with 4x NVMes I could not get it to boot into Win 10 from the adapter - but with 3x or less it booted fine...
      The RTX 4070 Ti should give solid performance. How has that stacked up for you? I settled on the RTX 3090 Ti for the extra VRAM which has helped for video editing.
      I am sure your workstation will handle anything you throw at it.

    • @Grumpy_Bubbles
      @Grumpy_Bubbles Год назад +1

      @RACERRRZ no wasn't able to boot from them but I'm sure there is a work around. Right now I've got an ssd laying on the floor of the case. Lol. Looking into adding another single pcie to nvme to see if I can boot that route. Also Linus did a build recently using a 420 using a USB Clover bootloader to then boot from the nvme storage since the bios was somehow dead set on not allowing it. Where there is a will there is a way!

    • @racerrrz
      @racerrrz  Год назад +1

      @@Grumpy_Bubbles I understand the SSD floating around - but I found a cheap fix for that (future video on the topic - Z840 upgrades! Olmaster 5.25" Bay to 4x 2.5" SSD with hotswap and built in fan; Olmaster He-2006).
      Yes, Clover is a solid way around this issue but when I tested it on a Z420 it wouldn't work! I was hoping to use an internal USB plugged into a USB 3.0 20pin socket to USB adapter (Z420 case swap) but that never panned out when Clover didn't detect the Win 10 ISO. I didn't spend long enough on it, but SSDs are quick and hence- I booted from an SSD lol. Linus is just flexing booting from NVMes on his Z420 haha.

    • @Grumpy_Bubbles
      @Grumpy_Bubbles Год назад

      @RACERRRZ yeah I saw the 5.25 to 4x 2.5 adapter. But I'm going to go with the 5.25 to 3x 3.5 hot swappable adapter. Currently I've got 2 lg BR optical for ripping. Once I get my library finished I'll make the swap for more storage.

  • @Sam-tb9xu
    @Sam-tb9xu Год назад +1

    The z840 only has PCIe 3.0 slots. How much does that slow down a card like the PCIe 4.0 capable 3080 or 3090?

    • @racerrrz
      @racerrrz  Год назад

      Hi Sam. You are correct that the PCIe 3.0 slots will bottleneck the GPU, and the RTX 3090 Ti is designed to make use of the extra bandwidth of the PCIe 4.0 interface (x16 PCIe 3.0 = 15.75GB/s max one direction ; x16 PCIe 4.0 = 31.5 GB/s max in one direction). Aside from the PCIe limitation, there is also a CPU limitation due to the aged Xeon CPUs (I have the E5-2697 V3s). I use pc-builds.com/bottleneck-calculator/result/0BD1aY/1/general-tasks/3840x2160/ as an example to calculate the CPU bottleneck. In practice, the bottleneck will be ~5-30% on performance, pending the task complexity. Despite that, this makes for blisteringly fast PCIe 3.0 GPU for the Z840!

  • @AlejandroMagnoRivas
    @AlejandroMagnoRivas 4 месяца назад +1

    Why some recommend 1000+ psu?
    Mine evga 850w 80+Gold G2 buying in the past where 1000 series was present,cant with a 3090ti?
    What is the spikes?
    Is real the PSU only offer constant 50% 60% of the energy?

    • @racerrrz
      @racerrrz  4 месяца назад

      The transient spikes are very short bursts of power that can overwhelm low power PSUs. Something like the Zotac RTX 3090 Ti can draw as much as 650W for a brief moment and if your PSU is unable to supply that power it will cause a PSU meltdown (capacitors get fried - leading to smoke and a fire hazard). It might seem odd given that the GPUs are still rated for say 450W TDP, but aiming for some redundant power overhead is a safeguard to make sure you don't damage your system hardware.
      If you can, aim for a 30% minimum overhead. So if your GPU is rated for 450W - aim for a PSU that can deliver say 650W to the GPU (it's not easy to find because the 8-pin power cables are only rated for 150W). My HP Z840 workstation's PSU is a 1450W Platinum that supplies 219.6W per 6-pin cable - a very over-powered PSU in these systems.

  • @550F21CEMO
    @550F21CEMO Год назад +1

    Z640 here
    64GB DDR4 ECC 2133
    Xeon e5 2699V4
    Z420 Liquid cooler
    HP Turbo drive (1TB Corsair mp600 pro nvme) boot
    2x4TB hdd
    2 blu ray drives
    RTX 3090
    Fucking amazing…

    • @racerrrz
      @racerrrz  Год назад

      Nice, the Z640 is a solid performer. That's a solid combination that you have there. How is the Z420 cooler working out for you? I found the stock Z440 cooler to be more than enough to keep a e5-2697A V4 cool (side note, the Z840's stock cooler was insufficient oddly enough).
      I am sure the HP Turbo drive will serve you well. How are you powering the RTX 3090? I presume it's a 2x 8-pin founder's edition? I had in mind the Z640 only has 2x 6-pin power cables (but I have not had the pleasure of owning one). If there are 3x 8-pins you should be sorted for most GPUs, minus the newer 4x 8-pin variants.
      Also, if you have some cash for a RAM upgrade the prices have dropped a lot. I just picked up 64GB of 2400MHz Reg. ECC memory for my Z440 NAS build.

    • @550F21CEMO
      @550F21CEMO Год назад +1

      @@racerrrz the current z420 AIO keeps temps well below what you would expect from a xeon, or an i7-6950X, used and tried on both, 2699 v4 with thr AIO tops out around 47C under max load, the 3090 is powered with 2x6pin TO 8 pin adapters it is the palit gaming pro, the turbo drive has the corsair MP600 but im trying to find the one you have with the little blower🥲, side note to get the AIO working on the z640 due to pin differences on mobo and on the Fan/aio combo i had to cut the fan header clip on the side to fit it

    • @racerrrz
      @racerrrz  Год назад

      @@550F21CEMO Nice, the 2x 8-Pin RTX 3090's are much easier to manage in these workstations.
      The smaller PCIe adapter can be sourced on ebay for $20 USD, but I would recommend going for the Asus Hyper M.2 V2 instead. It can hold 4x NVMes on a PCIe x16 lane (with bifurcation; and you can run just 1x NVMe also), plus I found it to have excellent thermals (I compared the HP Turbo Drive Quad Pro, Gigabyte AORUS Gen 4 and a Jeyi Quad U.2 to M.2 PCie adapter to the Asus Hyper M.2 V2 in a related video: ruclips.net/video/xqg0uQ93KTg/видео.html).
      Nice job with the fan header modification, I have not seen that done before but it makes sense to be able to repurpose the coolers. And given that the e5-2699 V4 pulls some power, it means they actually work really well!

  • @Hairybarryy
    @Hairybarryy Год назад +2

    The cpus aren't fast enough to drive the cards in the first place. Maybe with a turbo boost mod or maybe upgrading to broadwell cpus. I took the 3080 out of my z840 with 2696 v4 cpus and swapped in a titan x pascal. I put the 3080 into my encoding server that has a 7940x. That cpu is able to fully utilize the gpus

    • @Hairybarryy
      @Hairybarryy Год назад

      Also with titan x pascal the bottom duct can still be used

    • @racerrrz
      @racerrrz  Год назад +1

      Hi Nate. You are correct that the Z840's lineup of Xeons are technically too aged to fully support the capability of the more modern GPUs like the RTX 3090 Ti. The RTX 3080's performance was meant to be nearly in the "scope of reason" for the e5-2697 V3's which was why I initially opted for it. I checked the potential bottleneck for different configurations here: pc-builds.com/bottleneck-calculator/ or www.cpuagent.com (as a guide anyhow).
      My conclusion was there is no point upgrading to the e5-2697 V4's (I prefer higher clock speed over more cores) since they too are aged and will not fare much better. The Z840's hardware is aging and the only logical choice is to work towards the more modern HP Z8 G4, but that option is still cost prohibiting. With that knowledge, the RTX 3090 Ti upgrade has still been the best upgrade for my workflow because my main use is to render video with the odd gaming session in 4K.
      In terms of the Titan X (12GB), the limiting factor for my creative ventures has been VRAM. I went from a RTX 2070 Super to the RTX 3080 12GB, expecting that to be enough VRAM. But right now every video project is using ~19 - 24GB of VRAM on the RTX 3090 Ti which leaves me on thin ice. If I push a project beyond 24GB of VRAM DaVinci Resolve crashes during renders. A logical future proof upgrade could be the RTX A6000 (48GB), but for the price difference the RTX 3090 Ti still remains the best value.

    • @Hairybarryy
      @Hairybarryy Год назад +1

      @@racerrrz WOW that's crazy. Are you editing in 8k+?
      You really should need that much VRAM. What camera do you use to film? What is the bit rate of the raw footage that you are editing?
      Are you using premiere? I've noticed Premiere running like garbage on many slow cores over high single performance. Hints why I moved exporting to separate high frequency machine.
      Thanks again!

    • @racerrrz
      @racerrrz  Год назад

      @@Hairybarryy It sure is!
      I tried Premiere Pro, but the added price premium and need for AfterEffects for more flashy effects made it less logical. I use DaVinci Resolve Studio 18 (one-off fee) for everything from cutting, editing, effects, transitions, AI object/face tracking, green-screen removal, and audio adjustment with Fairlight FX.
      I have not dared touch 8K because 4k 60Hz is already straining enough for the RTX 3090 Ti, and DSLRs that can capture 4K 60Hz is already a big commitment. I mostly render with QuickTime in H.265 in Ultra HD 60Hz.
      The VRAM cost is less about bitrate in my application and relates more with how I process the video footage. Using DaVinci Resolve Studio 18’s wide variety of Open FX Filters I enhance aspects of the footage. The most taxing effects are the Noise Reduction filter and 3D Keyer but combine a few of these filters and the VRAM cost increases dramatically, as does the processing power to complete the task. If a video was just 4K 60Hz footage with basic transitions I might get away with 12GB of VRAM, but why settle on less when you can do so much more with the same footage?

  • @alexisdane1886
    @alexisdane1886 Год назад +1

    i have z640 with power supply gold 925 watt can i have MSI Gaming GeForce RTX 3090 24GB GDRR6XOWR CONSUMPTION (power consumption 370 with 8pin*3) connected ??

    • @racerrrz
      @racerrrz  Год назад

      Hi Alexis. There is really only one problem to solve to allow you to use the MSI RTX 3090 card in the Z640 with the HP 925W PSU. The issue is the 2x 6-Pin power adapters on the PSU, which will mean you need to split the power inputs into 3x 8-Pins. The 925W PSU can handle the RTX 3090's ~350W TDP. HP does a great job on the engineering side because each 6-pin cable can output 12V @ 18Amps, which is 216W, with 2x 6-Pins the absolute maximum being 432W between the two cables. The RTX 3090 is rated to ~116W per 8-Pin (so total of 350W), and I'll add that the founders edition RTX 3090's have 2x 8-Pins for power which would suit better. Another issue with joining the outputs of the 6-Pin cables is that there is a risk of damage due to bridging the power rails. You would need an adapter that is able to convert 2x 6-pin male adapters into 3x 8-pin adapters. Unfortunately I am not aware of such an adapter (it would be called "2x 6-pin female to 3x 8-pin male PCIe power adapter"). Converting 1x 6-pin into 2x 8-pin adapters is possible but that would mean you would need ~232W from a single 6-pin which is outside the scope of the PSU. My recommendation would be to upgrade your PSU to a gaming unit, ~800W platinum is ideal (larger PSU would open up more upgrade options) - although expensive (ebay/amazon search for: "ATX 24Pin to 18Pin + 8pin to 12pin Adapter" to convert your HP connections to ATX PSU standard).

  • @Sammyvanw
    @Sammyvanw Год назад +1

    Don't you need 2x6-pins for one 8-pin? Didn't know the Z840 PSU is strong enough for a 3090ti. Thanks for your video, I'll update my Z840.

    • @racerrrz
      @racerrrz  Год назад +3

      Hi Samuel. The Z840 PSU is very powerful, providing you have the 1125W PSU (e.g. P/N: 758470-001). Each 6-Pin is 12V @ 18Amps (max of 216W), and combining 3x of them will yield a max of 648W between the 3x 6-Pins. I wouldn't aim for the Max Watt rating however, but they will run a RTX 3090 Ti, or even some of the RTX 4090s without trouble. The 4x 8-Pin RTX 4090s might be an issue but a quality connector might solve that problem too! HP's PSUs are very well engineered. The Z840 PSU actually outputs more power from a 6-Pin than what a standard PSU can output from a 8-Pin, which is 216W (Z840) vs 150W (ATX 8-Pin)!

    • @adeelraza2337
      @adeelraza2337 Год назад

      @@racerrrz How's the PSU of a HP Z620 compared to this ? What options do I have if I am planing to plug in a 4070ti or 3090 using 3 8 pin connectors when I have just 2? 6pins available the 620?

  • @b.a.g2073
    @b.a.g2073 Год назад +1

    Thanks for this video. Do you think there are any benefits to putting in a 4080 vs 3090 ti? If they both get bottlenecked, I figured the power draw is less but the vram is much lower too. I do 3d content creation / ue5.
    Want to upgrade from 2080.

    • @racerrrz
      @racerrrz  Год назад +1

      Hi. I am glad you found the video of help. Pinning the RTX 3090 Ti against the RTX 4080 is a tough match-up. The extra speeds from the RTX 4080 might be combated more heavily by the PCIe 3.0 bottleneck in the Z840.
      If your 3D projects are managing ok on the RTX 2080, and if you are not using a lot of VRAM, then you might be ok with the RTX 4080's 16GB.
      I would still settle on the RTX 3090 Ti for added cooling performance (mine sits around 55-65'C and I turned down the fan curve - MSI Afterburner fan curve function) and extra VRAM (24GB of VRAM more than makes up for the lack of rendering speed, in my application). If anything, I find I need more VRAM. A typical video project uses 20-29 GB of VRAM (including Virtual memory), and I have to be cautious to not overload the VRAM (DaVinci still crashes if I do).
      The difference in TDP might also be offset by the bottleneck. My RTX 3090 Ti has not pulled more than ~380W, even at 100% load in DaVinci.

    • @b.a.g2073
      @b.a.g2073 Год назад +1

      @@racerrrz mmmm those are great thoughts. Thanks for replying. It such a shame the 4080 only has 16gb. I would like to experiment with training custom AI models as well. But particularly using Ue5, Maya, 3dmax, blender, embergen, Gaea. Realistically the vram is the biggest requirement.
      If the 840 can physically take a 4080 then it would take a 4090 since they same, but would that be a complete waste of gpu/cash you reckon? Could the 1125w even safely power a 4090.
      Love your content.

    • @racerrrz
      @racerrrz  Год назад +1

      @@b.a.g2073 Thank you for the feedback. There are a few challenges with the newer cards to consider.
      1) Although power draw isn't drastically higher, most of them have requirement for 4x 8-Pin connectors which places them out of reach for the Z840 - unless you do something drastic (case swap + 1600W PSU or external PSU). Their physical dimensions also changed - particularly the card height, which creates conflict with the side panel when you factor in the 12-Pin top-mounted power connector.
      2) It may seem like a downgrade to get the older GPU - but VRAM really is key - in both gaming and 3D rendering. I suspect the RTX 4080/90's will be hurt more substantially with the PCIe 3.0 limit (and keep in mind the ever-powerful Z840 is also aging - e.g. e5-2699 V4's are getting along in age and there is nothing better for them). However, in the end the newer GPUs will still handle tasks significantly quicker than older GPU hardware.
      For the price point the Z840 is still the best value, even if it there is a bottleneck. A Ryzen Threadripper loaded with RAM becomes a house deposit lol.

    • @b.a.g2073
      @b.a.g2073 Год назад

      @@racerrrz excellent points thankyou. We can dream hey for a thread ripper and RTX a6000 !😆

  • @EliezYT
    @EliezYT Год назад +2

    This is a great gpu only thing I wonder is will there be bottleneck?

    • @racerrrz
      @racerrrz  Год назад +1

      Yes, unfortunately there is a decent bottleneck! The degree to which the Z840 will bottleneck the more modern GPUs depends on your machine specification. For my setup the bottleneck can be anything from 5%-25%, pending the test. This is a good place to check it with: pc-builds.com/bottleneck-calculator/result/0BD1aY/1/general-tasks/3840x2160/

    • @EliezYT
      @EliezYT Год назад +1

      @@racerrrz You can use the E5-2697A v4 it would be a great choice.

    • @racerrrz
      @racerrrz  Год назад +1

      @@EliezYT That is the exact CPU I wanted to get, but they were priced through the roof when I was buying my CPUs (Video for reference: ruclips.net/video/7nXv20A2lZQ/видео.html pause at 22:59; pricing was current in April 2022!). Prices have dropped heaps since then and I think you can get them for a solid price now!

    • @EliezYT
      @EliezYT Год назад +1

      @@racerrrz Yep the ebay listings show you can get two of them for under $200 which ain't to bad I mean not only that your cpu power will get a HUGE increase but your gpu can be less bottlenecked for more performance.
      Also I am curious how do you feel about the Arc A770 16gb and how far the drivers have come since launch?

  • @sreedharkrkumar3959
    @sreedharkrkumar3959 7 месяцев назад +1

    Hi Can I add RTX 4070 OC edition (two fans) in HPZ840?

    • @racerrrz
      @racerrrz  7 месяцев назад

      Absolutely! The RTX 4070 Founders Edition came with 1x 12VHPWR cable which splits into 2x 8-Pin power connectors (some of the high spec models may be 3x 8-pin connectors). The HP Z840 has 3x 6-pin power cables that you can adapt to 8-pin power cables (product links in the video description). Once you adapt it you should have a solid GPU for your Z840.
      You can calculate any potential bottlenecks here: pc-builds.com/bottleneck-calculator/result/0IT1fl/3/graphic-card-intense-tasks/3840x2160/ Generally the Z840 will bottleneck the more modern GPUs by 5-30% depending on the tasks being completed. I am happy with my RTX 3090 Ti's performance - bottleneck or not - it has not missed a beat.

  • @MohammedBadran-dk9qn
    @MohammedBadran-dk9qn 10 месяцев назад +1

    i have z840 workstation 1125 watt (power supply) can i apply MSI RTX 3090 Ti GAMING X TRIO (Power Connectors 1x 16-pin )it possible to in z840 am gonna use 6 to 8 pin converter

    • @racerrrz
      @racerrrz  9 месяцев назад

      Sorry, your comment got blocked by RUclips for some reason! I just traced it now and let it through. The MSI RTX 3090Ti X Trio will work out the same as my Zotac RTX 3090 Ti in the Z840. The MSI RTX 3090Ti X Trio is only 325mm so it may slot in without needing to remove the HDD fan module as I had to do with the very long Zotac GPU. 3x 6-pin to 8-pin adapters is all you need. The 1x16-pin comes supplied with 3x 8-pin connectors at the other end (Seen @ 12:24).

    • @archdebiangentoo
      @archdebiangentoo 23 дня назад +1

      @@racerrrz where do i get the 3x 6-pin to 8-pin adapter?

    • @racerrrz
      @racerrrz  20 дней назад

      @@archdebiangentoo Hi. There may not be something like 3x 6-pin to 1x 8-pin adapter cable. The 6-pin cable was designed for a max output of 75W, and the 8-pin cable outputs a max of 150W under normal ATX circumstances. Combining 3x 6-pins into a single 8-pin will cause damage to your system due to over-amps. All you need is 1x 6-pin to 8-pin adapter cable. If your GPU has more than one 8-pin cable you can buy additional cables to match. The only exception is the new GPU 12VHPWR cable which can combine 3x 8-pins to achieve the required 450W for most modern GPUs.

  • @ericwanner7966
    @ericwanner7966 Год назад +1

    I sort of want a Z840 now because I’m impressed with the design and quality (and price). It looks pro grade. So might gamble on ordering one from EBay at some point. Or build the least expensive Threadripper (or Intel equivalent) when/if they become more available. In any case, I was thinking that a used A4000 GPU might fit/work in there nicely. Around $500 and is also pro grade. Thoughts on that?

    • @racerrrz
      @racerrrz  Год назад +1

      Hi Eric. For the price the performance you obtain from the Z840 is nearly unrivaled. A more modern system with a Threadripper 3970X for example will outperform the Z840 with ease, but to obtain that system you will be spending 3-5x the amount! For the end cost the Z840 provides a powerful platform, albeit a little aged now. I have not used the A4000 but it is one of the GPUs that I would recommend. Lower power draw and smaller form factor does mean you can install a couple of them if needs be. The main attraction is the ECC memory supplied on them. I settled on RTX gaming cards over the workstation grade card due to the processing power per dollar. If a video render fails I can restart the process with the only loss being time and power. For critical work the A4000 or A6000 (budget pending!) are great options. What was your end use for the machine?

    • @ericwanner7966
      @ericwanner7966 Год назад +1

      @@racerrrz Oh, interesting. So the workstation VRAM is ECC and is therefore less likely to fail during a render? Long story but I’ve been a computer enthusiast wannabe for 30 plus years. Bought a new HP desktop in 1994 or so and became disillusioned by the obsolescence curve at the time. There was no way to keep up financially. Used functional laptops and MacBooks in between. Couple years ago to present did some upgrades and builds of desktops with the adult kids. I have a tendency to buy significant headroom based on experience. I was shocked to find that consumer grade gaming/creator builds look like they can be equipped with multiple drives but in fact there are hidden tradeoffs. Namely the limited PCIe lanes. My Pro Art Creator MB with big case should hold 128 gb memory, 3 M.3s 4 SSDs, and 2 spinning hard drives. I was naïvely going to max this out for projects I have in mind. Then I found out that it would limit the 3090 among other things. So now I have a gaming PC. I want a workstation too. I don’t have the money currently to buy a Threadripper but I can set some aside for it. House is paid off and whatnot. But the Z840 could be a fun project to keep me busy for awhile at much lower costs. Just a little trepidation over finding one in good condition and upgrading it. Thanks! Probably TLDR…

    • @racerrrz
      @racerrrz  Год назад +1

      @@ericwanner7966 I specialize in producing TLDR content! Combining the workstation ECC memory with the GPU ECC VRAM will create a very stable machine, particularly with the Z840's 90% efficient PSU (I also have mine on a UPS for added protection). New hardware is best obtained once it ages, and the obsolescence curve summarizes the process well. The question is, is the Z840 heading for obsolescence anytime soon? I would argue it has another 10 years to go, placing us somewhere in the middle of the obsolescence curve, which = good bang-for-buck. The Pro Art Creator M/B is a solid system, everything you could want; the only limiting factor being the single CPU socket. Dual CPUs on a Z840 workstation expands the possibilities but there are always some caveats. Adding in some overhead on your hardware is something only experience can confer. My next video is exactly this actually! I am limited to 7 PCIe slots on the Z840 (I know, "only 7"), but there are many PCIe adapters that would make life easier. The RTX 3090 Ti is occupying 3.5 Slots, and I am not prepared to give up those PCIe slots. Long story short, I am actively using two PCIe slots underneath the GPU where I have a 10GBe card installed and a USB-C adapter (the outcome of the headroom mindset! This setup isn't particularly attractive, but it is working...). The Z840 bottlenecks my RTX 3090 Ti by at least 20% and that's a number I can live with if it means my processes are more efficient. Once the Threadrippers age their prices will drop also and I am certain that will create an opportunity down the line. If you keep an eye out you will find Z840's at a below market price and their hardware prices are dropping fast (e.g. the online marketplace price for my e5-2697 V3 CPUs fell by ~3fold over the last year)! I am keeping an eye on the HP Z8 Workstation as well, but their cost and availability still places them out of reach.

    • @ericwanner7966
      @ericwanner7966 Год назад

      @@racerrrz One more... I too use a UPS. And I will be watching the prices for both TR and Z840. So I went back and watched an episode of BuildOrBuy called 16 Core CPU Comparison - Intel vs AMD - I9-12900K, 5950X and 3955WX. Naturally Gill is steering us towards the Threadripper Pro. I priced out the 16 core variant with an ASRock MB at Micro Center and if you give it the minimum accompanying hardware to get it to POST then I think we could get started for under $4000. I know, I really want to put high capacity hard drives in it so it would go up from there but could do it sort of one piece at a time, a la Johnny Cash, except we aren't stealing it from the plant. Seems appealing to me, even though currently I could only buy like one component. Fun to think about though.

    • @jeffreygrindle6396
      @jeffreygrindle6396 Год назад +2

      Just saw one on eBay for 300 with shipping

  • @wilsione
    @wilsione Год назад +2

    Wait WTF!!! Z840 still catching up!!! Bro can you donate to me the 3080? haha!

    • @racerrrz
      @racerrrz  Год назад

      Quite a few vids to catch up on haha. I wish I could but that 3080 still has a heap of warranty and owes me several TB of 4k footage! I'll be using it in the Z240 Case Swap as a dedicated machine to handle recording camera footage (Maybe even some gaming if the 4 Core CPU can hack it! Z840 can't be used when its rendering video so...lol). MicroSD cards and camera batteries are a bit annoying. I have a work-around to plug the camera HDMI out into the machine (Elgato 4K Capture card). It's really awkward to hit record on the camera every 10min, I end up missing footage when comp building lol.

    • @wilsione
      @wilsione Год назад

      @@racerrrz im just joking. Its nice to know that z840 is still up to date. I think ill update next is the CPU rather than the GPU..u think its a good move?

  • @salwakotb1540
    @salwakotb1540 Год назад +1

    can i have ZOTAC GAMING GeForce RTX 3080 Ti Trinity OC to z640 work station ???

    • @racerrrz
      @racerrrz  Год назад

      Hi Salwa.
      The simple answer: Yes, but not with the Z640's PSU.
      The longer answer:
      There is really only one problem to solve to allow you to use the RTX 3080 Ti card in the Z640 with the HP 925W PSU. The issue is the 2x 6-Pin power adapters on the PSU, which will mean you need to split the power inputs into 3x 8-Pins. The 925W PSU can handle the RTX 3080 Ti's ~350W TDP. HP does a great job on the engineering side because each 6-pin cable can output 12V @ 18Amps, which is 216W, with 2x 6-Pins the absolute maximum being 432W between the two cables. The RTX 3080 Ti is rated to ~116W per 8-Pin (so total of 350W). Another issue with joining the outputs of the 6-Pin cables is that there is a risk of damage due to bridging the power rails. You would need an adapter that is able to convert 2x 6-pin male adapters into 3x 8-pin adapters. Unfortunately I am not aware of such an adapter (it would be called "2x 6-pin female to 3x 8-pin male PCIe power adapter"). Converting 1x 6-pin into 2x 8-pin adapters is possible but that would mean you would need ~232W from a single 6-pin which is outside the scope of the Z640's PSU. My recommendation would be to upgrade your PSU to a gaming unit, ~800W platinum is ideal (larger PSU would open up more upgrade options) - although expensive (ebay/amazon search for: "ATX 24Pin to 18Pin + 8pin to 12pin Adapter" to convert your HP connections to ATX PSU standard).

    • @salwakotb1540
      @salwakotb1540 Год назад

      @@racerrrz can i change only power supply in my workstation ? if answer is yes which model you recommend ?

  • @misterxxxxxxxx
    @misterxxxxxxxx Год назад +1

    Did you ever managed to close the case ? :)

    • @racerrrz
      @racerrrz  Год назад +1

      Sadly no. But I seized the opportunity to place some hardware outside the case (10 GbE NIC and a USB Type C adapter, link for interest: ruclips.net/video/5tQct_LxPjo/видео.html ). The 12-pin connector protrudes beyond the case margin, so there is no way to refit it unless the side panel is cut and modified which I am not prepared to do to a side panel. Case temps have remained decent as well, but I do worry about dust.

    • @misterxxxxxxxx
      @misterxxxxxxxx Год назад +1

      @@racerrrz Thanks for sharing all this, I Found a two slot wide 3090 that I'll try to fit together with a 5700 fe... first concern is power that I have to get through the 3x6 pins connectors... should be ok if I find good adapters

    • @racerrrz
      @racerrrz  Год назад +1

      @@misterxxxxxxxx The two slot 3090 will help out, but power will be an issue on the combination of both GPUs. The RTX 3090 will require ~300-350W under normal operation from the 3x 8-Pin connectors (I believe only the founders edition RTX 3090 was 2x8-Pin). The AMD Radeon RX 5700 XT FE (1x 8-pin and 1x 6 pin) would require ~225W which might not actually be possible from the Z840’s PSU. The theoretical max output from the Z840’s PSU for the GPU power cables is 648W (12V * 18A = 216W * 3x 6-Pins = 648W). To power two high power drawing GPUs, I suspect you will need an additional PSU to power both cards reliably. Splitting a single 6-Pin into two 8-Pins (or 2x 6+2 Pins) can create a situation where the total power draw through the one 6-Pin can exceed what the Z840’s PSU can deliver (absolute theoretical max is 12V * 18Amps = 216Amps). The 6-Pin cables are generally only designed to handle 75W and the 8-Pins 150W. Which models of 3090 and 5700 XT do you have? If you have managed to get the right combo of cards it might just make it, but it would be risky to power both cards at the same time given the 3x 6-Pin limit from the PSU.

    • @misterxxxxxxxx
      @misterxxxxxxxx Год назад +1

      @@racerrrz 75 W is coming from pcie power so I could have 6->6+8 (for the 5700) it would be 150W on a single 6 pin cable, and then 6->8 + 6->8 for the 3090 with "only" about 140W each.
      the 1200W PSU should be able to handle that ( 1450W peak under 220V)...
      Anyway my plan have changed a bit (I got screwed with a faulty 3090 :})
      Receiving a Quadro P6000 and a Radeon W6600 (to replace the 5700 fe) by the end of the week. Should make it easier.

    • @racerrrz
      @racerrrz  Год назад +1

      @@misterxxxxxxxx You should manage with your original planned setup if you can get the right number of 8-Pins (I have it that only the FE RTX 3090 had 2x 8-pins; if there is a 3rd 8-pin on the 3090 I presume the second GPU will not be feasible). Sorry to hear about the faulty RTX 3090! What went wrong with it (if you figured that out)? The P6000 is a heavy hitter with it's 24GB of memory! You should have no trouble powering them, in fact you could sneak in a 3rd GPU pending PCIe slot usage! I am sure you will make use of the extra memory. The W6600 should help spread the load on tasks. I didn't realize how heavy I was on GPU memory until I broke out form the 12GB limit I had! Every video project pushes 20-24GB now. Let us know how you get on with the new setup.

  • @flichostudio
    @flichostudio Год назад +1

    Inspired by your video, I got Gigabyte 3090 Gaming OC for my Z840. But the system refused to start and giving 6 beeps error. Despite multiple attempts and enabling legacy boot and disabling secure boot the problem still persists. The system works when I install my old Quadro M5000. Can you please help

    • @racerrrz
      @racerrrz  Год назад

      Hi there. Great choice with the RTX 3090, I am sure it will be possible to get it running. What power supply have you got in your Z840? How did you power the RTX 3090 cabling wise? What BIOS version do you have? The 6 beeps on the Z840 can mean a few things, the most likely of which is that there is insufficient power to drive the RTX 3090 (which would indicate a weaker 850W PSU in your system). Another possibility is the that fitment wasn't perfect in the PCIe slot, but I'll assume you can rule that out since the M5000 runs fine (but just double check that the card is seating fully and that the power connectors are all fully plugged in.

    • @racerrrz
      @racerrrz  Год назад

      Looking at the 850W PSU (P/N: 719798-001) that seems unlikely since it only has 1x6-Pin cable. How did you wire the 3x 6-pins? It is important to have the 6-pin to 8-pin connectors and the power must come from the 3x 6-pins. Taking power from a Molex or SATA power plug will not work and may cause an issue. If power checks out there might be something wrong with the RTX 3090! Can you confirm that it is working in another PC/ machine (if possible)?

    • @flichostudio
      @flichostudio Год назад

      @@racerrrz Thanks for quick help. I have 1125W PSU and using all 3x 6pin cables with the help of 2x 6- pins to 8-pins and 1x 6-pins to 8-pins adaptors. My Bios version is M60 V 02.59. I have checked RTX 3090 on other machine and it's working fine but when I place it in Z-840 the LEDs on RTX 3090 are showing but the system gives 6 beeps🙁

    • @racerrrz
      @racerrrz  Год назад

      @@flichostudio No trouble. Sorry to hear you ran into some issues there! If the PSU is the right specification, your BIOS is the latest specification (02.59 Rev.A), and the RTX 3090 works in another machine, that really just leaves one option which is that your Z840's PSU has an issue. As an example, I had an issue on a Z420 that I bought. The machine powered fine with a workstation GPU (like a Quadro K620 etc.) but gave beep errors when I tried to use a 6-pin plug to power a GTX 970. Online searches suggested a heap of potential issues but I went for the logical one which was that something like a capacitor failed in the PSU that negotiates the power supply to the 6-pin cable. New PSU went in and it worked like a charm again. Thus, I would conclude that something may have gone wrong with your Z840 PSU, likely before the RTX 3090 went in. Although super durable, the Z8xx PSU isn't immune to hardware failures. Can you test each of the 6-pin power connectors independently? A smaller 6-pin GPU might work for the test. Just make sure each of the 6-pins are actually still functioning. A simple tool that could help is a ATX power check tool like this one: shorturl.at/cmqN3 (short URL because ebay links are massive). I haven't used them myself but I think it could be worth having one for these situations!

    • @racerrrz
      @racerrrz  Год назад

      @@flichostudio One other thing, it may be worth testing alternative 6-pin to 8-pin PSU adapters. They are somewhat renowned for not always working perfectly. If you have spares (which is unlikely!), give that a test also.

  • @zCaptainz
    @zCaptainz 10 месяцев назад +1

    hum... it's not Virtual memory. It's Video memory. lol sorry
    I did like the video tho

    • @racerrrz
      @racerrrz  10 месяцев назад +1

      Thanks! We are both right, technically lol.
      The main issue is VRAM / video memory, but quite commonly some of the system RAM is allocated as a memory buffer - giving a larger pool of "GPU memory". In the video at 0:36 Windows Task Manager said my GPU memory was 156GB lol (RTX 3080 12GB's VRAM + over half of my system DDR4 RAM). This combination of VRAM and system RAM is called Virtual memory - like in the old days when you could allocate some of your HDD/SSD storage as RAM in Windows - "Virtual memory".
      Hence, I was being somewhat sarcastic in that the 156GB of "Virtual memory" was not enough. But the issue was that once I filled the GPU's VRAM the CPU's RAM was a bottleneck and that often lead my system to crashing. So far so good with the RTX 3090 Ti - things still lag at times but it always recovers without crashing.

  • @wilsione
    @wilsione Год назад +1

    This video just make me drool

    • @racerrrz
      @racerrrz  Год назад

      Good to hear from you! The Z840 is loaded to the brim right now and well worth the drool haha. I trust your one is working well also? All I know is that the next Z840 clip will be even more outrageous. The X540-T2 10Gbe NIC will be fitted underneath the GPU (sort-of; and it’s working well so far!).

    • @wilsione
      @wilsione Год назад

      @@racerrrz yeah i am fine. Just become busy at work this past few months. My z840 is working well and also dusty 😀

  • @mikhmikhser
    @mikhmikhser Год назад +1

    У меня 3090ti Extreme Holo Zotac .
    Очень хорошая карта главное память 60-65 градусов процессор 55+- градусов и это очень хорошо холодная даже !!!!.
    В ведьмак 3 Некст ген. 67 градусов память что ну нормально хотя думал горячо !!!

    • @racerrrz
      @racerrrz  Год назад

      Hi there. Great to hear your Zotac is serving you well! The fan curves are very aggressive out of the box. I normally use MSI Afterburner to tweak my fan curves (although this can be done in the Zotac Firestorm utility). I think the default fan speeds can actually be turned down! I have mine spinning ~50% at 50'C but I think it could go lower. I am quite happy with the temps. I see ~63'C for the GPU temp in games like Cyberpunk, but after a long gaming session those temps can climb to nearly 70'C (fans at 95%), but I am quite happy with the temps being well below 70'C normally.