If your mining at home with a 3090 and or 3090 ti just beware of the heat especially in the summer which is basically around the corner. Mining with 3090 ‘s was a nightmare for me . I constantly feared them getting burned or overheated no matter how much I cool them down .
Exactly. Better to have two 3070(non-LHR) than one 3090(Ti) for mining. Way more healthy power for long term, and they are more efficient - especially on ETH. So cheaper to buy, cheaper to run, because each 3070 can mine ETH around 62 MHs @50% TDP (110-125W). Especially the bigger 3070s are good for even hot ambients. My 3070 MSI Gaming X Trio can run 60% fanspeed and almost inaudible mining ETH - with very healthy temps.
I Mean the ti cooling is upgraded it's better then any card I've used tbh. Sure they push heat but if you get a good one Temps shouldn't be a worry for you lol
The size of video cards these days are just insane; I still remember my first video card, an ATI Radon HD2600 XT with it's 65 nm processor cooled by a 20mm fan; I actually still use it today in an Asrock H110+ BTC Pro that only had a DVI port; and I bought the joker back in July of 2007. Peace
The "conventional" design of GPU cooler is just obsolete in my opinion. Hell, even Linus pointed that out by putting a CPU cooler on a GPU - it worked wonder. But well people stay with that inefficient way... Anything drawing >300W will not be cooled by that air cooling design: That tier requires at least built-in AIO, otherwise we're just wasting the card's potential and our money.
It doesn't surprise me because it's all about thermal management - in that you need just a big honking heatsink to be able to cool the GPU (along with VRMs and VRAM). I love the fact that they moved all of the memory modules now to the front of the PCB, so trying to cool the memory modules that were on the back of the PCB on the "vanilla" 3090s is not an issue anymore (because that's STILL a colossal PITA to try and accomplish within the same given, physical space.
The 3090 ti moved all memory to the front of the board, so cooling the backplate isn’t as big of an issue compared to the 3090. I’ve been trying to get some more performance, but my best ETH is about 131MH/s at 345W on my Founders Edition card. I can get up to 133, but stability is poor and it takes far more power to get there. I’m hoping the next driver release improves stability. The current driver feels rushed.
It makes no sense that a 3090 Ti would pull more than 75W over the PCIe slot, unless it is modded and OCed. 3-6W makes sense since the 16 pin can deliver up to 600W and still be in spec. Typically Nvidia GPUs would load balance evenly using their dumb sensors (hence why shunt mods can work), so it might be the 3090 Tis are made to load balance preferring the 16 pin over the PCIe power instead. The way to figure that out for sure would require a bit of testing.
I love how this video is basically just so I know how much the hash rate is just to think about it and say yeah that would be nice to own and that I’m broke and I can’t afford it
Not effecient? My whole rig with 4x gtx1080 gives 128mh and draws about 640w from the wall... 3090ti is very efficient! For the price of 3090ti i can buy 4.2 second hand 1080s.. and it takes twice less power!
@@msgdeletdbymod4471 google dual mining. its become popular for LHR cards and 6900xt mostly (not CPU mining) you mine ETH + pick 1 of the 2 or 3 available shtcoins available for dual mine
I think Red Panda is misinterpreting (or overstating) the inefficiency of the power supply at around 1/3rd of a full load in these tests. He shows a graph where the peak efficiency is located near 50%, but the range where it is at or near 94% efficiency is pretty broad for that graph. If you do the math, when his PDU shows ~370 watts and he's reading in the high 390 watts at the wall, that is about right. 394 watts would be expected at peak efficiency. There are power supplies out there which may not have such a broad peak efficiency band, but the one he's using seems to be operating near that proper level to not be wasteful for mining. If you aren't sure about your power supply's peak efficiency curve, it's always a good idea to look it up from the manufacturer. Thumbs up on the content, though, we need miners to know the 3090Ti is not a mining card. Save your money, and your power capacity, for lower density / higher efficiency alternatives.
3090s and 3090 TIs will be worth it once the 4000 series comes out I think. I am mining off solar so the power consumption doesn’t matter for me. Also I have a ice cold basement.
I think nvidia mindshare is gonna drop hard next gen, this efficiency for mining and gaming makes little sense to base a bliud since now that we know the 3090 ti is a beta test for lovelace. Hopefully the miner programmers like lolminer and gminer expand the amd currency options to match nvidias.
when mining is dead, I would much rather be gaming with my paid for 3090ti. PS. how much are you spending on break out boards, pci power cables, risers, extra PSUs etc. Not to mention, I now have a GPU Leash. non of that with the 6600xt. :)
If your "vanilla" 3090 is only getting 125 MH/s at 358 W, that's actually quite inefficient (because that works out to be about 349 kH/W). My 3090s are getting about 122 MH/s at about 300 W (varies a little bit depending on the make of said 3090 and also where they are positioned in my open air mining rig), but this gives me a hashing efficiency of around 406.6666 kH/W (on average). My best card gets 122.82 MH/s @ 297 W for 415 kH/W and my worse is about 100 MH/s at 271 W which is 369 kH/W.
@@ewenchan1239 I was referring mostly to the power element. Let's see I have about 10 3090's here between my nephew and myself. The one thing they have in common is that none of them are the same. My best hash rate from one that happens to be mine is 128. Power is pretty cheap here so efficiency is not a huge concern especially considering the actual difference it makes over profit. I.e. save a few watts or get another 10 hash? I'll take the hash thankyou. The worst one we have is 99 m/hs. But that's because I turned it way down because it was throttling itself because he hasn't repadded it yet. 4 need repadded pretty bad and some just get unstable at rates others handle fine. Silicon lottery it seems. That doesn't mean I ignore power because I don't. Mostly because I have to juggle circuits so they can handle the power. Try that with a Zinsco breaker panel.
The efficiency of the psu shows up pretty much correctly. It's a platinum psu and has less than 10% loss, around 91% efficiency at that draw. So its not about the load, even with double the load, you would see a change by percentage of less than 10 watts. Nice psus those server platinum ones by the way. :)
The gigabyte card is a lot cooler at 70 celcius, but it draws 352w in the software for me with similar oc settings. Cant seem to get it lower without huge hashrate drop off.
I wonder if it is worth it to get this vs building a rig with the same MH rating. The cost of the extra wiring and risers and frame, etc is a large cost. I wonder at what point does it make sense just to build a single 3090 Ti vs 4 6600 XT’s per-say. If I picked an efficient power supply that would be in its efficiency range while mining with this card and not having to buy many PCIE cables and risers, I feel like this GPU makes sense for mining. Plus you can game on it.
I can’t push my 3090 ti ftw3 past 1200 memory clock on windows or I get “cuda error in cudaprogram.cu:256” I’m “only” getting 127mh/s anyone know what’s wrong here or do I just have a bad card? I have my core clock locked to 1245
@RPM It would be great to see you run some ALGOS that aren't so commercially known. .. Something like RYO , HTH , XEQ , BTG or one of the newer projects. Thanks for all you do I've learned a lot from you over the last 4years.
I just installed the custom copper plate in my ftw3 3080ti and the temperature dropped 20 degrees to 70... I wonder if that would help the 3090 series GPUs
its more likely that ur PMD is not working properly than ur GPU only using 6-7watts from the riser on everything thats prolly where the 20-30 watt delta resides
Hey Panda great vid brother. I was just wondering if you have a graph for a 2400w plat power supply? I bought it a couple years back from parallel miner but cant seem to find the efficiency on their page.
Hello! i would like to see a video about the Efficiency mode on each card. for example 3070 LHR can produce 72 SOLS on Flux at 220-230w but with properly settings it can achieve 0.445 Efficiency but with lower Hashrate. i have achieve 3070 LHR to produce 59-60.5 sols at 135w and i'm pretty happy about it. since the Energy problem affects all the countries that will be an amazing video!
@@ordanumn7494 Yeah I get between 74-76C load in mining with my EVGA RTX 3090 Ti FTW3 Ultra Model in a closed case. I also thought the temperatures of this card on this video in an open environment were pretty high. I wonder if some cards just have higher memory temperatures due to variances in binning of the chips.
its manufacturing tolerances, were talking less than 0.1mm can be the difference in 10C either he has a screw thats not tightened all the way, something is bent from the factory, or the pads are bunched up or too thin been taking apart multiple brands of 3000 series for a bit now trying to fix gddr6x temps almost half the time i have to disassemble and reassemble to make sure im within tolerance just re-tightening stuff will work more often than not but i have a asus tuf 3080ti thats going to need individual heatsinks on each memory chip it will just not go below 90C
that looks good but can you take a video how we can unlock the Hashlimiter for RTX 3090 (Not Ti) becouse it goes not up on 120MH/s or higher it still by Nvidia driver on 60MH/s max. If you have a video how to unlock that for the Platfrom Cudominer on Ethash/Phoenix that will be helpfull :)
Horrible heatsink dan design, the fans on the strix are so weak bro, they they can’t cool those tall fin designs. They need to make the fins thinner and wider so the aire can also cool the board. But what do i know, only been mining for a minute now. The old 1080ti minis had great cooling and the design was not so ridiculously looking. Evga sucks by the way. Worst nvidia cards ever.
I think that it really depends on which "vanilla" 3090s you were able to get your hands on before this 3090 Ti was released. Not all 3090s are made the same. Some 3090s (between OEMs) will has better than others. I would LOVE to be able to swap this in place of one of my underperforming "vanilla" 3090s because it is entirely limited by VRAM temps on the back of the PCB. (It doesn't have a big enough heatsink to deal with all that heat.) Probably works fine as a gaming card for a "vanilla" 3090, but for mining, the inadequate thermal management solution is definitely limiting it's ability to perform. Also, I'm sure that if you set the power limit in HiveOS, you'd be able to keep the absolute power draw in check. If anything, the 3090 Ti gives you more headroom to be able to optimise for hashing efficiency.
it has nothing to do with the heatsink, its actually the lack of good thermal pads or change the cooler to touch directly in the vram and core, they just want our money, and for them, it would cost 1 or 2$ to do so.
@@pav1u "it has nothing to do with the heatsink, its actually the lack of good thermal pads" That is patently false though. Even if you changed the thermal pads, a metal backplate with no fans pushing air over does not and cannot increase the heat transfer coefficient if you're effectively solely dependent on natural convection for heat transfer. A heat source that is actively cooled, with a heat sink (which increases the surface area for convective heat transfer, by design) is going to be more efficient at transferring heat away from said heat source vs. the same heat source, with ONLY passive cooling (e.g. dependent on natural convective cooling rather than forced convective cooling), and NO heatsink. This is just basic heat and mass transfer. "or change the cooler to touch directly in the vram and core" And....how do you propose manufacturer's change the cooler design so that the heatsink would "touch directly in the vram" that is sitting on the BACK of the PCB? The heatsink and the heatpipes already make contact with the GPU core and even then, you still have a thermal interface material in between to make sure that any air gaps are filled to the microstructure differences of the different surface roughnesses between the bottom of the heatsink/heatpipe assembly vs. the top of the GPU die. Therefore; why do you think that it would be any different between the top of the VRAM modules to the heatsink where they have the thermal pads (which is also just another thicker thermal interface material) before it makes contact with the heatsink? "they just want our money, and for them, it would cost 1 or 2$ to do so." Run the thermal analysis simulation, especially for the VRAM modules that are on the BACK of the PCB. You can let me know what your findings are from your studies/work here. :)
@@ewenchan1239 Everyone reporting an insane heat reduction when replacing the stock thermal pads, instead they Use copper shims coated with a thermal grease on the 2 sides which touch the RAM and the Heatsink, each card needs a different copper shims width, example: the rtx2000 most efficient card needs 1.2mm people reports, anyway that's what @Pav1upt meant.
@@Shacharikodikopiko "Everyone reporting an insane heat reduction when replacing the stock thermal pads" Yeah, I changed my thermal pads out for Thermal Grizzly thermal pads as well and on one of the cards, it didn't make ANY difference (other than I think it's either dielectric grease or the binder that's in the stock thermal pads that will separate out from said stock thermal pads at sustained moderate to high temperatures). "instead they Use copper shims coated with a thermal grease on the 2 sides which touch the RAM and the Heatsink, each card needs a different copper shims width, example: the rtx2000 most efficient card needs 1.2mm people reports, anyway that's what @Pav1upt meant." Interesting. I haven't heard of that. I would be wearing of using copper shims like that because you are depending on the pressure fit to hold it in place (yes, this is an assumption on my part that it's not affixed with mechanical fasteners of any kind which will maintain its proper position). That either means that you need to have enough pressure so that you can use said pressure fit to hold it in place or if you don't have enough pressure, then you can be risking a piece of metal falling onto your board. Even if it is just sheet copper shim that's 1.2 mm thick, that's still quite the risk associated with it if something goes wrong. People might be amazed to find out how much the tool and die costs are to be able to implement this, at the OEM level, en masse.
A cool video would be to do a little psu efficiency video, showing that graph, the pmd and the watt meter. Also when testing different coins, show dual mining too. Ton and alephium
When I first turned them on I shit myself because of the memory heat got the rig in my shed with smaller rigs but as soon as I turn on the 3090s rig the heat is crazy asus cards have given me no problems
Amazing video as always thanks for your opinions and experiences over the years I can’t tell you how many times I’ve looked up your videos when I get in a jam … could you please show us what that bios button does the one you mention at the end of the video ? The overclock and performance button … would like to know how it impacts dual mining eth and any other coin … thanks again RPM !!!
Could the power fluctuation on Flux be due to the LHR? These cards are mega crazy expensive $3300 - $400 AUS here. Oh well I can still dream of having one 😞
A shame you didn't test the card without the fan blowing on the back of the card, since that's the main change coming from the RTX 3090 supposed to fix the main issue of this series : how effective the one sided memory modules fixed the infamous temp issue that plagued the RTX 3090 memory. Best,
it got me 900 dollars more each card for me to put a 3090 here in Central America, fuck they are expensive.....Im getting 118 MH/s with 303 Watts. 380 or something like that efficiency,
Hello, in the video in 7min56sec you use a device to measure the W consumption of the TI 3090. tell me device name and aliexpress link for me to buy? Congrats on the videos.
The difference between PMD reading watts and the wall it’s like you said 30/40watts but it’s wrong to blame PSU when you forget that CPU and SSD also need power , a lot of people only calculates profit with power draw from GPU and forgot about the rest .
Hey RPM, could you do a test on a windows setup doing FLUX with a power limit in the bat file. Im curious if it will surge past the PL in a windows environment like it surges past your set PL in hiveos.
@@thomasdriskill5254 The point was to see if hiveos overclocking software is allowing the surging which can destroy hardware and to test if the same surging occurs in a windows environment using overclocking built in the mining software.
Jeez he actually ordered this extra expensive card tbh it’s way over the limit oh much a gpu should cost I’m happy with my 3080 even if I had money no chance it’s a bit ridiculous now
I was about to order 2 of these but not anymore. Will stick to 3090's. Also, do you think the 3070 Ti will be the best mining GPU after ETH POS since it has very good hashrate on the other coins ?
@@arfathkhalid1099 not sure how much it takes on the wall as i have 3 different one. It says on the Adrenaline software 95W and on another one it takes 110W
Hmm not sure because i have mix of gpus I have 3x 5700xt two of them powercolour one dragon one devil and one sapphire Plus 2080ti and 6800xt On 5 cards i get 292mhs with 870W on the wall i dont know how much each card takes and if its good efficiency
Y’all really out here mining these poor new gpus to death? Okay, whatever….. such a waste. Anyways, I’ve got 30 gpus running right now mining ethereum and RVn. What? Y’all think I was really gonna be negative or something? Have a nice day miners!
looks like 3090 Ti has over power consumption issues..!!! i feel Ergo with my 3070ti and 3090 is very Efficient when it comes to power and Hashrate GPU #1: RTX 3070 Ti - 171.41 MH/s, [T:54/72C, P:131W, F:60%, E:1.31MH/W], 610/611 R:0.16% I:0% GPU #2: RTX 3090 - 274.22 MH/s, [T:47/80C, P:244W, F:55%, E:1.12MH/W], 972/974 R:0.21% I:0%
Y’all suggesting interesting videos on what to make as another video but noone ever actually suggested you to clean up the mess he has on each and any table ever seen on his videos 😂 Come on RPM, make a cool video on a proper final cable managed , nicely ziptied, nice looking , and well organized farm/rig. Been there, done that 😂
2x 3070 no-lhr which are being sold now and are available is better: 2 x 3070 No-lhr = 240 w in tottal for 120 Mhs clean smooth and cold and working over years without touching oc
not bad but 260 w at 110 Mhs if it works would be even more productive.. Still 2 x 3070 no-lhr = 240w for 122Mhs and those graphics are available and cheaper
If your mining at home with a 3090 and or 3090 ti just beware of the heat especially in the summer which is basically around the corner. Mining with 3090 ‘s was a nightmare for me . I constantly feared them getting burned or overheated no matter how much I cool them down .
All my 3090 are WC, temp simply extremely good with 3 big 3x120 by GPU
Exactly. Better to have two 3070(non-LHR) than one 3090(Ti) for mining. Way more healthy power for long term, and they are more efficient - especially on ETH. So cheaper to buy, cheaper to run, because each 3070 can mine ETH around 62 MHs @50% TDP (110-125W). Especially the bigger 3070s are good for even hot ambients. My 3070 MSI Gaming X Trio can run 60% fanspeed and almost inaudible mining ETH - with very healthy temps.
@@Wushu-viking yea 3070s are
62 mhs . That’s just a typo at your end but I get it because that’s what I do .
@@noelventura2786 Edited. Thanks 😊
I Mean the ti cooling is upgraded it's better then any card I've used tbh. Sure they push heat but if you get a good one Temps shouldn't be a worry for you lol
The size of video cards these days are just insane; I still remember my first video card, an ATI Radon HD2600 XT with it's 65 nm processor cooled by a 20mm fan; I actually still use it today in an Asrock H110+ BTC Pro that only had a DVI port; and I bought the joker back in July of 2007.
Peace
The "conventional" design of GPU cooler is just obsolete in my opinion. Hell, even Linus pointed that out by putting a CPU cooler on a GPU - it worked wonder.
But well people stay with that inefficient way... Anything drawing >300W will not be cooled by that air cooling design: That tier requires at least built-in AIO, otherwise we're just wasting the card's potential and our money.
It doesn't surprise me because it's all about thermal management - in that you need just a big honking heatsink to be able to cool the GPU (along with VRMs and VRAM). I love the fact that they moved all of the memory modules now to the front of the PCB, so trying to cool the memory modules that were on the back of the PCB on the "vanilla" 3090s is not an issue anymore (because that's STILL a colossal PITA to try and accomplish within the same given, physical space.
Miners who need various styles can contact me I am a Chinese distributor
The 3090 ti moved all memory to the front of the board, so cooling the backplate isn’t as big of an issue compared to the 3090. I’ve been trying to get some more performance, but my best ETH is about 131MH/s at 345W on my Founders Edition card. I can get up to 133, but stability is poor and it takes far more power to get there. I’m hoping the next driver release improves stability. The current driver feels rushed.
Miners who need various styles can contact me I am a Chinese distributor
so that’s like $4 a day?
The ftw3 ultra is doing a stable 134.x @ 320W for me. How are the temps on the FE?
@@Thryce23 depending on Ambient temps, I’m at 80-88c at 60% fan speed
@@enzymebp my 3 3090 are at 123/126mhs for 285/288w, i think that 3090ti can be optimised, or it's not a good gpu.
It makes no sense that a 3090 Ti would pull more than 75W over the PCIe slot, unless it is modded and OCed. 3-6W makes sense since the 16 pin can deliver up to 600W and still be in spec. Typically Nvidia GPUs would load balance evenly using their dumb sensors (hence why shunt mods can work), so it might be the 3090 Tis are made to load balance preferring the 16 pin over the PCIe power instead. The way to figure that out for sure would require a bit of testing.
Miners who need various styles can contact me I am a Chinese distributor
I love how this video is basically just so I know how much the hash rate is just to think about it and say yeah that would be nice to own and that I’m broke and I can’t afford it
Miners who need various styles can contact me I am a Chinese distributor
Not effecient? My whole rig with 4x gtx1080 gives 128mh and draws about 640w from the wall... 3090ti is very efficient! For the price of 3090ti i can buy 4.2 second hand 1080s.. and it takes twice less power!
if we're comparing other 30 series cards... not efficient. When comparing older GPU's then yeah of course it is efficient! haha
For that price you could probably better buy a 3090 non ti ;p
I had one of these delivered earlier myself, somehow it's doing 133 / 1000 mhs (ETH + ALPH) while dual mining. Power @ 400W... this this is a beast!
can you please explain how you are mining 2 coins at once? Is it cpu mining?
@@msgdeletdbymod4471 google dual mining. its become popular for LHR cards and 6900xt mostly (not CPU mining) you mine ETH + pick 1 of the 2 or 3 available shtcoins available for dual mine
Miners who need various styles can contact me I am a Chinese distributor
@@Erebos710 ah so thats why the LHR cards are always out of stock.. good to know thanks!
That is horrible… absolutely not for mining. Rather have 2 x3080 ,better price more has and power.
0:00 tinitus alert 😝
I think Red Panda is misinterpreting (or overstating) the inefficiency of the power supply at around 1/3rd of a full load in these tests. He shows a graph where the peak efficiency is located near 50%, but the range where it is at or near 94% efficiency is pretty broad for that graph. If you do the math, when his PDU shows ~370 watts and he's reading in the high 390 watts at the wall, that is about right. 394 watts would be expected at peak efficiency.
There are power supplies out there which may not have such a broad peak efficiency band, but the one he's using seems to be operating near that proper level to not be wasteful for mining. If you aren't sure about your power supply's peak efficiency curve, it's always a good idea to look it up from the manufacturer.
Thumbs up on the content, though, we need miners to know the 3090Ti is not a mining card. Save your money, and your power capacity, for lower density / higher efficiency alternatives.
nerd
At this point what would you say the highest efficiency 30s series card is?
Wow but it said that the 3090ti are the most profitable card at the moment was that wrong?
Your gonna need an octominer full of these now lol
It's gonna pull more at the wall even at peak efficiency, as you don't have 100% efficiency at any point with the PSU.
False, at 0w it's possible !!
3090s and 3090 TIs will be worth it once the 4000 series comes out I think. I am mining off solar so the power consumption doesn’t matter for me. Also I have a ice cold basement.
How much did you invest into solar if you don't mind my asking?
i can listen rpm saying thick all day every day lol.
Get used to these power numbers, this is going to be the power draw of a midrange Nvidia card next gen.
not really. With Power Limits is when you adjust productivity.. Using cards at half of their Watts you can archive productivity.
*OH MAN....I was just thinking i should build a 12 x 3090ti rig. Cant wait to see this*
wait, 2200 per gpu and marginally above 3090...i retract my above statement ;0
Beginning of video: pans to RTX 3090
Subtitles: *Applause*
Can you try mining alephium? Curious to see the power draw would be on that.
Doesn't make much sense, 4x 6600xt at 228W mine the same and cost roughly a grand less for the same mh, more for ravencoin.
I think nvidia mindshare is gonna drop hard next gen, this efficiency for mining and gaming makes little sense to base a bliud since now that we know the 3090 ti is a beta test for lovelace. Hopefully the miner programmers like lolminer and gminer expand the amd currency options to match nvidias.
when mining is dead, I would much rather be gaming with my paid for 3090ti. PS. how much are you spending on break out boards, pci power cables, risers, extra PSUs etc. Not to mention, I now have a GPU Leash. non of that with the 6600xt. :)
I feel once 4000 series comes out 3090 ti will be cheaper/better for the price
Yeah pretty soon these things you wont be able to get rid of them, mining is dead bro.
imagine being named gary and spitting opinions like facts
That's a lot of power. My 3090 is putting out 125mhs and pulling 358-361 watts. I'd expect much better from the ti variant.
If your "vanilla" 3090 is only getting 125 MH/s at 358 W, that's actually quite inefficient (because that works out to be about 349 kH/W).
My 3090s are getting about 122 MH/s at about 300 W (varies a little bit depending on the make of said 3090 and also where they are positioned in my open air mining rig), but this gives me a hashing efficiency of around 406.6666 kH/W (on average). My best card gets 122.82 MH/s @ 297 W for 415 kH/W and my worse is about 100 MH/s at 271 W which is 369 kH/W.
@@ewenchan1239 I was referring mostly to the power element. Let's see I have about 10 3090's here between my nephew and myself. The one thing they have in common is that none of them are the same. My best hash rate from one that happens to be mine is 128. Power is pretty cheap here so efficiency is not a huge concern especially considering the actual difference it makes over profit. I.e. save a few watts or get another 10 hash? I'll take the hash thankyou. The worst one we have is 99 m/hs. But that's because I turned it way down because it was throttling itself because he hasn't repadded it yet. 4 need repadded pretty bad and some just get unstable at rates others handle fine. Silicon lottery it seems. That doesn't mean I ignore power because I don't. Mostly because I have to juggle circuits so they can handle the power. Try that with a Zinsco breaker panel.
The efficiency of the psu shows up pretty much correctly. It's a platinum psu and has less than 10% loss, around 91% efficiency at that draw. So its not about the load, even with double the load, you would see a change by percentage of less than 10 watts. Nice psus those server platinum ones by the way. :)
The gigabyte card is a lot cooler at 70 celcius, but it draws 352w in the software for me with similar oc settings. Cant seem to get it lower without huge hashrate drop off.
Hey what you mean by draws 352w
the power at the wall is important. Software shows wrong value sometimes in some bios.
Man is Nvidia the new AMD in terms of not displaying correct power usage??
This is basically how next generation of NVIDIA GPUs will roll: Energy waste hardware :$
I wonder if it is worth it to get this vs building a rig with the same MH rating. The cost of the extra wiring and risers and frame, etc is a large cost. I wonder at what point does it make sense just to build a single 3090 Ti vs 4 6600 XT’s per-say. If I picked an efficient power supply that would be in its efficiency range while mining with this card and not having to buy many PCIE cables and risers, I feel like this GPU makes sense for mining. Plus you can game on it.
If you get 131MH/s with the TI Version shouldn't you get more with the "regular" version of this card?
I can’t push my 3090 ti ftw3 past 1200 memory clock on windows or I get “cuda error in cudaprogram.cu:256”
I’m “only” getting 127mh/s anyone know what’s wrong here or do I just have a bad card?
I have my core clock locked to 1245
They didn’t bother making it LHR cause they already know it sucks.
Would like to see someone flash this with a 3090 bios maybe???
is it a good time to spend 10 to 20k on a mining rig, and what is better used or new? im new to mining, but learn. what gpus should i look for?
@RPM
It would be great to see you run some ALGOS that aren't so commercially known. .. Something like RYO , HTH , XEQ , BTG or one of the newer projects.
Thanks for all you do I've learned a lot from you over the last 4years.
I just installed the custom copper plate in my ftw3 3080ti and the temperature dropped 20 degrees to 70... I wonder if that would help the 3090 series GPUs
you could potentially daisy chains SATA cables to multiple risers for the 3090ti, lol
I have the same settings for eth and getting the same hashrate but my wattage is consistently at 330. Why is this?
Maybe the difference in riser power consuption could be because riser with 1x and direct to mobo with 16x. I dunno.
3090ti doesn't mine when used with any other gpu, have you figured out the solution yet?
P.S. I love how you've cleaned up your cryptomining garage.
You guys lucky to get GPU at this low price USD 2200. In India it cost around USD 3000.
its more likely that ur PMD is not working properly than ur GPU only using 6-7watts from the riser on everything
thats prolly where the 20-30 watt delta resides
Hey Panda great vid brother. I was just wondering if you have a graph for a 2400w plat power supply? I bought it a couple years back from parallel miner but cant seem to find the efficiency on their page.
my psu exploded just watching this video
2900 MHz (+1450 in Afterburner) was the highest stable OC you could go for the VRAM on ETH?
New LHR meta 2022.
Make cards that do not fit in a rig nor a pc.
Hello! i would like to see a video about the Efficiency mode on each card. for example 3070 LHR can produce 72 SOLS on Flux at 220-230w but with properly settings it can achieve 0.445 Efficiency but with lower Hashrate. i have achieve 3070 LHR to produce 59-60.5 sols at 135w and i'm pretty happy about it. since the Energy problem affects all the countries that will be an amazing video!
Why are memory temps so high? My 3090ti doesn't go over 72c at 133mh. Same model
Not possible.
@@EricLbs that's weird, since mine is doing just that. Other people I've talked talked to with same model get around same temps as me
@@ordanumn7494 Yeah I get between 74-76C load in mining with my EVGA RTX 3090 Ti FTW3 Ultra Model in a closed case. I also thought the temperatures of this card on this video in an open environment were pretty high. I wonder if some cards just have higher memory temperatures due to variances in binning of the chips.
its manufacturing tolerances, were talking less than 0.1mm can be the difference in 10C
either he has a screw thats not tightened all the way, something is bent from the factory, or the pads are bunched up or too thin
been taking apart multiple brands of 3000 series for a bit now trying to fix gddr6x temps
almost half the time i have to disassemble and reassemble to make sure im within tolerance
just re-tightening stuff will work more often than not
but i have a asus tuf 3080ti thats going to need individual heatsinks on each memory chip
it will just not go below 90C
was thinking of gettring one of these. but man thats a lot of juice for one card. I already have a few 3090's and they pull a lot of power
can you show us that each 8pin connector carries 150w /so that total 450w/, or some draws more than others?
How much wattage does each 8 pin draw? Please answer
Since Flux isn't using much of mem, try dropping the memory clocks for Flux, it could bring down the wattage and mem temps.
What’s your opinion about hive increasing there threshold payout?
My is pulling the same 7 to 8 watts at the riser 🔥🔥🔥🔥
that looks good but can you take a video how we can unlock the Hashlimiter for RTX 3090 (Not Ti) becouse it goes not up on 120MH/s or higher it still by Nvidia driver on 60MH/s max.
If you have a video how to unlock that for the Platfrom Cudominer on Ethash/Phoenix that will be helpfull :)
Horrible heatsink dan design, the fans on the strix are so weak bro, they they can’t cool those tall fin designs. They need to make the fins thinner and wider so the aire can also cool the board. But what do i know, only been mining for a minute now. The old 1080ti minis had great cooling and the design was not so ridiculously looking. Evga sucks by the way. Worst nvidia cards ever.
I think that it really depends on which "vanilla" 3090s you were able to get your hands on before this 3090 Ti was released.
Not all 3090s are made the same. Some 3090s (between OEMs) will has better than others.
I would LOVE to be able to swap this in place of one of my underperforming "vanilla" 3090s because it is entirely limited by VRAM temps on the back of the PCB. (It doesn't have a big enough heatsink to deal with all that heat.)
Probably works fine as a gaming card for a "vanilla" 3090, but for mining, the inadequate thermal management solution is definitely limiting it's ability to perform.
Also, I'm sure that if you set the power limit in HiveOS, you'd be able to keep the absolute power draw in check. If anything, the 3090 Ti gives you more headroom to be able to optimise for hashing efficiency.
it has nothing to do with the heatsink, its actually the lack of good thermal pads or change the cooler to touch directly in the vram and core, they just want our money, and for them, it would cost 1 or 2$ to do so.
@@pav1u
"it has nothing to do with the heatsink, its actually the lack of good thermal pads"
That is patently false though.
Even if you changed the thermal pads, a metal backplate with no fans pushing air over does not and cannot increase the heat transfer coefficient if you're effectively solely dependent on natural convection for heat transfer.
A heat source that is actively cooled, with a heat sink (which increases the surface area for convective heat transfer, by design) is going to be more efficient at transferring heat away from said heat source vs. the same heat source, with ONLY passive cooling (e.g. dependent on natural convective cooling rather than forced convective cooling), and NO heatsink.
This is just basic heat and mass transfer.
"or change the cooler to touch directly in the vram and core"
And....how do you propose manufacturer's change the cooler design so that the heatsink would "touch directly in the vram" that is sitting on the BACK of the PCB?
The heatsink and the heatpipes already make contact with the GPU core and even then, you still have a thermal interface material in between to make sure that any air gaps are filled to the microstructure differences of the different surface roughnesses between the bottom of the heatsink/heatpipe assembly vs. the top of the GPU die.
Therefore; why do you think that it would be any different between the top of the VRAM modules to the heatsink where they have the thermal pads (which is also just another thicker thermal interface material) before it makes contact with the heatsink?
"they just want our money, and for them, it would cost 1 or 2$ to do so."
Run the thermal analysis simulation, especially for the VRAM modules that are on the BACK of the PCB. You can let me know what your findings are from your studies/work here. :)
@@ewenchan1239 Everyone reporting an insane heat reduction when replacing the stock thermal pads, instead they Use copper shims coated with a thermal grease on the 2 sides which touch the RAM and the Heatsink, each card needs a different copper shims width, example: the rtx2000 most efficient card needs 1.2mm people reports, anyway that's what @Pav1upt meant.
@@Shacharikodikopiko
"Everyone reporting an insane heat reduction when replacing the stock thermal pads"
Yeah, I changed my thermal pads out for Thermal Grizzly thermal pads as well and on one of the cards, it didn't make ANY difference (other than I think it's either dielectric grease or the binder that's in the stock thermal pads that will separate out from said stock thermal pads at sustained moderate to high temperatures).
"instead they Use copper shims coated with a thermal grease on the 2 sides which touch the RAM and the Heatsink, each card needs a different copper shims width, example: the rtx2000 most efficient card needs 1.2mm people reports, anyway that's what @Pav1upt meant."
Interesting. I haven't heard of that.
I would be wearing of using copper shims like that because you are depending on the pressure fit to hold it in place (yes, this is an assumption on my part that it's not affixed with mechanical fasteners of any kind which will maintain its proper position).
That either means that you need to have enough pressure so that you can use said pressure fit to hold it in place or if you don't have enough pressure, then you can be risking a piece of metal falling onto your board.
Even if it is just sheet copper shim that's 1.2 mm thick, that's still quite the risk associated with it if something goes wrong.
People might be amazed to find out how much the tool and die costs are to be able to implement this, at the OEM level, en masse.
Could you do a hashrate for the reference amd 6800xt? I have a rig with 7 of them running at 61.9/mh
Keep up the amazing videos btw i love them!
do you have a link for the PMD?
A cool video would be to do a little psu efficiency video, showing that graph, the pmd and the watt meter. Also when testing different coins, show dual mining too. Ton and alephium
When I first turned them on I shit myself because of the memory heat got the rig in my shed with smaller rigs but as soon as I turn on the 3090s rig the heat is crazy asus cards have given me no problems
I think you need to multiply that riser power number, by 3. That makes up the 20 plus missing watts. 7W x 3=21. ????????????
A 3070 Noctua should never be in a mining place!
Amazing video as always thanks for your opinions and experiences over the years I can’t tell you how many times I’ve looked up your videos when I get in a jam … could you please show us what that bios button does the one you mention at the end of the video ? The overclock and performance button … would like to know how it impacts dual mining eth and any other coin … thanks again RPM !!!
Could the power fluctuation on Flux be due to the LHR? These cards are mega crazy expensive $3300 - $400 AUS here. Oh well I can still dream of having one 😞
the memory is actually on the front for the 3090ti.
if you use lock clock on ergo, you will get actually a lot more hashrate
cheers, rdm
Will the results be different if you use lhr mode in t Rex?
A shame you didn't test the card without the fan blowing on the back of the card, since that's the main change coming from the RTX 3090 supposed to fix the main issue of this series : how effective the one sided memory modules fixed the infamous temp issue that plagued the RTX 3090 memory. Best,
it got me 900 dollars more each card for me to put a 3090 here in Central America, fuck they are expensive.....Im getting 118 MH/s with 303 Watts. 380 or something like that efficiency,
Hello, in the video in 7min56sec you use a device to measure the W consumption of the TI 3090.
tell me device name and aliexpress link for me to buy?
Congrats on the videos.
I went to bed at 60% to payout (0.01 eth).
Today hiveos is telling me I'm 30% to payout (0.02 eth). Same for everyone else?
The difference between PMD reading watts and the wall it’s like you said 30/40watts but it’s wrong to blame PSU when you forget that CPU and SSD also need power , a lot of people only calculates profit with power draw from GPU and forgot about the rest .
Is the redpandamining channel on odysee yours? Why not post there?
Can't wait for ETH 2.0!
Nice card, but for the price,and power, I'd get a rig of A2000s
Hey RPM, could you do a test on a windows setup doing FLUX with a power limit in the bat file. Im curious if it will surge past the PL in a windows environment like it surges past your set PL in hiveos.
Just use hiveos get the Linux build and put it on a USB 3.0 or better. Linux for mining over windows
@@thomasdriskill5254 The point was to see if hiveos overclocking software is allowing the surging which can destroy hardware and to test if the same surging occurs in a windows environment using overclocking built in the mining software.
Jeez he actually ordered this extra expensive card tbh it’s way over the limit oh much a gpu should cost I’m happy with my 3080 even if I had money no chance it’s a bit ridiculous now
Looks like a beast cant wait to see it next to the new 4000 s coming out
I was about to order 2 of these but not anymore. Will stick to 3090's.
Also, do you think the 3070 Ti will be the best mining GPU after ETH POS since it has very good hashrate on the other coins ?
what about 5700xt 56mh 95W
@@Balabaneczek you are talking about software watts... 5700xt's use minimum 110w at wall for Samsung mem and 140w for Micron mem.
@@arfathkhalid1099 not sure how much it takes on the wall as i have 3 different one.
It says on the Adrenaline software 95W and on another one it takes 110W
@@Balabaneczek I have bunch of 5700xt's and I have measured them at wall wattage... do not believe software watts
Hmm not sure because i have mix of gpus I have 3x 5700xt two of them powercolour one dragon one devil and one sapphire
Plus 2080ti and 6800xt
On 5 cards i get 292mhs with 870W on the wall i dont know how much each card takes and if its good efficiency
Got the exact same wall wattmeter but with a UK plug on my rig
Looks like a 3090 noctua is coming now lol can't wait to see how big that bad boi is haha. Aslo is that a strix 3090 behind it?
Missing test with Power Limit on 50% or 70%... could be much more productive and much longer life for the GPU
3090 120mh
How to do 101w-125w? video tutorial please
The BIOS switch changes everything🌚
I just got the 3090 as well but not the ti... yet....
Can’t wait to see you snag a 6 slot Noctua 3090ti
I have a 850W Gold psu from Seasonic, I am nervous pairing it with a rtx 3090ti and a 5800x cpu. 👀
Miners who need various styles can contact me I am a Chinese distributor
You should
The 3090 ti has no memory on the back of the card, no need to cool the back.
You're telling me this beast is barely 3x better than my ancient 1070 SC, wtf?
what hashrate you will get if you drop power limits to 240 could you please try
Miners who need various styles can contact me I am a Chinese distributor
Y’all really out here mining these poor new gpus to death? Okay, whatever….. such a waste. Anyways, I’ve got 30 gpus running right now mining ethereum and RVn. What? Y’all think I was really gonna be negative or something? Have a nice day miners!
looks like 3090 Ti has over power consumption issues..!!!
i feel Ergo with my 3070ti and 3090 is very Efficient when it comes to power and Hashrate
GPU #1: RTX 3070 Ti - 171.41 MH/s, [T:54/72C, P:131W, F:60%, E:1.31MH/W], 610/611 R:0.16% I:0%
GPU #2: RTX 3090 - 274.22 MH/s, [T:47/80C, P:244W, F:55%, E:1.12MH/W], 972/974 R:0.21% I:0%
Y’all suggesting interesting videos on what to make as another video but noone ever actually suggested you to clean up the mess he has on each and any table ever seen on his videos 😂
Come on RPM, make a cool video on a proper final cable managed , nicely ziptied, nice looking , and well organized farm/rig. Been there, done that 😂
That pricetag though...
$2199.99😵
Your memory temps are higher than normal. I wonder if this card was opened up before
yeah... i prefer 2 6800XT's over this beast 😂
2x 3070 no-lhr which are being sold now and are available is better: 2 x 3070 No-lhr = 240 w in tottal for 120 Mhs clean smooth and cold and working over years without touching oc
@@valentinoutlaw2977 yeah but sold for how much?
I can't get this thing over 800 mem on windows or it pukes. Settled on 305w @ 122mh
not bad but 260 w at 110 Mhs if it works would be even more productive..
Still 2 x 3070 no-lhr = 240w for 122Mhs and those graphics are available and cheaper
3090t looks a lot more surface area than that other card you compare cooler with :)
If this gpu falls on the ground, I bet there would be an earthquake.
Please time me which site is best for mining
😂 Electricity monster
Wow I. Glad I watch this I was thinking about getting some for mining 😵