Forgot to mention in the video, I also tried Cyberpunk with ray tracing on the 4060 to see if maybe using the RT cores would bypass the limit, but it was no different.
On desktop is actually simpler games that consume more because they have an higher GPU occupancy, ray tracing could actually be counter productive as it cause lower occupancy due to its inherent incoherent instructions and memory access pattern. DLSS will lower the resolution which may hit occupancy too, maybe DLAA would push more
Hey Jarrod. Is this hard voltage lock on desktop as well? Also what's your theory on this? I'm thinking board partners were kind of in the dark or they would not have designed things around higher TDP's and the extra money that entails.🤔
Please let's take a minute to appreciate what Jarrod is doing for us here, I salute you man. I don't know without you how we would know these dirty quirks by Nvidia. Thank you so much.
Excuse me what is dirty here? They made power efficiency chip, then sell their chip to other hardware manufacturers. So laptop-produced company already knew how much TDP this chip can consume, they could market in whatever way they want to sell their laptop. And you wanna tell me what the heck Nvidia cheated or lied you?
On one hand, it's good because these GPUs can achieve the same performance without sucking down as much power as Ampere. On the other hand, the scaling is literal garbage once you hit 90 watts on these, yet they'll still be sold under the pretense of being more powerful than they are with the voltage limit they have.
Tbh I take a totally different approach to reviewing laptops, but I get a LOT of inspiration for my channel form Jarrod. Unbiased clean info as always sir
WHat's best laptop around 2500USD for working professional Screen Size 17.3 or 18 Need performance near-about 3070TI and processor: i7-i9 ~Data Science and AI graphics ~Computer modelling and networking ~Virtualization Should I buy this gen or 12gen with 3070TI????
I really love these power scaling tests you do. They allow us to get a very good idea about the performance of whatever laptop we might want to buy any uncover oddities like the one shown here. Thank you!
WHat's best laptop around 2500USD for working professional Screen Size 17.3 or 18 Need performance near-about 3070TI and processor: i7-i9 ~Data Science and AI graphics ~Computer modelling and networking ~Virtualization Should I buy this gen or 12gen with 3070TI????
Thanks for the valuable investigation. Very informative and useful to help people make an informed purchasing decision. It also confirms my decision to skip anything below 4080 for this gen.
@@darrell9616 but i think those cyberpowerPc laptops are somehow fairly priced. rtx 4080, i9-13900hx, 32gb ram, 1tb ssd is going for around 2,300. it ain't cheap but it's better than the rest. i just couldn't find much reviews about it on youtube.
@@ttkttk7463 You're right. That's the best value going right now for a 40 series machine. I can't find any either, and support from these smaller companies is always a concern. It's great to have options.
Give this man an award, equal to or greater then that of investigative journalism. He is the only laptop reviewer who appears to bring things like this to buyer/consumer attention. Jarrod have my respect all the way.
WHat's best laptop around 2500USD for working professional Screen Size 17.3 or 18 Need performance near-about 3070TI and processor: i7-i9 ~Data Science and AI graphics ~Computer modelling and networking ~Virtualization Should I buy this gen or 12gen with 3070TI????
Thanks for your another great coverage Jarrod. Definitely Nvidia is forcing their enthusiastic laptop gamers to pay a shitload of money for 4080 or higher. I'd been considering to buy another gaming laptop this year to replace my old Alienware m15 having 1070 max-q inside, but after watching your clips on laptop 40 series I changed my mind to have a desktop instead. They just don't deserve... Shame on you Nvidia.
WHat's best laptop around 2500USD for working professional Screen Size 17.3 or 18 Need performance near-about 3070TI and processor: i7-i9 ~Data Science and AI graphics ~Computer modelling and networking ~Virtualization Should I buy this gen or 12gen with 3070TI????
That's really groundbreaking! They're only allowing higher TGPs in some particular gpu workloads to avoid lawsuits. They only want their 4080-90 series to reach the max performance limits!
WHat's best laptop around 2500USD for working professional Screen Size 17.3 or 18 Need performance near-about 3070TI and processor: i7-i9 ~Data Science and AI graphics ~Computer modelling and networking ~Virtualization Should I buy this gen or 12gen with 3070TI????
Maybe that's why 4070 and 3070 looks so similar, the 4070 is running at 100w while the 3070 is closer to 140w, probably a 140w 4070 (if this thing got fixed somehow) would have that improvement that we would aspect.
It's strange but I think those higher power limits will serve you best with a slight overclock, for example on my legion 5 with the 115W RTX 2060 refresh, going full throttle at 115W gets me nothing other than extra heat, the FPS is the same as running in 80W mode, but if I just add a 115mhz OC to the core then my FPS jumps up by 10-15 FPS, putting it on the same level as a stock desktop 2060, doing that same OC in 80W mode gets me nothing, maybe 3 FPS at best
It's quite ridiculous that every alternate generation nvidia becomes super greedy and anti consumer, get roughed up in the market and respond well with the next generation. At this rate might as well just wait and see what they do for 5000 series. AMD not taking advantage of this gap is equally disgusting
WHat's best laptop around 2500USD for working professional Screen Size 17.3 or 18 Need performance near-about 3070TI and processor: i7-i9 ~Data Science and AI graphics ~Computer modelling and networking ~Virtualization Should I buy this gen or 12gen with 3070TI????
That was known for quite some time now, but no one has done a proper job testing it and putting numbers next to it. Great job. That being said, I am pretty sure Nvidia just (mis)used this limiter to restrict performance some more when they found out that they are still too strong even after castrating those dies.
@@niebuhr6197 It did for me, so I don't know what you're on about... Stop gatekeeping information about computers, not everyone watches or reads about these things 24/7, most people only game casually and actually have a job to be busy with unlike you.
Wow, seriously you deserve wayy more clicks. This is important stuff and we all thank you for your very hard work! Excellent video, as always! Nvidia has some explaining to do, even though I doubt they will.
this is a kind of techtuber that know their stuff and inform it to viewer, useful and ground breaking information and not just promoting their promotion brand while giving not so useful information. Good job JarrodsTech
Jarrod'sTech this video is VERY good timing for something I am working on. Your testing will be critical for explaining some decisions companies are making about upcoming products... ;)
If you watched the video, you would know that the problem is the voltage limit and not the power limit. My 4060 hits the voltage limit long before the power limit or thermal limit, in a thin and light (!!). These cards are already clocked pretty high so undervolting doesn't do much since it's not thermally limited. Max I've gone is +250 core and +1250 memory. In power virus benchmarks like furmark, it does reach power limit before the voltage limit. But not a single game has done that for me.
They aren't hitting power limits though so that's not the issue. What you'd want to do overclock. Run the gpu at the absolute max +core offset it can handle at the voltage. To use this extra power budget.
Lol ppl here really forgot to use their brains. If you undervolt, you can reach higher clock speeds at the same voltage. Which means more performance. I know you think undervolting means less power consumption, but that's not entirely true. At a given state, if you only lower voltage, power consumption will go down, sure. But at a given state, if you only increase frequency, power consumption will increase. So by undervolting, at that max voltage level Nvidia is holding ppl to, you can have higher clocks and therefore higher power consumption. So OP is right and y'all need to think before commenting
@@tarfeef_4268 Undervolting is holding a certain frequency by giving the core less power for the price of stability. If you can max out at 1volt for 1ghz you can set to reach 1ghz at 800mvs but you increase instability. How stable you can get depends on chip quality, but you will get lower power consumption for same performance. themselves
Nvidia for sure did this to block laptop companies from making a lower tier card run faster than a high tier one, like what lenovo was doing during the 30 serie laptops. Now if you want a faster gpu you have to buy the next tier up instead of a lower end card running at higher wattage than the higher end card.
10 mins ago I was thinking it's worth paying an extra for 140w 4070 :-D Now it's between slim 4070 100-120w and 4080 (170w), although those laptops are noticeably bigger and cost a kidney. I wonder which is better fps for buck 4070 (low wattage) compared to 4080 (full power). Thanks for the awesome reviews!
WHat's best laptop around 2500USD for working professional Screen Size 17.3 or 18 Need performance near-about 3070TI and processor: i7-i9 ~Data Science and AI graphics ~Computer modelling and networking ~Virtualization Should I buy this gen or 12gen with 3070TI?????
Great confirmation in actual games of good old leaks, that said AD107-106 is not scaling past ~105W. Now waiting to see if desktop cards with these chips (4060 and 4060Ti) have this same problem.
Linus tested boosting the 4090 to use over 600W, and the performance was barely improved even with voltage regulation disabled. I think this might be an architecture limitation.
I think that Nvidia and the manufacturers don't want to see more laptops on the market with for example a 3060/3070 beating the 3070/3080 because of the higher power of the 3060/3070 in these cases.
Well most laptop brands are still using last-gen cooling designs and power inefficient Intel H/HX CPUs in 15.5-18 inch SKUs. If they taken their time and prepared more 15.6-18" models with vapor chambers in time for Zen3+/Zen4 Ryzen supply to catch up then we would have *A LOT* more designs with 8-16 "P-core" CPUs running at full tilt with dGPUs dynamic boosting 14-33% higher to 185-200W TGPs instead of the same 150-175W limits. Just look at the 25-39% jump in TGP wattage in 14inchers from 75-100W last gen to 95-125W this gen. 🤯
Wow this was a really interesting find, Jarrod! I wonder if this will blow up and force Nvidia to unlock the power that people thought they were paying for.
Thanks Jarrod, it's unbelievable the huge work testing this, every game, at ALMOST EVERY WATTAGE, in 10W increments?!?! YOU'RE THE REAL MVP. For real, thanks.
@ElBandito_Gaming Did you get your laptop? How is the 3070 Ti 150 W running modern games? How are the temps and cooling? Also which laptop did you get? I am looking to buy Lenovo Legoon 5i Pro i7 12700H RTX 3070 Ti instead of a 40 series laptop after watching few of Jarrod's relevation videos
@ Check out Lenovo Legion subreddit. It seems like they dropped the quality and more and more laptops are dying for no reason. I went for an Hp Omen 17 with 4090 (got a very good deal on it).
I'm guessing this was their solution for all the overheating problems earlier. Faced with a massively expensive recall, they chose to secretly limit them. Well, the phone makers fod and do thos too, for the same reason.
I th8nk the benefits of the 4070 are in thin and light laptops, a lkwer wattage 4070 can provide great perofrmance in a 17mm thin chasis. The book 3 ultra, yoga 9i pro, asus zenbook 14 pro/16x for 2023 ar ethe only laptops I'd consider with a 4070 series gpu.
I'm glad I got a RTX 4080 laptop (ASUS Strix Scar G18), as that's the only RTX 40 series GPU alongside the RTX 4090 that I think are worth upgrading to if you're a few generations behind - I was upgrading from a GTX 1050Ti, so safe to say, the difference is huge.
Asus definitely has the best deals in today's market. But they have a bothersome issue; coil whine. I want to buy a scar g16 but issue is pushing me back; closest option is from MSI and it has a design that stuck in 2000's. With a price tag almost doubles the g16.
Really curious to how well they would do with MSI afterburner tuning, undervolting and overclocking may actually allow it to make the most of that wattage. Just food for thought.
Thank you so much Jarrod for this information! Best laptop reviews for sure! Could you also test the performance difference on video editing tasks like Davinci Resolve? Appreciate your work!
WHat's best laptop around 2500USD for working professional Screen Size 17.3 or 18 Need performance near-about 3070TI and processor: i7-i9 ~Data Science and AI graphics ~Computer modelling and networking ~Virtualization Should I buy this gen or 12gen with 3070TI????
i think LTT make video about rtx 4090 1000watts test today that conclude the same result as this one. They got same avarage fps using asus rog 4090, with 80 watt different(360watt vs 440watt).
Great video but disappointing to see no focus on whether or not a manual core overclock or full voltage/frequency curve sees significant performance uplift or a higher sustained power draw in gaming loads. Being limited by VREL means there's power available but the voltage itself, or rather the lack thereof, will not allow for *gpu boost* to go any higher within it's default boost range. That doesn't mean manual tweaking isn't viable. TSMCs new node is MILES more efficient than Samsung's as well and the fact Nvidia was so confident in that they just artificially limited the available voltage to reach their minimum/expected generational uplift or target performance means there should be massive overclocking headroom. What's the minimum voltage required on these laptops to sustain stock boost? It's probably much MUCH lower than the VREL point. How much can you push the final voltage point in OC? I wish you'd considered going into this as it's the endpoint of this question/revelation that increasing power brings no improvement.
That was exactly my reasoning for getting a 4070 laptop: to get the most fps out of an affordable and quiet machine. Plus hopefully it will live longer due to being run colder, but we'll have to see in a couple of years. Yeah, 4080 and 4090 would crank out even more frames but those laptops usually cost at least double - this is where I draw the line of hunting for efficiency xD
Exactly like Apple reduced performance in their older models so that people get frustated, believing their iphones getting slower with new demanding titles and switch to newest iphones.
For lighter laptops, the RTX 4050/4060/4070 still make sense. And whereas it seems that the 4070 is only a minor improvement over the 4060, there seems to be a huge difference in efficiency. 60fps in 1440p seems to require only roughly 57W for the RTX 4070 (min 4:02), but 85W for the RTX 4060 (min 3:28): a 50% increase (= 33% less efficient)! The RTX 4080/4090 are more efficient still, but their capabilities and minimum power consumption do not make them viable candidates for light laptops that are meant to last on the battery, especially if you intend to cap power consumption to
Why do companies do this??😠😠 I was thinking to buy a 4060-4070 laptop,but now I just don’t know what to buy….SHAME ON YOU NVIDIA.. We should get what we pay for.
you might be able to ignore the voltage by using msi afterburner curve editor. you will get less clock speed tho but you might be able to get more wattage instead… Idk how each will perform so you have to test it out.Try setting the voltage to 0.9.Make sure to tick constant voltage in msi afterbruner.
LTT basically said nVidia is changing the process timing in the GPU when it becomes unstable to keep it from crashing. Kinda like latency in memory. Just because its running at a higher clock speed consuming more power doesn't mean it's actually faster. They even hacked a desktop 4090 to make it go more than 500W and found no performance increase once it goes above 500W.
Here in the EU there might be a case for sewing on grounds of “abuse of dominant market position”. As DG COMP has done in the past with Microsoft, Apple and Google. Although give their limited capacity to prosecute this might be too small a fish.
Jarrod, can you tell us more about cooling solution here ? Because as far i can tell, RTX 4070 and below will stop improving at 100W, so they will run cooler in max powered chassis. Is the lower temperature and build quality worth the price ? Most temp chart see these cards running at around 60 -80 and CPU not more than 90, which are extremely nice for longevity.
@@JarrodsTech It would be really nice if you could just post a temp average for this generation at max load, because for folks like me who lived under tropical sun, that would be a world of different. Temp for charts usually i have to + 3 - 5 for my place.
Well, if the laptop is properly configured to run the GPU at 140 W sustained (like in that Haven? benchmark) then it will surely be cooler and/or quieter at 100W. So that might be a reason to get a higher tier laptop, even if the 140W is kind of useless to you (at least in today's games)
Jarrod, thank you very much for this video. I am currently considering buying a new laptop and choosing the gpu is a major headache, since 4080/4090 a just unreachable and after seeing your video I can just pick a thin and light 4070 laptop with no doubt. Good job as always!
@@jackali5014 3070ti laptops are not cheap too, i have considered them too. It’s just if I buy a new laptop, I want it to be latest model, and not buy already outdated hardware. It is what it is 🤷🏻♂️
I'm honestly wondering what this will means for AMD equivalent laptops at the mid-range when they do come out. The efficiency of these chips is awesome, but I can't brush off that this feels like it's an Advantage Edition situation brewing in the background.
Definitely smells of artificial product segmentation. Thank you, Jarrod, for all of the effort you put into these excellent videos. I'm seriously considering getting a new laptop this year, so I'm happy to save a buck for a negligible performance penalty.
Finding and identifying this deceptiveness, and filling in the community has been invaluable, thanks. Joined your patreon to support you like I have for GN and HWU. Independent reviewers shouldn't be revealing deception like this, it should be a known fact.
Thank you for this video breakdown! Im glad I purchased the Legion 7i RTX 4080 32GB 2TB option because i felt like anything lower would be a joke buying as far as performance. You also posed a question that I also had in my mind as well that being is Nvidia gonna release a RTX 4070 TI because I honestly feel like thats what they are gonna do which would still make me glad I went with the 4080 option. Keep the videos running bro your the best at what you do!
No; not all games support this feature and unless you have the budget for a 4080 or 90 it's pointless to spend extra %20 for a feature because 40 series doesn't have the same raw power 30 series has. ESPECIALLY when you think that you can get a 3070 laptop for a 4060 or even 4050 money. Plus some of them has 500 nits displays.
@@mehmetgurdal Jarrod made a comparison video between 4070 and 3070ti. While 3070ti was 2% faster on average, the 4070 was about 62% faster with dlss3. More and more games are using this feature and in coming years games are expected to be very demanding with unreal engine 5. So dlss3 will be neccesary.
RTX 40 series still has a performance gain over 30 series. All this means is you should not base your purchase on the GPU TDP. Feel safe in buying a lower powered version, or look for deals on higher powered. It is shady advertising to sell something labeled as something it isn't, I won't defend that, but less power draw with more FPS is not a bad thing. Nvidia should have just labeled them as 100 watt GPUs and this wouldn't even be a problem. Its a laptop, laptops should be designed to be power efficient.
This is a huge f you to customers, can’t believe there’s such an artificial limiter. Seems like going with laptops really isn’t the way this gen as 4080 or 4090 laptops are expensive and the lower tier specs are handicapped.
@@lil_brumski RUclips Revanced (android) or 'Return dislike button' extension on Windows. i dont recommend RUclips Revanced tho, there are a lot of scam websites which install virus in it
Forgot to mention in the video, I also tried Cyberpunk with ray tracing on the 4060 to see if maybe using the RT cores would bypass the limit, but it was no different.
i think this policy isn't legal, is computer obsolescence. the same think do apple with cpu and battery on iphone. (in Frence is a crime)
Market Share Ready® for AMD Radeon nom nom 🐊😋
On desktop is actually simpler games that consume more because they have an higher GPU occupancy, ray tracing could actually be counter productive as it cause lower occupancy due to its inherent incoherent instructions and memory access pattern. DLSS will lower the resolution which may hit occupancy too, maybe DLAA would push more
@Jarrod's Tech can you please make a review on the Predator Helios neo & Helios 16 series (2023)
Hey Jarrod. Is this hard voltage lock on desktop as well? Also what's your theory on this?
I'm thinking board partners were kind of in the dark or they would not have designed things around higher TDP's and the extra money that entails.🤔
Please let's take a minute to appreciate what Jarrod is doing for us here, I salute you man. I don't know without you how we would know these dirty quirks by Nvidia. Thank you so much.
very much agreed
Agreed completely. This is extremely helpful to the public.
I cannot give your comment a whole bunch of likes, but i can give you these 👍👍👍👍👍👍👍👍👍👍
What's with the take a minute bs. I'm taking 8minutes to watch it already.
Excuse me what is dirty here? They made power efficiency chip, then sell their chip to other hardware manufacturers. So laptop-produced company already knew how much TDP this chip can consume, they could market in whatever way they want to sell their laptop. And you wanna tell me what the heck Nvidia cheated or lied you?
It's the scam of the decade from nvidia selling cards claiming its "140w" when in reality barely reaches 100w smh . Great video Jarrod
it's even weirder considering people dont buy laptop for wattage, they just want frames!
On one hand, it's good because these GPUs can achieve the same performance without sucking down as much power as Ampere. On the other hand, the scaling is literal garbage once you hit 90 watts on these, yet they'll still be sold under the pretense of being more powerful than they are with the voltage limit they have.
Thin and lights running on low power definitely have an edge with these cards inside. 😅
I dunno how Nvidia hasn't been sued with false advertising yet.
@@thetrustysidekick3013 because they're a big corporation and they're gonna win the lawsuit regardless. Just how the world works now
So basically Nvidia has deliberately hindered performance so as to not give any extra performance than they decide the consumer should get?
Yes
Yes
Yes
seems about right!
Yes, to make you pay more on laptops that offer the second last digits, 8 and 9.
No one does a better job than you! Thank you for putting in the work you do!
Double Plus Good!
Laptop god
Yuppp!! Fr!!
Tbh I take a totally different approach to reviewing laptops, but I get a LOT of inspiration for my channel form Jarrod.
Unbiased clean info as always sir
WHat's best laptop around 2500USD for working professional
Screen Size 17.3 or 18
Need performance near-about 3070TI and processor: i7-i9
~Data Science and AI graphics
~Computer modelling and networking
~Virtualization
Should I buy this gen or 12gen with 3070TI????
I really love these power scaling tests you do. They allow us to get a very good idea about the performance of whatever laptop we might want to buy any uncover oddities like the one shown here.
Thank you!
big thanks for the power scaling charts
ScamVidia won't like this video.
cry
@@il_principeu seem to be are crying
@@il_principebro crying 😂
Bro really said "🥺:cry"@@il_principe
I call shame on Nvidia.
This is amazing information Jarrod❤
finally first on reply xD is this count ?
Wow 😒😮😳
Nshitia has clearly made 4050 appear as the 4060 and forced higher power limits on it. If this is what next gen is supposed to be, I can do better.
WHat's best laptop around 2500USD for working professional
Screen Size 17.3 or 18
Need performance near-about 3070TI and processor: i7-i9
~Data Science and AI graphics
~Computer modelling and networking
~Virtualization
Should I buy this gen or 12gen with 3070TI????
@@Naraayanay Do you need more than 8GB of VRAM? If so you're gonna need something with an RTX 3080TI, 4080, or 4090 Mobile.
Thanks for the valuable investigation. Very informative and useful to help people make an informed purchasing decision. It also confirms my decision to skip anything below 4080 for this gen.
i believe that's what Nvidia wants exactly. They want everyone to buy the more expensive laptops.
@@ttkttk7463 No question. It's terrible TBH. I have my eye on 3080 and AMD 6850M XT right now due to the pricing issues of 4080 machines.
@@darrell9616 Just wait until christmas time. Thats when the specials come out.
@@darrell9616 but i think those cyberpowerPc laptops are somehow fairly priced. rtx 4080, i9-13900hx, 32gb ram, 1tb ssd is going for around 2,300. it ain't cheap but it's better than the rest. i just couldn't find much reviews about it on youtube.
@@ttkttk7463 You're right. That's the best value going right now for a 40 series machine. I can't find any either, and support from these smaller companies is always a concern.
It's great to have options.
Give this man an award, equal to or greater then that of investigative journalism. He is the only laptop reviewer who appears to bring things like this to buyer/consumer attention. Jarrod have my respect all the way.
He talks like NPC doesn't bother u?
WHat's best laptop around 2500USD for working professional
Screen Size 17.3 or 18
Need performance near-about 3070TI and processor: i7-i9
~Data Science and AI graphics
~Computer modelling and networking
~Virtualization
Should I buy this gen or 12gen with 3070TI????
@@kaywonderer who cares however he talks, as long as the information is real and accurate
Lol it only bothers You bud @@kaywonderer
@@kaywonderershut up NPC
Hi Jarrod. So far you had the clearest and most data driven tech reviewer i ever watched. You deserve more than thousands of subscribers!
Thanks for your another great coverage Jarrod. Definitely Nvidia is forcing their enthusiastic laptop gamers to pay a shitload of money for 4080 or higher. I'd been considering to buy another gaming laptop this year to replace my old Alienware m15 having 1070 max-q inside, but after watching your clips on laptop 40 series I changed my mind to have a desktop instead. They just don't deserve... Shame on you Nvidia.
i think u can also buy amd ones or prev gen laptops which are on sale
For laptops I'd honestly just look at either AMD or intel this generation if you want to buy new.
WHat's best laptop around 2500USD for working professional
Screen Size 17.3 or 18
Need performance near-about 3070TI and processor: i7-i9
~Data Science and AI graphics
~Computer modelling and networking
~Virtualization
Should I buy this gen or 12gen with 3070TI????
Shame nvidia and after that u r going to shop and buy their desktop gpu lol
That's really groundbreaking! They're only allowing higher TGPs in some particular gpu workloads to avoid lawsuits. They only want their 4080-90 series to reach the max performance limits!
WHat's best laptop around 2500USD for working professional
Screen Size 17.3 or 18
Need performance near-about 3070TI and processor: i7-i9
~Data Science and AI graphics
~Computer modelling and networking
~Virtualization
Should I buy this gen or 12gen with 3070TI????
@@Naraayanay buy a 3070ti max power.
Thanks for the support! sorry for the massive delay, RUclips isn't the best at notifying for these!
@@Naraayanaybuy an alienware on the second hand markets, these laptops have good cooling system.
I wish you would make a video nvidia 3000 series vs 4000 series with the same type of graphs (per watt performence and per dollar performance) 😊
Maybe that's why 4070 and 3070 looks so similar, the 4070 is running at 100w while the 3070 is closer to 140w, probably a 140w 4070 (if this thing got fixed somehow) would have that improvement that we would aspect.
It's strange but I think those higher power limits will serve you best with a slight overclock, for example on my legion 5 with the 115W RTX 2060 refresh, going full throttle at 115W gets me nothing other than extra heat, the FPS is the same as running in 80W mode, but if I just add a 115mhz OC to the core then my FPS jumps up by 10-15 FPS, putting it on the same level as a stock desktop 2060, doing that same OC in 80W mode gets me nothing, maybe 3 FPS at best
It's quite ridiculous that every alternate generation nvidia becomes super greedy and anti consumer, get roughed up in the market and respond well with the next generation. At this rate might as well just wait and see what they do for 5000 series.
AMD not taking advantage of this gap is equally disgusting
what is intel, chopped liver? the arc cards are good.
AMD is 100% taking advantage of this gap, by joining in. They are making their money.
@@muizzsiddique Market Share Ready® for AMD Radeon nom nom 🐊😋
Sorry but AMD has noting that compete with RTX 4070 at 80w. THeir GPU don't scale that very well at extreme lower power
WHat's best laptop around 2500USD for working professional
Screen Size 17.3 or 18
Need performance near-about 3070TI and processor: i7-i9
~Data Science and AI graphics
~Computer modelling and networking
~Virtualization
Should I buy this gen or 12gen with 3070TI????
It's incredibly amazing that you point this out, Jarrod. You honestly do the most and I'm here for it! ♥️
That was known for quite some time now, but no one has done a proper job testing it and putting numbers next to it. Great job.
That being said, I am pretty sure Nvidia just (mis)used this limiter to restrict performance some more when they found out that they are still too strong even after castrating those dies.
Thank you jarrod for exposing nvidia
This has been known for a couple of months, Jarrod dint expose anything.
@@niebuhr6197 It did for me, so I don't know what you're on about... Stop gatekeeping information about computers, not everyone watches or reads about these things 24/7, most people only game casually and actually have a job to be busy with unlike you.
Wow, seriously you deserve wayy more clicks. This is important stuff and we all thank you for your very hard work! Excellent video, as always! Nvidia has some explaining to do, even though I doubt they will.
That chart comparison blew my mind right away from the abyss. Am sticking to my 30 series.
X14 aw 3060 here. Still happy with it.
I swear, even if not being a laptop user, Jarrods analysis always hit home. Such a great work.
this is a kind of techtuber that know their stuff and inform it to viewer, useful and ground breaking information and not just promoting their promotion brand while giving not so useful information. Good job JarrodsTech
hey thanks
Jarrod'sTech this video is VERY good timing for something I am working on. Your testing will be critical for explaining some decisions companies are making about upcoming products... ;)
Cool, I was wondering if the desktop cards that use the same GPU dies will be subject to this too or something similar.
HELLO TOM!!! 🐊🔥🔥🔥
I wonder if undervolting them would let them hit power limits later and eke out some extra performance.
If you watched the video, you would know that the problem is the voltage limit and not the power limit.
My 4060 hits the voltage limit long before the power limit or thermal limit, in a thin and light (!!). These cards are already clocked pretty high so undervolting doesn't do much since it's not thermally limited. Max I've gone is +250 core and +1250 memory.
In power virus benchmarks like furmark, it does reach power limit before the voltage limit. But not a single game has done that for me.
They aren't hitting power limits though so that's not the issue.
What you'd want to do overclock. Run the gpu at the absolute max +core offset it can handle at the voltage. To use this extra power budget.
@@prajaybasu My mistake. I guess undervolting would let you use more of the power or something. It's confusing a bit.
Lol ppl here really forgot to use their brains.
If you undervolt, you can reach higher clock speeds at the same voltage. Which means more performance.
I know you think undervolting means less power consumption, but that's not entirely true. At a given state, if you only lower voltage, power consumption will go down, sure. But at a given state, if you only increase frequency, power consumption will increase.
So by undervolting, at that max voltage level Nvidia is holding ppl to, you can have higher clocks and therefore higher power consumption.
So OP is right and y'all need to think before commenting
@@tarfeef_4268 Undervolting is holding a certain frequency by giving the core less power for the price of stability. If you can max out at 1volt for 1ghz you can set to reach 1ghz at 800mvs but you increase instability. How stable you can get depends on chip quality, but you will get lower power consumption for same performance.
themselves
Nvidia for sure did this to block laptop companies from making a lower tier card run faster than a high tier one, like what lenovo was doing during the 30 serie laptops. Now if you want a faster gpu you have to buy the next tier up instead of a lower end card running at higher wattage than the higher end card.
yes unfortunately, I've seen reviews where legion rtx 3060 beat notebook with rtx 3070
10 mins ago I was thinking it's worth paying an extra for 140w 4070 :-D Now it's between slim 4070 100-120w and 4080 (170w), although those laptops are noticeably bigger and cost a kidney. I wonder which is better fps for buck 4070 (low wattage) compared to 4080 (full power). Thanks for the awesome reviews!
Why would you want 4070 when it's slower than 3070 mobile in most games
Jarrod just saved a lot of people hundreds of dollars.
*Thousands*
WHat's best laptop around 2500USD for working professional
Screen Size 17.3 or 18
Need performance near-about 3070TI and processor: i7-i9
~Data Science and AI graphics
~Computer modelling and networking
~Virtualization
Should I buy this gen or 12gen with 3070TI?????
Jarrod you are a life saver, I initially wanted to get an M16 with 4060/4070, but after this, I'm going for last year's RTX 3070ti.
I went with 3070ti as well. WORKS like a beast
What laptops are you using for that GPU?
Pretty solid data for a class action lawyer to pounce on. The TI theory is plausible.
Great confirmation in actual games of good old leaks, that said AD107-106 is not scaling past ~105W. Now waiting to see if desktop cards with these chips (4060 and 4060Ti) have this same problem.
Linus tested boosting the 4090 to use over 600W, and the performance was barely improved even with voltage regulation disabled. I think this might be an architecture limitation.
Desktop chips wont have low voltage limits, they would be able to push 1.1v and 3ghz...
Quite interesting.. so now the
I think that Nvidia and the manufacturers don't want to see more laptops on the market with for example a 3060/3070 beating the 3070/3080 because of the higher power of the 3060/3070 in these cases.
Well most laptop brands are still using last-gen cooling designs and power inefficient Intel H/HX CPUs in 15.5-18 inch SKUs.
If they taken their time and prepared more 15.6-18" models with vapor chambers in time for Zen3+/Zen4 Ryzen supply to catch up then we would have *A LOT* more designs with 8-16 "P-core" CPUs running at full tilt with dGPUs dynamic boosting 14-33% higher to 185-200W TGPs instead of the same 150-175W limits.
Just look at the 25-39% jump in TGP wattage in 14inchers from 75-100W last gen to 95-125W this gen. 🤯
Nice shirt Jarrod, my first concert was COB lol
🤘
This is top tier content, thanks for all the effort and time put into this investigations and benchmarks
Looks like Nvidia fumbled the bag here
At 6:41
RTX 4060 at 35w get higher score than RTX 3060 at 60w
RTX 4060 at 60w score higher than RTX 3060 at any power level (even at 140w)
Yeah there's no doubt the newer stuff is more efficient, just seems weird that it hits a wall before the advertised power in games.
Wow this was a really interesting find, Jarrod! I wonder if this will blow up and force Nvidia to unlock the power that people thought they were paying for.
Market Share Ready® for AMD Radeon nom nom 🐊😋
Thanks Jarrod, it's unbelievable the huge work testing this, every game, at ALMOST EVERY WATTAGE, in 10W increments?!?! YOU'RE THE REAL MVP. For real, thanks.
Me and my wallet are thanking you Jarrod! Well explained!
145W 3070 Mobile or RX 6800S/6700S laptops deals here I coooooome!!!! 😁
@@handlemonium I hope to get a 3070 TI at 140ish watts. I don't care about dlss at all.
@ElBandito_Gaming Did you get your laptop? How is the 3070 Ti 150 W running modern games? How are the temps and cooling? Also which laptop did you get? I am looking to buy Lenovo Legoon 5i Pro i7 12700H RTX 3070 Ti instead of a 40 series laptop after watching few of Jarrod's relevation videos
@ Check out Lenovo Legion subreddit. It seems like they dropped the quality and more and more laptops are dying for no reason. I went for an Hp Omen 17 with 4090 (got a very good deal on it).
I'm guessing this was their solution for all the overheating problems earlier. Faced with a massively expensive recall, they chose to secretly limit them. Well, the phone makers fod and do thos too, for the same reason.
I th8nk the benefits of the 4070 are in thin and light laptops, a lkwer wattage 4070 can provide great perofrmance in a 17mm thin chasis. The book 3 ultra, yoga 9i pro, asus zenbook 14 pro/16x for 2023 ar ethe only laptops I'd consider with a 4070 series gpu.
Please do a Lenovo legion Pro 5/5i 4060 vs Legion 5 Pro 3060 gaming comparison
I'm glad I got a RTX 4080 laptop (ASUS Strix Scar G18), as that's the only RTX 40 series GPU alongside the RTX 4090 that I think are worth upgrading to if you're a few generations behind - I was upgrading from a GTX 1050Ti, so safe to say, the difference is huge.
But the cost difference is huge... even against a 4070-equipped laptop.
nVidia is just taking your money.
if y ou come from gtx 1050ti you will notice diferencie even whit a 3070 laptop or just get a 90w 4060 latpop if you want more portability
Asus definitely has the best deals in today's market.
But they have a bothersome issue; coil whine. I want to buy a scar g16 but issue is pushing me back; closest option is from MSI and it has a design that stuck in 2000's. With a price tag almost doubles the g16.
You must feel the burn of seeing $999 laptops with 4060 hanging alongside your bad investment
@@eliubfj Or the 3060 still cost $999.
I don't know what you talking about there is clearly 1 frame differences that makes all games 1% more playable 🤓🤓
Lol
Really curious to how well they would do with MSI afterburner tuning, undervolting and overclocking may actually allow it to make the most of that wattage. Just food for thought.
Awesome analysis and explanation!
Thank you so much Jarrod for this information! Best laptop reviews for sure!
Could you also test the performance difference on video editing tasks like Davinci Resolve?
Appreciate your work!
this
What a banger! Also waiting for your 2023 G14 review.
The data from these tests is amazing, thanks for all the work you do! It's very sad that the RTX 4050 - 4070s perform this way.
WHat's best laptop around 2500USD for working professional
Screen Size 17.3 or 18
Need performance near-about 3070TI and processor: i7-i9
~Data Science and AI graphics
~Computer modelling and networking
~Virtualization
Should I buy this gen or 12gen with 3070TI????
damn, this kind of extensive testing is what I'd hoped LTT would be doing
Labs isn't fully set up yet although they will probably focus more on PCs than laptops anyways
i think LTT make video about rtx 4090 1000watts test today that conclude the same result as this one. They got same avarage fps using asus rog 4090, with 80 watt different(360watt vs 440watt).
Great video but disappointing to see no focus on whether or not a manual core overclock or full voltage/frequency curve sees significant performance uplift or a higher sustained power draw in gaming loads. Being limited by VREL means there's power available but the voltage itself, or rather the lack thereof, will not allow for *gpu boost* to go any higher within it's default boost range. That doesn't mean manual tweaking isn't viable. TSMCs new node is MILES more efficient than Samsung's as well and the fact Nvidia was so confident in that they just artificially limited the available voltage to reach their minimum/expected generational uplift or target performance means there should be massive overclocking headroom. What's the minimum voltage required on these laptops to sustain stock boost? It's probably much MUCH lower than the VREL point. How much can you push the final voltage point in OC? I wish you'd considered going into this as it's the endpoint of this question/revelation that increasing power brings no improvement.
You sounds like an nvidia engineer
Thank you very much for letting us know these findings !
The good thing out of this, is that if you have a cooler that can cool a 140W GPU, it will be very cool and quiet with only 100W needed.
That was exactly my reasoning for getting a 4070 laptop: to get the most fps out of an affordable and quiet machine. Plus hopefully it will live longer due to being run colder, but we'll have to see in a couple of years.
Yeah, 4080 and 4090 would crank out even more frames but those laptops usually cost at least double - this is where I draw the line of hunting for efficiency xD
Exactly like Apple reduced performance in their older models so that people get frustated, believing their iphones getting slower with new demanding titles and switch to newest iphones.
Nvidia never fails to dissapoint.
Excellent work sir. I was specifically looking for the 4050 as i haven’t tested it much. S+ tier video as always
Nvidia has done great convincing you to strip your wallet on more costly laptops and GPUs.
Wow, if not Jarrod no one would ever noticed that, it would go uncovered with people struggling with performance.
This is why Jarrod is the best laptop reviewer on RUclips
Crazy situation. Thank you for covering it!
6-8gb vram gpu not gonna last long.
Well upscaled 900p on a 12-14.5" screen looks fine and upscaled 1080p is serviceable on 15.6-18" displays.
For lighter laptops, the RTX 4050/4060/4070 still make sense. And whereas it seems that the 4070 is only a minor improvement over the 4060, there seems to be a huge difference in efficiency. 60fps in 1440p seems to require only roughly 57W for the RTX 4070 (min 4:02), but 85W for the RTX 4060 (min 3:28): a 50% increase (= 33% less efficient)! The RTX 4080/4090 are more efficient still, but their capabilities and minimum power consumption do not make them viable candidates for light laptops that are meant to last on the battery, especially if you intend to cap power consumption to
Agreed.
Laptops vs Desktop have two very different uses imo.
The RTX 4070 Laptop is excellent GPU for Mobility and it performs great.
Jarrod you are a legend! Appreciate and love your work 💙
Why do companies do this??😠😠 I was thinking to buy a 4060-4070 laptop,but now I just don’t know what to buy….SHAME ON YOU NVIDIA.. We should get what we pay for.
Market Share Ready® for AMD Radeon nom nom 🐊😋
A quick thought on this; does the voltage limit mean that undervolting the GPU might actually unlock more performance?
A remarkable research. Thank you Jerrod.
you might be able to ignore the voltage by using msi afterburner curve editor. you will get less clock speed tho but you might be able to get more wattage instead… Idk how each will perform so you have to test it out.Try setting the voltage to 0.9.Make sure to tick constant voltage in msi afterbruner.
LTT basically said nVidia is changing the process timing in the GPU when it becomes unstable to keep it from crashing. Kinda like latency in memory. Just because its running at a higher clock speed consuming more power doesn't mean it's actually faster. They even hacked a desktop 4090 to make it go more than 500W and found no performance increase once it goes above 500W.
Does this hidden voltage wall warrant a class action lawsuit? 🙃
Here in the EU there might be a case for sewing on grounds of “abuse of dominant market position”. As DG COMP has done in the past with Microsoft, Apple and Google. Although give their limited capacity to prosecute this might be too small a fish.
You rock my friend as always. Great review. Best wishes from Türkiye.
why does the 4070 mobile have less vram than the desktops 12gb
Looks like the rtx 3060 mobile with 6gb and the desktop with 12gb
Its a shame they didnt have the same amount of vram
To force the user to upgrade more often
Even 12GB of Vram on the Desktop part is too low it should've came with 16gb minimum
And probably because increasing power consumption & motherboard space too
@@qusaifarid513 you are crazy
Awesome work as always. Thanks. Can you please do same test with FG ?
Jarrod, can you tell us more about cooling solution here ? Because as far i can tell, RTX 4070 and below will stop improving at 100W, so they will run cooler in max powered chassis. Is the lower temperature and build quality worth the price ? Most temp chart see these cards running at around 60 -80 and CPU not more than 90, which are extremely nice for longevity.
I didn't actually monitor the temperatures at different levels here, but below 100w I'd assume they're cooler
@@JarrodsTech It would be really nice if you could just post a temp average for this generation at max load, because for folks like me who lived under tropical sun, that would be a world of different. Temp for charts usually i have to + 3 - 5 for my place.
Well, if the laptop is properly configured to run the GPU at 140 W sustained (like in that Haven? benchmark) then it will surely be cooler and/or quieter at 100W. So that might be a reason to get a higher tier laptop, even if the 140W is kind of useless to you (at least in today's games)
@@hoanghanguyenvinh8476 j
You just earned a subscriber. Keep up the good work.
Era of thin gaming laptops 😎 Praise zephyrus lineup 🥳
Watching this from my 2022 zep g15
It almost sounds like nVidia is making the same graphics card but selling them as different SKUs and voltage limits.
Jarrod, thank you very much for this video. I am currently considering buying a new laptop and choosing the gpu is a major headache, since 4080/4090 a just unreachable and after seeing your video I can just pick a thin and light 4070 laptop with no doubt. Good job as always!
Don't buy 4000 laptops, don't support Nvidia, if you most have one get rtx 3070ti laptop
@@jackali5014 3070ti laptops are not cheap too, i have considered them too. It’s just if I buy a new laptop, I want it to be latest model, and not buy already outdated hardware. It is what it is 🤷🏻♂️
@@n1ce-0 just don't buy a 4070 buy it's a meme gpu
8 gb 4070 is already limited man by its VRAM. You will have to lower settings (low-med) at 1080p in new games.
Absolutely fantastic video & research mate. Keep it up! :)
I'm honestly wondering what this will means for AMD equivalent laptops at the mid-range when they do come out. The efficiency of these chips is awesome, but I can't brush off that this feels like it's an Advantage Edition situation brewing in the background.
Man such a detailllledd video...hats off man 🎉
Definitely smells of artificial product segmentation. Thank you, Jarrod, for all of the effort you put into these excellent videos. I'm seriously considering getting a new laptop this year, so I'm happy to save a buck for a negligible performance penalty.
Really quality info Jared thank you
3:30 showing results of 4060 in QHD instead of FHD is kinda scummy
Thank you Jarrod for your valuable insights
I can't imagine the amount of time it took to research all of this. Incredible work sir. Thank you!
Finding and identifying this deceptiveness, and filling in the community has been invaluable, thanks.
Joined your patreon to support you like I have for GN and HWU. Independent reviewers shouldn't be revealing deception like this, it should be a known fact.
Thank you for this video breakdown! Im glad I purchased the Legion 7i RTX 4080 32GB 2TB option because i felt like anything lower would be a joke buying as far as performance. You also posed a question that I also had in my mind as well that being is Nvidia gonna release a RTX 4070 TI because I honestly feel like thats what they are gonna do which would still make me glad I went with the 4080 option. Keep the videos running bro your the best at what you do!
Got this info from other reviewers as well. Thank you for making a full video on this topic.
No problem, the more people who know the better!
-High Powered- RTX 4050/4060/4070 Laptops Are a Waste
Thing is, you get DLSS 3 with the 40 series. And that's a big factor in choosing a gaming laptop these days.
No; not all games support this feature and unless you have the budget for a 4080 or 90 it's pointless to spend extra %20 for a feature because 40 series doesn't have the same raw power 30 series has.
ESPECIALLY when you think that you can get a 3070 laptop for a 4060 or even 4050 money.
Plus some of them has 500 nits displays.
@@mehmetgurdal
Jarrod made a comparison video between 4070 and 3070ti. While 3070ti was 2% faster on average, the 4070 was about 62% faster with dlss3.
More and more games are using this feature and in coming years games are expected to be very demanding with unreal engine 5. So dlss3 will be neccesary.
This has grounds for lawsuit. I suggest people also share this info on big channels like Linus etc so everyone sees Nvidias scam.
RTX 40 series still has a performance gain over 30 series. All this means is you should not base your purchase on the GPU TDP. Feel safe in buying a lower powered version, or look for deals on higher powered. It is shady advertising to sell something labeled as something it isn't, I won't defend that, but less power draw with more FPS is not a bad thing. Nvidia should have just labeled them as 100 watt GPUs and this wouldn't even be a problem. Its a laptop, laptops should be designed to be power efficient.
Rtx 3060 vs 4060 vs 4050
This is a huge f you to customers, can’t believe there’s such an artificial limiter. Seems like going with laptops really isn’t the way this gen as 4080 or 4090 laptops are expensive and the lower tier specs are handicapped.
depends. A few weeks ago you can get a legion 7 4080 for around 2200 dollars.
497 likes vs 3 dislikes
3 dislikes must come from Nvidia or OEM
You can see the dislikes?
@@lil_brumski RUclips Revanced (android) or 'Return dislike button' extension on Windows. i dont recommend RUclips Revanced tho, there are a lot of scam websites which install virus in it
Great video, appreciate the efforts!
They better "fix" this, or this is material for a class action. Thanks Jarrod, you're a hero.
according to them nothing is "broken"
That's an awesome information that nobody knows till now
Well done jarrod