@@alanharperchiropractor exactly , the 80 series is even worse 75% more over 3080 10gb , 95% price increase in Europe and for that your getting 50%-60% more performance for the 4080 16GB , Jensen is seriously taking the piss.
I believe the main reason for the performance discrepancy between the 2 laptops is due to the way thunderbolt 4 is being implemented. The i7-1260P has TB4 build into its main CPU die while the i9-12900HX does not have native TB4 so it requires another IC, the Intel Maple Ridge TB4 controller which may be causing problems
Here with the HX CPU, because of the discrete Thunderbolt controller, the data flow has to go through the motherboard chipset first and thus has to compete with other things like storage or wifi. In-CPU Thunderbolt implementation is definitely the way to go to have the best eGPU performance as it means the eGPU is pretty much directly connected to CPU PCI Express lanes.
TB4 has not changed bandwith with respect to TB3. It still has 4 lanes of PCIe 3.0, while the 4090 is a 16 lane PCIe 4.0 card, or 8 times more bandwith. That's way it runs like crap
Probably, I just assumed they were the same as the Intel ark page didn't really make it clear that there were thunderbolt differences, so this seems to be a downside for hx.
I can confirm that resizable BAR is an issue for Thunderbolt eGPUs. Under Linux I've had to outright disable it in order to increase eGPU performance. Fortunately with Linux I can set kernel boot parameters to enable me to do so regardless of BIOS settings or anything. I have no idea if Windows has any similar options or functionality.
considering moving to linux RN, since my company just ditched office and using google sheet. like how many hour is needed to setup it? is steam os have no major problems to use egpu with USB 4?
@@alexamderhamiltom5238depends, any distribution with a graphical installer should take less than 30 min to install. If it's your first time trying linux i reckon ubuntu would be a good option
I'd be most interested in testing the 3070ti with the i7-1260p compared to the 3070ti in the tower. I think this is likely true for 95% of others watching this video, as very few want to drop stupid money on a 4090. Some thing even more common such as a GTX 1080 or 1070 eGPU i7-1260p / tower comparison would be even better.
Agree. 4090 as eGPU is simply stupid. Most people want eGPU not because of they want the ultimate performance, but because they don't want to buy both laptop and desktop.
Exactly. A comparison of "eGPU vs dGPU" with the "same" card is the comparison i want. I don't need gaming on the go as much but I would like to use the same laptop for everything and connect it to the eGPU for gaming and 3d modeling at home.
Jarrod, I bought a Razer Core X several yrs ago as a gamble to boost up my outclassed old i7 7700HQ Omen15 1060MaxQ, using an RTX2060. It benchmarked at approx 95% of the average desktop 2060, and was amazing tbh. Maybe pushing the limits, it was upgraded with a 3060Ti last yr, and STILL benchmarked 95% of desktop average for that card, and (with CPU undervolting, GPU overclocking, & 16GB RAM) I can still play competively with my ancient 7700HQ laptop, eg. 105fps on Battlefield 1 ultra. But, I do think this is the limit for TB3/4, and my next PC will have to be a desktop build, I can't see eGPU being a viable use for an expensive RTX4-series graphics card on any laptop. RIP eGPU beyond RTX3060Ti.
Lower-tier GPUs use less bandwidth is why. TB3/TB4 is limited to 40Gb/s, which lower-end GPUs will typically cap out at anyway, even on a normal PCIe slot in most desktops. They just don't really use the rest of the throughput. The 4090, though, DOES utilize more than 40Gb/s because of how fast and data heavy it is, which is why it performs so poorly compared to a normal desktop setup. Sadly, Thunderbolt 5 with 80Gb/s will need to be released for eGPUs to become viable again because of how bandwidth dependent newer GPUs are compared to cards from 2015-2017 (when the Core/Core X was in it's heyday)
Just as fyi, Halo Infinite needed to have ReBar turned off in Nvidia Profile Inspector for driver 522.25 for it to work right on my eGPU 3070. I assume that's the case here alongside Watch Dog: Legion. With my 3070 I get 70-90 FPS at 4K and 130-160 FPS at 720P, so more graphics horsepower like the 4090 has would have definitely made the eGPU shine as it's not that bandwidth limited.
@@naufalkusumah2192 yeah if you need instructions I posted them in the eGPU sub, just search it up as my comment kept getting deleted when I posted the link
Affording a pc is not the issue, portability is. The reason people pay more for a laptop... If I chose to build a pc I can get so much more real bang for the buck. People don't buy gaming laptops to save money.
Shocker, eGPU's are still a terrible buy if you put anything above a low to mid range card in it. Not changing anytime soon either, seeing as TB4 has the same bandwidth as TB3.
If you put a low end card in it, it is also not that worth because the egpu enclosure costs as much as a low end card (I believe?). Honestly, I believe the best application of an egpu is for other workloads that depend less on the pcie bandwidth, such as compute, rendering or video editing. Sadly, it seems that gamers predominantly use it instead.
Unless you use pcie 4 or 5 SSDs slots.. Or disable resizable bar. I am surprised no one has yet to make and egpu that can combine 2 tb ports for double the memory as many laptops have two tb ports now?
Yeah. I think it’s the fact that the higher end laptop has a dedicated gpu. I remember in the early years of thunderbolt 2/3. It was easier to install and run a egpu on igpu only laptop vs a laptop with a dgpu.
I use the eGPU ( Dell xps 9710 with eGPU 4060 TI 16GB) when I’m running batch of Denoising in Lightroom .. it makes a world of difference. Of course my RTX 3090 with intel’s 13900K desktop is faster.
Very informative video. I'm using an eGPU for work coz I don't own a desktop. So it would be interesting to see how the rendering speed using Twinmotion or Lumion. To explain, I'm an architect by profession. My laptop goes around with me from office, client presentation, job sites and home. And I have eGPU for office and home.
A better comparison would be testing different egpu setups to see which graphics cards are pointless or most efficient given the bottleneck. For example,. does a 2070 or 3090, or even a 1070 egpu get similar framerates given the thunderbolt bottleneck? Also, you mention usb4 is faster so does this mean that the upcoming Win GPD 4 with USB4 will have a better thunderbolt bandwith and better performance with egpus?
Timestamps 01:44 - Comparison of connecting RTX 4090 to a laptop vs desktop 03:28 - Comparison of CPU performance between different laptops and desktop for eGPU setup 05:12 - eGPU setup comparison between laptops and desktops 06:56 - RTX 4090 gaming laptop performance limited by Thunderbolt bandwidth and processor 08:40 - Performance discrepancies with different hardware configurations and eGPU setups. 10:24 - Desktop setup outperforms laptop in gaming performance due to CPU and Thunderbolt limitations. 12:08 - RTX 4090 eGPU performance on gaming laptop compared to desktop. 13:50 - Considerations for RTX 4090 eGPU setup
Thunderbolt is far too limiting for an EGPU, I think the 2080 would be the most I wound pair it with and even then only if your laptop doesn't have a half decent video card. I winder if the iGPU in the HX was interfering with it. The HX line uses the older Intel HD graphics. I would try disabling it via device manager.
Yep. iGPU-s were always a terrible idea unless you wanna connect an ultrabook(like LG gram or something) to a mid-range GPU. But even then some bandwidth would be lost. Also, hi!
@@tabalugadragon3555 hey Sergei! It has actually reminded me. I have the Razer Core and Alienware Amplifier. I should sell those oversized paperweights, lol
Be interesting to test again with egpu external monitor instead of video data back to laptop. Considering the egpu would normally be used as a desktop usage and most people use external monitor at home/work with the laptop anyway.
If there's a way to do it, it would have been interesting to see one of the worst performing games for the laptops in a set up where the desktop is using the eGPU, making it posible, for example, to check the impact of resizable BAR, or maybe to use it as a best case scenario.
Hey jarrod the 12900hx was performaing worse cause 16 gen 4 lanes were connected to the 3070 ti leaving the i9 with 4 to spare from which 2 lanes are for the storage that leaves us with 2 lanes of thunderbolt for the 4090 while the 1260p had 4 gen 4 lanes on thunderbolt
Is this really the explanation? I didn't see reduced performance vs my Spectre 16 with RTX 3050 when I had it with my eGPU 3070 vs my Spectre 14 tbh, which should've had the same explanation for it if true
Wondering how much of a gap there would be with something more like a 4070 Ti, 7900 XT (or even more so, an 7800 once available). Going all out on the top of the top with eGPU clearly doesn't make sense at the moment, but depending on how much performance loss there is for the more attainable, upper midrange products, this may still be quite interesting. Context: considering to wait for the next generation of Frameworks (which, should hopefully include the Thunderbolt bandwidth upgrades you mentioned) and looking at a viable eGPU for an above-midrange solution. Coming from an Omen 15 4800H/1660Ti.
For better performance you could hook up directly to the pcie 4 ssds slot.. It carries none of the bottlenecks if thunderbolt and should get twice the bandwidth, assuming you can get it to work?
But what if you're connecting external displays to ghe eGPU and not running through the laptop screen? That's what I'm keen on, can I run my laptop as a CPU / OS and everything else connected to a dock and eGPU via TB3/4. Anyway, maybe a test for next time?
That's what he misses in this video, the bandwidth seems low because half the bandwidth is going to feed the laptop screen which also adds more load to the cpu, if he was using an external display he'd be getting the full 4GB/sec of bandwidth
Good review. I'd thought about having an external gpu with an HX cpu but not based on these results. Looks like an HX with an 3070 or 3080 is in my future. Depends on black Friday pricing.
I'm really interested to see how this plays out next year with Zen 4 and 13th gen laptops, ESPECIALLY Zen 4 Dragon Range which focuses on CPU performance and is made to pair with another GPU. I'm thinking a possible 50% jump in performance in a setup like this.
Dragon Range will not be available for at least 5-6 months though, so u're gonna have to wait quite a long time. At least the 13th gen CPU for laptop is supposed to launch in Jan 2023
Hey just bought a GPD Win 4 and enjoying it but thinking for future how docked usage of getting an E GPU Oculink Setup. Do you think a 4090 or 4090ti would be worth it in that case. And on that note if it is would a 4090ti fit in that case?
I heard that PCI-E M.2 connection is more direct and has less overhead than using Thunderbolt. It would be interesting to compare with this connection as well, who knows. Of course with 4-lanes PCI-E setup, as there are a plenty of 1-lane setups on the market.
@@XxGAMERxXPS3 100%. Who knows how long it'll be before we see PCIE 5.0 M.2 adapters and (more importantly) laptops with PCIE 5.0 M.2 slots though? With how cheap used GPU's are I decided to just build a gaming PC (5700 XT and i3-12100). Only cost about $600 and should beat the performance of most EGPU setups (assuming no RT).
@@vinyfiny There is laptop with Pice 5.0 M.2. I believe MSI have one. Give it 1-2 more years and we will se more laptops with pcie gen 5. Hopefully we get a USB5 or what ever TB with gen 5 x4.
Maybe the best implementation of e-GPU is in the house, eg: RoG Flow, which I recalled that using Thunderbolt only reduce the performance by much than using the full XG
I'm confused. These 4090 egpu benchmarks (specifically for red dead 2) are far lower than your 3080 egpu benchmarks from your previous video. I don't see how this test could be accurate if the 4090 is performing worse than the 3080. Both were tested at High settings at these same resolutions.
Wonderful Video, Jarrod. Can you also test the eGPU + Laptop performance on the Blender, Redshift rendering? I currently have the Legion 12700+3060, wandering to buy a eGPU to boost my laptop or to sell my laptop and buy a whole new 4090 setup. Thanks Bro.
would had been nice to see you using a m.2 slot for the egpu as well because the speed would be way faster. no one I have seen has done it this way while also using a 40 series gpu. would be nice to see the results compared to the 7950x etc.
4:00 It should not be legal to call something TB3 or TB4 and speed is what ever it is, depending on implementation as a user , I can't test it. I have nothing faster to test it with. Same as HDMI speeds. Isn't the point of standards is that we can trust it ?! Thanks for testing these things.
so glad I watched this video! Thank you. I was just about to pull the trigger on an ultrabook with thunderbolt and get an eGPU for desktop use. Now I can see it's not worth it yet. A gaming desktop and a cheap laptop or tablet would be a better use of money.
I took the liberty of posting this to r/egpu. There's some discussion there already about 1 or 2 things in the analysis. First is about Halo Infinite having different behavior with bugged driver versions(522.25, this appears to have been discussed in Jarrod's Dischord). Second is about the HX variant of the 12900 mobile CPU apparently having both the PCH and Thunderbolt controller separate to the CPU, unlike 1260P and (apparently) 12x00H (no X, such as the 12900H). The later could explain the boatload of results where the 1260P had over 12900HX and is a situation that dates back to Ice Lake where the integrated PCH and controller had a noticeable improvement for eGPU use.
Hello Jarrod, if it isn't too much trouble Could you do the same tests, but with this year's Zenbook to see how the bandwidth improvements improved the gameplay
@6:36 Thanks for the clarification that eGPU with RTX 4090 is even worse than the RTX 3070Ti Laptop GPU, otherwise I was under the wrong impression that if 4090 Desktop is 4x > 4090 eGPU, then it must be at least 10x Better than my RTX 3070ti laptop GPU.
Can you try this setup again with DLSS3 and Frame Gen enabled? Maybe even include comparisons with other 4000 series cards? I saw on some boards that frame gen compensates significantly for the drop with TB3/4 bandwidth.
You don't really need a 4090 for the eGPU setup. If you get, for example, a Razer Book and pair it with a 3060Ti, you can get stellar performance and you still have an advantage that a desktop doesn't have - you can unplug and get great battery life on the go allowing you to get some work done, do presentations, etc, on one device.
Honestly I think manufactures have forgotten about egpu solutions. When was the last time we saw a new one come to market? The XG mobile is sadly the best one currently and it is so expensive that if honestly that you might as well grab a mid range desktop pc and a budget gaming laptop and probably come out ahead. I really miss Alienwares egpu. I know it was proprietary, but it had much better performance. I really wish dell was putting all that money towards making a v2 and make it a no brainer choice to buy their machines.
Hey Jarrod, would much rather a comparison between desktop and eGPU performance for a card like the 1650 or 3060 or something in that range because people who can afford a 4090 probably won't have any problems buying other PC components
We desperately need Thunderbolt 4 docks to properly use these more beefier cards. Even with the 10 series TB3 was failing to saturate a good chunk of the bus the card could provide. eGPUs are still an amazing concept and I hope someone steps up to give the standard even more improvements
Not true. Just connect an external monitor to your eGPU box and you will see that it work much faster. If the internal display is in use, the TB3(4) send data in both direction. Just imagine a some person who has the eGPU but not external monitor.
Thanks for another good eGPU video Jarrod! I would really like to a full guide eGPU set up video from you! I've been fighting with an egpu to work properly for months now, but it's hard to find any good trouble shooters or guides out there, especially videos. I found out that there is an issue with windows 11 and tb4 drivers with intel 12th gen CPUs. Thunderbolt controller isn't present in device manager. I had to install win10 to just connect to my rtx 3090/mantiz egpu. But even now, Performance is better with the laptops inbuilt 3060 card in almost every single case than it is with the 3090. External display with the laptop screen and gpu turned off doesn't make any noticeable differences either which is weird as well. I imagine that I would know a lot more about TB issues and other egpu errors if you had included some of that installation process in your videos, becasue I've watched all of your egpu videos and will keep doing so! 🙂
Very interesting video! Could you do a comparison between thunderbolt and m.2 to pcie x16 adapters? They also have only 4 lanes of course, but they don't have the other thunderbolt limitations.
It's really weird that the gaming laptop had a so much bigger bottleneck compared to the zenbook when it should have more internal PCIe lanes and power to handle them. If you can you should talk to Intel and ask an explanation to the problem
more lanes don't matter if you're limited to pcie 3.0 x4 no matter what. thunderbolt 3 is outdated, it was already shaky for 10 series GPUs, but pretty much unusable for 40 series. Even SSD's offer more than double speeds than tb3 is capable of supporting we've now reached a GPU speed to where we get benefits of saturating pcie 4.0 x16, that is equivalent to pcie 3.0 x32, now compared to x4, you can see why they perform suboptimal
This will never happen, gaming laptops back in the day used to come with upgradable CPUs and GPUs like my Alienware I used to own - M17xR4 with i7-3610hq and Radeon 7970m These times are gone and will never come back, one can dream though 😜
@@alanharperchiropractor Manufacturers are pushing ever so thinner and lighter machines. It's impossible to have replaceable processors and graphics cards in this form factor. My alienware for example was almost 2 inches thick with weight of about 4.5kg. These days people say that 2.5kg and 1 inch are not that portable. To be honest socketed cpus and mxm graphics cards were always a niche product for enthusiasts only. Things are different now, I'm content with gaming laptops that have room for at least 2 sodimm and m2s. And if I'm honest I'm happy with my Legion 5 that has decent battery and doesn't crush my knees when used 🤣
Really interesting reporting here, thank you. It would be fascinating to see these results compared to with ROG XG Mobile with an older GPU (3080) and also the next generation of PCI-E EGPU when available.
Hi J., Lenovo legion slim 7 (AMD) or asus zephyrus g15? Which one is better. Also, is it real that the battery life of the slim 7 is poor as Matthew Monis say?
So external eGPU with modified box and Power supply connected to external display. This isn't so much a Laptop +eGPU test as it is a desktop + eCPU* test.
At this moment in time for compact portable gaming, the way to go is ITX sandwich case instead of a laptop. Is not like laptops are used to play while you're on the move in the plane or train or whatever, but to unpack and plug into the mains power supply and use at the various destination locations. The laptops used on your lap far away from a power plug are those ultrabook things that you use to drop in loads of photos and do a quick photoshop edit and upload on some blog. Not for gaming. In the future I hope case manufacturers will expand the compact form factor ITX cases that have a very small depth meaning the GPU and the mobo to be installed on the same plane side by side and developing a CPU cooling system that isn't a high tower but a small height and large surface area on the horizontal plane. This will result in having a similar footprint of the 17" laptops and a a slight height increase, thus useful to carry around. With a portable 17 " monitor you end up with a full gaming rig with a volume not much larger than a gaming 17" laptop
I have a vivobook pro with i5-11300h and iris xe and thunderbolt 4. I want a egpu set up for connecting one or (if possible) two 4k monitors and maybe from time to time play a game in moderate settings. Value/performance wise what would graphics card and egpu box would make sense?
I would love to know how this setup works for 3D renders (using the eGPU of course) with Cinema 4D, Blender, etc. Would this be ok or disappointing too? I am forced to use a laptop to travel around and would be amazing using an eGPU at home to speed up my renders, :) What do you think?
If the 4090 directly outputs to an external monitor, that will minimise the side effect of the structure, and I guess you can have a fairer comparison over cpu bottleneck then.
Oh my fudge, I was literally planning on doin this in the future since I just bought a Razer Blade 17 3070 Ti 150w TDP, and I'm planning on making this my full setup. I'm happy as hell you can try it out first.
This egpu enclosure was released in tandem with the 12.5 inch razer blade stealth. When it debuted it featured soldered ram and 7th gen intel processors with 15w tdp. It was a fine laptop but it had massive bezels and the build quality left a bit to be desired. Really these enclosures only make sense with ultrabook type devices. If you already have a gaming laptop there really is no point. And even still, is it really worth spending all that money on a desktop class card to get such middling performance for how much better the card can do? Back in the day it made more sense because the cards weren’t as powerful and the thunderbolt bandwidth wasn’t as much an issue but nowadays they make no sense. The egpu enclosure over thunderbolt is a lost cause until the day there’s a new cable standard with enough bandwidth to take full advantage of the cards of that time and beyond.
I've noticed some drivers just do not work with an E-gpu... Its not well enough supported, and win 11 made it worse by alot, these tests need to be done in win 10
So, I feel that it is time to ask the following question (stopped watching at 7:20) : does it actually make sense to "degrade" the RTX 4090 (or any other desktop dedicated graphic card like this one) by connecting it to a laptop? I think that it does NOT make sense, I think that it does make sense to use it for the original purpose it was meant to be used - in a big DESKTOP PC. My humble opinion ...
Find out how the ASUS ROG Ally runs with RTX 4090 next! ruclips.net/video/gNlgrxNlt7E/видео.html
DOES IT HAVE A USB VERSION
Do I have to use this gpu or can I find a cheaper one
this is not a usb c but a thunderbolt cable
newer In Flames sucks bro
Would like to see in your analysis the oculink or pcie x4 egpu instead of thunderbolt as it has nore bandwidth. 😊
Welcome to the new episode of which I can't afford
I can afford that but the price/performance is so bad.
Come back on Black Friday, (better be quick when that day is there.)
@@alanharperchiropractor exactly , the 80 series is even worse 75% more over 3080 10gb , 95% price increase in Europe and for that your getting 50%-60% more performance for the 4080 16GB , Jensen is seriously taking the piss.
Even if you can afford it, it'll be not worth the bucks.
@@alanharperchiropractor yep, 3060 with is the most cost effective GPU btw
I believe the main reason for the performance discrepancy between the 2 laptops is due to the way thunderbolt 4 is being implemented. The i7-1260P has TB4 build into its main CPU die while the i9-12900HX does not have native TB4 so it requires another IC, the Intel Maple Ridge TB4 controller which may be causing problems
Here with the HX CPU, because of the discrete Thunderbolt controller, the data flow has to go through the motherboard chipset first and thus has to compete with other things like storage or wifi. In-CPU Thunderbolt implementation is definitely the way to go to have the best eGPU performance as it means the eGPU is pretty much directly connected to CPU PCI Express lanes.
TB4 has not changed bandwith with respect to TB3. It still has 4 lanes of PCIe 3.0, while the 4090 is a 16 lane PCIe 4.0 card, or 8 times more bandwith. That's way it runs like crap
Probably, I just assumed they were the same as the Intel ark page didn't really make it clear that there were thunderbolt differences, so this seems to be a downside for hx.
@@JarrodsTech quite a shame you went through all the trouble using a problematic, not ideal higher TDP laptop, such as a 12700H, 12800H or 12900H
@@AdriMul Well said!
I can confirm that resizable BAR is an issue for Thunderbolt eGPUs. Under Linux I've had to outright disable it in order to increase eGPU performance. Fortunately with Linux I can set kernel boot parameters to enable me to do so regardless of BIOS settings or anything. I have no idea if Windows has any similar options or functionality.
considering moving to linux RN, since my company just ditched office and using google sheet. like how many hour is needed to setup it? is steam os have no major problems to use egpu with USB 4?
Is it possible to run the eGPU with the old drivers without resizable BAR?
@@alexamderhamiltom5238depends, any distribution with a graphical installer should take less than 30 min to install. If it's your first time trying linux i reckon ubuntu would be a good option
This is great but I wanted to see benchmark comparison before the Egpu and after Egpu on the same laptop as well.
It would be horrendous with the smaller laptop without a dedicated graphic card.
just search up intel hd graphics 770 or something like that, and you will get the results
Jarrod is just wonderful. All those testing. Massive results.
I'd be most interested in testing the 3070ti with the i7-1260p compared to the 3070ti in the tower. I think this is likely true for 95% of others watching this video, as very few want to drop stupid money on a 4090. Some thing even more common such as a GTX 1080 or 1070 eGPU i7-1260p / tower comparison would be even better.
Agree. 4090 as eGPU is simply stupid.
Most people want eGPU not because of they want the ultimate performance, but because they don't want to buy both laptop and desktop.
I'm having 1240p with 3060 eGPU, it's pretty fine. But laptop with 3060 can outperform it I guess
@@bogdand.4987 is it possible to connect eGPU (RTX 3080) to my laptop MSI GP62 6qf with type-c 3.1.
USB c looks identical to thunderbolt ports. If it's just USB c 3.2 it won't work. It has to be a thunderbolt port.
Exactly. A comparison of "eGPU vs dGPU" with the "same" card is the comparison i want. I don't need gaming on the go as much but I would like to use the same laptop for everything and connect it to the eGPU for gaming and 3d modeling at home.
Jarrod, I bought a Razer Core X several yrs ago as a gamble to boost up my outclassed old i7 7700HQ Omen15 1060MaxQ, using an RTX2060. It benchmarked at approx 95% of the average desktop 2060, and was amazing tbh. Maybe pushing the limits, it was upgraded with a 3060Ti last yr, and STILL benchmarked 95% of desktop average for that card, and (with CPU undervolting, GPU overclocking, & 16GB RAM) I can still play competively with my ancient 7700HQ laptop, eg. 105fps on Battlefield 1 ultra. But, I do think this is the limit for TB3/4, and my next PC will have to be a desktop build, I can't see eGPU being a viable use for an expensive RTX4-series graphics card on any laptop. RIP eGPU beyond RTX3060Ti.
Lower-tier GPUs use less bandwidth is why. TB3/TB4 is limited to 40Gb/s, which lower-end GPUs will typically cap out at anyway, even on a normal PCIe slot in most desktops. They just don't really use the rest of the throughput. The 4090, though, DOES utilize more than 40Gb/s because of how fast and data heavy it is, which is why it performs so poorly compared to a normal desktop setup. Sadly, Thunderbolt 5 with 80Gb/s will need to be released for eGPUs to become viable again because of how bandwidth dependent newer GPUs are compared to cards from 2015-2017 (when the Core/Core X was in it's heyday)
Jarrod is a tech scientist at this point. Running various experiments to advance the gaming life
Thanks for the amazing vid.
EGPU's have never stopped gaining my interest, I just wish they got better with new tech.
Just as fyi, Halo Infinite needed to have ReBar turned off in Nvidia Profile Inspector for driver 522.25 for it to work right on my eGPU 3070. I assume that's the case here alongside Watch Dog: Legion. With my 3070 I get 70-90 FPS at 4K and 130-160 FPS at 720P, so more graphics horsepower like the 4090 has would have definitely made the eGPU shine as it's not that bandwidth limited.
Nice info, I will remember this when I get an eGPU setup in the future
@@naufalkusumah2192 yeah if you need instructions I posted them in the eGPU sub, just search it up as my comment kept getting deleted when I posted the link
@@omegamalkior1874 In Reddit?
@@omegamalkior1874 I’m curious about that, if there’s a link or a place to find that info let us know
@@killertruth186 yes
Basically, if you can afford a 4090, you can afford a PC.
Common sense bro
you can afford both a PC and a laptop
@@GewelReal honestly... the biggest issue with a 4090 is the cost because i can buy a pretty ballin gaming laptop for 1600
@@wnxdafriz yet the 4090 desktop will be twice as fast as a 3080ti laptop, if not more.
Affording a pc is not the issue, portability is. The reason people pay more for a laptop... If I chose to build a pc I can get so much more real bang for the buck. People don't buy gaming laptops to save money.
Shocker, eGPU's are still a terrible buy if you put anything above a low to mid range card in it. Not changing anytime soon either, seeing as TB4 has the same bandwidth as TB3.
If you put a low end card in it, it is also not that worth because the egpu enclosure costs as much as a low end card (I believe?). Honestly, I believe the best application of an egpu is for other workloads that depend less on the pcie bandwidth, such as compute, rendering or video editing. Sadly, it seems that gamers predominantly use it instead.
Except with the Xg Mobile from Asus 🙃
eGPUs are a lifestyle where you throw out performance per dollar and will never be considered for budget builds
@@sparkz6381 Also throw out raw performance as well
Unless you use pcie 4 or 5 SSDs slots..
Or disable resizable bar.
I am surprised no one has yet to make and egpu that can combine 2 tb ports for double the memory as many laptops have two tb ports now?
Yeah. I think it’s the fact that the higher end laptop has a dedicated gpu. I remember in the early years of thunderbolt 2/3. It was easier to install and run a egpu on igpu only laptop vs a laptop with a dgpu.
The question here is, why don't you perform the tests with a usb 4.0 (40gbps)? That's the top tier in egpu market now.
I've looked all over for the usb 4.0 egpus and I cant find any. Where are they?
Or with oculink egpu setups
You mean TB4
I use the eGPU ( Dell xps 9710 with eGPU 4060 TI 16GB) when I’m running batch of Denoising in Lightroom .. it makes a world of difference. Of course my RTX 3090 with intel’s 13900K desktop is faster.
Very informative video. I'm using an eGPU for work coz I don't own a desktop. So it would be interesting to see how the rendering speed using Twinmotion or Lumion. To explain, I'm an architect by profession. My laptop goes around with me from office, client presentation, job sites and home. And I have eGPU for office and home.
Similar here.
A better comparison would be testing different egpu setups to see which graphics cards are pointless or most efficient given the bottleneck. For example,. does a 2070 or 3090, or even a 1070 egpu get similar framerates given the thunderbolt bottleneck? Also, you mention usb4 is faster so does this mean that the upcoming Win GPD 4 with USB4 will have a better thunderbolt bandwith and better performance with egpus?
Timestamps
01:44 - Comparison of connecting RTX 4090 to a laptop vs desktop
03:28 - Comparison of CPU performance between different laptops and desktop for eGPU setup
05:12 - eGPU setup comparison between laptops and desktops
06:56 - RTX 4090 gaming laptop performance limited by Thunderbolt bandwidth and processor
08:40 - Performance discrepancies with different hardware configurations and eGPU setups.
10:24 - Desktop setup outperforms laptop in gaming performance due to CPU and Thunderbolt limitations.
12:08 - RTX 4090 eGPU performance on gaming laptop compared to desktop.
13:50 - Considerations for RTX 4090 eGPU setup
Can you please test this with Blender benchmarks, and 3D workstations? Thanks
Would love to see the same test as I'm considering an eGPU for blender
Thunderbolt is far too limiting for an EGPU, I think the 2080 would be the most I wound pair it with and even then only if your laptop doesn't have a half decent video card. I winder if the iGPU in the HX was interfering with it. The HX line uses the older Intel HD graphics. I would try disabling it via device manager.
Yep. iGPU-s were always a terrible idea unless you wanna connect an ultrabook(like LG gram or something) to a mid-range GPU. But even then some bandwidth would be lost.
Also, hi!
@@tabalugadragon3555 hey Sergei! It has actually reminded me. I have the Razer Core and Alienware Amplifier. I should sell those oversized paperweights, lol
Be interesting to test again with egpu external monitor instead of video data back to laptop. Considering the egpu would normally be used as a desktop usage and most people use external monitor at home/work with the laptop anyway.
Isn't that what he said he did?
If there's a way to do it, it would have been interesting to see one of the worst performing games for the laptops in a set up where the desktop is using the eGPU, making it posible, for example, to check the impact of resizable BAR, or maybe to use it as a best case scenario.
I got the impression an EGPU works best with laptops without dedicated graphics.
Hey jarrod the 12900hx was performaing worse cause 16 gen 4 lanes were connected to the 3070 ti leaving the i9 with 4 to spare from which 2 lanes are for the storage that leaves us with 2 lanes of thunderbolt for the 4090 while the 1260p had 4 gen 4 lanes on thunderbolt
Is this really the explanation? I didn't see reduced performance vs my Spectre 16 with RTX 3050 when I had it with my eGPU 3070 vs my Spectre 14 tbh, which should've had the same explanation for it if true
@@omegamalkior1874 the 3050 laptop only has a max of 8 gen4 lanes
@@omegamalkior1874 also the 3070 is a weak gpu compared to 4090
Wondering how much of a gap there would be with something more like a 4070 Ti, 7900 XT (or even more so, an 7800 once available). Going all out on the top of the top with eGPU clearly doesn't make sense at the moment, but depending on how much performance loss there is for the more attainable, upper midrange products, this may still be quite interesting.
Context: considering to wait for the next generation of Frameworks (which, should hopefully include the Thunderbolt bandwidth upgrades you mentioned) and looking at a viable eGPU for an above-midrange solution. Coming from an Omen 15 4800H/1660Ti.
Now try this test again with a modern laptop that has PCIE 4 M.2 x4 slots with a GEN 4 eGPU dock.
For better performance you could hook up directly to the pcie 4 ssds slot..
It carries none of the bottlenecks if thunderbolt and should get twice the bandwidth, assuming you can get it to work?
But what if you're connecting external displays to ghe eGPU and not running through the laptop screen? That's what I'm keen on, can I run my laptop as a CPU / OS and everything else connected to a dock and eGPU via TB3/4. Anyway, maybe a test for next time?
That's what he misses in this video, the bandwidth seems low because half the bandwidth is going to feed the laptop screen which also adds more load to the cpu, if he was using an external display he'd be getting the full 4GB/sec of bandwidth
Good review. I'd thought about having an external gpu with an HX cpu but not based on these results. Looks like an HX with an 3070 or 3080 is in my future. Depends on black Friday pricing.
Please compare content creator workloads: octane render (octane benchmark), redshift render, blender… Egpu 4090 with laptop vs desktop and 4090
I'm really interested to see how this plays out next year with Zen 4 and 13th gen laptops, ESPECIALLY Zen 4 Dragon Range which focuses on CPU performance and is made to pair with another GPU. I'm thinking a possible 50% jump in performance in a setup like this.
Dragon Range will not be available for at least 5-6 months though, so u're gonna have to wait quite a long time. At least the 13th gen CPU for laptop is supposed to launch in Jan 2023
@@ucle9955 In fairness, most of the heavy lifting -- if not all of it -- is already done for Dragon Range.
Hey just bought a GPD Win 4 and enjoying it but thinking for future how docked usage of getting an E GPU Oculink Setup. Do you think a 4090 or 4090ti would be worth it in that case. And on that note if it is would a 4090ti fit in that case?
I heard that PCI-E M.2 connection is more direct and has less overhead than using Thunderbolt. It would be interesting to compare with this connection as well, who knows. Of course with 4-lanes PCI-E setup, as there are a plenty of 1-lane setups on the market.
I think oculink does that. No usb4/thunderbolt overhead. A very janky setup but has a better compatibility with most games.
Would be interesting to compare against a M.2 4.0 PCIE adapter.
Yeah, then we can gauge the pure performance falloff without CPU interference from the desktop.
I believe when PCI-E 5.0 4x m.2 will benefit these type of gpu alot, Like quite alot since it’s basically pcie 4 x8. That is more then enough.
@@XxGAMERxXPS3 100%. Who knows how long it'll be before we see PCIE 5.0 M.2 adapters and (more importantly) laptops with PCIE 5.0 M.2 slots though?
With how cheap used GPU's are I decided to just build a gaming PC (5700 XT and i3-12100). Only cost about $600 and should beat the performance of most EGPU setups (assuming no RT).
@@vinyfiny There is laptop with Pice 5.0 M.2. I believe MSI have one. Give it 1-2 more years and we will se more laptops with pcie gen 5. Hopefully we get a USB5 or what ever TB with gen 5 x4.
Maybe the best implementation of e-GPU is in the house, eg: RoG Flow, which I recalled that using Thunderbolt only reduce the performance by much than using the full XG
I'm confused. These 4090 egpu benchmarks (specifically for red dead 2) are far lower than your 3080 egpu benchmarks from your previous video. I don't see how this test could be accurate if the 4090 is performing worse than the 3080. Both were tested at High settings at these same resolutions.
This is why I tested multiple laptops, it's the way it is. It's probably because the older gen laptop had thunderbolt on die rather than separate.
Wonderful Video, Jarrod. Can you also test the eGPU + Laptop performance on the Blender, Redshift rendering? I currently have the Legion 12700+3060, wandering to buy a eGPU to boost my laptop or to sell my laptop and buy a whole new 4090 setup. Thanks Bro.
What about using a eGPU for laptops for rendering and other 3d applications?
would had been nice to see you using a m.2 slot for the egpu as well because the speed would be way faster. no one I have seen has done it this way while also using a 40 series gpu. would be nice to see the results compared to the 7950x etc.
And connect eGPU to the external display to reduce overhead.
Thank you man for uploading
Hey Jarrod!
Can you try comparing egpu performance between midrange RTX cards and Arc cards?
Arc is pretty trash and no one is really going to be able to get one.
Actually ARC dGPU laptops are in abundance but no one is really going to buy one
2 years later, what the hell is going on :/
4:00 It should not be legal to call something TB3 or TB4 and speed is what ever it is, depending on implementation as a user , I can't test it. I have nothing faster to test it with. Same as HDMI speeds.
Isn't the point of standards is that we can trust it ?!
Thanks for testing these things.
Question is there a way to mode an egpu to combine 2 tb ports for double the bandwidth.?
so glad I watched this video! Thank you.
I was just about to pull the trigger on an ultrabook with thunderbolt and get an eGPU for desktop use.
Now I can see it's not worth it yet. A gaming desktop and a cheap laptop or tablet would be a better use of money.
Hi Jarrod, please can you test the eGPU with content workloads. 🙏🙏 looking at an eGPU for rendering.
I took the liberty of posting this to r/egpu. There's some discussion there already about 1 or 2 things in the analysis. First is about Halo Infinite having different behavior with bugged driver versions(522.25, this appears to have been discussed in Jarrod's Dischord). Second is about the HX variant of the 12900 mobile CPU apparently having both the PCH and Thunderbolt controller separate to the CPU, unlike 1260P and (apparently) 12x00H (no X, such as the 12900H). The later could explain the boatload of results where the 1260P had over 12900HX and is a situation that dates back to Ice Lake where the integrated PCH and controller had a noticeable improvement for eGPU use.
imagine this is how the future gpu will look like.
Hope they are not huge like this and use less wattage.
Hello Jarrod, if it isn't too much trouble
Could you do the same tests, but with this year's Zenbook to see how the bandwidth improvements improved the gameplay
I don't have a laptop or a pc but i wanna see if rog will be making a portable 4090 like they did with the 30 series that u can plug on their laptop
Do you have M.2/PCIe eGPU connector? Will it perform better, if using it? Or is Thunderbolt superior on laptops?
It would be extremely worthy on the PE ratio if using a GTX1080, I believe.
@6:36 Thanks for the clarification that eGPU with RTX 4090 is even worse than the RTX 3070Ti Laptop GPU, otherwise I was under the wrong impression that if 4090 Desktop is 4x > 4090 eGPU, then it must be at least 10x Better than my RTX 3070ti laptop GPU.
Interested in 4070ti performance on eGPU! But I guess it's definitely worse than 4090
Can you try this setup again with DLSS3 and Frame Gen enabled? Maybe even include comparisons with other 4000 series cards? I saw on some boards that frame gen compensates significantly for the drop with TB3/4 bandwidth.
Would love to see that with an AMD machine.
12th gen Intel is faster
AMD is vastly more efficient tho
Does AMD support Thunderbolt tho🤔
@@ezrapierce1233 Some of the laptops do now with USB 4.
@@ezrapierce1233 USB 4 don't discriminate, they just penetrate.
You don't really need a 4090 for the eGPU setup. If you get, for example, a Razer Book and pair it with a 3060Ti, you can get stellar performance and you still have an advantage that a desktop doesn't have - you can unplug and get great battery life on the go allowing you to get some work done, do presentations, etc, on one device.
Honestly I think manufactures have forgotten about egpu solutions. When was the last time we saw a new one come to market? The XG mobile is sadly the best one currently and it is so expensive that if honestly that you might as well grab a mid range desktop pc and a budget gaming laptop and probably come out ahead. I really miss Alienwares egpu. I know it was proprietary, but it had much better performance. I really wish dell was putting all that money towards making a v2 and make it a no brainer choice to buy their machines.
Laptops with QLED. QD-OLED fast refresh rate displays?
If you ever find out why the i7 was outperforming the i9 later, please tell us!
Melting aluminium, memory difference, some sort of difference on how both laptops handle I/O as TB will compete with other I/O traffic. Hard to tell.
What case would a 4090 fit in for egpu use
Hey Jarrod, would much rather a comparison between desktop and eGPU performance for a card like the 1650 or 3060 or something in that range because people who can afford a 4090 probably won't have any problems buying other PC components
Very welcomed video, many thanks for sharing.
Great video. Thanks for always doing these egpu tests. Would love to see it paired with a 6800u laptop to see how new handhelds might perform
You got the 3090TI to fit in the eGPU? I couldn't get my Founders Edition to fit. I however, did not try just taking the case off.
We desperately need Thunderbolt 4 docks to properly use these more beefier cards. Even with the 10 series TB3 was failing to saturate a good chunk of the bus the card could provide. eGPUs are still an amazing concept and I hope someone steps up to give the standard even more improvements
Damn , this is the video i was looking foor, thanks mate
Thanks Jarrod. As usual, eGPU is not worth it due to bandwidth limitations of thunderbolt but boy is it interesting.
Not true. Just connect an external monitor to your eGPU box and you will see that it work much faster. If the internal display is in use, the TB3(4) send data in both direction. Just imagine a some person who has the eGPU but not external monitor.
i don't find anyone else doing this. Thank you for the test
No problem, thanks for watching!
it would be good to see 3070 ti mobile inclded in all tests! to compare that scenario too!
Would be good idea to see lower tier RTX 4000 series (4050,60 or 70). Should be less significant differences (logically thinking)
Was so curious about the amount of bottleneck
Thanks for another good eGPU video Jarrod! I would really like to a full guide eGPU set up video from you! I've been fighting with an egpu to work properly for months now, but it's hard to find any good trouble shooters or guides out there, especially videos.
I found out that there is an issue with windows 11 and tb4 drivers with intel 12th gen CPUs. Thunderbolt controller isn't present in device manager. I had to install win10 to just connect to my rtx 3090/mantiz egpu. But even now, Performance is better with the laptops inbuilt 3060 card in almost every single case than it is with the 3090. External display with the laptop screen and gpu turned off doesn't make any noticeable differences either which is weird as well. I imagine that I would know a lot more about TB issues and other egpu errors if you had included some of that installation process in your videos, becasue I've watched all of your egpu videos and will keep doing so! 🙂
Have you checked the processor's operating frequencies, has there been a frequency drop of about 2 Ghz?
Time for manufactuers to update their egpu enclosures, it seems
No, it's time to wait for the new thunderbolt spec
No, it's time to wait for manufactures to update their eGPU encloses for new thunderbolt spec, it seems
Very interesting video! Could you do a comparison between thunderbolt and m.2 to pcie x16 adapters? They also have only 4 lanes of course, but they don't have the other thunderbolt limitations.
It's really weird that the gaming laptop had a so much bigger bottleneck compared to the zenbook when it should have more internal PCIe lanes and power to handle them. If you can you should talk to Intel and ask an explanation to the problem
more lanes don't matter if you're limited to pcie 3.0 x4 no matter what. thunderbolt 3 is outdated, it was already shaky for 10 series GPUs, but pretty much unusable for 40 series. Even SSD's offer more than double speeds than tb3 is capable of supporting
we've now reached a GPU speed to where we get benefits of saturating pcie 4.0 x16, that is equivalent to pcie 3.0 x32, now compared to x4, you can see why they perform suboptimal
meanwhile me on my 9 year old shitty laptp trying to run flappy bird with no lag
I hope one day to see the possibility to easily upgrade CPU and GPU on gaming laptops. That would be amazing!
Nice idea
This will never happen, gaming laptops back in the day used to come with upgradable CPUs and GPUs like my Alienware I used to own - M17xR4 with i7-3610hq and Radeon 7970m
These times are gone and will never come back, one can dream though 😜
@@EvilijoUK why did it end tho?
@@alanharperchiropractor Manufacturers are pushing ever so thinner and lighter machines. It's impossible to have replaceable processors and graphics cards in this form factor. My alienware for example was almost 2 inches thick with weight of about 4.5kg. These days people say that 2.5kg and 1 inch are not that portable. To be honest socketed cpus and mxm graphics cards were always a niche product for enthusiasts only. Things are different now, I'm content with gaming laptops that have room for at least 2 sodimm and m2s. And if I'm honest I'm happy with my Legion 5 that has decent battery and doesn't crush my knees when used 🤣
With an eGPU that's possible, just get a Framework Laptop and hook the eGPU
Love this Frankenstein experiment haha. Results are all over the place
Really interesting reporting here, thank you. It would be fascinating to see these results compared to with ROG XG Mobile with an older GPU (3080) and also the next generation of PCI-E EGPU when available.
Hi J., Lenovo legion slim 7 (AMD) or asus zephyrus g15? Which one is better. Also, is it real that the battery life of the slim 7 is poor as Matthew Monis say?
Very solid review. thank you!
or you could've used a ADT-LINK m.2 to eGPU 4.0 board and have a way better result since it uses pcie gen4 but whatever i guess
i'm sorry man but you should've done more research regarding the types of eGPU setups
like the way you did has the worst case scenario for performance
and since the adt board connects directly to m.2 it doesn't have the same overhead as TB4, like bruh
In Flames T-Shirt. Nice
can you test this gpu setup with amd ryzen 6000 laptop through usb4 ?
Can you do this with nvme 4.0 x4 to pcie 4.0 x16? Also plz test an egpu with the 80gbs thunderbolt 4 v2 or whatever it's called that's coming
It's USB 4.2, or Thunderbolt 5
@@samega7cattac yeah whatever they call it I can't wait for it
So external eGPU with modified box and Power supply connected to external display. This isn't so much a Laptop +eGPU test as it is a desktop + eCPU* test.
At this moment in time for compact portable gaming, the way to go is ITX sandwich case instead of a laptop. Is not like laptops are used to play while you're on the move in the plane or train or whatever, but to unpack and plug into the mains power supply and use at the various destination locations. The laptops used on your lap far away from a power plug are those ultrabook things that you use to drop in loads of photos and do a quick photoshop edit and upload on some blog. Not for gaming.
In the future I hope case manufacturers will expand the compact form factor ITX cases that have a very small depth meaning the GPU and the mobo to be installed on the same plane side by side and developing a CPU cooling system that isn't a high tower but a small height and large surface area on the horizontal plane. This will result in having a similar footprint of the 17" laptops and a a slight height increase, thus useful to carry around. With a portable 17 " monitor you end up with a full gaming rig with a volume not much larger than a gaming 17" laptop
I have a vivobook pro with i5-11300h and iris xe and thunderbolt 4.
I want a egpu set up for connecting one or (if possible) two 4k monitors and maybe from time to time play a game in moderate settings.
Value/performance wise what would graphics card and egpu box would make sense?
I would love to know how this setup works for 3D renders (using the eGPU of course) with Cinema 4D, Blender, etc. Would this be ok or disappointing too? I am forced to use a laptop to travel around and would be amazing using an eGPU at home to speed up my renders, :) What do you think?
same here
If the 4090 directly outputs to an external monitor, that will minimise the side effect of the structure, and I guess you can have a fairer comparison over cpu bottleneck then.
Great analysis, thanks!
One of the most valuable eGPU vids I've seen, GREAT WORK!
thanks!
How does the heat distribution work exactly? Does a laptop look cooler when using an eGPU?
Time is the most valuable asset in world and you saved me 1 h time researching for limitations of Thunderbolt 3-4. Thank you.
Oh my fudge, I was literally planning on doin this in the future since I just bought a Razer Blade 17 3070 Ti 150w TDP, and I'm planning on making this my full setup. I'm happy as hell you can try it out first.
best and professional presentation, thank you very much.
This egpu enclosure was released in tandem with the 12.5 inch razer blade stealth. When it debuted it featured soldered ram and 7th gen intel processors with 15w tdp. It was a fine laptop but it had massive bezels and the build quality left a bit to be desired. Really these enclosures only make sense with ultrabook type devices. If you already have a gaming laptop there really is no point. And even still, is it really worth spending all that money on a desktop class card to get such middling performance for how much better the card can do? Back in the day it made more sense because the cards weren’t as powerful and the thunderbolt bandwidth wasn’t as much an issue but nowadays they make no sense. The egpu enclosure over thunderbolt is a lost cause until the day there’s a new cable standard with enough bandwidth to take full advantage of the cards of that time and beyond.
I've noticed some drivers just do not work with an E-gpu... Its not well enough supported, and win 11 made it worse by alot, these tests need to be done in win 10
So, I feel that it is time to ask the following question (stopped watching at 7:20) : does it actually make sense to "degrade" the RTX 4090 (or any other desktop dedicated graphic card like this one) by connecting it to a laptop? I think that it does NOT make sense, I think that it does make sense to use it for the original purpose it was meant to be used - in a big DESKTOP PC. My humble opinion ...