Do YOU guys agree with my conclusion? Let me know why or why not down below! The BEST M1 Mac on Amazon (ON SALE) ➡ geni.us/1mJ41T Best deals on M1 Macs & Accessories on Amazon ⬇️ M1 MacBook Pro (ON SALE) ➡ geni.us/7lb3Gn M1 MacBook Air ➡ geni.us/1mJ41T M1 Mac Mini 2020 ➡ geni.us/Atm1 OWC Thunderbolt Hub exclusively for M1 Macs ➡ bhpho.to/37SIff7 Thunderbolt 3 SSD that we use ($100 OFF) ➡ geni.us/zWdO Support us by checking out our Merch! ➡ 2teespring.com/stores/max-tech-store Max tech wallpapers ➡ bit.ly/2WNc6Qw
My only question is can the M1 Max TRULY support editing workflows using two 2k or 4k monitors at more than 60hz? To me, THIS is what eGPUs really do, they let me expand the laptop to a proper desktop editing experience. I haven't seen ANYONE talking about the new max chips being able to do this and would love to see you test this out! My specs / scenario is this: 6K BRAW 12 bit footage exporting to h.264 on Premiere 15" MBP 2019 maxed out (except for SSD storage) Razer Core eGPU + Radeon 5700 XT I'm running two monitors at 2K (one at 75hz and the other at 120hz) How would the 16in M1 Max 64GB fare with the same footage / export settings while also running two 2K monitors (one HDMI and one through thunderbolt)?
Not really, the performance comparison you are seeing is mostly ASIC codecs. I already use a eGPU daily on all my platfoms, my displaylink dock. It's a different protocol yes, but still.
What I always find fascinating is so many of these tech youtubers have tunnel-vision when it comes to the laptop as a professional tool. They only look at it as a video editing rig....because they're all video editors, I guess. Newsflash: Most laptop users aren't video editors. Some of us are architects, industrial & mechanical engineers, designers etc, etc. I use an eGPU with a Quadro card because I'm a solidworks user. With that setup connected to my XPS 9310 (which obvs uses TB4), it's amazing how capable that machine is. So, maybe eGPUs don't have future in Apple ecosystem, but that doesn't mean they're dead.
Yes. This. Not every pro user is editing video. And the examples he gave where eGPU showed no improvement is because of the T2 chip which has special hardware for some specific encodings.
@@zf1786 I do have a desktop. But, it's a desktop with an RTX 2080, a great graphics card for gaming - which I use it for - but not one designed for Solidworks. This is the dilemma many people who need Quadro cards for CAD programs have. So, with the eGPU, I can still do work at home while having portable computing power while traveling.
@@zf1786 I already have the macbook pro for game, with an i9 and 32gb ram on it. Why I would buy a whole new PC if I can just buy an eGPU and plug a single cable?
I honestly don't think that "generalists" tech YTbers can actually appease the whole world at once i.e. every professional laptop user. The sheer amount of resources needed to cover that wide a base with small (or one-man) production teams would, IMO, be too great a burden and a poor ROI for the channel. And I guess the market for video editors (professional or otherwise) is much larger and more sustainable than smaller professional and/or niche occupations/industries. I'd wager there are more people who edit videos on laptops (including budding "home porn" producers) than there are those who render architecture modeling as a pastime.
@@zf1786 I'm unaware of a 13" thin & light that comes with a discrete Quadro card. I have no need nor desire to lug around a 15" or 17" laptop. But, do need access to Solidworks when working from home.
People didn’t buy an e-gpu for video editing, but for gaming.... its only in the mind of most streamers that video editing performance is more important then gaming performance...
For me, an eGPU in 2021 is still attractive. This is because I already own a laptop. It’s dramatic how fast graphics technology evolves - when I bought this machine in 2017 the 4GB card seemed more than enough. Today, not so. And yet, an eGPU and GPU can supercharge my laptop for a quarter of the price of either a new PC, or a new laptop. We might all be stunned when Apple’s new Macs arrive, but something tells me there won’t be won’t widespread support for Windows gaming on the Mac right out of the gate. As it stands, I’m glad the eGPU option is still around as an option for someone in my situation, but for anyone buying a new laptop or desktop today, the choice is obvious.
Apple not supporting eGPUs is far from the end of eGPUs in general. It's just the end for Apple users where that was important to them. Still benefits in other use cases and devices: content creation is not the only reason to have GPU power. It makes it a hard pass for me getting a Mac anytime soon.
What impacted the potential proliferation of eGPUs was the rarity of thunderbolt 3 ports on laptops. However, even with Apple dropping Mac support, the higher bandwidth thunderbolt 4 port will be ubiquitous on all 11th gen intel CPU powered laptops. This means a huge increase in the potential market (customers) for eGPUs. We may see Thunderbolt 4 optimized eGPUs becoming much more common.
The reason egpus are not good for what you showed is because of the overall bandwidth mainly, and latency secondly. The thunderbolt 3 standard only contains 4 pcie lanes, which is a bottleneck for all the rest of the system and it causes latency because, as you mentioned, the same amount of data needs to travel the same route with a quarter of the bandwidth, so it takes more time, creating latency. BUT if intel releases thunderbolt 5 or 6 or whatever with the option of containing 16 pcie lanes, this latency will just disappear because the bottleneck i mentioned earlier won't be. So the problem you mentioned is solveable! Especially with pcie gen 4, every lane has double the bandwidth of gen 3 so even with 8 lanes thunderbolt will have the bandwidth of 16 gen 3 lanes (4 times the performance it has now without too much complexity, eliminating any latency)
don't listen to this guy he's an apple Fanboy they think all because apple stops doing it pc is going to copy its funny I'd recommend a intel laptop all there newer pcs are supporting egpus
I do disagree. TB3 uses PCI GEN 3 4x (TB4 too). But with pci gen 4 they will soon be able to double the data rate, running a gpu basically on 8x gen 3 speed. Considering only very high-end gpus slightly exceed 8x gen 3 speeds TB will definitely not die. External and cooled gpus will ALWAYS outperform internal graphics. Thunderbolt 5 will be a game changer with double the bandwidth and lower latency. Apple might drop pci e support for real graphicscards completely. But because they will probably replace the Mac pro with a dedicated gpu I don't think so. They might still drop it though for laptops. But when max performance is key, proper cooling is absolutely necessary! For this reason I guess apple will offer a solution, maybe completely inhouse. The rest of the industry will stick with pci and thunderbolt for max performance in light laptops. Create a problem, then sell the solution, that's apples Motto!
Correction: thunderbolt 4 uses only 1 lane of pcie gen 3. It's info is hard to find, but I found it on the egpu forum. Those lost lanes are to allow 3 thunderbolt 3 downstream ports on docks.
@@_reZ Thank you for the for the input! From my research I found that Thunderbolt is very confusing by "tunneling" multiple protocols interchangeably (USB, DisplayPort, PCI-E). TB4 is PHYSICALLY connected using 4 PCI-E 3.0 and 2 DP1.4 interfaces! (Example ASUS ThunderboltEX 4 expansion card) TB3 and TB4 are able to allocate all available PCI-E 3.0 (4x) and DP lanes to the connection or even split the connection between multiple protocols. When connected, the target device is able to choose the “tunnelling protocols” from the TB-Connection and E-GPUs choose PCI-E (and a video return if needed (maybe DP is bidirectional?)). The controller in source1 somehow mentioned “Other native interfaces: x1 PCIe” where I am unsure what is different for native protocols from “tunnelling” Unfortunately my prediction hasn’t come true yet. Asus is using it’s proprietary (yet cool) connector for E-GPUs and TB is still waiting to be updated for an increase in data-rate. Will come though🤞. Source1 TB4 Controller: “Tunneling capabilities (32G PCIe, USB3(10G), 2 displays (up to DP1.4) […] Other native interfaces: x1 PCIe 8GT/s, 1x USB3 (10G)” ark.intel.com/content/www/us/en/ark/products/189982/intel-jhl8440-thunderbolt-4-controller.html Source 2 TB3 Techbrief (nothing available for TB4 lol): “[…] the silicon extracts and routes up to 4 lanes of PCI Express Gen 3(4 x 8 Gbps) and up to two full (4 lane) links of DisplayPort out over the Thunderbolt cable and connector to the device(s) attached downstream from the host system.” www.thunderbolttechnology.net/sites/default/files/Thunderbolt3_TechBrief_FINAL.pdf But please do your own research if you are interested in the topic. I’ve always wanted to dig deeper into thunderbolt and you gave me a reason :) Thank you! If you got things to add, don’t hesitate :)😊
Video editing yes, but for a 3D motion graphics designer eGPUs were a godsend. GPU render engines seem to be trendy now, and Apple kicked the whole industry to use buggy-as-hell Windowses.
I'm having second thoughts about purchasing an Apple laptop due to its GPU's inadequate support for the Unreal Engine application, and with the recent news that they're discontinuing support for external GPUs, I've made the decision to permanently part ways with Apple.
It seems as if Apple doesn’t want to take the focus off the M1 by (easily) supporting eGPUs with simple OS-level support in Big Sur, or take the focus off the M1s 8 GPU cores. Apple may fear that allowing eGPUs (for now) might run the risk of suggesting that the M1 is in any way deficient. (FWIW, rumor has it that the 14 and/or the 16 inch M[?] MacBook Pros will have native PCIe 3 support…)
Thats a dumb assumption. eGPU is just a case for a GPU. You saying "eGPU will die" therefore doesn't make any sense, because GPUs won't die for sure and somebody is going to manufacture a case for a GPU. Latency argument also doesn't make any sense, because other Laptops proof that its entirely possible to do without latency, especially because nobody cares about the cable distance if we are at lightning speed. Let's also not forget that Thunderbold is a developing technology.
I noticed the same issue with FCP, I was having dropped frames on 60 frame footage but when I unplugged the gpu my integrated intel graphics played it back with no issues
I agree. eGPUs made sense when the only available GPU on your laptop was weak Intel Integrated graphics such as a UHD 630. eGPUs are also at a disadvantage compared with laptops that include a discrete GPU, such as the MacBook Pro 16, as the internal discrete GPU typically connects to the CPU with 8 or more PCi data lanes, where a Thunderbolt 3 port has a max bandwidth of 4 PCi data lanes. The M1 chip provides an early indication of how good integrated GPUs could become on Apple Silicon M series processors, and I even wonder if Apple will also get rid of discrete GPUs in their entire laptop product range, and only use them in their desktop MACs. The closer you can get the GPU to the CPU the lower the latency with performance gains. It then becomes a thermal management tradeoff issue as you are starting to concentrate a lot processing performance with power consumption and associated thermal load into an exceedingly small space.
Maybe, though I don't know if Apple can actually create a high end competitive GPU. If they can, great. If not they need to buy them from AMD or Nvidia
At the moment I am getting better results with my Mac mini 2018 (64 GB RAM, i5 model) and an eGPU with Vega 64 than I am getting with the MacBook Air M1 (16 GB RAM) in video editing. I think that will change in the future so next time I buy a new computer, I won't need the eGPU. But think that is some years further down the road, are happy with my set up at the moment.
Every thing looks fine with tests, but heat. eGPU heats own radiator, CPU can work on higher speeds in this case. A lot of gaming laptops has throttle issues.
Guys, I’m a subscriber and watch every video you show. But don’t you think there’s just so many videos you can make on Mac M1? I counted 37 so far... didn’t count iPhones or air max. I say this with love, really want you guys to thrive, and also eager for more content that you make so well
For raytracing applications however, an eGPU is still better because they offload all the data to the GPU in a one-time operation, and then the GPU crunches on that data. It's not back and forth like video editing.
Theoretically you could still connect a egpu as I believe that the m1 macs still have thunderbolt, however as far as I am aware nvidia, amd and Intel don't have arm drivers for their desktop gpus
Sounds like bad news for Mac gamers like me. I want to stay in Mac OS, but inability to use an eGPU (or boot camp) seems like a real future deterrent for me. Unless Mac's will actually get good graphics performance for gaming... (I'm not holding my breath). Damnit, now I'm all depressed! :(
Its still good for gaming though, since gaming doesnt require crazy bandwith. Oh just realized asus flow has a dedicated connector for the gpu which should have more bandwith so editing performamce should be good too.
Intel just matched the dedicated connector's speed with Thunderbolt 5. Asus needs to make a PCIe 6.0 ver of the XG Connector for the next version of this device to keep up. They also need to give a Desktop Ver of the eGPU, the mobile version is a bit pricey.
Have you thought about making another video 2 years later on the same theme. Today the cost of Apple computers is skyrocketing, at least in Italy and buying one has become really expensive. I use the vectorworks cad id and the 2023 version of the cad is really energy-intensive and would like the mac ultra card for average performance, so you do little about the fact that you can have greater performance with a pc-window and more performing nvidia graphics cards. I'd like to buy a mac but I'm hesitant, because I don't want to spend €8,000.00 for a studio mac with 192mb integrated memory and 72 gpu cores. Also how does the mac gpu work?
.. They didn't drop eGPU Support dude. The M1 Mac's are first gen. The eGPU's are supported by Thunderbolt on the M1 Macs. The issue is drivers, there are no ARM Drivers for OSX.
I can see external graphics card enclosures remaining useful, but not for graphics. Similar to how some tasks are better performed using a cluster of Raspberry Pis than a single PC or Mac that is overall more powerful, there are other tasks that graphics cards are useful for where more is always better. Things like Bitcoin mining & distributed computing - BOINC, Folding@home, etc. - i.e. eGPUs won’t be used for streaming graphics from the CPU back to the display, but they will be used for “dumping” computational models onto graphics cards, letting them run the process at hand, and then getting the results back once they’ve completed.
Just because it’s slower in video editing doesn’t mean it’s bad or dead. The future could have even more faster ports with more power with multiple channels allowing more data with huge bandwidth. We haven’t yet reached the
It may well be dead for the Mac world, but in no way is it even close to dead in the Windows world, particularly in the business environment and for many people working from home, where it allows creative professionals to use portable ultrabooks with high quality high resolution screens (Gaming laptops are generally geared towards high refresh 1080 displays) which can then just dock with a eGPU station hot desk and get instant graphics performance and multi monitor support.
@@FAT8893 hell there great for mini pcs I don't have a lot of space in my room for a full gaming desktop rig I have a skull nuc i7 been gaming on it with a egpu it's my favorite pc I swear apple Fanboys are dumb
But what about multi-monitor support? I use eGPU so i can hook up 4 external monitors @ 4k. Currently none of the M1 chip macs could support more than 2 external monitors
This dude's logic is epic dumb, Apple's Graphics improvement is good for Apple's products, that doesn't affect to Windows PCs, also Apple is nowhere near to Windows Gaming compatibility.
You make very valid points. My concern is with bootcamp gone and now EGPU support gone AAA gaming is basically dead on the Mac. It seems like Apple is betting everything on AAA developers coding for the Mac or maybe buying up some studios...which is far from a sure thing. As far as streaming services, I think USA’a spotty broadband coverage doesn’t make it a good solution either. Which is why I’m going to rock an EGPU on my MacBook Pro 16” when graphics cards become available again ;-).
Why didn't you just get a separate gaming PC? The extra cost wouldn't have been that much higher and your gaming would have been better in the long run..
You can run a virtual machine to play games. Havent tried my self yet (at least any graphically challenging games) but I saw a video of a guy running the witcher 3, it was decent.
What’s most upsetting about this is that the M1’s GPU really isn’t great. I don’t know why so many tech RUclipsrs gloss over this. I’m worried that even the 16” won’t have a GPU that can compete with RTX 30 and RDNA 2 laptops.
So because apple dropped the eGPU support all eGPUs are dead? No. You see, other laptop brands still exist out there. While apple holds a sizeable part of the market, just because they don't support them anymore, other brands still have their fair share of laptops with support. While it is a bummer to not have support anymore, but most people still have other laptops WITH support, so i think that eGPUs will not die out
Egpus didn’t start with the alienware 13 in 2014 they were around before, just most of the were niche products or diy hacked togethers, you could even do egpus with thunderbolt/thunderbolt 2 macbooks( i remember seeing a video of a dude playing games on a air)
I had a thunderbolt 2 eGPU with a modded Akitio Thunder 2. I had to mod a psu for it. Pretty hacky, but oh boy it did work. The video is good but what he says is not true. Apple will drop support for eGPU because Apple doesn't make GPUs and they won't need drivers for Nvidia or amd GPUs because no Mac with ARM processor will have any of those. Simple as that. Also, eGPU still makes sense if u have a MBP for work on MacOS and game on Windows (bootcamp). That's literally the best of both worlds.
@@ricardonacif5426 Why not just get a separate gaming PC? The extra cost wouldn't be that much higher and your gaming would be better in the long run..
@@PeterKoperdan Well for people that already have a Macbook Pro and want to game, the options are: 1- eGPU: $250 for eGPU enclosure GPU 2- Separate gaming pc: GPU Ram Processor Motherboard SSD PSU Case So...
I think there will eventually be some standardized solution for external PCIe. This would allow for your GPU to do double duty for both your desktop and laptop when needed.
I think EGPU will stay around. I think they will turn into dock stations. With the advancement of handheld PCs, mini pcs, and laptops being able to better utilize them
I think eGPUs on Macs are not dead. The initial eGPU support on Macs showed that there is a lot of potential but there is also a lot of room for improvements. I believe that Apple did no more than running a public beta test on the capabilities of eGPU. They learned a lot from the feedback coming from the market and communities. Learning 1: Thunderbolt 3 is not fast and robust enough. Learning 2: eGPUs bring gaming PC performance to Macs while handling heat and power consumption externally. Conclusion: With the gained experience, Apple can offer their own ARM based eGPUs for even higher performance. This won't be available anymore for TB3. Apple would leverage TB4 only to reduce the bottleneck (requires new Mac mini, new Mac Pro, new Mac Book Pro).
Apple M1 and M2 is based on arm architecture. While arm soc like snapdragon is extremely efficient and long battery life. Its internal gpu or igpu is lackluster compared to egpu. The one weakness of egpu is video editing as said in video. Since data has to travel from cpu to egpu and back making video editing on egpu with even PCI express lanes limited by the transfer speed. If youre a RUclips content creator youre worried about video editing speed. Then M1 or M2 is the way to go. For gaming application an egpu is better.
Maybe I’m just uneducated in this topic but wouldn’t they just do what they have been doing with the intel macs? Have the M1X or whatever chip and have AMD dedicated graphics. Just have the graphics card sit idle while it’s not in use.
eGPUs may be on their way out, but dedicated, high-end graphics cards aren't going anywhere. Even Apple will continue to use them in their high end Mac Pros. I currently depend on an eGPU to drive my Apple Pro Display HDR from my iMac Pro.
I think you might have missed the tech jump with the M1. It doesn't have a graphics card, it can be built into the chip to borrow from the RAM. You obviously run a very high end system and I do not. So explain how a dedicated high end graphics card works better in a system that can self contain graphics within the chip and developers can add to the chip exponentially to include more graphics cores as it progresses to higher end machines. I am not in that end of the market myself, i just can't see how a technology that uses base RAM to run graphics can underperform if you increase the available cores and the available RAM? It seems that if you purchase the machine with base RAM you will suffer because you exceed your ability to compensate using high end graphics when more GPU cores are added. But that is baseline RAM, not a dedicated component.
@@MrEiniweini I have an M1, and yes, I know it doesn’t support an eGPU. People that buy the Mac Pro need the very best graphics available, and quite often, they upgrade those graphics boards several times during the life of the machine (I certainly have on previous “pro” Macs). I’m also guessing that the Mac Pro will continue to have discrete RAM chips just like the Intel version does today.
@@keithwalls6316 I don't think apple really wants you to keep the machine running past the "use by" date. It is a marketing thing that likely effects all users but would be painful when using a high end and expensive machine. When they struggled to keep up with windows machines on graphics they allowed for a certain amount of third party intervention to kick them back into the ballpark. By integrating their own graphics and CPU, they can trickle out just enough for you to see it as the best but just not enough to keep it the best for 4 years. Everything in this last rollout says they are aiming squarely at the home user market in a way they never have before. With the home user market comes the business market. You won't need a eGPU until such a time as Apple decides you do.
The whole idea behind egpus is being able to separate grpahics from the laptop. I just upgraded my graphics card so i could play newer games with better fps on my macbook pro. The laptop is perfectly fine but the graphics card was the letdown. Without egpus, i can only use my macbook pro for work. Youre suggesting that i have two laptops? One for work, and an9ther one for gaming because other manufacturers have decent graphics? Thats .crazy!
Irony is that my M1 MacBook Air eliminated the need of an eGPU. I'd originally looked into an eGPU for my Windows PC in order to do 4K timelines in DaVinci Resolve without having to use 1/4 proxies to even get close to realtime playback during edit. Tuns out that the USB-C connector was 3.2 USB, but not Thunderbolt 3. Now I'm not missing it at all.
Sorry, but I so doubt if this ever happens. ARM chipsets still has a long way to beat x86-x64 CPUs when it comes to AAA gaming. That is why gaming laptops, let alone eGPU never dies.
I respectfully disagree that eGPUs are going to die off, for several reasons: First, powerful gaming laptops are often exorbitantly expensive, often costing far more than their non-gaming counterparts and desktops. However, they all suffer from one basic problem: The internal discrete GPU is soldered onto the motherboard and cannot be upgraded. Since graphics cards advance in power and efficiency so quickly, the current generation will always be outpaced in 2-4 years by successive ones. This would entail the gamer having to buy new gaming laptops to replace the one they own in order to stay viable in playing the newest games at very high frame rates several years down the track. Second, not everyone has the desk space to house a full-sized desktop PC. Further, there are those who don't want a desktop (which is non-portable), or those who don't want both a desktop and laptop. There are those who want a thin and light laptop that they can carry around for school or work, then plug it into an eGPU for gaming when they are at home. Third, gamers (particularly the hardcore ones) and non-gamers have very different needs and outlook on what works for them. Those who game on laptops and are on a budget understand the indispensable value of eGPUs, and for some this is the holy grail that will allow them to have the best of both worlds.
Oddly one of the only exceptions to this rule.. of slower performance. was the use case for 3d rendering in octane and redshift.. in their case COMPUTE isn't as effected by the slowdown like realtime stuff.
also.. so far the M1 and M2 chips are kinda crappy and hobbled for 3d rendering compared to eGPUs sadly... they didn't drop it cause of that they did this because they want to control the stack and make more $$$$$$$$
the latency issued could be solved by having 2 thunderbolt cables as you have at least 2 thunderbolt ports on the Macbook. Internal GPU is the way to go, still can't beat PCI.
That solves bandwidth, not latency. That could actually increase latency as the controller would have to deal with information coming from two ports simultaneously. Same thing happens with NVME raid. Double the bandwidth, with increased latency.
Killed the future of eGPU? No, I don't think so. I'm still not a firm believer of ARM-powered chipset because you can't upgrade the CPU in the future. If you want to experience a new ARM chipset, you basically have to buy a new machine. It's the same story with smartphones whenever there is a new chipset launched. That is why I'm not jumping into Apple Silicon bandwagon and stay with Intel/AMD x86-x64 CPU. Even when I'm changing between my Mac Mini and my Intel NUC from time to time, I can still keep using the same eGPU for both of them.
It wouldn't be a new Apple product if they didn't limit some kind of functionality and people start with the "actually, taking this feature away is good for you because reasons"
This right here. The reason I bought an eGPU was to be able to use the same machine for macOS and Windows with Boot Camp and have decent graphics performance. I have the last generation of Intel MacBook Pro before the M1 was launched and I don't plan to upgrade anytime soon (it's way too expensive!). After my Mac dies, perhaps I'll still be able to use it with a different x86-based sleek laptop.
EGPUs are still good for older laptops, but pricing needs to be much lower to make upgrading a better option than just buying a whole new system. Then of course you have your arguments regarding planned obsolesce and overflowing land files with more than capable hardware.
I reused the 5700xt in my egpu enclose in a new virtualization server so I could run Big Sur on unraid. It’s overall a much less janky setup. I don’t think I’ll miss the old stop l setup. I do hope new MacBooks have support for 4+ monitors though…
How did you do that? Which Mainboard did you use? If I'm not mistaken Thunderbolt on PC always runs through the chipset. I dont see a way how you could seperate the iommu group in which the thunderbolt controller is located without causing issues with the host. Please share!
@@franzpleurmann2585 That is exactly right, also, thunderbolt support on most chipsets is pretty flakey (I think Asus and AsRock have some mobos that incorporate TB3 but I wanted a high core count CPU from AMD and AMD + TB3 is still kinda jank) So I just removed the GPU from the enclosure and put it directly into the machine. The hardware I used was pretty consumer grade since I'm a college student and money is kind of tight with covid and everything. MSI x470 Gaming Max Plus (not really recommended), a used Ryzen 2700 8c/16t CPU, and a 5700xt GPU I bought for MSRP like a year ago. Unraid 6.9 makes it really easy to stub pice devices so you can passthrough GPUs, NICs, and even USB controllers if you miss hot plug functionality. You might also have to enable the ACS override patch if you go with the MSI board above, but honestly I would recommend spending a bit more on a better board or a server grade board with more discreet IOMMU groups if you intend to go deep into VMs and have the money. This MSI board was cheap but has caused problems in the few weeks I've had it. Also getting Big Sur working in a VM is super easy with the MacInABox plug in for Unraid (way easier than when I tried it on Proxmox). Checkout "Spaceinvader One" here on RUclips for a more in depth tutorial. I'm still pretty new to this stuff but if you have any other questions I'll try my best to answer them :)
I have macbook pro 16. I want to play windows games with best graphics and performance on windows bootcamp. Kindly suggest me which egpu will be best for the task
There is still a point of eGPUs. When you have 4 different computers with good old Intel graphics and Thunderbolt port. You can share the GPU power between these by just connecting them to it. I bought a used Razer Core 2 for that use case and I can say it works pretty seamlessly. It’s true that M1 Macs, and especially M1X and next generations will certainly limit the eGPU market as we know it, unless these things happen : - A step forward in connectivity, TB4 won’t be that but maybe TB5 or an evolution of USB type C protocols - A step forward in PCI norm or equivalent internal connectivity tech - A revolution in GPUs, as M1X may already be close to the high end of the dedicated GPU market Maybe all of these will come from Apple anyway. The integrated/SoC type of architecture allows them to bypass PCI and other technologies which still rely on Intel. But if they want to make an ARM Mac Pro, they’ll have to build their own internal connectivity tech I guess.
Do YOU guys agree with my conclusion? Let me know why or why not down below!
The BEST M1 Mac on Amazon (ON SALE) ➡ geni.us/1mJ41T
Best deals on M1 Macs & Accessories on Amazon ⬇️
M1 MacBook Pro (ON SALE) ➡ geni.us/7lb3Gn
M1 MacBook Air ➡ geni.us/1mJ41T
M1 Mac Mini 2020 ➡ geni.us/Atm1
OWC Thunderbolt Hub exclusively for M1 Macs ➡ bhpho.to/37SIff7
Thunderbolt 3 SSD that we use ($100 OFF) ➡ geni.us/zWdO
Support us by checking out our Merch! ➡ 2teespring.com/stores/max-tech-store
Max tech wallpapers ➡ bit.ly/2WNc6Qw
Why you can't connect 2 thunderbolt cables between each other for no battleneck?
My only question is can the M1 Max TRULY support editing workflows using two 2k or 4k monitors at more than 60hz? To me, THIS is what eGPUs really do, they let me expand the laptop to a proper desktop editing experience. I haven't seen ANYONE talking about the new max chips being able to do this and would love to see you test this out!
My specs / scenario is this:
6K BRAW 12 bit footage exporting to h.264 on Premiere
15" MBP 2019 maxed out (except for SSD storage)
Razer Core eGPU + Radeon 5700 XT
I'm running two monitors at 2K (one at 75hz and the other at 120hz)
How would the 16in M1 Max 64GB fare with the same footage / export settings while also running two 2K monitors (one HDMI and one through thunderbolt)?
I want to use an egpu when my laptop gpu isnt modern anymore and I dont want to throw away the whole laptop.
Not really, the performance comparison you are seeing is mostly ASIC codecs. I already use a eGPU daily on all my platfoms, my displaylink dock. It's a different protocol yes, but still.
What I always find fascinating is so many of these tech youtubers have tunnel-vision when it comes to the laptop as a professional tool. They only look at it as a video editing rig....because they're all video editors, I guess. Newsflash: Most laptop users aren't video editors. Some of us are architects, industrial & mechanical engineers, designers etc, etc. I use an eGPU with a Quadro card because I'm a solidworks user. With that setup connected to my XPS 9310 (which obvs uses TB4), it's amazing how capable that machine is. So, maybe eGPUs don't have future in Apple ecosystem, but that doesn't mean they're dead.
Yes. This. Not every pro user is editing video. And the examples he gave where eGPU showed no improvement is because of the T2 chip which has special hardware for some specific encodings.
@@zf1786 I do have a desktop. But, it's a desktop with an RTX 2080, a great graphics card for gaming - which I use it for - but not one designed for Solidworks. This is the dilemma many people who need Quadro cards for CAD programs have. So, with the eGPU, I can still do work at home while having portable computing power while traveling.
@@zf1786 I already have the macbook pro for game, with an i9 and 32gb ram on it. Why I would buy a whole new PC if I can just buy an eGPU and plug a single cable?
I honestly don't think that "generalists" tech YTbers can actually appease the whole world at once i.e. every professional laptop user. The sheer amount of resources needed to cover that wide a base with small (or one-man) production teams would, IMO, be too great a burden and a poor ROI for the channel.
And I guess the market for video editors (professional or otherwise) is much larger and more sustainable than smaller professional and/or niche occupations/industries. I'd wager there are more people who edit videos on laptops (including budding "home porn" producers) than there are those who render architecture modeling as a pastime.
@@zf1786 I'm unaware of a 13" thin & light that comes with a discrete Quadro card. I have no need nor desire to lug around a 15" or 17" laptop. But, do need access to Solidworks when working from home.
People didn’t buy an e-gpu for video editing, but for gaming.... its only in the mind of most streamers that video editing performance is more important then gaming performance...
Actually lots of people bought egpus for video editing.
Just not you (I assume) or me
@@RyanLennoxBradley oh, really? go to egpu.io/forums and try to find "lots" discussions about video editing using eGPU.
They are in 0.01%, rest of us get it for gaming. Those yotubers are completely deluded donkeys
Man you’re wrong. I don’t work with video, but people DO get eGPU’s for video editing.
For me, an eGPU in 2021 is still attractive. This is because I already own a laptop. It’s dramatic how fast graphics technology evolves - when I bought this machine in 2017 the 4GB card seemed more than enough. Today, not so. And yet, an eGPU and GPU can supercharge my laptop for a quarter of the price of either a new PC, or a new laptop. We might all be stunned when Apple’s new Macs arrive, but something tells me there won’t be won’t widespread support for Windows gaming on the Mac right out of the gate. As it stands, I’m glad the eGPU option is still around as an option for someone in my situation, but for anyone buying a new laptop or desktop today, the choice is obvious.
Bought the MacBook Air because of you - best device I have ever bought! Thanks!
@@lukeszns it was the 2019 MacBook Air which I ended up selling.
It thermally throttled often for me, it struggled while multitasking and the battery life, though quite good is not near the m1 MacBook Air.
@@raiyan8978 I did the same thing: sold my 2019 MBA and bough the M1 MBA
How is the battery life for your average task ?
Can you play games on that ? Like nfs games ?
eGPUs require drivers. Apple simply won’t spend the money to create drivers for them.
that and the drivers need to be done from the gpu maker's side as well. AMD isn't likely gonna want to invest much time into that
No, Apple absolutely has no issue spending the money to do that, if it is worthwhile. The point is what was made in the video.
@@chidorirasenganz Totally wrong.
@@Zeegoner nope but nice try
I would rather a razer, omen, rog, gigabyte labtop of this shit
Apple not supporting eGPUs is far from the end of eGPUs in general. It's just the end for Apple users where that was important to them. Still benefits in other use cases and devices: content creation is not the only reason to have GPU power.
It makes it a hard pass for me getting a Mac anytime soon.
Base? Super? What?
yes I bought an eGPU 1 month ago and I sold it back within 2 weeks, totally not worth it
What computer were you using it with?
Just no games anyway?
Another reason to not get MacBooks, because Apple is anti-consumer and anti-right-to-repair. eGPUs are here to stay :D
What impacted the potential proliferation of eGPUs was the rarity of thunderbolt 3 ports on laptops. However, even with Apple dropping Mac support, the higher bandwidth thunderbolt 4 port will be ubiquitous on all 11th gen intel CPU powered laptops. This means a huge increase in the potential market (customers) for eGPUs. We may see Thunderbolt 4 optimized eGPUs becoming much more common.
The reason egpus are not good for what you showed is because of the overall bandwidth mainly, and latency secondly.
The thunderbolt 3 standard only contains 4 pcie lanes, which is a bottleneck for all the rest of the system and it causes latency because, as you mentioned, the same amount of data needs to travel the same route with a quarter of the bandwidth, so it takes more time, creating latency. BUT if intel releases thunderbolt 5 or 6 or whatever with the option of containing 16 pcie lanes, this latency will just disappear because the bottleneck i mentioned earlier won't be.
So the problem you mentioned is solveable! Especially with pcie gen 4, every lane has double the bandwidth of gen 3 so even with 8 lanes thunderbolt will have the bandwidth of 16 gen 3 lanes (4 times the performance it has now without too much complexity, eliminating any latency)
Time will tell but I also think he jumped the gun with this video.
in conclusion the new rog 13 that comes with a 3080 egpu already fixed the bottleneck with a custom port connector
So it looks I'll be sticking with my 2020 13 inch Intel MacBook. It's the only one that can run x86 Windows via bootcamp and connect to my eGPU.
As they say,
“An apple a year,
makes your money disappear...”
Well, right now the M1 Macs are the cheapest option for the performance you get
I was looking for info about eGPU for a long time. This is the first video that answer all of my questions about eGPU. Many thanks.
don't listen to this guy he's an apple Fanboy they think all because apple stops doing it pc is going to copy its funny I'd recommend a intel laptop all there newer pcs are supporting egpus
EGPUs are still limited by the thunderbolt bandwidth, fortunately its evolving with other technologies.
Like? What info you have on these "other tech"?
Thunderbolt 4 has more PCIe lanes than Thunderbolt 3
I do disagree. TB3 uses PCI GEN 3 4x (TB4 too). But with pci gen 4 they will soon be able to double the data rate, running a gpu basically on 8x gen 3 speed.
Considering only very high-end gpus slightly exceed 8x gen 3 speeds TB will definitely not die. External and cooled gpus will ALWAYS outperform internal graphics. Thunderbolt 5 will be a game changer with double the bandwidth and lower latency.
Apple might drop pci e support for real graphicscards completely. But because they will probably replace the Mac pro with a dedicated gpu I don't think so. They might still drop it though for laptops.
But when max performance is key, proper cooling is absolutely necessary! For this reason I guess apple will offer a solution, maybe completely inhouse.
The rest of the industry will stick with pci and thunderbolt for max performance in light laptops.
Create a problem, then sell the solution, that's apples Motto!
Correction: thunderbolt 4 uses only 1 lane of pcie gen 3. It's info is hard to find, but I found it on the egpu forum. Those lost lanes are to allow 3 thunderbolt 3 downstream ports on docks.
@@_reZ Thank you for the for the input!
From my research I found that Thunderbolt is very confusing by "tunneling" multiple protocols interchangeably (USB, DisplayPort, PCI-E).
TB4 is PHYSICALLY connected using 4 PCI-E 3.0 and 2 DP1.4 interfaces! (Example ASUS ThunderboltEX 4 expansion card)
TB3 and TB4 are able to allocate all available PCI-E 3.0 (4x) and DP lanes to the connection or even split the connection between multiple protocols.
When connected, the target device is able to choose the “tunnelling protocols” from the TB-Connection and E-GPUs choose PCI-E (and a video return if needed (maybe DP is bidirectional?)).
The controller in source1 somehow mentioned “Other native interfaces: x1 PCIe” where I am unsure what is different for native protocols from “tunnelling”
Unfortunately my prediction hasn’t come true yet. Asus is using it’s proprietary (yet cool) connector for E-GPUs and TB is still waiting to be updated for an increase in data-rate. Will come though🤞.
Source1 TB4 Controller:
“Tunneling capabilities (32G PCIe, USB3(10G), 2 displays (up to DP1.4) […] Other native interfaces: x1 PCIe 8GT/s, 1x USB3 (10G)” ark.intel.com/content/www/us/en/ark/products/189982/intel-jhl8440-thunderbolt-4-controller.html
Source 2 TB3 Techbrief (nothing available for TB4 lol):
“[…] the silicon extracts and routes up to 4 lanes of PCI Express Gen 3(4 x 8 Gbps) and up to two full (4 lane) links of DisplayPort out over the Thunderbolt cable and connector to the device(s) attached downstream from the host system.” www.thunderbolttechnology.net/sites/default/files/Thunderbolt3_TechBrief_FINAL.pdf
But please do your own research if you are interested in the topic. I’ve always wanted to dig deeper into thunderbolt and you gave me a reason :) Thank you! If you got things to add, don’t hesitate :)😊
@@oso3557 cool np
Video editing yes, but for a 3D motion graphics designer eGPUs were a godsend. GPU render engines seem to be trendy now, and Apple kicked the whole industry to use buggy-as-hell Windowses.
I'm having second thoughts about purchasing an Apple laptop due to its GPU's inadequate support for the Unreal Engine application, and with the recent news that they're discontinuing support for external GPUs, I've made the decision to permanently part ways with Apple.
It seems as if Apple doesn’t want to take the focus off the M1 by (easily) supporting eGPUs with simple OS-level support in Big Sur, or take the focus off the M1s 8 GPU cores. Apple may fear that allowing eGPUs (for now) might run the risk of suggesting that the M1 is in any way deficient. (FWIW, rumor has it that the 14 and/or the 16 inch M[?] MacBook Pros will have native PCIe 3 support…)
ROG flow 13 is perfect kind of laptop for an eGPU.
It's a far superior product to every single MacBook in the market.
Thunderbolt 5 coming to the rescue.
Shoutout to your Appleinsider days! You guys have come a long way!
Thats a dumb assumption.
eGPU is just a case for a GPU. You saying "eGPU will die" therefore doesn't make any sense, because GPUs won't die for sure and somebody is going to manufacture a case for a GPU.
Latency argument also doesn't make any sense, because other Laptops proof that its entirely possible to do without latency, especially because nobody cares about the cable distance if we are at lightning speed.
Let's also not forget that Thunderbold is a developing technology.
lets hope asashi linux realises full egpu support on apple silicon
dont they still need to fix more basic things before they worry about a niche like this?
@@hwy9nightkid why should they limit themselves to basics? If they can implement sought after features it will increase support!
Until the bandwidth limitation of 40Gbps issue is resolved then yes it will likely cease to exist. Even thunderbolt 4 is still stuck at 40Gbps
I noticed the same issue with FCP, I was having dropped frames on 60 frame footage but when I unplugged the gpu my integrated intel graphics played it back with no issues
I agree. eGPUs made sense when the only available GPU on your laptop was weak Intel Integrated graphics such as a UHD 630. eGPUs are also at a disadvantage compared with laptops that include a discrete GPU, such as the MacBook Pro 16, as the internal discrete GPU typically connects to the CPU with 8 or more PCi data lanes, where a Thunderbolt 3 port has a max bandwidth of 4 PCi data lanes. The M1 chip provides an early indication of how good integrated GPUs could become on Apple Silicon M series processors, and I even wonder if Apple will also get rid of discrete GPUs in their entire laptop product range, and only use them in their desktop MACs. The closer you can get the GPU to the CPU the lower the latency with performance gains. It then becomes a thermal management tradeoff issue as you are starting to concentrate a lot processing performance with power consumption and associated thermal load into an exceedingly small space.
Maybe, though I don't know if Apple can actually create a high end competitive GPU. If they can, great. If not they need to buy them from AMD or Nvidia
iMac (2021) is only using Apple Silicon's iGPU
that sucks because the m1 cpu actually is pretty good... if u work with 3d stuff or video editing a a6000 connected would be top notch...
At the moment I am getting better results with my Mac mini 2018 (64 GB RAM, i5 model) and an eGPU with Vega 64 than I am getting with the MacBook Air M1 (16 GB RAM) in video editing. I think that will change in the future so next time I buy a new computer, I won't need the eGPU. But think that is some years further down the road, are happy with my set up at the moment.
Man I want to buy a Macbook M1 but no egpu support is a deal killer for me
Yeah, I mean sometimes you want to do some gaming.
These videos remind me of Bevis and Butthead kicking a 20k server. “Hehe why are you so slow, hehe uhh”
Rip in the chat ;(
Every thing looks fine with tests, but heat. eGPU heats own radiator, CPU can work on higher speeds in this case. A lot of gaming laptops has throttle issues.
Guys, I’m a subscriber and watch every video you show. But don’t you think there’s just so many videos you can make on Mac M1? I counted 37 so far... didn’t count iPhones or air max. I say this with love, really want you guys to thrive, and also eager for more content that you make so well
For raytracing applications however, an eGPU is still better because they offload all the data to the GPU in a one-time operation, and then the GPU crunches on that data. It's not back and forth like video editing.
It's too bad, an eGPU module the shape of the Mac mini would be fantastic. Love my new M1 mini, but would really appreciate a bit more gpu power.
Theoretically you could still connect a egpu as I believe that the m1 macs still have thunderbolt, however as far as I am aware nvidia, amd and Intel don't have arm drivers for their desktop gpus
Sounds like bad news for Mac gamers like me. I want to stay in Mac OS, but inability to use an eGPU (or boot camp) seems like a real future deterrent for me. Unless Mac's will actually get good graphics performance for gaming... (I'm not holding my breath). Damnit, now I'm all depressed! :(
Seems you were right... sadly. I wish I could use my RTX via eGPU with my new M1 Max to help rendering 3D.
What about the ROG flow 13? I think that is the perfect setup for an e gpu
The price.....
Its still good for gaming though, since gaming doesnt require crazy bandwith.
Oh just realized asus flow has a dedicated connector for the gpu which should have more bandwith so editing performamce should be good too.
It has a dedicated port for it so it’s fine
@Harsh Gurnani Thunderbolt 4 port still can't deliver full GPU performance. The propertiary connector on ROG Flow X13 however can deliver it.
Intel just matched the dedicated connector's speed with Thunderbolt 5. Asus needs to make a PCIe 6.0 ver of the XG Connector for the next version of this device to keep up. They also need to give a Desktop Ver of the eGPU, the mobile version is a bit pricey.
Have you thought about making another video 2 years later on the same theme. Today the cost of Apple computers is skyrocketing, at least in Italy and buying one has become really expensive. I use the vectorworks cad id and the 2023 version of the cad is really energy-intensive and would like the mac ultra card for average performance, so you do little about the fact that you can have greater performance with a pc-window and more performing nvidia graphics cards. I'd like to buy a mac but I'm hesitant, because I don't want to spend €8,000.00 for a studio mac with 192mb integrated memory and 72 gpu cores. Also how does the mac gpu work?
.. They didn't drop eGPU Support dude. The M1 Mac's are first gen. The eGPU's are supported by Thunderbolt on the M1 Macs. The issue is drivers, there are no ARM Drivers for OSX.
Man the content you put out is just amazing. Quantity and quality. Good job!
I always enjoy the amount of research that goes into these videos. Very educational, keep it up!
I can see external graphics card enclosures remaining useful, but not for graphics. Similar to how some tasks are better performed using a cluster of Raspberry Pis than a single PC or Mac that is overall more powerful, there are other tasks that graphics cards are useful for where more is always better. Things like Bitcoin mining & distributed computing - BOINC, Folding@home, etc. - i.e. eGPUs won’t be used for streaming graphics from the CPU back to the display, but they will be used for “dumping” computational models onto graphics cards, letting them run the process at hand, and then getting the results back once they’ve completed.
Just because it’s slower in video editing doesn’t mean it’s bad or dead. The future could have even more faster ports with more power with multiple channels allowing more data with huge bandwidth.
We haven’t yet reached the
It may well be dead for the Mac world, but in no way is it even close to dead in the Windows world, particularly in the business environment and for many people working from home, where it allows creative professionals to use portable ultrabooks with high quality high resolution screens (Gaming laptops are generally geared towards high refresh 1080 displays) which can then just dock with a eGPU station hot desk and get instant graphics performance and multi monitor support.
Preach! That is why Windows 10 is still my preferred desktop OS.
@@FAT8893 hell there great for mini pcs I don't have a lot of space in my room for a full gaming desktop rig I have a skull nuc i7 been gaming on it with a egpu it's my favorite pc I swear apple Fanboys are dumb
But what about multi-monitor support? I use eGPU so i can hook up 4 external monitors @ 4k. Currently none of the M1 chip macs could support more than 2 external monitors
This dude's logic is epic dumb, Apple's Graphics improvement is good for Apple's products, that doesn't affect to Windows PCs, also Apple is nowhere near to Windows Gaming compatibility.
Fanboy logic is always dumb he's clearly an apple Fanboy thinking people only game on apple
You make very valid points. My concern is with bootcamp gone and now EGPU support gone AAA gaming is basically dead on the Mac. It seems like Apple is betting everything on AAA developers coding for the Mac or maybe buying up some studios...which is far from a sure thing. As far as streaming services, I think USA’a spotty broadband coverage doesn’t make it a good solution either. Which is why I’m going to rock an EGPU on my MacBook Pro 16” when graphics cards become available again ;-).
Why didn't you just get a separate gaming PC? The extra cost wouldn't have been that much higher and your gaming would have been better in the long run..
You can run a virtual machine to play games. Havent tried my self yet (at least any graphically challenging games) but I saw a video of a guy running the witcher 3, it was decent.
What’s most upsetting about this is that the M1’s GPU really isn’t great. I don’t know why so many tech RUclipsrs gloss over this. I’m worried that even the 16” won’t have a GPU that can compete with RTX 30 and RDNA 2 laptops.
Welcome to the party pal. You're on the conveyor belt, now. Start walking. They'll have you running soon enough.
So because apple dropped the eGPU support all eGPUs are dead? No. You see, other laptop brands still exist out there. While apple holds a sizeable part of the market, just because they don't support them anymore, other brands still have their fair share of laptops with support. While it is a bummer to not have support anymore, but most people still have other laptops WITH support, so i think that eGPUs will not die out
Egpus didn’t start with the alienware 13 in 2014 they were around before, just most of the were niche products or diy hacked togethers, you could even do egpus with thunderbolt/thunderbolt 2 macbooks( i remember seeing a video of a dude playing games on a air)
I had a thunderbolt 2 eGPU with a modded Akitio Thunder 2. I had to mod a psu for it. Pretty hacky, but oh boy it did work. The video is good but what he says is not true. Apple will drop support for eGPU because Apple doesn't make GPUs and they won't need drivers for Nvidia or amd GPUs because no Mac with ARM processor will have any of those. Simple as that. Also, eGPU still makes sense if u have a MBP for work on MacOS and game on Windows (bootcamp). That's literally the best of both worlds.
@@ricardonacif5426 Why not just get a separate gaming PC? The extra cost wouldn't be that much higher and your gaming would be better in the long run..
@@PeterKoperdan Well for people that already have a Macbook Pro and want to game, the options are:
1- eGPU:
$250 for eGPU enclosure
GPU
2- Separate gaming pc:
GPU
Ram
Processor
Motherboard
SSD
PSU
Case
So...
@@ricardonacif5426 I see. I thought the price was closer to 350 which would get it closer to a PC build.
imagine, apple making their own gpus, with raytracing, and whatever other tech they throw into it
Thunderbolt 4 Plus cable will allow faster 2 way Data flow. This is my product coming soon. Also with a new eGPU dock .. really cool.
I think there will eventually be some standardized solution for external PCIe. This would allow for your GPU to do double duty for both your desktop and laptop when needed.
only programmers and coders knows what went wrong, basically in short, apple told it's coders to no longer support eGPU for marketing purposes
I bought a Macbook Pro M1 when I was living in Japan, was planning to use egpu, but after this video makes me think better if really worth 🤔🤔🤔
Sony Vaio Z came with a egpu (non changeable) in 2011 I think. Why not use that as the historic example, it is the 10 year anniversary after all.
Thanks! This info helped me make a purchase decision
I think EGPU will stay around. I think they will turn into dock stations. With the advancement of handheld PCs, mini pcs, and laptops being able to better utilize them
I think eGPUs on Macs are not dead. The initial eGPU support on Macs showed that there is a lot of potential but there is also a lot of room for improvements. I believe that Apple did no more than running a public beta test on the capabilities of eGPU. They learned a lot from the feedback coming from the market and communities. Learning 1: Thunderbolt 3 is not fast and robust enough. Learning 2: eGPUs bring gaming PC performance to Macs while handling heat and power consumption externally. Conclusion: With the gained experience, Apple can offer their own ARM based eGPUs for even higher performance. This won't be available anymore for TB3. Apple would leverage TB4 only to reduce the bottleneck (requires new Mac mini, new Mac Pro, new Mac Book Pro).
I don't think it's up to apple to support egpus because it already has the Thunderbolt 4 ports. It's AMD and Nvidia that have to write the drivers 🤔
Apple M1 and M2 is based on arm architecture. While arm soc like snapdragon is extremely efficient and long battery life. Its internal gpu or igpu is lackluster compared to egpu.
The one weakness of egpu is video editing as said in video.
Since data has to travel from cpu to egpu and back making video editing on egpu with even PCI express lanes limited by the transfer speed.
If youre a RUclips content creator youre worried about video editing speed. Then M1 or M2 is the way to go.
For gaming application an egpu is better.
A serious gamer doesn't game on a Mac. Everyone knows that !
I do..
noobplayswithmac LOL
And I just bought a eGPU for my 2020 13" MBP lol!
I bought mine for my mbp 2016, but can say it works better both in Davinci and Steam games.
does it work? I want about to return my 2020 intel to buy the m1 chip
@@samyoe An eGPU? Yeah with intel Macs it works. But it will never work with an M1 Mac. There is no support for external pcie devices.
Thunderbolt 4 should fix this issue but do not quote me on that
Since AMD is still working with Apple, either Apple still produces Intel Mac or have a workaround to integrate AMD gpu in their new chip.
Maybe I’m just uneducated in this topic but wouldn’t they just do what they have been doing with the intel macs? Have the M1X or whatever chip and have AMD dedicated graphics. Just have the graphics card sit idle while it’s not in use.
I miss videos like these.
eGPUs may be on their way out, but dedicated, high-end graphics cards aren't going anywhere. Even Apple will continue to use them in their high end Mac Pros. I currently depend on an eGPU to drive my Apple Pro Display HDR from my iMac Pro.
I think you might have missed the tech jump with the M1. It doesn't have a graphics card, it can be built into the chip to borrow from the RAM. You obviously run a very high end system and I do not. So explain how a dedicated high end graphics card works better in a system that can self contain graphics within the chip and developers can add to the chip exponentially to include more graphics cores as it progresses to higher end machines. I am not in that end of the market myself, i just can't see how a technology that uses base RAM to run graphics can underperform if you increase the available cores and the available RAM? It seems that if you purchase the machine with base RAM you will suffer because you exceed your ability to compensate using high end graphics when more GPU cores are added. But that is baseline RAM, not a dedicated component.
@@MrEiniweini I have an M1, and yes, I know it doesn’t support an eGPU. People that buy the Mac Pro need the very best graphics available, and quite often, they upgrade those graphics boards several times during the life of the machine (I certainly have on previous “pro” Macs). I’m also guessing that the Mac Pro will continue to have discrete RAM chips just like the Intel version does today.
@@keithwalls6316 I don't think apple really wants you to keep the machine running past the "use by" date. It is a marketing thing that likely effects all users but would be painful when using a high end and expensive machine. When they struggled to keep up with windows machines on graphics they allowed for a certain amount of third party intervention to kick them back into the ballpark. By integrating their own graphics and CPU, they can trickle out just enough for you to see it as the best but just not enough to keep it the best for 4 years. Everything in this last rollout says they are aiming squarely at the home user market in a way they never have before. With the home user market comes the business market. You won't need a eGPU until such a time as Apple decides you do.
The whole idea behind egpus is being able to separate grpahics from the laptop. I just upgraded my graphics card so i could play newer games with better fps on my macbook pro. The laptop is perfectly fine but the graphics card was the letdown. Without egpus, i can only use my macbook pro for work. Youre suggesting that i have two laptops? One for work, and an9ther one for gaming because other manufacturers have decent graphics? Thats .crazy!
the bottle neck is just too great tho
I think why Apple Silicon doesn’t support eGPU is these GPUs can only work on x86-based CPU (like Intel), not ARM-based chips.
Irony is that my M1 MacBook Air eliminated the need of an eGPU. I'd originally looked into an eGPU for my Windows PC in order to do 4K timelines in DaVinci Resolve without having to use 1/4 proxies to even get close to realtime playback during edit. Tuns out that the USB-C connector was 3.2 USB, but not Thunderbolt 3. Now I'm not missing it at all.
Sorry, but I so doubt if this ever happens. ARM chipsets still has a long way to beat x86-x64 CPUs when it comes to AAA gaming. That is why gaming laptops, let alone eGPU never dies.
So that’s why game developers refuse to support Apple
The eGPU is useful when your internal GPU is corrupted. eGPU has better heat dissipations that prolong your laptop's fans lifespan.
You don't need a corrupted iGPU. Egpus are for improving performance
Can you use two cables to double your bandwidth on your eGPU? Some boxes come with two ports.
Actually it might be tricky. Maybe maybe not.
@@aryanbhushan2118 thunderbolt 4 changes the game
I respectfully disagree that eGPUs are going to die off, for several reasons:
First, powerful gaming laptops are often exorbitantly expensive, often costing far more than their non-gaming counterparts and desktops. However, they all suffer from one basic problem: The internal discrete GPU is soldered onto the motherboard and cannot be upgraded. Since graphics cards advance in power and efficiency so quickly, the current generation will always be outpaced in 2-4 years by successive ones. This would entail the gamer having to buy new gaming laptops to replace the one they own in order to stay viable in playing the newest games at very high frame rates several years down the track.
Second, not everyone has the desk space to house a full-sized desktop PC. Further, there are those who don't want a desktop (which is non-portable), or those who don't want both a desktop and laptop. There are those who want a thin and light laptop that they can carry around for school or work, then plug it into an eGPU for gaming when they are at home.
Third, gamers (particularly the hardcore ones) and non-gamers have very different needs and outlook on what works for them. Those who game on laptops and are on a budget understand the indispensable value of eGPUs, and for some this is the holy grail that will allow them to have the best of both worlds.
Oddly one of the only exceptions to this rule.. of slower performance. was the use case for 3d rendering in octane and redshift.. in their case COMPUTE isn't as effected by the slowdown like realtime stuff.
also.. so far the M1 and M2 chips are kinda crappy and hobbled for 3d rendering compared to eGPUs sadly... they didn't drop it cause of that they did this because they want to control the stack and make more $$$$$$$$
Sounds great. From this video I am concluding, that with my old Mac book pro from 2018 (Intel) I can leverage NVIDIA as an eGPU, right?
nope, only AMD is supported in macos
the latency issued could be solved by having 2 thunderbolt cables as you have at least 2 thunderbolt ports on the Macbook. Internal GPU is the way to go, still can't beat PCI.
That solves bandwidth, not latency. That could actually increase latency as the controller would have to deal with information coming from two ports simultaneously. Same thing happens with NVME raid. Double the bandwidth, with increased latency.
The eGPU is soon as obsolete as my Intel Macbook Pro.
Eventually, the discrete graphics card will die completely.
May I ask if the mac will work even if I don't connect a monitor? Will it affect performance? Thank you
Killed the future of eGPU? No, I don't think so. I'm still not a firm believer of ARM-powered chipset because you can't upgrade the CPU in the future. If you want to experience a new ARM chipset, you basically have to buy a new machine. It's the same story with smartphones whenever there is a new chipset launched. That is why I'm not jumping into Apple Silicon bandwagon and stay with Intel/AMD x86-x64 CPU. Even when I'm changing between my Mac Mini and my Intel NUC from time to time, I can still keep using the same eGPU for both of them.
I think it's kind of like the external disc slot, technically pro consumer but not that useful in the modern day.
How Abbout Mac Pro with M' Chips and PCie GPU slot?
I trade stocks, Is it worth getting the mac book pro 16 inch and getting an eGpu or wait for the MacBook Pro 16 inch m1 chip?
It wouldn't be a new Apple product if they didn't limit some kind of functionality and people start with the "actually, taking this feature away is good for you because reasons"
ohhhh hella naw. no egpu compability is a no go for macbook......
Just as well. You can't even buy the latest graphics cards anyway to install into a eGPU.
I'm not sure this is true. An intel mac with bootcamp and egpu will still be able to play loads of games the m1 mac can't.
This right here. The reason I bought an eGPU was to be able to use the same machine for macOS and Windows with Boot Camp and have decent graphics performance. I have the last generation of Intel MacBook Pro before the M1 was launched and I don't plan to upgrade anytime soon (it's way too expensive!). After my Mac dies, perhaps I'll still be able to use it with a different x86-based sleek laptop.
EGPUs are still good for older laptops, but pricing needs to be much lower to make upgrading a better option than just buying a whole new system. Then of course you have your arguments regarding planned obsolesce and overflowing land files with more than capable hardware.
I reused the 5700xt in my egpu enclose in a new virtualization server so I could run Big Sur on unraid.
It’s overall a much less janky setup. I don’t think I’ll miss the old stop l setup. I do hope new MacBooks have support for 4+ monitors though…
How did you do that? Which Mainboard did you use? If I'm not mistaken Thunderbolt on PC always runs through the chipset. I dont see a way how you could seperate the iommu group in which the thunderbolt controller is located without causing issues with the host. Please share!
@@franzpleurmann2585 That is exactly right, also, thunderbolt support on most chipsets is pretty flakey (I think Asus and AsRock have some mobos that incorporate TB3 but I wanted a high core count CPU from AMD and AMD + TB3 is still kinda jank)
So I just removed the GPU from the enclosure and put it directly into the machine.
The hardware I used was pretty consumer grade since I'm a college student and money is kind of tight with covid and everything. MSI x470 Gaming Max Plus (not really recommended), a used Ryzen 2700 8c/16t CPU, and a 5700xt GPU I bought for MSRP like a year ago.
Unraid 6.9 makes it really easy to stub pice devices so you can passthrough GPUs, NICs, and even USB controllers if you miss hot plug functionality. You might also have to enable the ACS override patch if you go with the MSI board above, but honestly I would recommend spending a bit more on a better board or a server grade board with more discreet IOMMU groups if you intend to go deep into VMs and have the money. This MSI board was cheap but has caused problems in the few weeks I've had it.
Also getting Big Sur working in a VM is super easy with the MacInABox plug in for Unraid (way easier than when I tried it on Proxmox). Checkout "Spaceinvader One" here on RUclips for a more in depth tutorial.
I'm still pretty new to this stuff but if you have any other questions I'll try my best to answer them :)
1.75 x watch
I have macbook pro 16. I want to play windows games with best graphics and performance on windows bootcamp. Kindly suggest me which egpu will be best for the task
Go to eGPU.io.
So there has to be seperate input and output fir the graphics....? Can that solve the problem
There is still a point of eGPUs. When you have 4 different computers with good old Intel graphics and Thunderbolt port. You can share the GPU power between these by just connecting them to it.
I bought a used Razer Core 2 for that use case and I can say it works pretty seamlessly.
It’s true that M1 Macs, and especially M1X and next generations will certainly limit the eGPU market as we know it, unless these things happen :
- A step forward in connectivity, TB4 won’t be that but maybe TB5 or an evolution of USB type C protocols
- A step forward in PCI norm or equivalent internal connectivity tech
- A revolution in GPUs, as M1X may already be close to the high end of the dedicated GPU market
Maybe all of these will come from Apple anyway. The integrated/SoC type of architecture allows them to bypass PCI and other technologies which still rely on Intel.
But if they want to make an ARM Mac Pro, they’ll have to build their own internal connectivity tech I guess.
-_- not everyone needs an eGPU for editing. i would love to boost my graphics on my air m2 for gaming only but apple says 🖕