There are some things I want to comment on before this video goes live: 1) THANK YOU for 1 Million views on my previous video! I am blown away by all your comments and glad you really seem to like it. 2) This video will be much more about explaining all the steps properly, with some minor shortcuts and references to part 1. I had the feeling Part 1 (3080 Mobile upgrade) was a bit confusing here and there. 3) No I don't offer this as a service and I don't think that would be a good idea anyway. I live in Germany and costs of living are quite high. For example I pay roughly 1000€ for rent (all in all, which is still quite cheap in germany for an apartment as big as mine) and another 1000€ for feeding my family. You can imagine how much I would have to charge for such an upgrade if you consider the material costs (e.g. 400€ GPU + 90€ VRAM + 10€ flux, solder and such), renting a proper workshop for that, cover risk and get some profit to pay my bills. I guess a 3080 upgrade would be 800€. Anything below that would make no sense because of all the costs :( But I am fine to keep working on building up my channel to be able to step back from my main job (40h a week) in part time to make more YT videos. If that's possible for me I can imagine to upgrade a viewers laptop (or desktop GPU?). As of today, it can't be a widely available service, though. 4) Before this comes up again I want to make clear, that this mod requires the new GPU's pads to be compatible with the old GPU. Ada Lovelace AD104/103 (RTX 40 series) for example have the same footprint (visual pad layout) as Ampere GA104/103 but are NOT compatible electrically unfortunately. Some pads got different signals on them. However I am working on something to keep upgrading further (slowly, no priority at the moment). Hope I can update you about this in near future. At first, I will explore some other ideas for future videos. :) 5) Temperatures are not an issue. A lot of people asked that. At least if you run the upgraded GPU at the same TGP level as your old GPU. My old 3070 Mobile ran at 115W TGP, so does my new 3080 Mobile in the same laptop. Temperatures are pretty much identical. Except for one upgrade I put into my daily machine: the watercooling modded heatsink from one of my earlier videos. Together with a fast-detachable external water cooling loop I can run 150W TGP easily, with temperatures being very good, GPU 70°C max, CPU 70°C - ish (jumps a lot), while the air cooling fans in the laptop run at quite low speed. The max TGP of 175W should be usable as well, but I don't want to play the game of diminishing returns and put a lot of stress on the VRMs and peripherals. 6) This video is also available as blog post in case you want to take a closer look at some things: techmodlab.com/xmg-apex-15/xmg-apex-15-max-2022/upgrading-a-soldered-laptop-gpu-part-2/
really wish these types of things were more practicable and cost effective. it really sucks that devices are not made to be serviceable or upgradeable anymore..
I have a question could you upgrade a dell precision 5510 like could u put on a better gpu because its currently has a NVidia Quadro M1000M and the cpu intel(R) core(TM) i7-6820HQ @ 2.70GHZ
It's also worth noting that if that GPU is substantially faster, & more powerful than the chip originally on the board, not to mention faster ram & probably double what was available prior, means the GPU will be trying to stash lots more textures in it's ram for availability, (or at least at a faster pace) which means your doubling the graphics workload for the prefetch pipeline of the CPU. All those instructions have to be ordered & sent to the GPU, & I have seen (on older systems primarily) when paired with a GPU better matched to the processor it always had a bit of CPU overhead, but with a newer GPU the CPU would be tasking @ 99% almost constantly, even if it isn't thermal throttling. Also the older CPU's don't have the larger on-die ram buffer of the newer chips either. That's especially true of Intel- with their whole 24 Mbytes of L3 cache compared to any newer AMD. AMD also leverages it's on die ram for more things & more efficiently than older Intel chips do. It might be worth checking to see if the bios is limiting the power, or if your temps are high enough for the board sensor near them to force your VRMs to change phase because it thinks it's drawing more current than it actually is. The most common form of a system dying is cascade failure. For instance your CPU can take 80+°C before it tries to cool down, but there are other components that even their max temp is usually around 20-30 degrees less than the hottest components, & this makes them perform out of tolerance, like a resistor bleeding a bit more voltage through, thus knocking the next component a bit out of spec also, causing microfractures or other damage like being unable to maintain proper electrical isolation of individual modules within said component. Eventually one will fail- leaving many of the other components to take the brunt of that failure.
Cant wait to see it. I own the same laptop and copied your heatsink mod and it really helped I plan on trying this eventually (depending on how hard it is).
Nice! Which heatsink mod of the two I showed? The watercooling one? If not then definitely think about doing that. It is a true game changer. I am on a 360 external radiator now with a D5 pump and oh sweet water cooling ... the fans stay at near silent RPMs when playing games at 115W GPU and 80W CPU. Best mod in terms of quality of life improvement so far
1:16 This GA103 silicon was originally meant to be the desktop RTX3080 . It has 7680 cuda cores and 320bit memory bus when fully enabled . Somehow due to poor yields. Samsung ended up with alot of partially defective GA102 silicon Nvidia basically got them almost for free , therefore consumers got a sort of upgraded RTX3080 with 8704 cuda cores and a bigger silicon. Nvidia found it cheaper to use GA102 for RTX3080 than GA103.
@@EtaCarinaeSC well MLID gets highly criticized here and there for claiming to have sources which proof to be wrong afterwards. So IDK, he is no real source in my opinion. But will look out for his video anyway, thx! The info you gave me make sense. Don't get me wrong, I think that could be accurate.
@@TechModLab yeah i mean, sure he can be mislead, but he is more often correct than wrong + making these claims way past ampere's prime gives him no real boost in anything...
@@TechModLab oh and on the topic of the performance, yes memory at 14gbps performs better. Nvidia messed something up with that gpu. But also 14gbps saves like 5w vs 16gbps so this should also be taken into consideration. Some manual voltage control should be added for testing too.
This is so freaking satisfying!!! I have been following your videos since the very first one on these XMG laptops, and I get even more excited after every new video. You're unstoppable!! Great job man, regardless of the results!
@pufborekkymal7091 But Acer employee replaced orginal liquid metal during repair and now its bad (he applied liquid metal too). I don't know what he did but temps are horrible, cpu goes 100°C and gpu to 89°C. So i won't use laptop for heavy work until and unless i myself reapply it or go to any repair shop.
@@lucifergaming839 that is so unfortunate of you. But be awere of risks considering that now liquid metal already wrong applied(probably). Maybe send laptop acer again because something going wrong is more likely when it is already wrong applied. If things got wrong while you repairing it will be on you but if it goes wrong while acer service you can request guarantee replecment.
@@pufborekkymal7091 when they repaired my laptop it was in last day of warranty, fortunately i opened the laptop before running it and it had 3-4 leaked lm bubbles which i cleaned carefully. I think problem is that heatsink and cpu,gpu are not in proper contact as normally temps are normal but during games it goes 100C , so i stopped playing games until i repaste it. EDIT: Also after few days i opened it again to check if there was any more leak and fortunately there was none.
Get a ptm 7950 pad. Thermal grizzly sells authentic ones under their own name. Perfect for laptops and lasts indefinitely without dryout or pump out @@lucifergaming839
I always wanted to try something super technical like this with some of my older laptops. You have much talent and seeing the process is a trade art. I only wish I had the tools lol
Ampere is a very power hungry arch thanks to that tuned down samsung 10nm (yes ik it's 8nm, but it's based on an optimized 10nm process) so the best gains that I've seen on mobile is to limit core voltage somewhere in the 800-850mv range for highly intensive GPU bound scenarios, anything more will leave that core starved and clock stretching like a madman.
Amazing work! Thank you so much for sharing this experience with us. I was supposed to be sleeping now, then I saw this and I was like: no way man, I gotta see this! LOL can't wait for the next
22:10 you should look at mosfet power stages, those dips in clock speed make it look like you are getting too much voltage sag on the core and you might need to step up the mosfet spec or add more capacitance. Also check the amount of phases, you might be lacking phases because of the smaller chip that was on there origionally.
That's what is one of my suspicions as well. The 3080 Ti draws much more current at lower voltage, so the load on the inductors and caps is much higher. But I would have expected better performance at lower TGPs in this case, because strain is way lower on them when TGP is low. I will investigate that soon.
@@TechModLablong time ago my gtx965m laptop is dead. problem is voltage (over 1.3V) and temperature (i was hacked temp lock to unlock 99C). and unlimited power set. may be 3 phase.
Great work mate, one pro tip, when you reflow the new BGA GPU, place 2 coins on the die that way you don't need to tap them from the top avoiding mistakes, the coins will push it far enough without putting too much pressure as you can with your hand.
Mind blowing work! Owning a Lenovo P1 4th gen, eleventh gen i9, 3080 16gig. I feel super lucky. It cranks to 5ghz no problem. Total powerhouse for all 3d laser scanning, and post processing. Defiantly not a laptop. Well worthy of its "mobile workstation' class.
Nice video Bro. The only cons of this project is too much time and money, but still it is a worthy project. Imagine you have custom build laptop designed by you. It is like Tony Stark building his customize Iron man suit.
I think this would be basically impossible but in theory since its a b550 chipset you could put any pcie 4.0 gpu into it like a 3090 or a 3090ti. The method of power delivery would need to be changed, and using the battery would be basically gone.
at that point you might as well just pinch the pcie lanes directly since this is an AM4 socket with plenty of those and make a full size pcie slot to run an eGPU.
@@yarost12 nvme oculink is only 4 lanes which means you'd have to either turn it down from 4x gen 4 to 16x gen 3 which bottlenecks gpus more than the effort is worth when you've already got something like a laptop 4080 installed and working. the difference between laptop and desktop gpu models nowadays is like 15-30% and oculink is also around the 20% mark.
Du Verrückter. :-) Richtig Gut gemacht. Ich trau mich netmal meine A770 zu reparieren, wo es doch beim Laptop geklappt hat und Du machst dir einfach mal ne neue GPU inkl Ram aufs Board.
The wizardry involved with replacing soldered hardware here is impressive. Curious if you keep data graphs for all temperature sensors durin gyour benchmarks. Wondering if the processors are hitting their max temperature due to the cooling solution. That might prevent the performance from being higher, but is still an honest result for longer sessions.
Will add them later on my Blog, linked in the video description. :) the 3080 temperature results i had earlier were bugged. I have to repeat the runs to make them truly comparable. But temperatures were mostly ok for the GPU up till 150W. At 175W it hit a little above 80°C max. Nothing to worry as well, but a little too high in the long run for my taste. The CPU had not much load in those synthetic benchmarks. It's temperatures should be negligible. I will post them later anyway
@@TechModLab sorry, I should have been more clear that I meant the graphics processor temperatures (not the central processor :). That's good to know that the laptop cooling solution is sufficient to 150w. Might be worth using Afterburner and adjusting the performance curve to undervolt for greater performance within the thermal limit. :) Appreciate you taking the time to share your thorough response and data. This video will break 100k views for sure. :D
lookin at this while rockin my 1650ti mobile is quite sad. but same like you i rlly wanna test out some things and who knowes; i may get into my bios at one day and can adjust TDP etc proper (locked one with i7 10750h.. even with 65w GC-bios it still runs at 50w). after all that useless information lets just say: good job and good luck with adjusting mosfets etc too. have a feelig about that. keep going 🙂
My assumption here would be the amount of cores being powered is what stifles performance at low wattages as they're all being starved. I dare say VRMs depending the 3080 ti would gain a lot from a shunt mod.
you are amazing and inspirational. i wish i knew how you just put those chips on without doing any kind of bios flashing. or was that part not necessary when just changing the gpu and vram?
You are right! I just noticed from your comment! I forgot to mention that in Part 2, oopsi! Part 1 got it right, the vBIOS has to be flashed and I flashed it externally. I found that all 3080 Ti Mobile vBIOS work just fine, except for some display outputs. You have to try some different until you find one which makes all your display outputs working. You just have to make sure the DeviceId is correct. In my case that's 10DE 2420, which is the non-GSYNC vBIOS.
@@TechModLab *taking notes* i wonder if the acer nitro motherboards can be upgraded as well. i just want to do it now. i have the equipment i have several acer nitro 5 laptops i need to replace the PCH chip on and i might as well try to upgrade them. vram and gpu. would you mind giving any advice on these? ill research the deviceID but im not sure where to start my searching atm. They are all 3050ti so i have some to try this out on. a unicorn laptop would make a great holiday gift.
@@Hexauslion The 3050 Ti has a smaller GPU package and can't be upgraded further as far as I know. You should take a look for RTX 3060 laptops to do that
@@TechModLab thats a shame. but very helpful thank you. but okay maybe i can upgrade the memory in that case? that should be straight forward because the 3050ti mobile has an 8gb model and the ones i have are all 4gb. so i assume i should try upgrading that and then flashing the vbios with the appropriate version to have it recognize.
You know, at lower TDP limits, memory power consumption can make a huge difference. Especially since you've changed it for faster one... It can draw up to 40W for all 8 chips. I would check it under GPUz whilst running a benchmark.
it is at around 36W max, and yes, it does make a difference. However, the power draw at lower TGPs is actually the same as with the 14GBit/s 3080 and 3070 Mobile GPUs, because the 3080Ti Mobile clocks the VRAM down from time to time when at below 165W. At 115W it almost never goes to 16GBit/s (@1.35V), but switches most times between 12 (@1.25V) and 14GBit/s (@1.35V). At 90W and less it stays at 12GBit/s.
@TechModLab interesting. My 3060 runs at fixed 14 gbps, until very specific conditions are met, like GPU only being used at 60% and drawing less than 75W, only then runs at 12 gbps. Also it's a micron memory, which is like 30% less efficient than Samsung.
@@TechModLab also 12 gbit seems extremely slow even at lower core speeds. Have you tried raising the memory clock ? The voltage should stay the same in that case, no ?
@@panjak323 you can check out my blog. I added two more interesting graphs there, which show the benchmark score at fixed video memory clocks. You can see in them that 12GBit/s is faster at lower TGP than 14 and 16GBit/s. It is the other way around for higher TGPs. This behavior is coded into the vBIOS. There was an issue with earlier RTX 3070 Mobile vBIOSes as well, where they jumped between 12 and 14GBit/s all the time as well. It got fixed shortly after launch day with a new vBIOS. When you look at the additional graphs on my blog you can see another anomaly, which might got something to do with the video memory clock lock by the nvidia-smi tool. The 3080 Mobile clearly performs better on "auto" memory clocks compared to locked. The 3080 Ti Mobile does not. I want to do another run in the future, when the 3080 Ti Mobile is set to the 14GBit/s straps and see if the memory clock behavior is different. It would be crazy if this would make it perform better in the end. Let's see.
@@panjak323 by overclocking the video memory you do not increase the vmem voltage. Additional note to previous comment: The 3080 Ti Mobile is special because it got 3 clock tables for different loads (one table for each memory speed, 12, 14, 16GBit/s), while the other Ampere GPUs got only two (12, 14GBit/s). It might be less stable than two. That's why I commented to try out the 14Gbit/s straps previously
ya Clevo laptops were just amazing i still have my old Clevo D901C Desktop CPU, 3 x HDD, 2 x GPUs in SLI upgradeable i first had two 8800GTX then upgraded the CPU and the GPUs to two 9800GTX it was a beast and yes this thing run Crysis back in the day like a champ Unfortunately nowadays laptops are made to look good and to be thin as possible and not functional nor practical modern laptops are just complete garbage
If you're gonna do this again and again, I suggest you to use the AMAOE Magnetic Stencil kit for use with Solder Paste (not solid balls) as it's more easier to use (no balls flying away, ...) and very fast for productivity. Just don't forget to make the paste dry (pressing it with a paper towel) and use some strong non magnetic tweezers to press the stencil down (as it wraps through high temperature). Also don't use too much temp (300°C max) and a medium air flow.
Amazing Video, what you are doing is super interesting. Any chance you could link the tools you use or at least give the names of them? I know from your accent and some screenshots shown in your videos that your tools are from a market I might have access to and are not just on some American hardware website.
Definitely taking a look into that. It would be more efficient on the high end, yes. The 3080 Ti puts a lot more strain on the VRM because it runs at higher current but lower voltage. The poor GPU core VRM got 6 phases only. Laptops with 150W-175W TGP come with 6-8 phases. So I am at the lower end. At the highest TGP (175W) there is roughly 30W of power lost on the VRMs. Something like 20W would be normal.
Can you make a video about dtr so because in my country they are not common at all and I hear about them first in your previous video. I wander what is their disadvantages? Because upgradable replaceable CPUs on a laptops are realy a dream for me😢
APEX 15 MAX with the Ryzen 9 5900X board is likely competent with the 3060 and 3070 TGP's and current draw rated to support those GPU's and memory timings. You got lucky pushing it with the 3080, but the 3080Ti is a beast that likely was sucking the mosfets / caps dry on phases and crashing the pipeline performance accordingly with resets / stutters which probably took out 20% of the expected performance gains. I'm very interested in following up with your other entertaining videos, as usual, and seeing you conquer this thorny issue! I haven't -to my recollection- seen a good video out there yet (haven't looked either, just saying I haven't yet come across a good one) that describes how to upgrade the supporting connective tissue of the MB/Memory and steady power curve between the DDR6 / GPU / CPU and the motherboard. Well, I suppose that's what the manufacturers and engineering teams do... so asking you to do that is really pushing it. Sort of like you did with your 3080Ti insane mod. hahah.. fun!
So a cap mod to increase capacity and lower ESR for example? Done ^^ No gains, just some more OC headroom! I checked some data from JarrodsTech about the 3080Ti Mobile as well. Seems like my results are in line with his. I guess the GA103 is starving as you said. It starts to scale at above 130W TGP and probably continues well above 175W
You could start heating up the balls while the sieve matrix is still atop the chip - ive seen others fo it this way and they had no issues with their balls moving ir veing imperfect.
try to optimize the frequency-voltage curve for consumption, not the higher frequency. Reduce the voltage by -0.025 - 0.075V at the maximum frequency at which the GPU boosts. In the end you should get the most stable boost frequency and lower consumption. I don't know how much headroom there is on laptop chips for under-volting, on desktop ones it is quite large.
The Grounding system could be upgraded. shift it to .01v negative, 19.1v positive, loop all the grounds with cables so the resistance is 1ohm throughout. Upgrade the mosfetts to extrenely better mosfets and a tiny thermo nuclear active choke. Stay healthy ok.
I'd be curious to know if you see the same clock dips and performance on Linux. I personally upgraded to a 6800 XT that I troubleshooted for days in Windows where it was completely unusable. The clocks, voltage and performance is all over the place. All while there's no issue in Linux. Haven't figured out why, so it could be a totally different reason to yours of course. But, I'm curious if perhaps you could see a bit of a difference there?
Dude fantastic job on this upgrade. The amount of skill required and steady hands im sure my clumsy ass would knock the gpu off the table or smth sh!t and fck everything up. Amazing job on this upgrade
Makes me wish I could replace the 1650 in my laptop with at least a 1650 Ti and to adjust the power curves on the cpu and gpu to sane values. The CPU tends to get pushed too hard for the cooler and the gpu doesn't get pushed nearly hard enough. On top of that the on battery performance is set by temperature not by power limit. 50W TGP on the gpu while plugged in makes sense, 60C temp limit on the gpu while on battery does not. It averages the gpu limit to around 15W which would make sense but because its being limited by the temp instead of by board power games stutter A LOT even when they really shouldn't.
Not sure what you mean exactly, but here you go. Reflowing the loose (unattached) solder balls: 170°C + hot air, OR 200°C without hot air. Soldering in reballed GPU with leaded solder balls: 170°C with lead-free: 180°C
The 3080ti in my MSI GE67 laptop holds up great. Its powerful enough for anything at best settings in 1440p. But what really sold me where the 16GB VRAM which is alot and hasnt been surpassed with newer laptops.
By any chance, do you have any Time Spy or Port Royal results with your GPU without OC (disabled auto-OC by MSI center at highest power preset)? I wonder if you see the same memory clock behavior, rapidly switching between clocks.
Hi, this is very interesting, I have an rx6800m, I want to flash the vbios of an rx6850m XT, do you think it will work? I have an spi programmer, and the chips are the same and the same memories It's just a different subsystemID, what do you think?
Pew good question... well I don't know AMD GPUs too well, so I can't tell you if it will work. On Nvidia it would not, because the GPU checks if the vBIOS GPU ID matches the burned in ID of the chip. Rtx 3080M is the same GPU as a 3070Ti Desktop, but vBioses are not interchangable
I sent you a message on your website, as I mentioned, everything is the same only subsystemID is different, but I guess maybe there is some way to edit it, I tried using Amdvbflash but subsystemID is the problem so I thought of 2 options 1, edit subsystemID to match and allow flashing, or 2, use spi programmer,Maybe this will allow flashing without editing, but if not, I could try to modify this parameter, I would like to be able to chat with you by some other means.To be able to find out if this is possible Some time ago the community managed to flash 6600xt on 6600m and gained a lot of performance,So possibly this can be achieved, I hope so Since it would gain performance and also unlock the power settings in the AMD panel, which AMD blocked on this PC. And more power tool no longer works, I await your response friend, thank you very much!!@@TechModLab
Ja..., wie man in deinem Video sieht "geht nicht gibt es nicht". Hab noch ne Frage, liegt es am Sockel, welches Upgrade von der GPU gemacht werden kann? Danke für das ausführliche zeigen, wie man es richtig macht, super, mach bitte weiter so!!!
First of all, the laptop has hardware based TGP monitoring based on its cooling capacity. I've got one laptop from my friend of mine to repair it but i bought it for 150 bucks. Basic laptop GTX 1050Ti ( desktop version ) i7 7700Q etc... i took out all the guts and i made open air construction from the flooring sheets ( i know it's messed up ) and i've got two water cooling blocks from alliexpress, water pump and tubing. Even though i managed to have temps well under 50c ( almost 50c bellow the maximum temperature limit of the original cooling design) i wasn't been able to get higher boosts from the GPU, it always sags whatever i tried. The original heatsink in this laptop is designed to handle 120W together because those two finns are connected via a large and fat heatpipe and this power can't be exceeded with both CPU and GPU ( 45W + 70W which is only 115W but there is 5W headroom for something ). You can manually do a shunt mode if you find that exact measuring resistor but i didn't found anything for my board. There are some 0Ω resistors but they are acting like vias. Only problem is that circuit that measures what comming to the phases for the CPU and the GPU and for your laptop, the problem is heatsink as well. You could push that 3080Ti above 200W but it will suck shit ton of power to dissipate somewhere. If you have some room between the lid and the heatpipes, buy a fat bulky one and connect all finns together if possible.
i made a watercooling mod some videos before. Don't worry, it can get well cooled :) It was running on air in this video though, which is good enough for 220W combined. In combined loads (e.g. games) I run into laptop power limiting issues indeed. The GPU is limited to roughly 150W when the CPU draws 70W + peripherals (wifi, screen, USB, etc.). So I guess there is a 230W limit measured by the input shunt. I thought about modding it to allow 175W to the GPU all the time, but tbh, I think 150W is more than enough. I kind of like the built-in throttling by the EC.
@ yep, that could be also the case. Everything in that laptop runs with some headroom but running it always at max ( VRM, step down convertors etc. ) isn't good for logavity. CPU GPU might survive but other components not.
ich kann Bismuth-haltiges Lot empfehlen (schmilzt bei unter 140°C) ;) Misch das mit dem auf dem Board mit viel Flux, dann fällt die fast von allein raus.
Bro I have a HP Probook laptop in which I have a GPU and VRAM slots just like yours but they are empty because it was a integrated graphics variant should I put a graphics card and vram in it it supports something like a Nvidia GeForce GTX 680M and AMD Radeon HD 7970N and where should I buy them please reply soon sir ❤
Remember removable dGPU that was used in the past, including Framework recently did? These are the fairly easy way to get good GPU Performance But anyway, I love this video as usual 😅
There are some things I want to comment on before this video goes live:
1) THANK YOU for 1 Million views on my previous video! I am blown away by all your comments and glad you really seem to like it.
2) This video will be much more about explaining all the steps properly, with some minor shortcuts and references to part 1. I had the feeling Part 1 (3080 Mobile upgrade) was a bit confusing here and there.
3) No I don't offer this as a service and I don't think that would be a good idea anyway. I live in Germany and costs of living are quite high. For example I pay roughly 1000€ for rent (all in all, which is still quite cheap in germany for an apartment as big as mine) and another 1000€ for feeding my family. You can imagine how much I would have to charge for such an upgrade if you consider the material costs (e.g. 400€ GPU + 90€ VRAM + 10€ flux, solder and such), renting a proper workshop for that, cover risk and get some profit to pay my bills. I guess a 3080 upgrade would be 800€. Anything below that would make no sense because of all the costs :( But I am fine to keep working on building up my channel to be able to step back from my main job (40h a week) in part time to make more YT videos. If that's possible for me I can imagine to upgrade a viewers laptop (or desktop GPU?). As of today, it can't be a widely available service, though.
4) Before this comes up again I want to make clear, that this mod requires the new GPU's pads to be compatible with the old GPU. Ada Lovelace AD104/103 (RTX 40 series) for example have the same footprint (visual pad layout) as Ampere GA104/103 but are NOT compatible electrically unfortunately. Some pads got different signals on them. However I am working on something to keep upgrading further (slowly, no priority at the moment). Hope I can update you about this in near future. At first, I will explore some other ideas for future videos. :)
5) Temperatures are not an issue. A lot of people asked that. At least if you run the upgraded GPU at the same TGP level as your old GPU. My old 3070 Mobile ran at 115W TGP, so does my new 3080 Mobile in the same laptop. Temperatures are pretty much identical. Except for one upgrade I put into my daily machine: the watercooling modded heatsink from one of my earlier videos. Together with a fast-detachable external water cooling loop I can run 150W TGP easily, with temperatures being very good, GPU 70°C max, CPU 70°C - ish (jumps a lot), while the air cooling fans in the laptop run at quite low speed. The max TGP of 175W should be usable as well, but I don't want to play the game of diminishing returns and put a lot of stress on the VRMs and peripherals.
6) This video is also available as blog post in case you want to take a closer look at some things: techmodlab.com/xmg-apex-15/xmg-apex-15-max-2022/upgrading-a-soldered-laptop-gpu-part-2/
I know you mentioned that temperatures are not an issue. But can the fluctuations in the VRAM clock be associated with memory chip thermal throttling?
really wish these types of things were more practicable and cost effective. it really sucks that devices are not made to be serviceable or upgradeable anymore..
can u do 3050 replacement
I have a question could you upgrade a dell precision 5510 like could u put on a better gpu because its currently has a NVidia Quadro M1000M and the cpu intel(R) core(TM) i7-6820HQ @ 2.70GHZ
It's also worth noting that if that GPU is substantially faster, & more powerful than the chip originally on the board, not to mention faster ram & probably double what was available prior, means the GPU will be trying to stash lots more textures in it's ram for availability, (or at least at a faster pace) which means your doubling the graphics workload for the prefetch pipeline of the CPU. All those instructions have to be ordered & sent to the GPU, & I have seen (on older systems primarily) when paired with a GPU better matched to the processor it always had a bit of CPU overhead, but with a newer GPU the CPU would be tasking @ 99% almost constantly, even if it isn't thermal throttling. Also the older CPU's don't have the larger on-die ram buffer of the newer chips either. That's especially true of Intel- with their whole 24 Mbytes of L3 cache compared to any newer AMD. AMD also leverages it's on die ram for more things & more efficiently than older Intel chips do.
It might be worth checking to see if the bios is limiting the power, or if your temps are high enough for the board sensor near them to force your VRMs to change phase because it thinks it's drawing more current than it actually is. The most common form of a system dying is cascade failure. For instance your CPU can take 80+°C before it tries to cool down, but there are other components that even their max temp is usually around 20-30 degrees less than the hottest components, & this makes them perform out of tolerance, like a resistor bleeding a bit more voltage through, thus knocking the next component a bit out of spec also, causing microfractures or other damage like being unable to maintain proper electrical isolation of individual modules within said component. Eventually one will fail- leaving many of the other components to take the brunt of that failure.
Cant wait to see it. I own the same laptop and copied your heatsink mod and it really helped I plan on trying this eventually (depending on how hard it is).
Nice! Which heatsink mod of the two I showed? The watercooling one? If not then definitely think about doing that. It is a true game changer. I am on a 360 external radiator now with a D5 pump and oh sweet water cooling ... the fans stay at near silent RPMs when playing games at 115W GPU and 80W CPU. Best mod in terms of quality of life improvement so far
Just the first one with the better vrm contact but Im planning on doing the watercooling one some time in 2025
😮
Also the shunt mod but thats about it
how would i find one
1:16 This GA103 silicon was originally meant to be the desktop RTX3080 . It has 7680 cuda cores and 320bit memory bus when fully enabled . Somehow due to poor yields. Samsung ended up with alot of partially defective GA102 silicon Nvidia basically got them almost for free , therefore consumers got a sort of upgraded RTX3080 with 8704 cuda cores and a bigger silicon. Nvidia found it cheaper to use GA102 for RTX3080 than GA103.
Good info, thanks! Do you have any sources for that I can read through?
@@TechModLab i have the sources, moore's law is dead made a video about it.
@@EtaCarinaeSC well MLID gets highly criticized here and there for claiming to have sources which proof to be wrong afterwards. So IDK, he is no real source in my opinion. But will look out for his video anyway, thx!
The info you gave me make sense. Don't get me wrong, I think that could be accurate.
@@TechModLab yeah i mean, sure he can be mislead, but he is more often correct than wrong + making these claims way past ampere's prime gives him no real boost in anything...
@@TechModLab oh and on the topic of the performance, yes memory at 14gbps performs better. Nvidia messed something up with that gpu. But also 14gbps saves like 5w vs 16gbps so this should also be taken into consideration. Some manual voltage control should be added for testing too.
This is so freaking satisfying!!!
I have been following your videos since the very first one on these XMG laptops, and I get even more excited after every new video. You're unstoppable!! Great job man, regardless of the results!
And here i am afraid to change the liquid metal on my laptop.
You don't need a change anyways. I don't recommend it
@pufborekkymal7091 But Acer employee replaced orginal liquid metal during repair and now its bad (he applied liquid metal too). I don't know what he did but temps are horrible, cpu goes 100°C and gpu to 89°C. So i won't use laptop for heavy work until and unless i myself reapply it or go to any repair shop.
@@lucifergaming839 that is so unfortunate of you. But be awere of risks considering that now liquid metal already wrong applied(probably). Maybe send laptop acer again because something going wrong is more likely when it is already wrong applied. If things got wrong while you repairing it will be on you but if it goes wrong while acer service you can request guarantee replecment.
@@pufborekkymal7091 when they repaired my laptop it was in last day of warranty, fortunately i opened the laptop before running it and it had 3-4 leaked lm bubbles which i cleaned carefully. I think problem is that heatsink and cpu,gpu are not in proper contact as normally temps are normal but during games it goes 100C , so i stopped playing games until i repaste it.
EDIT: Also after few days i opened it again to check if there was any more leak and fortunately there was none.
Get a ptm 7950 pad. Thermal grizzly sells authentic ones under their own name. Perfect for laptops and lasts indefinitely without dryout or pump out @@lucifergaming839
I always wanted to try something super technical like this with some of my older laptops. You have much talent and seeing the process is a trade art. I only wish I had the tools lol
I still could not believe that someone really did it in their home/garage.
Hats off to you sir!
TYVM! Been waiting for standard and better than MXM laptop upgrade potential all my life :D
Ampere is a very power hungry arch thanks to that tuned down samsung 10nm (yes ik it's 8nm, but it's based on an optimized 10nm process) so the best gains that I've seen on mobile is to limit core voltage somewhere in the 800-850mv range for highly intensive GPU bound scenarios, anything more will leave that core starved and clock stretching like a madman.
Found your channel and have been binging all of your videos. I cannot stop watching! Keep it up.
It's mind-blowing when I think about all the knowledge and skill you have that enables you to do this! 🤯
To give a nudge to the floating ics I would recommend to use something that bends easily at the right amount of force.
this; i messed up a few reflows cause of my shaky hands nudging the ic a bit too much
Impressive great work so that I know I am not be able to upgrade graphic card like you. Thanks man.
Glad I could help 😅
Yes, soldered things are beyond ability to modify on most of us
Congrats. ❤ the 2nd part... sometimes you just have to do it to see if it's worthwhile.
You are an artist, man! A lot of talent!
Thank you so much! I will try some in the future
i just can't believe about your skill to identifying which parts is which
my mean that memory config straps
it just unbelievable
keep the hardwork man
Man with balls of steel. Back at it again.
He said the balls were leaded
They're actually 60% tin and 40% lead
I was told this was not possible. I told them that just because they don't know how does not mean that it's impossible!
forgot to add thermalpaste under the GPU as well!
I love seeing this type of work, subscribing.
Amazing work! Thank you so much for sharing this experience with us. I was supposed to be sleeping now, then I saw this and I was like: no way man, I gotta see this! LOL can't wait for the next
when i was a child, jumpers and dip switches on 8086's were my toys. never made it this far.
22:10 you should look at mosfet power stages, those dips in clock speed make it look like you are getting too much voltage sag on the core and you might need to step up the mosfet spec or add more capacitance. Also check the amount of phases, you might be lacking phases because of the smaller chip that was on there origionally.
That's what is one of my suspicions as well. The 3080 Ti draws much more current at lower voltage, so the load on the inductors and caps is much higher. But I would have expected better performance at lower TGPs in this case, because strain is way lower on them when TGP is low.
I will investigate that soon.
Nice to see you here, hi fri
@@TechModLablong time ago my gtx965m laptop is dead. problem is voltage (over 1.3V) and temperature (i was hacked temp lock to unlock 99C). and unlimited power set. may be 3 phase.
i love watching these and thinking man i would never have this much technical knowlage to do 10% of this
This dude has me wanting to attempt to swap the GPU out of my laptop
Impressive, your really talented 👍🏼
I’m curious where did you learn how to do this, stuff like this is pretty damn interesting
Impressed mate. You gained a sub.
Great work mate, one pro tip, when you reflow the new BGA GPU, place 2 coins on the die that way you don't need to tap them from the top avoiding mistakes, the coins will push it far enough without putting too much pressure as you can with your hand.
great introduction to my 2025! something ive never bother ashking myself yet didn't knew i yearn for are peak youtube.
Mind blowing work!
Owning a Lenovo P1 4th gen, eleventh gen i9, 3080 16gig.
I feel super lucky.
It cranks to 5ghz no problem.
Total powerhouse for all 3d laser scanning, and post processing.
Defiantly not a laptop.
Well worthy of its "mobile workstation' class.
Nice video Bro. The only cons of this project is too much time and money, but still it is a worthy project. Imagine you have custom build laptop designed by you. It is like Tony Stark building his customize Iron man suit.
I think this would be basically impossible but in theory since its a b550 chipset you could put any pcie 4.0 gpu into it like a 3090 or a 3090ti. The method of power delivery would need to be changed, and using the battery would be basically gone.
at that point you might as well just pinch the pcie lanes directly since this is an AM4 socket with plenty of those and make a full size pcie slot to run an eGPU.
That would be really cool still hard though.@@ActualRealKayamariOfficialVEVO
@@ActualRealKayamariOfficialVEVO nvme -> oculink adapters exist, might be a better option.
@@yarost12 nvme oculink is only 4 lanes which means you'd have to either turn it down from 4x gen 4 to 16x gen 3 which bottlenecks gpus more than the effort is worth when you've already got something like a laptop 4080 installed and working. the difference between laptop and desktop gpu models nowadays is like 15-30% and oculink is also around the 20% mark.
Du Verrückter. :-) Richtig Gut gemacht. Ich trau mich netmal meine A770 zu reparieren, wo es doch beim Laptop geklappt hat und Du machst dir einfach mal ne neue GPU inkl Ram aufs Board.
Man your videos are jus awesome 😭❤️💯.
The wizardry involved with replacing soldered hardware here is impressive. Curious if you keep data graphs for all temperature sensors durin gyour benchmarks. Wondering if the processors are hitting their max temperature due to the cooling solution. That might prevent the performance from being higher, but is still an honest result for longer sessions.
Will add them later on my Blog, linked in the video description. :) the 3080 temperature results i had earlier were bugged. I have to repeat the runs to make them truly comparable. But temperatures were mostly ok for the GPU up till 150W. At 175W it hit a little above 80°C max. Nothing to worry as well, but a little too high in the long run for my taste. The CPU had not much load in those synthetic benchmarks. It's temperatures should be negligible. I will post them later anyway
@@TechModLab sorry, I should have been more clear that I meant the graphics processor temperatures (not the central processor :). That's good to know that the laptop cooling solution is sufficient to 150w. Might be worth using Afterburner and adjusting the performance curve to undervolt for greater performance within the thermal limit. :) Appreciate you taking the time to share your thorough response and data. This video will break 100k views for sure. :D
Many people are ready to pay you for this modifications... Including me boss!!!!! Congratulations bro🎉🎉🎉🎉
lookin at this while rockin my 1650ti mobile is quite sad. but same like you i rlly wanna test out some things and who knowes; i may get into my bios at one day and can adjust TDP etc proper (locked one with i7 10750h.. even with 65w GC-bios it still runs at 50w). after all that useless information lets just say: good job and good luck with adjusting mosfets etc too. have a feelig about that. keep going 🙂
What could we replace our 1650's with? maybe 2050 or 3050?
@@kaanasdf8131 will not work sadly we need to stick with 1650--1660ti its smarter to buy newer one haha
Schönes Video, Teil 1 & 2 haben mir gut gefallen. Weiter so!
very good work ...
too clean and precise . . . 👌👌
Nice attempt. I guess the 3xxx soc is not compatible with 4xxx ones. If not it would be interesting to know that any laptop can be upgraded.
That's exactly the reason
Possible if you mastering electro tech 😂
just imagine a true clevo p870 successor chassis with 20"-21" display on a AM5 platform with next gen gpus... Dreams
You’re a surgeon dude, incredible work.
The skill level you have. Amazing work :)
This is insane. Thanks for sharing your findings.
Geile Arbeit und echt interessant anzuschauen, nur triggert dieser akzent so hart (ich weiß du kannst da nix für)
My assumption here would be the amount of cores being powered is what stifles performance at low wattages as they're all being starved. I dare say VRMs depending the 3080 ti would gain a lot from a shunt mod.
you are amazing and inspirational. i wish i knew how you just put those chips on without doing any kind of bios flashing. or was that part not necessary when just changing the gpu and vram?
You are right! I just noticed from your comment! I forgot to mention that in Part 2, oopsi! Part 1 got it right, the vBIOS has to be flashed and I flashed it externally. I found that all 3080 Ti Mobile vBIOS work just fine, except for some display outputs. You have to try some different until you find one which makes all your display outputs working. You just have to make sure the DeviceId is correct. In my case that's 10DE 2420, which is the non-GSYNC vBIOS.
@@TechModLab *taking notes* i wonder if the acer nitro motherboards can be upgraded as well. i just want to do it now. i have the equipment i have several acer nitro 5 laptops i need to replace the PCH chip on and i might as well try to upgrade them. vram and gpu. would you mind giving any advice on these? ill research the deviceID but im not sure where to start my searching atm. They are all 3050ti so i have some to try this out on. a unicorn laptop would make a great holiday gift.
@@Hexauslion The 3050 Ti has a smaller GPU package and can't be upgraded further as far as I know. You should take a look for RTX 3060 laptops to do that
@@TechModLab thats a shame. but very helpful thank you. but okay maybe i can upgrade the memory in that case? that should be straight forward because the 3050ti mobile has an 8gb model and the ones i have are all 4gb. so i assume i should try upgrading that and then flashing the vbios with the appropriate version to have it recognize.
You know, at lower TDP limits, memory power consumption can make a huge difference. Especially since you've changed it for faster one... It can draw up to 40W for all 8 chips.
I would check it under GPUz whilst running a benchmark.
it is at around 36W max, and yes, it does make a difference. However, the power draw at lower TGPs is actually the same as with the 14GBit/s 3080 and 3070 Mobile GPUs, because the 3080Ti Mobile clocks the VRAM down from time to time when at below 165W. At 115W it almost never goes to 16GBit/s (@1.35V), but switches most times between 12 (@1.25V) and 14GBit/s (@1.35V). At 90W and less it stays at 12GBit/s.
@TechModLab interesting. My 3060 runs at fixed 14 gbps, until very specific conditions are met, like GPU only being used at 60% and drawing less than 75W, only then runs at 12 gbps.
Also it's a micron memory, which is like 30% less efficient than Samsung.
@@TechModLab also 12 gbit seems extremely slow even at lower core speeds.
Have you tried raising the memory clock ? The voltage should stay the same in that case, no ?
@@panjak323 you can check out my blog. I added two more interesting graphs there, which show the benchmark score at fixed video memory clocks. You can see in them that 12GBit/s is faster at lower TGP than 14 and 16GBit/s. It is the other way around for higher TGPs.
This behavior is coded into the vBIOS. There was an issue with earlier RTX 3070 Mobile vBIOSes as well, where they jumped between 12 and 14GBit/s all the time as well. It got fixed shortly after launch day with a new vBIOS.
When you look at the additional graphs on my blog you can see another anomaly, which might got something to do with the video memory clock lock by the nvidia-smi tool. The 3080 Mobile clearly performs better on "auto" memory clocks compared to locked. The 3080 Ti Mobile does not.
I want to do another run in the future, when the 3080 Ti Mobile is set to the 14GBit/s straps and see if the memory clock behavior is different. It would be crazy if this would make it perform better in the end. Let's see.
@@panjak323 by overclocking the video memory you do not increase the vmem voltage.
Additional note to previous comment: The 3080 Ti Mobile is special because it got 3 clock tables for different loads (one table for each memory speed, 12, 14, 16GBit/s), while the other Ampere GPUs got only two (12, 14GBit/s). It might be less stable than two. That's why I commented to try out the 14Gbit/s straps previously
Amazing and precise job, well done.
Taking the lead free solder balls off makes sense until I realized you had to melt them anyways to remove. Save the heat cycles. leave the lead-free.
ya Clevo laptops were just amazing i still have my old Clevo D901C Desktop CPU, 3 x HDD, 2 x GPUs in SLI upgradeable i first had two 8800GTX then upgraded the CPU and the GPUs to two 9800GTX it was a beast and yes this thing run Crysis back in the day like a champ
Unfortunately nowadays laptops are made to look good and to be thin as possible and not functional nor practical modern laptops are just complete garbage
If you're gonna do this again and again, I suggest you to use the AMAOE Magnetic Stencil kit for use with Solder Paste (not solid balls) as it's more easier to use (no balls flying away, ...) and very fast for productivity.
Just don't forget to make the paste dry (pressing it with a paper towel) and use some strong non magnetic tweezers to press the stencil down (as it wraps through high temperature). Also don't use too much temp (300°C max) and a medium air flow.
25:00 this shot looks amazing 🤩
Thank YOU for all the good videos.
Amazing Video, what you are doing is super interesting. Any chance you could link the tools you use or at least give the names of them? I know from your accent and some screenshots shown in your videos that your tools are from a market I might have access to and are not just on some American hardware website.
13:56 16kb of video memory, that unlocked some old memories of a vintage commodore lmao.
interesting, if we try to modify the mobile 3080 ti power delivery circuit for more power, will it have an increase in performance? good video.
Definitely taking a look into that. It would be more efficient on the high end, yes. The 3080 Ti puts a lot more strain on the VRM because it runs at higher current but lower voltage. The poor GPU core VRM got 6 phases only. Laptops with 150W-175W TGP come with 6-8 phases. So I am at the lower end. At the highest TGP (175W) there is roughly 30W of power lost on the VRMs. Something like 20W would be normal.
@@TechModLab never thought it was that deep, thanks for sharing your skills.
damn i wish i had these skills, great video!
Subscribed!
If you open a claim with Ali and say it’s a counterfeit item, there system triggers when it hears the word counterfeit and it helps out.
*Great work, so cool!*
I like this idea, aber das ist krass das this work!
Can you make a video about dtr so because in my country they are not common at all and I hear about them first in your previous video. I wander what is their disadvantages? Because upgradable replaceable CPUs on a laptops are realy a dream for me😢
This is the best youtube channel ever
APEX 15 MAX with the Ryzen 9 5900X board is likely competent with the 3060 and 3070 TGP's and current draw rated to support those GPU's and memory timings. You got lucky pushing it with the 3080, but the 3080Ti is a beast that likely was sucking the mosfets / caps dry on phases and crashing the pipeline performance accordingly with resets / stutters which probably took out 20% of the expected performance gains. I'm very interested in following up with your other entertaining videos, as usual, and seeing you conquer this thorny issue! I haven't -to my recollection- seen a good video out there yet (haven't looked either, just saying I haven't yet come across a good one) that describes how to upgrade the supporting connective tissue of the MB/Memory and steady power curve between the DDR6 / GPU / CPU and the motherboard. Well, I suppose that's what the manufacturers and engineering teams do... so asking you to do that is really pushing it. Sort of like you did with your 3080Ti insane mod. hahah.. fun!
So a cap mod to increase capacity and lower ESR for example? Done ^^ No gains, just some more OC headroom! I checked some data from JarrodsTech about the 3080Ti Mobile as well. Seems like my results are in line with his. I guess the GA103 is starving as you said. It starts to scale at above 130W TGP and probably continues well above 175W
performance uplift is minimal, but now you can run decent llms
You wrote about the costs but considering how expensive are the current and 5000 laptops it could be worth for a well kept machine
My hands start to shake just watching this art
Well done, Viel Erfolg!
You could start heating up the balls while the sieve matrix is still atop the chip - ive seen others fo it this way and they had no issues with their balls moving ir veing imperfect.
awesome work bro , loved it❤ you earned a sub❤
Thanks 🔥
Was würde ich für diesen skill geben. Alter Falter genial.
Ich bewerbe mich und meinen. XMG Apex 17 mit ner 3070 offiziell hiermit für ein Upgrade. 😍
VERY cool video! How are there 33 people who have given this video a thumbs down?!?!
Bro is an artist!
try to optimize the frequency-voltage curve for consumption, not the higher frequency. Reduce the voltage by -0.025 - 0.075V at the maximum frequency at which the GPU boosts. In the end you should get the most stable boost frequency and lower consumption. I don't know how much headroom there is on laptop chips for under-volting, on desktop ones it is quite large.
I would say to go with the higher temperature solder balls, but it depends from the laptop cooling aswell, how cool can it keep it
Absolute legend!
The Grounding system could be upgraded. shift it to .01v negative, 19.1v positive, loop all the grounds with cables so the resistance is 1ohm throughout. Upgrade the mosfetts to extrenely better mosfets and a tiny thermo nuclear active choke. Stay healthy ok.
I'd be curious to know if you see the same clock dips and performance on Linux. I personally upgraded to a 6800 XT that I troubleshooted for days in Windows where it was completely unusable. The clocks, voltage and performance is all over the place. All while there's no issue in Linux. Haven't figured out why, so it could be a totally different reason to yours of course. But, I'm curious if perhaps you could see a bit of a difference there?
Dude fantastic job on this upgrade. The amount of skill required and steady hands im sure my clumsy ass would knock the gpu off the table or smth sh!t and fck everything up. Amazing job on this upgrade
Makes me wish I could replace the 1650 in my laptop with at least a 1650 Ti and to adjust the power curves on the cpu and gpu to sane values. The CPU tends to get pushed too hard for the cooler and the gpu doesn't get pushed nearly hard enough. On top of that the on battery performance is set by temperature not by power limit. 50W TGP on the gpu while plugged in makes sense, 60C temp limit on the gpu while on battery does not. It averages the gpu limit to around 15W which would make sense but because its being limited by the temp instead of by board power games stutter A LOT even when they really shouldn't.
Would love to see you upgrade a Steam Deck to 32gb GDDR5 and WIFI6
This is just amazing!
What temp You used in preheater for the GPU? Appreciate your work🎉
Not sure what you mean exactly, but here you go.
Reflowing the loose (unattached) solder balls: 170°C + hot air, OR 200°C without hot air.
Soldering in reballed GPU with leaded solder balls: 170°C
with lead-free: 180°C
@TechModLab While putting balls on the GPU and lifting it and placing it on a preheater, what do you keep the temperature of the heater at that time?
@@eFixcenter it was off. I turn it on after it is placed there. It heats up to 150C in 60 seconds and 200C in roughly 2 minutes.
@@TechModLab 🤗
Amazing skill! 👍✨
Could you do a video about upgrading soldered ram?
brother are you crazy, this video perfect 🥳🥳🥳🥳
The 3080ti in my MSI GE67 laptop holds up great. Its powerful enough for anything at best settings in 1440p.
But what really sold me where the 16GB VRAM which is alot and hasnt been surpassed with newer laptops.
By any chance, do you have any Time Spy or Port Royal results with your GPU without OC (disabled auto-OC by MSI center at highest power preset)? I wonder if you see the same memory clock behavior, rapidly switching between clocks.
Great work!
Very nice work but not for everyone 😊
Wow what a coincidence
New video coming. And i have a Legion 5 Pro 3070 5800H.
Hi, this is very interesting, I have an rx6800m, I want to flash the vbios of an rx6850m XT, do you think it will work? I have an spi programmer, and the chips are the same and the same memories It's just a different subsystemID, what do you think?
Pew good question... well I don't know AMD GPUs too well, so I can't tell you if it will work. On Nvidia it would not, because the GPU checks if the vBIOS GPU ID matches the burned in ID of the chip. Rtx 3080M is the same GPU as a 3070Ti Desktop, but vBioses are not interchangable
I sent you a message on your website, as I mentioned, everything is the same only subsystemID is different, but I guess maybe there is some way to edit it, I tried using Amdvbflash but subsystemID is the problem so I thought of 2 options 1, edit subsystemID to match and allow flashing, or 2, use spi programmer,Maybe this will allow flashing without editing, but if not, I could try to modify this parameter, I would like to be able to chat with you by some other means.To be able to find out if this is possible Some time ago the community managed to flash 6600xt on 6600m and gained a lot of performance,So possibly this can be achieved, I hope so Since it would gain performance and also unlock the power settings in the AMD panel, which AMD blocked on this PC. And more power tool no longer works, I await your response friend, thank you very much!!@@TechModLab
this is so impressive. amazing.
Man I wish I had your level of skill and steady hand 🥲
Ja..., wie man in deinem Video sieht "geht nicht gibt es nicht". Hab noch ne Frage, liegt es am Sockel, welches Upgrade von der GPU gemacht werden kann? Danke für das ausführliche zeigen, wie man es richtig macht, super, mach bitte weiter so!!!
Great video man!!!, might as well upgrade my 3050m 4gb vram........
not sure if a 3050 is upgradable. I think it is physically smaller and has less pads on the board
Yeah you are right it has 4 memory chips 1gb each, I m planning to get same all same chips but 2gb each will order soon from ali expres
I have decent soldering skill that might help🤞
First of all, the laptop has hardware based TGP monitoring based on its cooling capacity. I've got one laptop from my friend of mine to repair it but i bought it for 150 bucks. Basic laptop GTX 1050Ti ( desktop version ) i7 7700Q etc... i took out all the guts and i made open air construction from the flooring sheets ( i know it's messed up ) and i've got two water cooling blocks from alliexpress, water pump and tubing. Even though i managed to have temps well under 50c ( almost 50c bellow the maximum temperature limit of the original cooling design) i wasn't been able to get higher boosts from the GPU, it always sags whatever i tried. The original heatsink in this laptop is designed to handle 120W together because those two finns are connected via a large and fat heatpipe and this power can't be exceeded with both CPU and GPU ( 45W + 70W which is only 115W but there is 5W headroom for something ). You can manually do a shunt mode if you find that exact measuring resistor but i didn't found anything for my board. There are some 0Ω resistors but they are acting like vias. Only problem is that circuit that measures what comming to the phases for the CPU and the GPU and for your laptop, the problem is heatsink as well. You could push that 3080Ti above 200W but it will suck shit ton of power to dissipate somewhere. If you have some room between the lid and the heatpipes, buy a fat bulky one and connect all finns together if possible.
i made a watercooling mod some videos before. Don't worry, it can get well cooled :) It was running on air in this video though, which is good enough for 220W combined.
In combined loads (e.g. games) I run into laptop power limiting issues indeed. The GPU is limited to roughly 150W when the CPU draws 70W + peripherals (wifi, screen, USB, etc.). So I guess there is a 230W limit measured by the input shunt. I thought about modding it to allow 175W to the GPU all the time, but tbh, I think 150W is more than enough. I kind of like the built-in throttling by the EC.
@ yep, that could be also the case. Everything in that laptop runs with some headroom but running it always at max ( VRM, step down convertors etc. ) isn't good for logavity. CPU GPU might survive but other components not.
Wahnsinn. Und ich bin gestern verzweifelt gewesen beim auslöten meiner Ladebuchse 😂. Gruß aus Berlin❤
ich kann Bismuth-haltiges Lot empfehlen (schmilzt bei unter 140°C) ;) Misch das mit dem auf dem Board mit viel Flux, dann fällt die fast von allein raus.
Bro I have a HP Probook laptop in which I have a GPU and VRAM slots just like yours but they are empty because it was a integrated graphics variant should I put a graphics card and vram in it it supports something like a Nvidia GeForce GTX 680M and AMD Radeon HD 7970N and where should I buy them please reply soon sir ❤
Love your video ! You have new subscriber🤩
Remember removable dGPU that was used in the past, including Framework recently did? These are the fairly easy way to get good GPU Performance
But anyway, I love this video as usual 😅