Clarification: @ 3:09, the G16 is explicitly labelled as having an OLED, while the G14 is not. Linus' voiceover is correct: both laptops feature OLED panels.
At 7:19 Linus talk about running Forza on the AMD machine while using G16 and then at 7:54 he says "turning our attention to the AMD side of things" when switching to G14.
@@bula288 Linus did not make a mistake. The G14 is AMD and the G16 is Intel. At 7:16, he says "while we get our Forza files validated on the AMD machine let's check out [the Intel machine]." He's gaming on the G16 while the game loads on the G14.
@@bula288 I think he said "while we get the files validated on the AMD" so he was playing on the G16 which is intel I believe, while waiting for the G14 which is AMD
@@yensteel Modern high speed ram is running into lane length issues, so while i hate it, having soldered ram actually is justifiable if its very highly clocked
@@RepChris Sure, if it's a high performance device and you're sticking in 32GB of super high clocked DDR5 in, solder it up!... But most people can't tell if a computer is using DDR4 or 5, let alone whether it's high or low clocked. The vast majority of laptops sold open: Chrome, Office and Adobe... The **vast majority** of laptops don't need soldered RAM.
@@almc8445 I can kinda understand it in super low-end computers. Sockets cost money, and for some people it might be worth a cost reduction. But for mid-range machines it absolutely makes sense. CPU generational gains are quite slow, while theoretical performance limits of RAM aren't that much of an issue (since they aren't pushing speed limits).
@@wojtek4p4 It's $1.30/SODIMM socket, and $14/module. Solder on memory is $13/chip. This part's just opinion, but I don't think the $2.30/unit difference is the real driver for the decision, it seems more plausible to me that the driver is them wanting people to just buy new devices... (All figures are for DDR5, from Digikey Aus converted from AUD to USD @ 0.67)
Not in the market for this but it is cool to see asus make an effort to show how the sausage is made and to make an effort to make a more balanced system that makes more sense
What do you mean ? For both you can go with a 4060 or 4050 (well, I wouldn't recomment 4050) which is (the 4060 that is) very resonable in performance and power needs. And I'm quite sure you can also go down with a more midrange CPU like R7 maybe R5 or on intel side core ultra 5 (maan, this naming scheme is so stupid). With these, the cooling will be very good, it will probably be always very silent and not hot and also not cost a ginormous amount either. Ok, thinking about it, the screen might be overkill for some, especially since it probably adds to the cost too. Maybe there are cheaper options for that too, I wouldn't be surprised.
In an industry of bigger numbers and adding more Xs it's nice to just see a basic pitch about who they made a product for and why it's good for that user
That "not everyone uses these for gaming" hits home with me, I got a zephyrus for my college work & 3d modelling. My home rig is what I game on. My G14 is awesome, and its one of the lower model ones and its still great! I have the GA401Q. My only compaint is the matte black paint it came with is peeling. I would definitely probably prefer the G16 in the future though, just because I prefer the larger laptops (considering my use case is programming & 3d modelling).
Same here! Got the 2022 Flow x13 though as I was sick of 17'' chonkers I kept carrying around, figured I could game, model and use it as a tablet when I need to all in a slim package. Then I got a steam deck... the tablet mode wasn't very good (Lots of parallax, buggy writing and just unwieldy as a tablet) I got a tablet as well. Asus crapware and windows gremlins made me sick of it as well, now thats all sorted out. Screen is great, battery life is not the best but serviceable, Usb-C is amazing, need to leave some breathing room for the turbo mode though. I am considering a 14'' laptop thats just a laptop now.
@@DoorKnob92 I'd rather some cherry picked LTT numbers than Samsung or any other brands and then sure I'll probably wait until the full GN video with their testing for 100% accuracy but that's my opinion
You know whats so much more convenient then face id? Having both face id and fingerprint in case it doesn't pick up your face for one reason or another (like the angel of the screen, or how close you are to it because you're in bed etc). Keep both! (same for phones!)
My fingerprint on my zephyrus doesnt seems to be very accurate, sometimes its not recognizing my fingerprint for few times in a row, anyone have same problem?
A lot of that comes down to how ridiculously light these specific laptops are. I'm pretty sure they are the lightest gaming laptops ever made at their respective size.
I have a G15 i bought last year and I was suuuuuper tempted to get a framework but Im glad i didnt (yet). Its such a nice lil laptop. a 3060 and a ryzen 9 6900HS in mine. 1440@165hz make it a Rocket League BEAST.
Get the 16" Framework w/ the GPU add-on if you're buying for gaming purposes. You can upgrade it later instead of having to buy an entire new laptop every upgrade.
Man, if all the manufacturers out there could do something like this, I would love it. Like obviously it’s sponsored and so their talking points are never going to be overly critical of themselves, but hearing *their rationale* for engineering decisions they’ve made instead of leaving us all in the dark to just guess at what’s going on behind the curtain is a nice breath of fresh air.
@@jaketyler2702 The number of people who use that key is a single digit percentage. That’s not “they forgot” that’s “There’s little to no benefit to including this physical key since its only common function can be mapped to another key combo”
Found an error At 3:16 the left hand laptop has the picture and specs for g16 but is labeled g14, And laptop on the right is labeled G16 but has the picture and specs of the G14. (So looks like g14 and g16 labels are switched). Hope this helps! Happy new years LMG team!
@@mitsoko I hear you, but I doubt the build quality of the Framework even comes close to how rigid and nice my G15 feels. I can pick it up by the very corner with my thumb and it does not flex. Certainly I do want to get a Framework eventually but for the money and timing I got a g15 last year. 165 hz 1440, ryzen 9 6900hs and 3060
Sucks for ASUS then that they didn't meet the goals that LMG set for them last year and LMG is dropping them once the prior contracts are executed on (with this being one such prior contract from last year). The team at ASUS did a great job on this, but it's just another case of one hand not knowing what the other is up to, and parts of the company just continually fucking things up for their customers despite the efforts of the engineers to bring their A game
@@Adowrath They didn't say what exactly IIRC. They talked about it on one of the recent WAN shows and basically said that while they feel like their contacts at ASUS did want to / try to push for change in accordance with the goals set by LMG it was effectively stonewalled at an upper level and so LMG dropped them from sponsorship negotiations for now.
I had the original G14 model, cool to see that its a form factor they seem committed to refining further, the work done on the design refinement and build quality (which was already great aside from a fan issue) looks great
Mini Tech Tip: The Intel AX200 and AX210 are compatible with both Intel and AMD platforms, granted your motherboard does not have a network card whitelist that omits them. However, the AX201, AX211, and AX411 are all Intel-specific, since they require an Intel chipset to both authenticate and function.
I hate that. I don't even know if it's an open standard or not, but, even if it were, just introducing a new one to gatekeep WiFi improvement iterations is still a real asshole move. And I'm saying that as a dedicated Intel WiFi user because nobody else even comes close for the combination of performance + comparability.
there's also newer Intel B200, with pending WiFi 7 certification. but it only works with m.2 E key. I don't know whether G14 2024 uses m.2 A key or E key for WiFi card though.
I'm annoyed and confused why our GAMING LAPTOPS that have the LANES for it, don't come standard with 2x USB 4/Thunderbolt 4 ports. I mean at least the IO on this is actually pretty decent, but still. If portable handhelds can have 2 USB 4 ports (don't know if its technically shared lanes), then a GIANT GAMING LAPTOP with a non-mobile "U" variant chip in it should be able to have them as well.
Would you rather have more thunderbolt chips? Or more heat pipes? Everything has a cost. If you want everything, there's always a higher end line of laptops that has everything.
funny thing is that business laptops with Intel chips have had dual thunderbolt for years now. My mom's LG Gram 14 and my ASUS Zenbook 14 both have 2 thunderbolt 4 ports.
Nice! The G14 is one of the most sensible and practical gaming laptops, glad to see improvements. If this would be my machine I'd love to have a fingerprint sensor for linux, tho.
Wait, did ASUS advertise a computer with soldered memory in 2023 instead of burying that fact in fine print or never saying it at all? Thats the kind of thing you couldn't have gotten out of me without waterboarding if i was them.
You kinda need to have it soldered for those speeds. It sucks, but that seems to be the best way ATM to get those ridiculous MT/s. Hopefully that'll change soon
sodimm DDR5 rn is still only topped out at 4800MT/s only, unless it's some enthusiast overclockable specific models. LP modules aren't interchangeable, but that's the only way for high speed performance
the propitiatory connector is far more egregious. no amount of marketing talk is gonna convince me this is better for the customer than a standard barrel connector.
I was gonna say this laptop was nearly perfect especially with that OLED display. But then the removal of the fingerprint scanner and the removal of the soddim slot makes me fairly glad i bought the g14 2023 recently.
@@jivewigWait did they remove it on the last model? I’m still using my 2020 G14 that does have a fingerprint scanner and that seems like a really dumb thing to have gotten rid of.
@@jivewig oh, wtf really? That's disappointing I was hoping it had both. Weird they said so much about how good that fancy sensor is only to remove it from the laptop the literal next year. Welp guess I will have to use something like howdy on linux.
@@zerotwo_.002 when he pointed at the triangle button calling it a fingerprint sensor, I ran to check if I had been missing out on it, but no, my 2022 model doesn’t have one, and so doesn’t the 2023 version.
As an Asus fanboy, one of my biggest problems with my laptop is how hot it gets on the chassis as part of the heat dissipation. It gets so hot that you literally cannot touch it when gaming anywhere except the keys and palm rest. It doesn't look like they would solve that judging by those metal plates you were holding up. I also will miss the fingerprint reader.
I always wonder why not move the CPU/GPU to the display and use the back as a heat sink/cooler and have more room for a larger battery under the keyboard.
Makes sense for an iGpu with shared ram, when using a dgpu with dedicated memory, the gain is minimal. I however can see 4800Mhz ram be a dealbreaker for most people tho, since higher means better without checking.@@Hathos9
Currently running a 2018 ZenBook. From gaming to business ASUS is really good at keeping their machines stable and durable by design. Not to mention they don't copy other manufactors, everything is truly designed in-house. The new Omen looks like a 2020 spin-off.
Is 4mm of thickness and 200g of weight really making that much of a difference for anyone? Once you get it in a bag, with a mouse, the power brick, and everything else you may carry in your bag, will you even notice? Just keep the slightly thicker frame and use the space to improve the battery life, cooling, and repairability.
I'd never even notice a 200g difference on a device that weighs much more than 1kg and 4mm of thickness isn't really a big deal imo if you really want better battery life and thermals unless you're a weirdo that's embarrassed about having a thicker device....@@uruacufutsal1
@@uruacufutsal1 Yeah, those are impressive-sounding percentages, but that doesn't explain **why** it's better. Also, at certain point percentages become useless. If your daily sales are usually $1, going up to $3 is technically 300% but not actually impressive.
Just got my g16 with the 4090 loving it thus far. I am a die hard Asus ROG user and have only used Asus ROG in my desktops and laptops for years. I will says Asus quality control is going down hill. My last 3080ti strix had dead RGB from the factory so had to RMA. My 4090 temp 3 delta skyrocketed about 6 months in hitting 110deg so RMA'd that. Now my brand new G16 straight out of the box has a dead power brick and\or plug port. I am working with the vendor first to get a replacement brick and hoping I do not have to RMA a brand new machine. I get it RMAs happen but my last RMA experience with my strix 4090 was a bad experience. I sent a pristine card in with no cosmetic issues and received back a "refurbished unit" which had lots of scratches and the PCI bracket bent. Sure I straightened the bracket but the scratches look like garbage in my Hyte case...I am still sticking to Asus for the time being but they need to do better in the RMA department and quality control. Justs my 2 cents for what its worth. The g16 is a beast and cannot wait to get the power brick issue resolved so I can really put it through its pacess. One other thing to mention about the G16 if you want to run an external monitor over 60hz and off the Nvida card you have to use the right hand side USBC (thunderbolt) port by the SD reader. I tried all kinds of cords on the HDMI port and could not get over 60hz. Over all high quality product from a physical build aspect, screen is baller, over all noise and thermal seems great.
Just an FYI to anyone looking to cop these models, the RAM in these laptops is strictly soldered and as someone who games for at least a few hours a day I also worry about the OLED screen burn in issues but it might be inconsequential if I get an external display for my gaming needs.
Burn in is an overblown issue. Maybe if you're playing the same game with the same HUD elements for several hours per day you might see burn in after 3-4 years.
if you're gaming enough to worry about burn in just get a desktop my guy that takes hours per day of sitting on a game with stationary hud in the same place
@@Adierit Fair point but I'm a comp sci student with some ML work and I'm running from class to class for half the day. I kinda wanted this more professional laptop thing to be an all in one package especially considering the less "gamer" aesthetics and the better battery life (which is possible even with the Intel CPUs now). Right now my laptop is about 5-6 years old and has a 1050 and I plan on using this new one for at least 3 years.
@@rightwingsafetysquad9872 You might be right, I probably have to look into it but I have issues with an OLED TV from 2021 and its already showing some burn in. I'm sure there's some new tech, so thanks for the tip.
Cool video and cool marketing that actually explains decisions! Having said that the laptop is sooo not for me. I switched to unly having USBC 2 years ago and never looked back. Literally have 3 thunderbolt cables that can charge, datatransfer and do whatever I need them to for all of my devices is godsend. I sometimes find lacking my XPS15 having only 3 usbc ports, let alone this Asus having 2 and realistically 1 would be used for charging on a go. Also not having fingerprint is annoying. I love that XPS has fingerprint (which is kinda crap, but still works) AND face recognition
@@Daunlouded but you do. I'll be caught dead carrying another manufacturer specific brick that weights a tonne and has only single use. I use my tiny 120W Anker charger with 3 outputs for everything - my laptop, cameras, drone, peripherals, even my shaver and toothbrush charge from that. WIth the same 3 cables. It's so much more convenient
Was thinking more or less the same, except my XPS 17 has 4 Thunderbolt ports. Recently I got the 200W Anker charger, to use it while traveling for all devices. It is such a simpler life to just use one type of cable and changer(s). It's also great to be able to charge on both sides, cannot get that with a proprietary port.
I take a USB-C charger when I am traveling and when I am gaming at home, I charge to their own brick. 100W of charging just is not enough for gaming. For non-gaming laptops, sure, USB-C is enough. I love that I can charge with USB-C though, and I hope these laptops have power delivery... I have G15 2022 version and it's amazing.
As far as oled displays on phones what was said about displays around the 6:20 mark is only true for certain oled displays. The ltpo oled displays can go from 1hz to 120hz.
After looking at different middle budget gaming laptops for years, my G14 (rtx 2060, r9 4900hs, 1080p 165hz) was the first gaming laptop that I could say had perfectly balanced features for the price. A lot of the ones I come across had 4k screens with budget gpus, low refresh rates, very expensive with bad inputs, no ssd space, etc. I was able to get mine used a year old for ~$700. A Gpu that was perfect for 1080p, a great processor to pump out the frames, and high refresh to display then. Looks like they realized that and are pushing further towards that market!
The new G14 seems really uncompelling. I drive 2022 G14 and this new chassis is a step backwards. The kickstand is an amazing product design and they just ditched it, shaved off 4mm and called it progress. Shaving 4mm off the chassis is a giant's leap backwards. 4mm of extra thermal management - now that's a sales argument. Not once have I felt like my G14 is too heavy or too thick.
personally don't like those feet makes the laptop a bit weirder to open and it all changes if I tilt the screen. Also putting it on your lap must be weird.
@linustechtips I think the audio descriptions/graphic are mismatched at 3:10? It’s also kinda early for me so might not be firing on all cylinders yet today 😅
Video Editor: On the display section, video shows G16 laptop on the left and G14 laptop on the right. The text overlay added in the editing process has the displays' specs on the opposite sides.
5:41 If you'll remember 15 years ago the old Plasma TVs had the same 960hz refresh rate, for a different reason but it's the same method of maintaining consistent brightness.
So... how much different is it if the RAM is soldered vs slotted? is the difference really that big in FPS if we're talking about gaming? Cause recently I just upgraded 32gigs of RAM on a previously 8gigs of RAM laptop, so the benefit of soldered RAM doesn't seem enticing for me if it turns out we're gonna need more RAM down the road
The important difference that i know of is the speed. Having the RAM closely integrated and with shorter traces, closer to the CPU, makes them able to have higher speeds like at 6200+ MTs. But I don't know exactly how much that can improve for gaming. Otherwise I would also prefer being able to upgrade it.
Honestly for me I don't understand why it has to be an either or. You could have the default 8 soldered on for speed and have a free slot available to expand. You would sacrifice speed and you'd have to adjust the timings of the ram soldered to the board but if you just need a boat load of ram for rendering or something then give the people their damn expandability pls ✨
The max rated speed of Sodimm is around 4800Mts vs soldered is up to 7800Mts at the moment. for gaming it can be useful but for more creative tasks I prefer the slower but more upgradeable Sodimm
Actually scratch that. You could have a ram cache on the machine using the slower ram from the expansion. You could allocate specific processes to run on the faster soldered ram and then everything else could be dumped in the slower ram
I noticed that the g14 and the new dell xps 14 laptop is basically the same laptop. But change how they look and ports and the processor. I just find it weird that these laptops are very close together.
So great to finally have an answer about USBC high power. It's standard came out in 2020 and I haven't seen any laptops that are implementing it. Even though the technical limit is 240 watts, it seems the practical limit is around 150.
The biggest issue is the size of the port and the physical fragility of it honestly. Electricity does not discriminate and tiny flimsy pins aren't conductive of high wattage transfer. The heat dump laptops can give off always should have been a sign that it makes no sense to not have a USBC+ port that can slow charge and accept the higher end charge rates
Loved that you had a rep in the room who could discuss the technical "why" with you on the decisions being made! I do think though that repairability should be a standard segment in _all_ laptop (and phone) videos, and I assume the lack of a segment means this laptop has real bad repairability.
I completely agree that repairability is a very important category. I appreciated that he made sure to point out twice that the ram is soldered to the board. Whether or not that is worth the hit to repairability is up for debate, but that definitely tells us what we need to know about its lack of repairability, without saying something they wouldn't approve in a sponsored video.
@@John-b1v6uThe issue with repairability is that it depends a lot on what the manufacturers will make available after release, which is questionable at best during this time. Not to mention, it is not guaranteed that internal design will not change. As for the soldered ram, there is a bit of a necessary evil here, since soldered lpddr5 ram can reach faster speeds than sodimms. I would rather have it not on the board, but not everyone agrees with me. Some people just want the highest numbers on their spec sheet...
It's always very interesting to get a manufacturer's insight, even if the video is sponsored. However I really feel like you should talk about repairability, sponsor or not. I gather the fact that the ram is soldered comes from the limitations of performance of sodimm. Still, how easy is it to open is a question that I feel should've been tackled. Also you didn't specify but the laptops can still charge on USB-C I imagine ?
Not sure how much I like this refresh. I have a current-gen G14 anyway, so I'm not in the market right now. Still, the lack of SODIMM slots is a little iffy. Mine has some kind of fault where it uses too much RAM at idle, so not seeing that expanded is not exactly encouraging. Plus, my biggest fault with the 2023 model is literally that it doesn't have per-key RGB... so we'll see where this puts them in a few years. I also hate the light-bar-thing.
Per-key rgb is your biggest fault? That's just cute. Such a stupid thing to focus on. Isn't your ram usage just Windows paging and indexing files like it always does? It's normal in Windows.
@@PhyrexJ for real. That's such a weird problem. If per-key RGB is the only missing thing, that must mean that the laptop is basically perfect in every other way.
@@Killermonktr my 2022 has that issue as well, it really is a bummer. Im a big fan of carrying battery packs for traveling and while i can still use them i do have to be more aware and unplug when i hit 90 percent or so. That and the crappy wifi card were my only gripes though. Im guessing if that was something they could have fixed via updates then they would have already lol
He says facial recognition is convenient (around 2:15) with the change - but is it actually better? With security speed of entry in is a factor but a MUCH bigger one for me is security - can the laptop facial ID be tricked? Is it up to the perceived standard that is apples system? We know twins can access apple phones - can Pro camera - properly printed pictures get into windows? Does it store the data on secured hardware that is resistant to attack? If it was me designing systems I would have the option to select fingerprint AND facial scan at the same time. If they want to get swanky add voice recognition too for a triple lock on entry.
My 2020 g14 1650ti still works until now and for my use case really has no major problems only some inconvenience, i even used my laptops more than my pc.
@@Mikazuki13 minor inconvenience like running out of battery in 2 hours? How about getting a fried computer if you plug in the charger and a powered USBC cable
5:36 I can't believe this isn't how it was always done. This is how basically all LED lighting has controlled brightness basically since LED lighting started becoming popular.
Wait so they made it even worse. The half solder RAM with 1 DIMM was always a turn off to me with the old G14s, and now it's only soldered. When I was shopping for a 5800H laptop I opted for the Lenovo Legion 5 which offered 2 DIMM slots, and 2 M.2 slots, more USB (2xC, 4xA) and a 1Gb nic.
As someone who also owns a legion 5, these are very different products. The Zephyrus laptops are much lighter and more portable, so if you want to carry them around, use them for work as well as play ect, that's much more achievable
@@Drakshl I've been hands on several of the G14s at work as I do computer repair. Don't think I've managed to do a proper drag race against a 5800H/5900HS one yet because every one of them had a board failure. The slightly thicker chassis of the Legion still affords it better cooling potential, so it likely wins anyway. Having 32GB DR DDR4-3200 probably helps a little. My Legion does beat even AIO liquid cooled 10700K desktops in multicore perf though, let alone most Intel mobile systems up to 11th Gen. They aren't wildly different except for the footprint and IO. Oh and the price, the G14 was overpriced. I don't get the supposed "more portable" as if the Legion is in desktop replacement class. It's not. I carry my 15" Legion to work every day, which includes at least a half mile worth of walking with it. It's lighter than the Dell Precision I used to carry. Heck I come across a lot relatively small people that carry around real behemoths, like 17" Alienwares. Petite little ladies. You put them all in a bag and throw it on your back. The G14 is lighter sure but less functional for the work part, because you give up so much IO and didn't have a webcam. Still has a bulky charger that is most likely overkill, but you're going to need it when gaming. The website says 180w, but I feel like every one someone brought in had the 240w. I don't game a lot on my Legion, but it has a 165hz Display and RTX 3060. It's fine when I do game but I've got a Desktop with a 34" ultra wide and VR, which is where I prefer to do it any way. The only thing I don't like about my Legion is that has a 300W power adapter, for basically no reason. It maxes out at 170w from the wall if pushed with full synthetic workloads on the CPU & GPU simultaneously. Being an unrealistic draw in daily use, 180w would've been enough. That's useless extra weight, where as the bulk in the laptop promotes function and longevity. Might have been a trend where many OEMs just started shipping the extra large adapter with the AMD systems. As I was pretty sure it was supposed to come with a 240w. The trackpad is kinda crap too. Always have a mouse on me anyway, and ports to spare.
considering the amount of ASUS bloatware they had installed and locked behind a protected windows directory on my last laptop from them, I doubt I'll ever go back to another system that has their name on it.
@@Hendlton It should be, unless they have some specific drivers that require installing such bloatware, otherwise you can't use the functionality, or reconfigure it. It's usually rgb stuff that does it I think, I've seen it. On the other hand if someone is so concerned about bloatware it would stop them from buying it - why even use windows? /hj
it's possible even without the format, but you'll need to boot into a non-windows OS off of something like a USB stick and then remove the directories from there. Once you've done that, you'll also need to delete the services in windows to make sure they don't get reinstalled as well as deleting the premade recovery drive and creating one of your own or else you'll have to repeat this process if you ever have to recover/reinstall windows again.@@Hendlton
2:03 what do u mean facial recognition is more convenient? Asus used to have a feature where it would cache ur fingerprint when u turned the laptop on at the beginning and use the chached data to log in, that’s so much nicer, and plus why not both? I’m sure it doesn’t cost them more than a few dollars to add
For the power I/O, Linus asked why they used a proprietary plug instead of USB/C, which would be a smart question if this was a phone or tablet. It's not. It's a laptop with laptop level power draws. Most people understand that the power draw from a laptop exceeds USB/C capabilities. He did provide a good explanation for the majority of people who probably didn't know already. The better question is why didn't ASUS go with a non-proprietary barrel plug? It appears to be the same size and the barrel plug can definitely handle the power demands of that laptop without the added expense of engineering a proprietary solution.
I agree, That's the one thing HP, Dell and Lenovo have gotten right is they use a standard voltage and plug so my old 2007 laptop power brick still works on a new laptop as a spare. Protip, Dell's will run off HP power bricks but won't charge the battery, but HP won't run off Dell's power brick
My current work laptop can use USB-C for charging, but of course only up to 90 W. Then they have a power brick with a proprietary plug to ensure I get enough power if needed (think it is rated for 170W)
I wonder if the USB C port will be able to charge the device as well as the dedicated port. It would be cool to have the USB C port still draw the 20v (or whatever the closest voltage of the standard is) and act as a backup for just in case you forget your charger. Having a little note on the bottom saying that the USB C port can't draw the necessary wattage to run the laptop at full tilt would be a nice touch as well.
Nice video, this is such a big deal! As the technology advances, the entry-level should not be left behind, people will "pay more for less" in specs if they can get a better experience -- besides, an iGPU from today can easily beat many old cards.
The power brick system is something I really like in HP, Dell, and lenovo as they've all settled on their own power voltage and barrel design and stuck to it for a long time, so I can use an old 2007 Dell power brick to power a new one and it just works. Sure if it detects it's not enough wattage Dell or HP run in low power limp mode and thinkpads just give a post error, but for me that's not that common. Whenever I see an Toshiba Fujitsu Acer or even Asus laptop come across my desk It's never been consistent on what they use and I've got to use one of the universal bricks that has a selector switch for 10 different voltages and a million plugs because I never have an adapter that would work with it. In a few years when people start getting rid of these and they start showing up at e-waste places, a lot of them are going to get scrapped even if they work because the people testing them can't without their weird power brick, and at least here in the states with revised R2 laws won't be able to sell them even for parts "untested" edit Protip Dell's will run off HP power bricks but because their BIOS can't detect the wattage it won't charge the battery but will otherwise function. HP at least the model's I've tested won't post with Dell power bricks
To elaborate on your protip - most dells I have tested (mostly a variety of XPS'es) varies between draining their battery down to 3% before using power from an HP charger, or not being able to use it at all (the higher power xps'es usually), while yes, I have never gotten an HP to suck power from a dell charger
@@Flimzes Good to know. My experience is mostly outside of an OS just key testing laptops for resale, (does it post and is it locked/computraced, remove HDD, document specs, serial number, any cosmetic damage etc) and so all I usually see is the Dell bios were it says battery health and state of charge (if it still has it's battery) and what power supply is connected. Since I have to go through a lot per day I've lined up 70% of my work shelf with HP power bricks with a small box of those big to small blue barrel adapters and the rest are lenovo square type. If I know I'm going to be getting a lot of lenovo stuff to process I'll add some more of those. The one nice thing I'll say about HP at least the ones that use DDR3L is that they'll still post with regular DDR3 and just give a warning where as most Dells and Lenovos will just give out beep codes.
@@edison700Dells will technically run without a known charger (center pin), but in addition to not charging, they're locked at minimum clocks that make them unuseable aside from checking function. For instance, my 3793 with a i7-1065g7 normally runs mid-high 3ghz at 15-18w TDP, but on a 'universal' Onn charger from Walmart with no center pin connection, will run at something like 500mhz with a correspondingly ultra-low TDP of a few watts. My older Precision M6600 had the same logic and would run MUCH faster on battery than with an unrecognized charger. Dell's 4.5mm charger/jack is notorious for the DC Jack breaking (the plastic breaks around the center pin contacts, resulting in unrecognized charger) if dropped or handled like an average kinda careless user.
Quick side note. I am not sure if Linus meant first on a GAMING laptop, but my Galaxy book 3 pro 360 has a 3k 120hz display that is variable refresh from 48-120hz, and it works great in games with no screen tearing. It has no GPU in it tho and only relies on the built in iGPU, so I am not really able to test it with anything too thicc. The ultra version does come with a 4070 tho, and its in the same form factor with the same display, so I am not sure what exactly he means by the first, unless he means specifically in the gaming focused segment, rather than thin and light on the go workstation
they could have put a 40-42V usb-C @ 5A and a capacitive divider inside the laptop that has very high efficiency leaving with 20V @ 10A = 200W and like 3-5W power loss inside the laptop, it would need an expensive usb-c charger but it is possible
@@TheHalfBorg yes, you can also look it up as Switched Capacitor Charger. Basically it uses mosfets to charge up capacitors in series and then switch them to be discharged as connected in parallel. Over 90% eff, its used in phones for the fastest battery charge state because it generates less power, i think that was the reason we have USB PD PPS mode, whatever voltage and current you send. The divider cuts the voltage in half and doubles the current, and with the PPS charging mode, the fine voltage control and heat generation is moved to the power brick.
Would consider if it could still charge vis usb C. I understand it being cut down if I don't plug it in with their specific plug, but it's just not for me if I can't also charge while on a trip via the same cable I'm using for everything else.
I love my g14 (2021) because it is not a MAC book. I’ve upgrade the ram, repasted the thermal paste, clean the fans 4 times a year, and have replaced the battery. It’s also feels substantial while being compact. I’m worried this new designs gonna take those things away, guess I have to wait till ifixit gets their hands on it.
I'm glad I watched this, the OLED VRR explainer was great and I was actually wondering about this, having seen the refresh-rate-change-flicker on my new phone. Thanks!
I have a G14 from 2020 which is still going strong. Love the thing to death but it's really the thermals that limit these laptops. If I'm playing anything remotely demanding I usually manually max out the fans (which are much louder than a desk fan when at full power) in order to keep my temps as low as possible, but I still often reach 95C. I would be worried about them trying to fit the same power hungry SoCs into a thinner chassis... As an aerospace engineer, I deal with convective heat transfer stuff all the time, really the only solution (without using liquid coolant and a radiator) is to simply move more air through the system. As the air temperature in your case gets closer to the temperature of the components, the heat transfer will slow down. Liquid metal and vapor chambers sure do reduce the thermal 'resistance' for heat leaving the CPU, but often times it is the air that is the limiting factor. Hopefully the solid state fan tech will fix this, and perhaps make it a bit quieter.
This is an issue for all thinner Laptops of the past 5-8 years or so, they all have grossly inadequate fans and heatsinks and, given encouragement from Intel and others, basically run at 100c steady state under load. It's nothing like the past with cooling that could maintain 60-70c at max power, the only Laptops coming close are the higher-end thick models that are sadly kind of niche now. It sucks, and I hate it, but the OEMs no longer care about cooling and model revisions like this are because the previous generation had to have such low power limits that it performed terribly for the higher-end hardware. Since the RTX 2000 gen, there's been little performance difference between a xx70 and xx90 mobile GPU because most models can't handle the extra power and heat.
The 2022 and 2023 G14 model did not have fingerprint sensor either. It is IR face unlock. I am currently using the 2022 G14 with RX6700s and its kept up well!
Defending Asus for replacing the fingerprint scanner with face recognition but complaining day in and day out about Apple that has done that before. Don't get me wrong I hate that Apple only offers FaceID but I hate hypocrisy a lot more.
I'll never use a laptop again for work & gaming but nonetheless this is always exciting too see, and that Asus sent some guys over to you Linus made it even better, and funnier when you confront them about design choices and whatnot haha xD These videos in my opinion are better than the normal review videos, not that they're bad of course ! Can't wait to see what more neat tech everyone has to show this year ! :D
Well I don't think that's the case, with LTT dropping Asus as a sponsor as discussed in the last WAN show. They're still going to publish the current videos they've committed to, but won't be working with Asus again until the ongoing issues are solved.
Yeah as cool as these laptops are, I can't ever buy or recommend an Asus machine again. My 2022 G14 bricked itself such that the laptop no longer runs stably on battery. Forced BIOS update ruined MUX functionality so badly that even rolling back to every BIOS on their website doesn't fix it. The laptop had just ended warranty coverage by about 2 weeks when the update rolled out and I was told I was SOL because I didn't buy the extended 2-year warranty.
At 3:18 the screen refresh rates are mixed up: It says 240Hz for the 2.5K, 120Hz for the 3K (after which Linus triumphantly lifts up the (not) 120Hz laptop and says "I think you'll know which I prefer")
I can actually see ASUS's logic on the slim powerjack™️. There's what Linus mentioned about the voltage, but also the proprietary connector looks like it will hold up to more strain than USB C. I wish they kept the barrel jack, but that was probably hard to fit on a thin and light laptop.
at least give is the option for convenience, the proprietary connector just means more money to asus if you need to buy a new one. Totally NOT worth it
Clarification: @ 3:09, the G16 is explicitly labelled as having an OLED, while the G14 is not. Linus' voiceover is correct: both laptops feature OLED panels.
Lol why wouldn‘t they label it as one? What‘s their marketing team doing
At 7:19 Linus talk about running Forza on the AMD machine while using G16 and then at 7:54 he says "turning our attention to the AMD side of things" when switching to G14.
@@bula288 Linus did not make a mistake. The G14 is AMD and the G16 is Intel. At 7:16, he says "while we get our Forza files validated on the AMD machine let's check out [the Intel machine]." He's gaming on the G16 while the game loads on the G14.
@@bula288 I think he said "while we get the files validated on the AMD" so he was playing on the G16 which is intel I believe, while waiting for the G14 which is AMD
@@bula288 Maybe one has an Nvidia GPU and the other doesn't? Idk honestly
I love hearing manufactures talk about their strange decisions. Really annoying when they wont explain them
Upgradable ram was a big selling point in the past. You could run 40 or 48gb just by buying a ram kit.
@@yensteel Modern high speed ram is running into lane length issues, so while i hate it, having soldered ram actually is justifiable if its very highly clocked
@@RepChris Sure, if it's a high performance device and you're sticking in 32GB of super high clocked DDR5 in, solder it up!... But most people can't tell if a computer is using DDR4 or 5, let alone whether it's high or low clocked. The vast majority of laptops sold open: Chrome, Office and Adobe... The **vast majority** of laptops don't need soldered RAM.
@@almc8445 I can kinda understand it in super low-end computers. Sockets cost money, and for some people it might be worth a cost reduction.
But for mid-range machines it absolutely makes sense. CPU generational gains are quite slow, while theoretical performance limits of RAM aren't that much of an issue (since they aren't pushing speed limits).
@@wojtek4p4 It's $1.30/SODIMM socket, and $14/module. Solder on memory is $13/chip. This part's just opinion, but I don't think the $2.30/unit difference is the real driver for the decision, it seems more plausible to me that the driver is them wanting people to just buy new devices...
(All figures are for DDR5, from Digikey Aus converted from AUD to USD @ 0.67)
Not in the market for this but it is cool to see asus make an effort to show how the sausage is made and to make an effort to make a more balanced system that makes more sense
What do you mean ? For both you can go with a 4060 or 4050 (well, I wouldn't recomment 4050) which is (the 4060 that is) very resonable in performance and power needs. And I'm quite sure you can also go down with a more midrange CPU like R7 maybe R5 or on intel side core ultra 5 (maan, this naming scheme is so stupid). With these, the cooling will be very good, it will probably be always very silent and not hot and also not cost a ginormous amount either.
Ok, thinking about it, the screen might be overkill for some, especially since it probably adds to the cost too. Maybe there are cheaper options for that too, I wouldn't be surprised.
In an industry of bigger numbers and adding more Xs it's nice to just see a basic pitch about who they made a product for and why it's good for that user
That "not everyone uses these for gaming" hits home with me, I got a zephyrus for my college work & 3d modelling. My home rig is what I game on. My G14 is awesome, and its one of the lower model ones and its still great! I have the GA401Q. My only compaint is the matte black paint it came with is peeling. I would definitely probably prefer the G16 in the future though, just because I prefer the larger laptops (considering my use case is programming & 3d modelling).
Shame it only seems to come in Intel flavour, and Meteor Lake is barely as good as Intel initially claimed, as per usual Intel marketing BS.
Same here! Got the 2022 Flow x13 though as I was sick of 17'' chonkers I kept carrying around, figured I could game, model and use it as a tablet when I need to all in a slim package. Then I got a steam deck... the tablet mode wasn't very good (Lots of parallax, buggy writing and just unwieldy as a tablet) I got a tablet as well. Asus crapware and windows gremlins made me sick of it as well, now thats all sorted out. Screen is great, battery life is not the best but serviceable, Usb-C is amazing, need to leave some breathing room for the turbo mode though. I am considering a 14'' laptop thats just a laptop now.
@@damienlobb85 Amd is even worse. So..........
@@neon_archTBD
100% with you on this one and good luck in your 3D modeling endeavors!
It is great companies are letting you guys do your testing at HQ and not do quick summaries on show floor. Hope more channels go this route.
lmao, I didn't even realize it was CES time until I read this comment
i forgot about CES too
They do they testing but because they’re the sponsor, they choose which results were published
@@DoorKnob92 I'd rather some cherry picked LTT numbers than Samsung or any other brands and then sure I'll probably wait until the full GN video with their testing for 100% accuracy but that's my opinion
@@KenS1267 I think last year, they did half and half like Alex and Jake went down for actual CES and most of those videos were on shortcircuit
Appreciate the in-depth discussion about the laptops' specs and the reasoning behind Asus's decisions. Looking forward to the release!
You don't need to add extra s if the word ends with s.
Did you pay for 100k subscribers? Trying to figure out how a channel with over 100k subs only has an average of 50 views or so on their videos
You know whats so much more convenient then face id? Having both face id and fingerprint in case it doesn't pick up your face for one reason or another (like the angel of the screen, or how close you are to it because you're in bed etc). Keep both! (same for phones!)
Or in a dark room
My fingerprint on my zephyrus doesnt seems to be very accurate, sometimes its not recognizing my fingerprint for few times in a row, anyone have same problem?
more is better in this case
I find fingerprint is always faster. It's nice to have both but if I can only have one its fingerprint scanner all day.
As well as 3.5 headphone jack. Give us all options everywhere!
The way linus can balance each laptop so effortlessly is impressive while waving it around
A lot of that comes down to how ridiculously light these specific laptops are. I'm pretty sure they are the lightest gaming laptops ever made at their respective size.
They do drop, a lot times
That right there is what we get in tradeoff for the drops lol
As someone who has been using a G15 for 3 years, this is making the decision between a Framework and one of these INCREDIBLY difficult lol
Framework you can Upgrade or even switch Motherboard and CPU
I know i have a G14 for a year now and was gonna get the framework at the end on this year, and dang is that new one looking sweet.
I have a G15 i bought last year and I was suuuuuper tempted to get a framework but Im glad i didnt (yet). Its such a nice lil laptop.
a 3060 and a ryzen 9 6900HS in mine. 1440@165hz make it a Rocket League BEAST.
@@DarthAwar with the price you can mostly buy a new laptop.
Get the 16" Framework w/ the GPU add-on if you're buying for gaming purposes. You can upgrade it later instead of having to buy an entire new laptop every upgrade.
Man, if all the manufacturers out there could do something like this, I would love it. Like obviously it’s sponsored and so their talking points are never going to be overly critical of themselves, but hearing *their rationale* for engineering decisions they’ve made instead of leaving us all in the dark to just guess at what’s going on behind the curtain is a nice breath of fresh air.
Yeah its really refreshing they forgot the print screen key again
@@jaketyler2702 The number of people who use that key is a single digit percentage. That’s not “they forgot” that’s “There’s little to no benefit to including this physical key since its only common function can be mapped to another key combo”
2 videos in a day? Now that's a win in my books.
Give us a 3rd Linus!
As long as they aren't overworking themselves...
But you’ve never written any books
Yay, let's overwork them again
@@Lefthandup it's a phrase
Found an error At 3:16 the left hand laptop has the picture and specs for g16 but is labeled g14, And laptop on the right is labeled G16 but has the picture and specs of the G14. (So looks like g14 and g16 labels are switched). Hope this helps! Happy new years LMG team!
Also came here to mention this
ECC team hired by LTT is trash 😂
they keep messing up somehow
@@teeblackgold97 turns out everyone saying they could do a better job couldnt
they're true though
These sponsored videos are surprisingly good. Companies seem to be more willing now to really pull back the curtain and it’s been rad. Nice work
TBH the framework will almost certainly be a better laptop long term. But this will be more convenient
@@mitsoko I hear you, but I doubt the build quality of the Framework even comes close to how rigid and nice my G15 feels. I can pick it up by the very corner with my thumb and it does not flex.
Certainly I do want to get a Framework eventually but for the money and timing I got a g15 last year. 165 hz 1440, ryzen 9 6900hs and 3060
Sucks for ASUS then that they didn't meet the goals that LMG set for them last year and LMG is dropping them once the prior contracts are executed on (with this being one such prior contract from last year). The team at ASUS did a great job on this, but it's just another case of one hand not knowing what the other is up to, and parts of the company just continually fucking things up for their customers despite the efforts of the engineers to bring their A game
@@TheOriginalFaxon What goals did they set for them/where? I genuinely forgot.
@@Adowrath They didn't say what exactly IIRC. They talked about it on one of the recent WAN shows and basically said that while they feel like their contacts at ASUS did want to / try to push for change in accordance with the goals set by LMG it was effectively stonewalled at an upper level and so LMG dropped them from sponsorship negotiations for now.
I had the original G14 model, cool to see that its a form factor they seem committed to refining further, the work done on the design refinement and build quality (which was already great aside from a fan issue) looks great
The specs at 3:18 are titled wrong! The G14 is listed as the G16 and vice versa.
2 videos in one day with a mistake way to change lmg
I mean they do it on movie posters all the time...
yea somethings up there
Not only that, but only one of them is labeled "OLED" - they are both OLED panels.
Good thing I checked B4 putting a comment here!
Mini Tech Tip: The Intel AX200 and AX210 are compatible with both Intel and AMD platforms, granted your motherboard does not have a network card whitelist that omits them. However, the AX201, AX211, and AX411 are all Intel-specific, since they require an Intel chipset to both authenticate and function.
They also connect using CNVI, which AMD boards cannot have.
I hate that. I don't even know if it's an open standard or not, but, even if it were, just introducing a new one to gatekeep WiFi improvement iterations is still a real asshole move. And I'm saying that as a dedicated Intel WiFi user because nobody else even comes close for the combination of performance + comparability.
there's also newer Intel B200, with pending WiFi 7 certification. but it only works with m.2 E key. I don't know whether G14 2024 uses m.2 A key or E key for WiFi card though.
I'm annoyed and confused why our GAMING LAPTOPS that have the LANES for it, don't come standard with 2x USB 4/Thunderbolt 4 ports. I mean at least the IO on this is actually pretty decent, but still. If portable handhelds can have 2 USB 4 ports (don't know if its technically shared lanes), then a GIANT GAMING LAPTOP with a non-mobile "U" variant chip in it should be able to have them as well.
Would you rather have more thunderbolt chips? Or more heat pipes? Everything has a cost. If you want everything, there's always a higher end line of laptops that has everything.
@@hk07666PCIE lanes are in the CPU itself
@@captainheat2314 the connectors require space.
@@XGD5layer laptops have that space as we are talking USB c size and not ethernet
funny thing is that business laptops with Intel chips have had dual thunderbolt for years now. My mom's LG Gram 14 and my ASUS Zenbook 14 both have 2 thunderbolt 4 ports.
Nice! The G14 is one of the most sensible and practical gaming laptops, glad to see improvements. If this would be my machine I'd love to have a fingerprint sensor for linux, tho.
2 videos in a day! We're full swing into CES
Can we get an updated version of this video now? Curious about fan noise & the frames you are actually able to get
Wait, did ASUS advertise a computer with soldered memory in 2023 instead of burying that fact in fine print or never saying it at all? Thats the kind of thing you couldn't have gotten out of me without waterboarding if i was them.
You kinda need to have it soldered for those speeds. It sucks, but that seems to be the best way ATM to get those ridiculous MT/s. Hopefully that'll change soon
sodimm DDR5 rn is still only topped out at 4800MT/s only, unless it's some enthusiast overclockable specific models. LP modules aren't interchangeable, but that's the only way for high speed performance
the propitiatory connector is far more egregious. no amount of marketing talk is gonna convince me this is better for the customer than a standard barrel connector.
This entire video is Linus explaining how a massive downgrade is actually for our own good.
@@baoquoc3710 5600MT/s, not 4800, but yes, less than soldered sadly
I was gonna say this laptop was nearly perfect especially with that OLED display. But then the removal of the fingerprint scanner and the removal of the soddim slot makes me fairly glad i bought the g14 2023 recently.
But the G14 2023 anyways doesn’t have a fingerprint scanner
@@jivewigWait did they remove it on the last model? I’m still using my 2020 G14 that does have a fingerprint scanner and that seems like a really dumb thing to have gotten rid of.
@@jivewig oh, wtf really? That's disappointing I was hoping it had both. Weird they said so much about how good that fancy sensor is only to remove it from the laptop the literal next year. Welp guess I will have to use something like howdy on linux.
@@-Burb they removed it since 2022. Idk why Linus didn’t know that.
@@zerotwo_.002 when he pointed at the triangle button calling it a fingerprint sensor, I ran to check if I had been missing out on it, but no, my 2022 model doesn’t have one, and so doesn’t the 2023 version.
As an Asus fanboy, one of my biggest problems with my laptop is how hot it gets on the chassis as part of the heat dissipation. It gets so hot that you literally cannot touch it when gaming anywhere except the keys and palm rest. It doesn't look like they would solve that judging by those metal plates you were holding up. I also will miss the fingerprint reader.
this laptop will get hotter than the surface of the sun
Why would you need to touch it elsewhere?
I always wonder why not move the CPU/GPU to the display and use the back as a heat sink/cooler and have more room for a larger battery under the keyboard.
Did you clean your vents and Repaste with ptm7950 it helps a lot
@@AlexDatcoldness theres only 8GB of it as well so you’ll be bottlenecked as soon as you load up windows
Soldered RAM is a massive dealbreaker. Previous gen at a discount started looking as a better deal 😅
Between 6400mhz soldered ram and 4800mhz unsoldered ram, I prefer the 6400mhz. The ram being unsoldered is pointless if there isn't an upgrade path.
Some of the previous models are still set up that way. I have a G14 that had 8 soldered and 8 not, that could be upgraded to 16.
Makes sense for an iGpu with shared ram, when using a dgpu with dedicated memory, the gain is minimal. I however can see 4800Mhz ram be a dealbreaker for most people tho, since higher means better without checking.@@Hathos9
@@Hathos9they could’ve done atleast 5600mhz like Razer did. Then that difference would’ve been minimal for choosing soldered over upgradeable.
TWO LTT VIDEOS IN ONE DAY!
It must be Christmas, or CES :D
Three if you include ShortCircuit's Showcase...
@@nojacko omg yes! Gonna watch that next :)
It's nice that this is going to be one of last Asus ads in this channel.
Its amazing how far they have come since the Asus laptop I bought back in 2011
Currently running a 2018 ZenBook. From gaming to business ASUS is really good at keeping their machines stable and durable by design. Not to mention they don't copy other manufactors, everything is truly designed in-house. The new Omen looks like a 2020 spin-off.
G14 is my fav laptop of all time. I didn't think they could make it look ever better and they did. I wonder what the price will be.
I asked at ces and they said price is not increasing. You'll have to confirm once launched, but hopefully that's right!
@@isaacr7416 that’s good news! Thanks
Is 4mm of thickness and 200g of weight really making that much of a difference for anyone? Once you get it in a bag, with a mouse, the power brick, and everything else you may carry in your bag, will you even notice? Just keep the slightly thicker frame and use the space to improve the battery life, cooling, and repairability.
Agree. I rather they make it thicker but lighter. Maybe like carbon fiber, super light, but can be thick for great cooling and power
It does, 4mm difference is a 25% improvement over the last one, and 200g is 15% improvement. It adds up
200g lighter is now in the zenbook territory so I say yes, it does make a difference
I'd never even notice a 200g difference on a device that weighs much more than 1kg and 4mm of thickness isn't really a big deal imo if you really want better battery life and thermals unless you're a weirdo that's embarrassed about having a thicker device....@@uruacufutsal1
@@uruacufutsal1 Yeah, those are impressive-sounding percentages, but that doesn't explain **why** it's better. Also, at certain point percentages become useless. If your daily sales are usually $1, going up to $3 is technically 300% but not actually impressive.
Quiet? I like! I call my G14 (2021) Harry Cane, because, you guessed it, it sounds like a Hurricane ^^ (and every piece of tech needs a name :D )
@3:08 mark it has the stats flipped and overlaid over the wrong models shown in background, unless I am mistaken.
Well There are still problems in editing videos ltt, at 3:13 you reversed the g14 to g16.
I like how with the speaker test you could hear the bass thump to the point it rattled the table.
yeah loved the test
It is distortion, not table rattle
At 3:16, the config charts are on the wrong sides. Great video though. Daily driving the G14 (2022) and this looks great.
3:08 the refresh rates seem to be swapped incorrectly or something
Just got my g16 with the 4090 loving it thus far. I am a die hard Asus ROG user and have only used Asus ROG in my desktops and laptops for years. I will says Asus quality control is going down hill. My last 3080ti strix had dead RGB from the factory so had to RMA. My 4090 temp 3 delta skyrocketed about 6 months in hitting 110deg so RMA'd that. Now my brand new G16 straight out of the box has a dead power brick and\or plug port. I am working with the vendor first to get a replacement brick and hoping I do not have to RMA a brand new machine. I get it RMAs happen but my last RMA experience with my strix 4090 was a bad experience. I sent a pristine card in with no cosmetic issues and received back a "refurbished unit" which had lots of scratches and the PCI bracket bent. Sure I straightened the bracket but the scratches look like garbage in my Hyte case...I am still sticking to Asus for the time being but they need to do better in the RMA department and quality control. Justs my 2 cents for what its worth. The g16 is a beast and cannot wait to get the power brick issue resolved so I can really put it through its pacess. One other thing to mention about the G16 if you want to run an external monitor over 60hz and off the Nvida card you have to use the right hand side USBC (thunderbolt) port by the SD reader. I tried all kinds of cords on the HDMI port and could not get over 60hz. Over all high quality product from a physical build aspect, screen is baller, over all noise and thermal seems great.
isnt the g14 and g16 backwards at 3:09 ? Thought the 16 inch had the bigger better faster pannel?
3:14 you say there is a 3k panel @120Hz on G14 and QHD+ @240Hz on G16 but in the video it's exactly opposite plus resolution is also mixed.
Yup. Oops
Just an FYI to anyone looking to cop these models, the RAM in these laptops is strictly soldered and as someone who games for at least a few hours a day I also worry about the OLED screen burn in issues but it might be inconsequential if I get an external display for my gaming needs.
Burn in is an overblown issue. Maybe if you're playing the same game with the same HUD elements for several hours per day you might see burn in after 3-4 years.
@@rightwingsafetysquad9872 a lot of people only play a couple of games...
if you're gaming enough to worry about burn in just get a desktop my guy that takes hours per day of sitting on a game with stationary hud in the same place
@@Adierit Fair point but I'm a comp sci student with some ML work and I'm running from class to class for half the day. I kinda wanted this more professional laptop thing to be an all in one package especially considering the less "gamer" aesthetics and the better battery life (which is possible even with the Intel CPUs now).
Right now my laptop is about 5-6 years old and has a 1050 and I plan on using this new one for at least 3 years.
@@rightwingsafetysquad9872 You might be right, I probably have to look into it but I have issues with an OLED TV from 2021 and its already showing some burn in. I'm sure there's some new tech, so thanks for the tip.
I think you got the display specs backwards at 3:09
Cool video and cool marketing that actually explains decisions!
Having said that the laptop is sooo not for me. I switched to unly having USBC 2 years ago and never looked back. Literally have 3 thunderbolt cables that can charge, datatransfer and do whatever I need them to for all of my devices is godsend. I sometimes find lacking my XPS15 having only 3 usbc ports, let alone this Asus having 2 and realistically 1 would be used for charging on a go.
Also not having fingerprint is annoying. I love that XPS has fingerprint (which is kinda crap, but still works) AND face recognition
I’m the same moved to dell xps 15 and switched to Samsung z flip phone a year ago I only use usb c now
But you don't need USB-C for charging.
@@Daunlouded but you do. I'll be caught dead carrying another manufacturer specific brick that weights a tonne and has only single use.
I use my tiny 120W Anker charger with 3 outputs for everything - my laptop, cameras, drone, peripherals, even my shaver and toothbrush charge from that. WIth the same 3 cables. It's so much more convenient
Was thinking more or less the same, except my XPS 17 has 4 Thunderbolt ports. Recently I got the 200W Anker charger, to use it while traveling for all devices. It is such a simpler life to just use one type of cable and changer(s). It's also great to be able to charge on both sides, cannot get that with a proprietary port.
I take a USB-C charger when I am traveling and when I am gaming at home, I charge to their own brick. 100W of charging just is not enough for gaming. For non-gaming laptops, sure, USB-C is enough. I love that I can charge with USB-C though, and I hope these laptops have power delivery... I have G15 2022 version and it's amazing.
3:10 @ whoever is in charge of editing, you swapped the G14 and G16 titles by mistake. G14 is 3k 120hz and G16 is 2.5k 240hz
As far as oled displays on phones what was said about displays around the 6:20 mark is only true for certain oled displays. The ltpo oled displays can go from 1hz to 120hz.
After looking at different middle budget gaming laptops for years, my G14 (rtx 2060, r9 4900hs, 1080p 165hz) was the first gaming laptop that I could say had perfectly balanced features for the price.
A lot of the ones I come across had 4k screens with budget gpus, low refresh rates, very expensive with bad inputs, no ssd space, etc.
I was able to get mine used a year old for ~$700. A Gpu that was perfect for 1080p, a great processor to pump out the frames, and high refresh to display then.
Looks like they realized that and are pushing further towards that market!
Doesn't Zephyr mean wind? Without that kickstand to take in the extra air, it seems like a different model.
That's the most stupidest qualm I've ever heard.. you could also say that it's still "zephyr"us bc of the extra fan inside.
Its already liftted up by the tall rubber feet. The hinge-feet are awful if you want to use a laptop stand
The new G14 seems really uncompelling. I drive 2022 G14 and this new chassis is a step backwards. The kickstand is an amazing product design and they just ditched it, shaved off 4mm and called it progress. Shaving 4mm off the chassis is a giant's leap backwards. 4mm of extra thermal management - now that's a sales argument. Not once have I felt like my G14 is too heavy or too thick.
personally don't like those feet makes the laptop a bit weirder to open and it all changes if I tilt the screen. Also putting it on your lap must be weird.
@@EitanGelfgat yeah the feet sound great on paper but practically, not so much
@linustechtips I think the audio descriptions/graphic are mismatched at 3:10? It’s also kinda early for me so might not be firing on all cylinders yet today 😅
linus uploading a video brings joy into my life..
unless he drops something. then it brings even more joy 😆
03:19 was an exciting minute!
@@stalincat2457 oof, that is terrifying to look at
Video Editor: On the display section, video shows G16 laptop on the left and G14 laptop on the right. The text overlay added in the editing process has the displays' specs on the opposite sides.
Where Linus talks about the screens on the G14 and G16 at minute 3:15, which one is correct? What the editor put or what he says?
5:41 If you'll remember 15 years ago the old Plasma TVs had the same 960hz refresh rate, for a different reason but it's the same method of maintaining consistent brightness.
So... how much different is it if the RAM is soldered vs slotted? is the difference really that big in FPS if we're talking about gaming? Cause recently I just upgraded 32gigs of RAM on a previously 8gigs of RAM laptop, so the benefit of soldered RAM doesn't seem enticing for me if it turns out we're gonna need more RAM down the road
The important difference that i know of is the speed. Having the RAM closely integrated and with shorter traces, closer to the CPU, makes them able to have higher speeds like at 6200+ MTs. But I don't know exactly how much that can improve for gaming. Otherwise I would also prefer being able to upgrade it.
Honestly for me I don't understand why it has to be an either or. You could have the default 8 soldered on for speed and have a free slot available to expand. You would sacrifice speed and you'd have to adjust the timings of the ram soldered to the board but if you just need a boat load of ram for rendering or something then give the people their damn expandability pls ✨
The max rated speed of Sodimm is around 4800Mts vs soldered is up to 7800Mts at the moment. for gaming it can be useful but for more creative tasks I prefer the slower but more upgradeable Sodimm
The difference is if the soldered ram ever fails, you need to give the company more money
Actually scratch that. You could have a ram cache on the machine using the slower ram from the expansion. You could allocate specific processes to run on the faster soldered ram and then everything else could be dumped in the slower ram
I noticed that the g14 and the new dell xps 14 laptop is basically the same laptop. But change how they look and ports and the processor. I just find it weird that these laptops are very close together.
Excluding the fact that the xps only has 80w of cpu and gpu's power budget (gpu takes 40/50w)
So great to finally have an answer about USBC high power. It's standard came out in 2020 and I haven't seen any laptops that are implementing it.
Even though the technical limit is 240 watts, it seems the practical limit is around 150.
Framework 16 has it. Starts shipping in a couple of weeks.
The biggest issue is the size of the port and the physical fragility of it honestly. Electricity does not discriminate and tiny flimsy pins aren't conductive of high wattage transfer. The heat dump laptops can give off always should have been a sign that it makes no sense to not have a USBC+ port that can slow charge and accept the higher end charge rates
The one finger lid open shot was impressive.
Two uploads in one day? What a nice surprise!
3:14 What Linus is saying is not what is written on the screen. Someone made an error.
Loved that you had a rep in the room who could discuss the technical "why" with you on the decisions being made! I do think though that repairability should be a standard segment in _all_ laptop (and phone) videos, and I assume the lack of a segment means this laptop has real bad repairability.
I completely agree that repairability is a very important category. I appreciated that he made sure to point out twice that the ram is soldered to the board.
Whether or not that is worth the hit to repairability is up for debate, but that definitely tells us what we need to know about its lack of repairability, without saying something they wouldn't approve in a sponsored video.
@@John-b1v6uThe issue with repairability is that it depends a lot on what the manufacturers will make available after release, which is questionable at best during this time. Not to mention, it is not guaranteed that internal design will not change.
As for the soldered ram, there is a bit of a necessary evil here, since soldered lpddr5 ram can reach faster speeds than sodimms. I would rather have it not on the board, but not everyone agrees with me. Some people just want the highest numbers on their spec sheet...
It's always very interesting to get a manufacturer's insight, even if the video is sponsored. However I really feel like you should talk about repairability, sponsor or not. I gather the fact that the ram is soldered comes from the limitations of performance of sodimm. Still, how easy is it to open is a question that I feel should've been tackled.
Also you didn't specify but the laptops can still charge on USB-C I imagine ?
Not sure how much I like this refresh. I have a current-gen G14 anyway, so I'm not in the market right now. Still, the lack of SODIMM slots is a little iffy. Mine has some kind of fault where it uses too much RAM at idle, so not seeing that expanded is not exactly encouraging. Plus, my biggest fault with the 2023 model is literally that it doesn't have per-key RGB... so we'll see where this puts them in a few years.
I also hate the light-bar-thing.
Per-key rgb is your biggest fault? That's just cute. Such a stupid thing to focus on. Isn't your ram usage just Windows paging and indexing files like it always does? It's normal in Windows.
@@PhyrexJ for real. That's such a weird problem. If per-key RGB is the only missing thing, that must mean that the laptop is basically perfect in every other way.
My problem 2023 g14 is the lack of good usb C charging without fucking up your battery and maybe a touch screen would be cool
@@Killermonktr my 2022 has that issue as well, it really is a bummer. Im a big fan of carrying battery packs for traveling and while i can still use them i do have to be more aware and unplug when i hit 90 percent or so. That and the crappy wifi card were my only gripes though. Im guessing if that was something they could have fixed via updates then they would have already lol
He says facial recognition is convenient (around 2:15) with the change - but is it actually better?
With security speed of entry in is a factor but a MUCH bigger one for me is security - can the laptop facial ID be tricked? Is it up to the perceived standard that is apples system?
We know twins can access apple phones - can Pro camera - properly printed pictures get into windows?
Does it store the data on secured hardware that is resistant to attack?
If it was me designing systems I would have the option to select fingerprint AND facial scan at the same time.
If they want to get swanky add voice recognition too for a triple lock on entry.
I have a 2020 G14 with a 1650, that new look is giving me a head turn moment since I'm due to an upgrade this year.
New look is a downgrade in my opinion. Surprised you want another one considering my experience with the 2020 g14
My 2020 g14 1650ti still works until now and for my use case really has no major problems only some inconvenience, i even used my laptops more than my pc.
@@Mikazuki13 minor inconvenience like running out of battery in 2 hours? How about getting a fried computer if you plug in the charger and a powered USBC cable
5:36 I can't believe this isn't how it was always done. This is how basically all LED lighting has controlled brightness basically since LED lighting started becoming popular.
Hey, just letting you know you accidentally swapped the laptop names at 3:17
3:10 I think the laptops are mislabeled in the on screen graphic. G16 laptop and specs are to the left, but it says G14 above it.
hopefully they change that. If not, then ruh roh, LTT clearly didn't learn from their mistakes.
Wait so they made it even worse. The half solder RAM with 1 DIMM was always a turn off to me with the old G14s, and now it's only soldered. When I was shopping for a 5800H laptop I opted for the Lenovo Legion 5 which offered 2 DIMM slots, and 2 M.2 slots, more USB (2xC, 4xA) and a 1Gb nic.
As someone who also owns a legion 5, these are very different products. The Zephyrus laptops are much lighter and more portable, so if you want to carry them around, use them for work as well as play ect, that's much more achievable
@@Drakshl I've been hands on several of the G14s at work as I do computer repair. Don't think I've managed to do a proper drag race against a 5800H/5900HS one yet because every one of them had a board failure. The slightly thicker chassis of the Legion still affords it better cooling potential, so it likely wins anyway. Having 32GB DR DDR4-3200 probably helps a little. My Legion does beat even AIO liquid cooled 10700K desktops in multicore perf though, let alone most Intel mobile systems up to 11th Gen.
They aren't wildly different except for the footprint and IO. Oh and the price, the G14 was overpriced. I don't get the supposed "more portable" as if the Legion is in desktop replacement class. It's not. I carry my 15" Legion to work every day, which includes at least a half mile worth of walking with it. It's lighter than the Dell Precision I used to carry. Heck I come across a lot relatively small people that carry around real behemoths, like 17" Alienwares. Petite little ladies. You put them all in a bag and throw it on your back.
The G14 is lighter sure but less functional for the work part, because you give up so much IO and didn't have a webcam. Still has a bulky charger that is most likely overkill, but you're going to need it when gaming. The website says 180w, but I feel like every one someone brought in had the 240w. I don't game a lot on my Legion, but it has a 165hz Display and RTX 3060. It's fine when I do game but I've got a Desktop with a 34" ultra wide and VR, which is where I prefer to do it any way.
The only thing I don't like about my Legion is that has a 300W power adapter, for basically no reason. It maxes out at 170w from the wall if pushed with full synthetic workloads on the CPU & GPU simultaneously. Being an unrealistic draw in daily use, 180w would've been enough. That's useless extra weight, where as the bulk in the laptop promotes function and longevity. Might have been a trend where many OEMs just started shipping the extra large adapter with the AMD systems. As I was pretty sure it was supposed to come with a 240w. The trackpad is kinda crap too. Always have a mouse on me anyway, and ports to spare.
The ASUS rep was definitely sweating watching Linus hold both laptops open like that…
considering the amount of ASUS bloatware they had installed and locked behind a protected windows directory on my last laptop from them, I doubt I'll ever go back to another system that has their name on it.
Is it not possible to just format it to get rid of all the bloatware? I was kind of considering buying one of these, but now I'm not so sure.
@@Hendlton It should be, unless they have some specific drivers that require installing such bloatware, otherwise you can't use the functionality, or reconfigure it. It's usually rgb stuff that does it I think, I've seen it.
On the other hand if someone is so concerned about bloatware it would stop them from buying it - why even use windows? /hj
it's possible even without the format, but you'll need to boot into a non-windows OS off of something like a USB stick and then remove the directories from there. Once you've done that, you'll also need to delete the services in windows to make sure they don't get reinstalled as well as deleting the premade recovery drive and creating one of your own or else you'll have to repeat this process if you ever have to recover/reinstall windows again.@@Hendlton
Look up g-helper, helps get rid of asus bloatware
2:03 what do u mean facial recognition is more convenient? Asus used to have a feature where it would cache ur fingerprint when u turned the laptop on at the beginning and use the chached data to log in, that’s so much nicer, and plus why not both? I’m sure it doesn’t cost them more than a few dollars to add
For the power I/O, Linus asked why they used a proprietary plug instead of USB/C, which would be a smart question if this was a phone or tablet. It's not. It's a laptop with laptop level power draws. Most people understand that the power draw from a laptop exceeds USB/C capabilities. He did provide a good explanation for the majority of people who probably didn't know already. The better question is why didn't ASUS go with a non-proprietary barrel plug? It appears to be the same size and the barrel plug can definitely handle the power demands of that laptop without the added expense of engineering a proprietary solution.
I agree, That's the one thing HP, Dell and Lenovo have gotten right is they use a standard voltage and plug so my old 2007 laptop power brick still works on a new laptop as a spare. Protip, Dell's will run off HP power bricks but won't charge the battery, but HP won't run off Dell's power brick
My current work laptop can use USB-C for charging, but of course only up to 90 W. Then they have a power brick with a proprietary plug to ensure I get enough power if needed (think it is rated for 170W)
I wonder if the USB C port will be able to charge the device as well as the dedicated port. It would be cool to have the USB C port still draw the 20v (or whatever the closest voltage of the standard is) and act as a backup for just in case you forget your charger. Having a little note on the bottom saying that the USB C port can't draw the necessary wattage to run the laptop at full tilt would be a nice touch as well.
So much for the increased quality and no more daily releases
/s
i love the presentation ends with linus holding a blurry crab in display
Nice video, this is such a big deal! As the technology advances, the entry-level should not be left behind, people will "pay more for less" in specs if they can get a better experience -- besides, an iGPU from today can easily beat many old cards.
FYI the specs for the g14 and g16 at 3:20 are backwards in the edit from what Linus said.
I have a g14 with a 3060 from 2021. These new designs are really tempting.
idk about that one. i really liked the AniMe Matrix thing they had going
No upgradeable ram tho, thats s no go for me
The power brick system is something I really like in HP, Dell, and lenovo as they've all settled on their own power voltage and barrel design and stuck to it for a long time, so I can use an old 2007 Dell power brick to power a new one and it just works. Sure if it detects it's not enough wattage Dell or HP run in low power limp mode and thinkpads just give a post error, but for me that's not that common. Whenever I see an Toshiba Fujitsu Acer or even Asus laptop come across my desk It's never been consistent on what they use and I've got to use one of the universal bricks that has a selector switch for 10 different voltages and a million plugs because I never have an adapter that would work with it. In a few years when people start getting rid of these and they start showing up at e-waste places, a lot of them are going to get scrapped even if they work because the people testing them can't without their weird power brick, and at least here in the states with revised R2 laws won't be able to sell them even for parts "untested"
edit Protip Dell's will run off HP power bricks but because their BIOS can't detect the wattage it won't charge the battery but will otherwise function. HP at least the model's I've tested won't post with Dell power bricks
To elaborate on your protip - most dells I have tested (mostly a variety of XPS'es) varies between draining their battery down to 3% before using power from an HP charger, or not being able to use it at all (the higher power xps'es usually), while yes, I have never gotten an HP to suck power from a dell charger
@@Flimzes Good to know. My experience is mostly outside of an OS just key testing laptops for resale, (does it post and is it locked/computraced, remove HDD, document specs, serial number, any cosmetic damage etc) and so all I usually see is the Dell bios were it says battery health and state of charge (if it still has it's battery) and what power supply is connected. Since I have to go through a lot per day I've lined up 70% of my work shelf with HP power bricks with a small box of those big to small blue barrel adapters and the rest are lenovo square type. If I know I'm going to be getting a lot of lenovo stuff to process I'll add some more of those. The one nice thing I'll say about HP at least the ones that use DDR3L is that they'll still post with regular DDR3 and just give a warning where as most Dells and Lenovos will just give out beep codes.
@@edison700Dells will technically run without a known charger (center pin), but in addition to not charging, they're locked at minimum clocks that make them unuseable aside from checking function. For instance, my 3793 with a i7-1065g7 normally runs mid-high 3ghz at 15-18w TDP, but on a 'universal' Onn charger from Walmart with no center pin connection, will run at something like 500mhz with a correspondingly ultra-low TDP of a few watts. My older Precision M6600 had the same logic and would run MUCH faster on battery than with an unrecognized charger.
Dell's 4.5mm charger/jack is notorious for the DC Jack breaking (the plastic breaks around the center pin contacts, resulting in unrecognized charger) if dropped or handled like an average kinda careless user.
For some reason I enjoy some LTT videos more than others (though nearly all I enjoy quite a lot), this is the one I think I have enjoyed the most
Fr, it has that old Linus vibe, which I used to get when I was a little younger and crazy about tech and things I can't afford.
Quick side note. I am not sure if Linus meant first on a GAMING laptop, but my Galaxy book 3 pro 360 has a 3k 120hz display that is variable refresh from 48-120hz, and it works great in games with no screen tearing. It has no GPU in it tho and only relies on the built in iGPU, so I am not really able to test it with anything too thicc.
The ultra version does come with a 4070 tho, and its in the same form factor with the same display, so I am not sure what exactly he means by the first, unless he means specifically in the gaming focused segment, rather than thin and light on the go workstation
they could have put a 40-42V usb-C @ 5A and a capacitive divider inside the laptop that has very high efficiency leaving with 20V @ 10A = 200W and like 3-5W power loss inside the laptop, it would need an expensive usb-c charger but it is possible
@@TheHalfBorg yes, you can also look it up as Switched Capacitor Charger. Basically it uses mosfets to charge up capacitors in series and then switch them to be discharged as connected in parallel. Over 90% eff, its used in phones for the fastest battery charge state because it generates less power, i think that was the reason we have USB PD PPS mode, whatever voltage and current you send. The divider cuts the voltage in half and doubles the current, and with the PPS charging mode, the fine voltage control and heat generation is moved to the power brick.
Would consider if it could still charge vis usb C. I understand it being cut down if I don't plug it in with their specific plug, but it's just not for me if I can't also charge while on a trip via the same cable I'm using for everything else.
I'd like to mention. this line up of Samsung's laptops (Book3 Series) has variable refresh rate! makes sense as the 3K Panel is from Samsung 😅
Editor did a great job on this one
I love my g14 (2021) because it is not a MAC book. I’ve upgrade the ram, repasted the thermal paste, clean the fans 4 times a year, and have replaced the battery. It’s also feels substantial while being compact. I’m worried this new designs gonna take those things away, guess I have to wait till ifixit gets their hands on it.
I'm glad I watched this, the OLED VRR explainer was great and I was actually wondering about this, having seen the refresh-rate-change-flicker on my new phone. Thanks!
Sticking with the 2023 model I got a month ago. Good to know there is barely a difference.
Guys can we all just take a moment to acknowledge how insane technology is getting recently? Imagine going from a 2010 Windows 7 laptop to THIS.
I have a G14 from 2020 which is still going strong. Love the thing to death but it's really the thermals that limit these laptops. If I'm playing anything remotely demanding I usually manually max out the fans (which are much louder than a desk fan when at full power) in order to keep my temps as low as possible, but I still often reach 95C. I would be worried about them trying to fit the same power hungry SoCs into a thinner chassis...
As an aerospace engineer, I deal with convective heat transfer stuff all the time, really the only solution (without using liquid coolant and a radiator) is to simply move more air through the system. As the air temperature in your case gets closer to the temperature of the components, the heat transfer will slow down. Liquid metal and vapor chambers sure do reduce the thermal 'resistance' for heat leaving the CPU, but often times it is the air that is the limiting factor. Hopefully the solid state fan tech will fix this, and perhaps make it a bit quieter.
This is an issue for all thinner Laptops of the past 5-8 years or so, they all have grossly inadequate fans and heatsinks and, given encouragement from Intel and others, basically run at 100c steady state under load. It's nothing like the past with cooling that could maintain 60-70c at max power, the only Laptops coming close are the higher-end thick models that are sadly kind of niche now. It sucks, and I hate it, but the OEMs no longer care about cooling and model revisions like this are because the previous generation had to have such low power limits that it performed terribly for the higher-end hardware. Since the RTX 2000 gen, there's been little performance difference between a xx70 and xx90 mobile GPU because most models can't handle the extra power and heat.
Were the graphics reversed (in relation to the B roll) on the side-by-side specs comparison ?
3:15 So is it what is shown in the screen or what Linus is saying?
The 2022 and 2023 G14 model did not have fingerprint sensor either. It is IR face unlock. I am currently using the 2022 G14 with RX6700s and its kept up well!
Just a heads up at 3:09 you have the laptop names flipped. The specs are on the right sides it seems but for what he's describing it seems flipped.
Defending Asus for replacing the fingerprint scanner with face recognition but complaining day in and day out about Apple that has done that before. Don't get me wrong I hate that Apple only offers FaceID but I hate hypocrisy a lot more.
It's a gaming macbook and I'm all for it
which grafic cards do they have in the video, exspecially the g16 model?
I'll never use a laptop again for work & gaming but nonetheless this is always exciting too see, and that Asus sent some guys over to you Linus made it even better, and funnier when you confront them about design choices and whatnot haha xD
These videos in my opinion are better than the normal review videos, not that they're bad of course !
Can't wait to see what more neat tech everyone has to show this year ! :D
3:16 you guys said the right specs for the right models but then listed them incorrectly in your video with no correction that I can see
I see we're already giving Asus a pass for attempting to void peoples' warranties when installing a critical BIOS security update.
Well I don't think that's the case, with LTT dropping Asus as a sponsor as discussed in the last WAN show. They're still going to publish the current videos they've committed to, but won't be working with Asus again until the ongoing issues are solved.
@@dushtube It happened half a year ago. No way this video was in production back then.
Yeah as cool as these laptops are, I can't ever buy or recommend an Asus machine again. My 2022 G14 bricked itself such that the laptop no longer runs stably on battery. Forced BIOS update ruined MUX functionality so badly that even rolling back to every BIOS on their website doesn't fix it. The laptop had just ended warranty coverage by about 2 weeks when the update rolled out and I was told I was SOL because I didn't buy the extended 2-year warranty.
At 3:18 the screen refresh rates are mixed up:
It says 240Hz for the 2.5K, 120Hz for the 3K (after which Linus triumphantly lifts up the (not) 120Hz laptop and says "I think you'll know which I prefer")
I can actually see ASUS's logic on the slim powerjack™️. There's what Linus mentioned about the voltage, but also the proprietary connector looks like it will hold up to more strain than USB C. I wish they kept the barrel jack, but that was probably hard to fit on a thin and light laptop.
at least give is the option for convenience, the proprietary connector just means more money to asus if you need to buy a new one. Totally NOT worth it
"G14" & "G16" labels are reversed @ 3:09. (Specs are in front of the correct laptops; just the laptop 'name labels' are reversed.)