Says the man who did an experiment that resulted in the Resonance Cascade. Jokes aside, I feel like they're just making it hard for consumers without even trying. It's like they haven't tested the names or some shit.
"Things get slower and make less sense as that feeling grows of of being crushed to death by a large, looming mass" works for General Relativity as well. This is fun!
@@handlemonium since the next version of Thunderbolt is supposed to be 80Gbps, via leaks from Intel executives, bumping USB 4 to match does make some sense.
The thing that makes me most upset about USB is that Microcenter still stocks more 2.0 hubs than 3.0 hubs and it is also very hard to find a usb-c hub that has a replicable cable or one that has more than a 6" cable. Their naming is awful, but more so the adoption rate is very slow. It blows my min that motherboards still come with 2.0 headers or that only high end motherboards have a usb-c front panel connector on them.
Because it's not really needed. Usb-c is nice and all but the vast majority of stuff I plug into my pc uses good old USB A. Only thing I own that's fully usb-c on both ends is my quest 2. Literally everything else uses USB A and the one usb-c port on my pc never really gets used at all...
USB 2.0 type A is still all you need for mice or keyboards and the like, and they use very little CPU/chipset bandwidth. So I prefer having at least some USB 2/A and 3/A connectors on both my rear and front IO. I have no devices that use a usb c connector as the host connection. Even my phone goes from type c to type a.
Yep, as Alexander said, there’s a trade off here. Usb 2 ports tie up less bandwidth compared to usb 3. Would you prefer multiple usb 2 port or one usb 3 port? There’s good reason for USB 2 ports to stick around for a bit longer.
Outside of fast USB storage (which isn't needed much because of the cloud) there is no real use for most people to use USB 3.0. USB C devices normally come with a type a to type C cable so having more type C ports on the PC is useless
Actually USB SS 40gbps, USB SS 80gbps is super consumer friendly as they know exactly what speeds to expect on the cables they buy and if it doesn't meet standards then easier to complain and return
Well no its terrible because the thing is you can still plug anything in so if they plug a usb2.0 thing in and the go Why isnt it going at 80GBPS. its better to just have the generic standard so people just go 4.2 higher then 4.0 higher number better
@@emeraldbonsai do you look at 1TB hard drives and think oh how come I only have 931gb space, no because once you learn that the actual HDD space after formatting is approx 10% less
This is why I'm excited for LTT labs and think it has incredible potential. LTT has enough clout and credibility that they could enforce better consumer focused standards like good naming/labeling of cable types if they establish a 'LTT Certified' badge the same way other companies like THX does with home theater equipment.
You're stark staring mad. Linus Tech Tips doesn't have either the clout or the credibility you think it does. The people who make the naming decisions aren't going to be in the least bit impressed by Linus trying to muscle in (not that he would, he isn't that kind of idiot). THX is a technology, and the badge only indicates that the tech is used and licensed for the device. You're barking up the wrong tree.
Come on 343…. Split screen was the one thing I wanted so badly. I was super excited to finally get back to playing halo with my brother (who lives with me) on the same couch, being idiots and enjoying ourselves.
Ditto. After USB 3, things started getting a little weird. Then everything became "USB," but only sometimes. At this point I have absolutely no idea what any given port can or cannot do. Is it a charging port or a data port or a Thunderbolt port or a DisplayPort or a legacy USB port with a new connector, or all of the above? Does it do 5V, 9V, 12V, 15V, or no volts? Can this thing charge on 5V or does it need whatever the %$#^ this other charger does, since neither one lists their capabilities and requirements on the label, and all I have is a random Internet post (which may or may not be just as hopelessly confused) to go on? If I'm a techie, how is anyone else supposed to ever figure any of this out? You've lost it, USBF. You've just completely lost it, and are now ruining everything that used to be nice about USB.
So right as they cleaned up their USB naming scheme, they screwed it up again. Either way, I hope we see this show up in devices sooner rather than later. As for some older cables being compatible, at least Linus can do another video with his cable tester.
This is why forcing companies to use the same cable is a bad idea. This kind of stuff happens with HDMI and the end consumer being confused. It needs to be simple. Doent fit = doesn't work.
@@pedrotski losing backwards compatibility or forcing consumers to buy the latest cable because of a different connection even if they don’t need the full bandwidth would be even worse.
Remember the KFC computer that was supposed to keep your chicken warm? Didn’t really keep it very warm. I think 40 series could do it. Make it happen tech news man.
@@joshknight1620 And that's where they messed up, a console sits way too far from the gamer for it to be a useful chicken-warmer-thing, a desktop PC on the other hand, sits right there next to the average gamer!
Not to mention that splitscreen coop is actually in Halo Infinite and they just have it locked away. People have glitched into it and found that it has little to no issues. One big mystery. You can even have 4 players on the same system with the glitch.
honestly i wish the USB naming scheme was on the ball park of "USB + (Connector shape) + Generation number + Gimmick" for example "USB C 4.0 Fast Charge" telling you is gonna be the rounded small connector, with 40GB/s speed & is wired to allow for fast electrical charging
Or even better: "USB + (Connector shape) + Speed + Gimmick", like USB C 80gbps Fast Charge. It allows consumer to know what speed to expect without having to remember which generation correspond to their desired speed.
the problem when the connectors don't change but the capabilities do is you forget what the cable you have does. colour coding could help out here. like all USB 4 80gbs must have a colour collar on the plug.
Honestly, even if it would've been dumb to see "USB10" in a year, they should've just gone up by 1 number with each incremental upgrade. Would've made things waaaay simpler.
@@achillesa5894 remember how the usb forum consists of companies that produce tech products? They don't want their products to feel outdated, so that customers still buy them. Being confusing is good for them, because only a small percentage of customers are gonna research the differences. So they try to keep every speed and feature set relevant for as long as they can. Also helps in cutting cost because not every usb port needs the full spec when they can just call them usb 3.2 or usb 4. So yeah, they're not trying to make sense. They're trying to maximize profits.
Its really good when they show one synthetic benchamark for general RT performance. Definitely shows how well the gpu performs in gaming and isnt misleading in any way. Gets a thumbs up from me.
I can see USB4 20/40/80Gbps being sensible, if that's how everything for consumers is marked. It's not deceptive like USB 3.1/3.2 Gen 1 (aka USB 3.0) were, and it tells you exactly what transfer speed the device/cable supports.
Folio Photonics are actually at 3$/TB with 1$/TB roadmapped. I hope they succeed. There's a lot of archive data that needs a resurgence of optical media.
It was digitally altered. Go to the very very very last second of the video and it flashes back blue. They just tweaked it in post. Though… I don’t know why…
Spotted the new purple background, it really sucks that Intel couldn't just admit their first attempt was a good trial and they would do better for the next generation of gpus.
Just a half baked thought, use a DP (data and power) thing like IP ratings. USB[generation number] D(for data)[speed number] P(for power) [power rating] Eg. USB4 DP54 Say 5 = 40Gbps and 4 = 240W
@@retroman-- I'm not here to grovel. I'm here for Tech News. And TechLinked is the best channel for it. Are you not better than starting an unnecessary reply thread?
They need to go after connector manufacturers for making the collars behind the plug so big that they won't fit. Not just USB, but audio and video cables also. Dunno how many times an HDMI or Display port cable wouldn't fit to my GPU because of the giant blob of plastic around the plug.
Oh, I'd hire you on the spot were that the headline of your cover letter. I get that they don't rehearse these videos to keep the improvisational feel, but maybe do a run-through of people's names in advance (or dub them in afterwards or just cut out the mispronounced takes). None of these names are even particularly hard to pronounce, especially when so many Chinese-Americans (and presumably Chinese-Canadians) already Anglicize their first names. (in case you can't tell, this is very personal for me, as I've lived my whole life sighing as people refer to me as "JillEdh" (understandable) "Barley" (okay, so you're not even trying)).
@@GSBarlev I went and looked up the application but unfortunately they're only taking Canadians right now. Naming is also important to me and it annoys me that in [current year] it's apparently still hip to do "let's mispronounce foreign name funny haha" joke
USB 3.0 = 5Gbps USB 3.1 = 10Gbps USB 3.2 = (2X2) 20Gbps USB 4.0 = 40 Gbps, DP etc USB 4.1 = 80 Gbps There I fixed the issue now everyone forward that to the USB spec certification team. Trust a certification team to make a mound out of a mole hill
Didn't it seem a lot more sensible that the Ethernet people just made a new one when we could shove 10x the data through the same cable? Did we really need a 5Gb/s to 10Gb/s gen? Or a 10 to 20? It's like HDMI with a new freakin standard every two weeks. "But this one does Dolby Vision 12-and-a-half bit!" Nobody needed that!
@@betablockrr I wasn't sure if it was an expression I hadn't heard before. I think your solution is what will actually happen with the naming scheme, or they'll just add the transfer rate in the title in parentheses.
The first number should represent the connector hardware/shape version. Second number should represent each optimized iteration of the connector "software?" Third number should be for each fix done to the "software?"
I found that my camera, the Canon R5 has a USB-C port which can be used to charge the official battery but not from a USB-A to USB-C cable. The cable is required to be a USB-C to USB-C type, I understand that the cables have a printed circuit board that probably handshakes with the camera. I haven’t looked it up but probably the USB-C to USB-C cable is a higher spec than any USB-A to USB-C cable. Therefore I had to buy a new battery (good news it only cost $50).
Regarding usb, there’s a simple solution they’ll never do… 1. Rename all usb variants now to a simplified naming scheme. (I’d say letters to make it clear it’s new, but they used letters for the plugs already…) so usb 1.0 becomes v1, 1.1 becomes v2, 2.0 becomes v3, 3.1 gen 1 becomes v5, and so on. 2. Make a chart indicating device vs host compatibility of modes. Basically, the fastest valid speed the host and device have will be what they most likely run at. 3. Make an app for windows, Mac, Linux, iOS and Android, that is capable of listing all modes the computer’s ports and devices are capable of, highlighting the fastest for each. This should contain the chart and maybe even let you specify a theoretical host device to see what speed everything WOULD run at. 4. Sell preprinted label packs for convenience (label makers work too) so you can relabel your devices with a new universal version number. The biggest issue will always be making past versions of usb not confusing. Until an upgrade addresses this, nothing they do will be satisfactory, not that this “attempt” even comes close…
Making past revisions not confusing wouldn't be that hard. They just need to leave 1.0, 1.1, and 2.0 untouched, change 3.2 Gen 1 back to 3.0 (which it was called originally before they renamed it twice), rename 3.2 Gen 2 to usb 3.1, which is what everyone was calling it anyways when they introduced it as 3.1 gen 2 (and later renamed to 3.2 gen 2) and rename 3.2 gen 2x2 to 3.2, as many people call it anywyas, and rename USB4 to USB 4.0, version 2.0 can be USB 4.0. It was all perfectly fine and understandable until they started renaming existing standards with the introduction of 3.1, pretty much anyone gets that a full number increase is a big improvement and a small number increase is a smaller revision, they should have just stuck to that.
Yeah that ARM lawsuit seems weird. If Qualcomm bought the company, that typically means the IP too. I don't see what their legal standing is, but it's weird so I'm not going to see it.
Don't "the new version number is only of interest to developers" me Benson. Any reasonably savvy consumer should be able to correspond a product's version number with its capabilities. It's well known at this point that USB's naming conventions are ass. This doesn't need to be complicated, but it's artificially made so.
One thing that blows my mind is, I'm hard pressed to know what hardware uses anything close to the speeds these are rated for. Even buying a USB 3.0 flash drive for the speed was almost pointless. The fastest speed I've seen is about 120MBps to an external hard drive. So what am I missing? What device pairings actually communicate anywhere near the speeds that justify purchasing quality and modern equipment and cables?
As far as that art competition goes...he won, right? That means the art was as good or better than the human-generated art on display, which might be shitty in the context of an art contest but it's much worse when you consider that AI‐generated art will end up being good enough for many situations where art is needed. A lot of jobs are going to be lost because of this. There was a time when I thought art would be one of the safer fields but that's over now.
That's just progress. Ton of jobs are lost to automation every day. On the upside though art will be way more accessible. Like if you need some artwork or assets made for a product you're trying to make but you can't afford a decent artist now you can just get an AI to do it for you.
And as usual people forget the key difference between an AI mashing together pieces of art found on Google and an actual artist is IMAGINATION. Sure you can get some pretty pictures with Midjourney but at the end of the day it's just a horrible amalgamation of artworks found on Google Images. Nothing will ever replace human artists, stop imagining up this doomsday scenario that doesn't exist
@@zwenkwiel816 There are still limits on what it can do. You can only get so descriptive with your prompt. Like I said before an AI has no concept of what art is supposed to be. It just mashes together relevant pictures to form something barely cohesive. The key selling point in having a human artist is imagination and creativity which an AI has neither. I'd be worried once AI programs start to develop a conscious
@@ToastedLobster that's only a matter of time and developing the right tools. Like they already have tools like Nvidia canvas that let you just paint simple shapes and AI will fill it in with stuff you choose. So you can control the composition and everything while the AI fills in the details. This program only does landscapes really but there's also AI's that can translate footage from gta 5 into photorealistic video and other crazy stuff like that. So if the user just inputs a simple sketch and can control some variables you'll get the best of both worlds. Though personally I'm quite surprised how well things like midjourney are already doing with just simple prompts. Like with some iteration you can get pretty specific stuff Also I don't think you get how this stuff works cuz it's not just mashing together pictures. 0 pixels in the final image correlate to any of the images in its gigantic dataset.
Believe it or not, I'm okay with the current naming scheme for USB. Because they scale. 3.0 means USB-A. 3.1 means type c but not insane speeds. 3.2 means better speeds and PD. 4.0 means better speeds with better PD and USB-C connectors on both ends. 4.1 adds speed and power to 4.0. We asked for this when we named the connectors on each end different things. With the current structure I can say a number and you can determine it's specs almost down to the connectors, unlike USB 2.0.
In fact glass based data storage already existed for >10 years, and it can hold terabytes at a time, with all the benefits of the long lasting life of glass, aside from it breaking. The only hinderence is cost.
Alot of the license contracts have mandatory clauses that they need to enforce. If I had to guess, it will be settled out of court for an undisclosed amount, which will likely just be a "thanks for the increased business hand shake". Sometimes legal requirements are silly, but to maintain licenses and control they are needed
For me in versioning would be Version.Speed.Variant 4.80.2 or 4.80.B This would allow even other variants (B) for different speeds (80) of the same generation (4).
i had just figured out that 3.0, 3.1 and 3.2 were Gen 1, Gen 2, and Gen 2x2 respectively, so i knew what it meant it terms of Bytes not bits, and now USB are leaning hard into making the speed the marketing, but with bits... i can't handle this, USE BYTES PLEASE ITS WHAT MOST PEOPLE KNOW
WAIT ITS WORSE THAN I EXPECTED the version number is COMPLETELY IRRELEVANT only the GEN and LANE numbers matter, like for PCIe Gen 1 is 625MB/s Gen 2 is 1.25 MB/s Gen 3 exists, and is 2.5MB/s and all of these can be multiplied by 2, hence Gen 2x2 being 2.5MB/s WHICH MEANS THAT Gen 1x2 IS A THING at 1.25 GB/s Gen 3x1 and Gen 2x2 are the same speed despite being different, 2.5GB/s and therefore, Gen 3x2, which is what USB 4 v2 80gbps (10GB/s) is, can use existing Gen 3x1 cables, that by necessity have extra lanes for Gen 2x2 support. USB-IF has actually come up with a consistent and understandable naming scheme BUT HAS INTENTIONALLY OBSCURED IT FOR NO GALE-DAMN REASON
so actually: 2.0: 60 MB/s (oh gales yep, still really slow) 3.0 Gen 1x1: 625MB/s Sata v3: 750MB/s 3.2 Gen 1x2: 1.25GB/s 3.1 Gen 2x1: 1.25GB/s 3.2 Gen 2x2: 2.5GB/s 4.0 Gen 2x1 20gbps: 1.25GB/s 4.0 Gen 2x2 40gbps: 2.5GB/s 4.0 Gen 3x1 40gbps: 5GB/s 4.1 Gen 3x2 80gbps: 10GB/s
what they should do: USB 2.0 USB 3.0 Gen 1 USB 3.1 Gen 2 USB 3.2 Gen 2x2 USB 4x1 USB 4x2 what they should have done: USB 2.0 USB 3.0 USB 4.0 USB 4x2 USB 5x1 USB 5x2
I don't like the magenta and purple background with orange on the right hand side. In fact, even without considering the contrast with orange I think you shouldn't mix two colours; as in, using orange and [orange mixed with white] or blue and light blue is okay, but these two colours clearly differ in the amount of blue (at least on my screen). This means the purple pixels look like highlights, and the background pulls too much attention. But this is just my humble opinion.
It seems pretty clear now that the members of the USB-IF don't actually know the definition of 'universal'. Their versioning is all over the place, they even retcon them, their same connectors are rated wildly differently with no visual delineation, they don't standardise protocols (a-la Switch), some devices/ports just aren't backward compatible for no reason, and they seem to have completely abandoned the sensible colour differentiation.
PCI Express 5 has double the bandwidth of PCI Express 4. But USB decides to go with USB4 Version 2.0, which has double the bandwidth of USB4 Version 1.0
USB is a quite large collection of standards, with a few of them being more consumer facing, like connectors, medium, bandwidth, power delivery, HDMI/DP/Ethernet/etc over USB, daisy chaining devices... Many of them optional. I'm not sure how you'd come with a simple naming scheme for both a couple dollar and ten of thousands dollar devices.
5:27 "...force them to innovate, possibly create something even more powerful" Lmaoo this is historically accurate. When the US unilaterally revoked their deal of F-16s with Pakistan, the latter was forced to accelerate its military spending and innovate jointly with China to create JF-17 Thunder aircrafts. This strengthened ties between Pakistan and China, spoiled the chances of US for region influence and created more competition in the aircraft market.
$5 a terabyte of optial disk backup? Nice it'll still cost my $60-70 to back up all my drives but cheaper then extra hard drives. Hope it hits the market soonish.
It randomly popped up in my recommendation and I like that you are go right to the news without explaining everything. OH yeh also i was at 1.75x that made it even better for news highlight :D
So, that "1TB disc" is actually very old news, 7 years old, and Tom's hardware was just super late to the party. They're called "Datafilm Discs" and are actually just thick, multi-layer BD XL discs with an extra focusing element on the laser to switch between layers. Problem is, the only prototype ever publicised was 7 layers (so theoretically 700gb) of their own custom pressed film per disc, with the 7th having focusing issues, and it spins slower than a 1x CD from the late 80s.
Ah they anticipate reaching market by 2026. If they work Ill buy at least 10 petabytes worth. This will let me archive my entire family history and data bank and store it in multiple low cost locations that could survive well until a better solution is available.
@@Whatsup_Abroad Yeah, they're not going to have a lot of time left on their patent by the time it's on the market. Their longevity is definitely up for debate though. The more layers you add, the shorter the actual lifespan of the disc is, yet they're claiming the same lifespan and durability as a Verbatim M Disc BDXL for half the price? I doubt they're gonna make that mark...
While USB 4 20/40/80 GBit might be less confusing the real issue, just like with USB 3, is when all they put on a box is "USB 4" and oyu have to guess what speed it is.
Inventors of revolutionary data transfer connections, absolutely at the forefront of technology, used worldwide by literally everyone. Their kryptonite: Naming their products.
No split screen in Halo is ridiculous. This is one of the many reasons I upgraded my PC instead of getting a new console. If my girlfriend and all my friends need their own $500 console, plus another screen, controller, copy of the game, etc. Why not just have a PC at that point. Split screen multiplayer is the best thing about consoles and so few games support it anymore
@@tams805 yeah I usually hear that argument or "they can't do it because of graphical limitations, performance,etc" yeah ok. My n64 with it's 4MB of RAM could do 4 player split screen but a modern system can't. Devs can figure it out in most games, the game companies just don't want to because online only makes more money.
The non-split screen business is a nightmare for us parents who want our kids to be able to game at the same time in consoles. Two TV's Two Consoles Two Games for my twins. That's why they are almost exclusively on PC's.
Somewhere in the near future: USB hub with 80 Gbps ports $60 8 port ethernet switch with a single 10 Gbps port $300 Like come on, a single 40 Gbps QSFP+ module can cost like $500-1000, but then again, those have ranges of over 10km.
The Qualcomm/Arm issue feels a lot like the unauthorized family sharing problem. Yes, your parents are paying for the service, however you don't even live in the same city and can afford your own sub. Isn't that more privateering than adblock?
They won't stop until they invent a new language based around USB cables
Turing complete. On the same level as c++.
Its called slang
😂🤣 ROFL
USB++
Says the man who did an experiment that resulted in the Resonance Cascade.
Jokes aside, I feel like they're just making it hard for consumers without even trying. It's like they haven't tested the names or some shit.
I can't wait for devices to include USB 5x6.8 three-way HighFast V56 +2 Pro compatibility
Same
You mean USB 5x6.8 three-way HighFast v56+2 pro and Knuckles
@@emilybreneman2140 featuring Dante from the devil may cry series!
u mean USB 6.9 V69 Deez Nuts
But will it be backwards compatible with USB 4.2³ x2 Rev.8 HyperSpeed Type E ?
"Things get faster and they make less sense" is indeed not a bad summary of relativity 😂
And "Things get smaller and they make less sense" is not a bad summary of quantum theory.
"Things get slower and make less sense as that feeling grows of of being crushed to death by a large, looming mass" works for General Relativity as well.
This is fun!
"Things start to stop and they make less sense" is also not a bad summary of near Absolute Zero behaviors.
Accurate
The USB Forum seems to have a habit of changing their versioning schemes in every major USB revision after 2.0.
80Gbps would be nice though.
TB4 is only 40Gbps AFAIK
@@handlemonium for sure, especially for eGPU. Although they could have just name it 4.1
@@mcslender2965 I mean, why not USB 5? Are they afraid of running out of numbers or something?
@@Deliveredmean42 Yeah with Intel reportedly working on a 80Gbps TB5 as of last year.......
@@handlemonium since the next version of Thunderbolt is supposed to be 80Gbps, via leaks from Intel executives, bumping USB 4 to match does make some sense.
“Shut the front door!”
“I won’t.”
Never change, guys.
The thing that makes me most upset about USB is that Microcenter still stocks more 2.0 hubs than 3.0 hubs and it is also very hard to find a usb-c hub that has a replicable cable or one that has more than a 6" cable. Their naming is awful, but more so the adoption rate is very slow. It blows my min that motherboards still come with 2.0 headers or that only high end motherboards have a usb-c front panel connector on them.
Because it's not really needed. Usb-c is nice and all but the vast majority of stuff I plug into my pc uses good old USB A. Only thing I own that's fully usb-c on both ends is my quest 2. Literally everything else uses USB A and the one usb-c port on my pc never really gets used at all...
USB 2.0 type A is still all you need for mice or keyboards and the like, and they use very little CPU/chipset bandwidth. So I prefer having at least some USB 2/A and 3/A connectors on both my rear and front IO. I have no devices that use a usb c connector as the host connection. Even my phone goes from type c to type a.
We need more USB-C hubs that aren't meant for laptops or tablets
Yep, as Alexander said, there’s a trade off here. Usb 2 ports tie up less bandwidth compared to usb 3. Would you prefer multiple usb 2 port or one usb 3 port? There’s good reason for USB 2 ports to stick around for a bit longer.
Outside of fast USB storage (which isn't needed much because of the cloud) there is no real use for most people to use USB 3.0. USB C devices normally come with a type a to type C cable so having more type C ports on the PC is useless
I think the color change would have been more funny if it was a single high hue pixel that was never acknowledged like a failed pixel.
Bring back the green pixel! 🦀
Change one per day for like 4 months, only the uncommitted to the channel would notice
@@Depl0rable10 jokes on you I'm colourblind and have no idea what you're on about
You should have painted one pink square every episode until they took over. It would freak people out.
Idk if they painted it. I thought they did until the last scene where Linus pops in. The BG is still blue, could be in post.
@@creekboi7 yeah it's just a colour grade in post
Actually USB SS 40gbps, USB SS 80gbps is super consumer friendly as they know exactly what speeds to expect on the cables they buy and if it doesn't meet standards then easier to complain and return
But then how will the companies use USB 2.0 on their ssds with them knowing ppl don't read the finer details
@@dhruvakhera5011 what
Well no its terrible because the thing is you can still plug anything in so if they plug a usb2.0 thing in and the go Why isnt it going at 80GBPS. its better to just have the generic standard so people just go 4.2 higher then 4.0 higher number better
does consumer actually use usb c 40gbps speed?
@@emeraldbonsai do you look at 1TB hard drives and think oh how come I only have 931gb space, no because once you learn that the actual HDD space after formatting is approx 10% less
This is why I'm excited for LTT labs and think it has incredible potential. LTT has enough clout and credibility that they could enforce better consumer focused standards like good naming/labeling of cable types if they establish a 'LTT Certified' badge the same way other companies like THX does with home theater equipment.
You're stark staring mad. Linus Tech Tips doesn't have either the clout or the credibility you think it does. The people who make the naming decisions aren't going to be in the least bit impressed by Linus trying to muscle in (not that he would, he isn't that kind of idiot).
THX is a technology, and the badge only indicates that the tech is used and licensed for the device. You're barking up the wrong tree.
Come on 343…. Split screen was the one thing I wanted so badly. I was super excited to finally get back to playing halo with my brother (who lives with me) on the same couch, being idiots and enjoying ourselves.
Same man, I can't even not buy it in protest it's on game pass 😂
I haven't understood USB for years. I legit assume it's that way so people buy old inventory. Someone is being paid off. USB is not consumer friendly.
Same, I have no idea what version of USB3 my motherboard has. I could look it up in the manual but that info's probably wrong.
Ditto. After USB 3, things started getting a little weird. Then everything became "USB," but only sometimes.
At this point I have absolutely no idea what any given port can or cannot do. Is it a charging port or a data port or a Thunderbolt port or a DisplayPort or a legacy USB port with a new connector, or all of the above? Does it do 5V, 9V, 12V, 15V, or no volts? Can this thing charge on 5V or does it need whatever the %$#^ this other charger does, since neither one lists their capabilities and requirements on the label, and all I have is a random Internet post (which may or may not be just as hopelessly confused) to go on?
If I'm a techie, how is anyone else supposed to ever figure any of this out?
You've lost it, USBF. You've just completely lost it, and are now ruining everything that used to be nice about USB.
So right as they cleaned up their USB naming scheme, they screwed it up again. Either way, I hope we see this show up in devices sooner rather than later. As for some older cables being compatible, at least Linus can do another video with his cable tester.
This is why forcing companies to use the same cable is a bad idea. This kind of stuff happens with HDMI and the end consumer being confused. It needs to be simple. Doent fit = doesn't work.
I'm certain that they are doing this on purpose to confuse customers.
@@pedrotski that the most wrong takeway you could have come up with...
@@pedrotski losing backwards compatibility or forcing consumers to buy the latest cable because of a different connection even if they don’t need the full bandwidth would be even worse.
@@pedrotski Yea so you can buy so many cables for cheap that fit and don't work. 👌
Remember the KFC computer that was supposed to keep your chicken warm? Didn’t really keep it very warm.
I think 40 series could do it. Make it happen tech news man.
It was actually a video gaming console that had a heating rack, not a computer. The KFConsole.
How do you know it didn’t keep it warm? It never came out
KFC had the right idea, it was just ahead of it's time.
@@somethinglikethat2176 Gamers are totally that lazy and love chicken 💯
@@joshknight1620 And that's where they messed up, a console sits way too far from the gamer for it to be a useful chicken-warmer-thing, a desktop PC on the other hand, sits right there next to the average gamer!
Not to mention that splitscreen coop is actually in Halo Infinite and they just have it locked away. People have glitched into it and found that it has little to no issues. One big mystery. You can even have 4 players on the same system with the glitch.
honestly i wish the USB naming scheme was on the ball park of "USB + (Connector shape) + Generation number + Gimmick" for example "USB C 4.0 Fast Charge" telling you is gonna be the rounded small connector, with 40GB/s speed & is wired to allow for fast electrical charging
Or even better: "USB + (Connector shape) + Speed + Gimmick", like USB C 80gbps Fast Charge. It allows consumer to know what speed to expect without having to remember which generation correspond to their desired speed.
Or just usb c4 means it's 80gbps fast charge capable, and USB c chap shit knock off, meaning it isn't.
Sure, but still havehave to decipher logo hieroglyphs on laptops and such.
That will never catch on, it makes too much sense.
the problem when the connectors don't change but the capabilities do is you forget what the cable you have does. colour coding could help out here. like all USB 4 80gbs must have a colour collar on the plug.
Wait until USB 4 gets renamed to 4.2 and 4.2 gets renamed to 4.2 gen 2 and the other one to 4.2 gen 2x2 and maybe USB 4.2 gen 2x4.
Fairly sure it USB 4.2 gen 1.8x3.6 when you actually go out to measure it. ;)
Just FYI, USB4 Gen 2x1, 2x2, 3x1 and 3x2 are all part of the USB4 official spec.
Honestly, even if it would've been dumb to see "USB10" in a year, they should've just gone up by 1 number with each incremental upgrade. Would've made things waaaay simpler.
Or just stick with the 3.0, 3.1, 3.2 scheme into 4.0, 4.1 etc. I'm not sure why they decided that wasn't good enough.
@@achillesa5894 remember how the usb forum consists of companies that produce tech products? They don't want their products to feel outdated, so that customers still buy them. Being confusing is good for them, because only a small percentage of customers are gonna research the differences. So they try to keep every speed and feature set relevant for as long as they can. Also helps in cutting cost because not every usb port needs the full spec when they can just call them usb 3.2 or usb 4.
So yeah, they're not trying to make sense. They're trying to maximize profits.
but marketing people would be out of job
@@angelodou but wouldn't make it people more tempted to upgrade since their current hardware is outdated already?
Exactly. Just double bandwidth each revision, like PCIe. No one complains about PCIe.
Tech news is part of my daily routine at this point, i dont even search for it just pops up and i click it
I swear its the same for me too
Kinda like my parents watching news
How is it daily when it's only 3 weekdays?
@@plplplplplpl7336 time flows differently while waiting for technews
Not sure "ACTIVE" is a good naming for something that's supposed to keep itself unchanged for a long period of time🤔
It's active because you can read the data without wearing the disc under normal use.
"Active Inertness'"
Thank you to whomever always adds the skippable tangent sections to TechLinked. It saves so much time
5:59 - for the jumpscare
Its really good when they show one synthetic benchamark for general RT performance. Definitely shows how well the gpu performs in gaming and isnt misleading in any way. Gets a thumbs up from me.
(comes up behind you and whispers) "it's okay, let them have this."
*benchmark
I genuinely do not understand how they keep screwing this up.
like come on, counting really isn't that hard...
Ask Windows
Ask Valve
tell that to Xbox
Look at HDMI...
Programming is ez
They are trolling us with those USB names. There cannot be another explantion.
*explanation
"Things get faster and they make less sense" seems to describe my life path quite accurately.
I can see USB4 20/40/80Gbps being sensible, if that's how everything for consumers is marked. It's not deceptive like USB 3.1/3.2 Gen 1 (aka USB 3.0) were, and it tells you exactly what transfer speed the device/cable supports.
Folio Photonics are actually at 3$/TB with 1$/TB roadmapped. I hope they succeed. There's a lot of archive data that needs a resurgence of optical media.
They're promising it by 2026. At this point it's a dream.
The background looks neat but it is weird that it no longer matches the logo colours.
I haven’t looked very closely, but isn’t where the right side of the T and top of the L meet, colored purple?
It was digitally altered. Go to the very very very last second of the video and it flashes back blue. They just tweaked it in post. Though… I don’t know why…
one of the best tech news ever made . riley was on such good mood today . linus showing in the back was the cherry on top
So enjoyable
Spotted the new purple background, it really sucks that Intel couldn't just admit their first attempt was a good trial and they would do better for the next generation of gpus.
After seeing the blue background for so many videos, I now understand how color blind people must feel.
@@Smumbo 😂😂
@@Smumbo I have not noticed any difference and am colorblind. So I don't feel better or worse than before.
I can't get enough of Riley and James talking about tech news. Thanks for yesterday's TalkLinked, and thank you for today's TechLinked ^_^
SuperSpeed USB4.0-v2.0-gen2x2... just what we needed.
Finally I’ve waited too long for tech news
Just a half baked thought, use a DP (data and power) thing like IP ratings. USB[generation number] D(for data)[speed number] P(for power) [power rating]
Eg. USB4 DP54
Say 5 = 40Gbps and 4 = 240W
Everyone's bragging about their USB connectors until I bring my USB 5.56x45 into class
USB 4.20 will be the most popular revision of all time.
…until it’s superseded.
@@laustinspeiss superspeeded 😶
That Einstein joke was incredible 😂
I really hope the Active Disk tech takes off, JUST so that it makes optical drive bays come back in mainstream case design.
I only found out optical drives WERENT still ubiquitous like this uear while looking for cases for a new pc. I couldn't go without.
I honestly don't think it's worth the fan space. I'd rather run a usb from the back for it.
1:31 Shut the front door 😂😂😂😂
A day without TechLinked is a day wasted.
How are you not embarrassed to write a comment like that?
@@retroman-- Thanks for feeling embarrassed on my behalf
@@profoundpotato Great response
@@profoundpotato Great response, I never said I was embarrassed for you. Stop groveling to youtubers. Are you not better than that?
@@retroman-- I'm not here to grovel. I'm here for Tech News. And TechLinked is the best channel for it. Are you not better than starting an unnecessary reply thread?
They need to go after connector manufacturers for making the collars behind the plug so big that they won't fit. Not just USB, but audio and video cables also. Dunno how many times an HDMI or Display port cable wouldn't fit to my GPU because of the giant blob of plastic around the plug.
Alright, I’m applying to be a writer for your show, if only so that I can get Riley to say Leung correctly
Oh, I'd hire you on the spot were that the headline of your cover letter.
I get that they don't rehearse these videos to keep the improvisational feel, but maybe do a run-through of people's names in advance (or dub them in afterwards or just cut out the mispronounced takes).
None of these names are even particularly hard to pronounce, especially when so many Chinese-Americans (and presumably Chinese-Canadians) already Anglicize their first names.
(in case you can't tell, this is very personal for me, as I've lived my whole life sighing as people refer to me as "JillEdh" (understandable) "Barley" (okay, so you're not even trying)).
@@GSBarlev is your name hebrew? I'd pronounce it Gill-udd
@@Empyrean55 Got the etymology correct! It's geel-AHD bahr-LEHV. Though even my wife puts the accent on the first syllable of our last name.
@@GSBarlev I went and looked up the application but unfortunately they're only taking Canadians right now. Naming is also important to me and it annoys me that in [current year] it's apparently still hip to do "let's mispronounce foreign name funny haha" joke
They really need a naming scheme that easily tells you data transfer speeds, wattage, display resolution, and framerate.
Don't worry Riley, I noticed the new weird thing. Love that purple backdrop
So now we have Universal Serial Bus version 4.0 version 2.0
USB 3.0 = 5Gbps
USB 3.1 = 10Gbps
USB 3.2 = (2X2) 20Gbps
USB 4.0 = 40 Gbps, DP etc
USB 4.1 = 80 Gbps
There I fixed the issue now everyone forward that to the USB spec certification team. Trust a certification team to make a mound out of a mole hill
You mean a mountain not a mound right?
@@Duaality. I did indeed 🤣🤣🤚
Didn't it seem a lot more sensible that the Ethernet people just made a new one when we could shove 10x the data through the same cable?
Did we really need a 5Gb/s to 10Gb/s gen? Or a 10 to 20? It's like HDMI with a new freakin standard every two weeks. "But this one does Dolby Vision 12-and-a-half bit!" Nobody needed that!
@@betablockrr I wasn't sure if it was an expression I hadn't heard before.
I think your solution is what will actually happen with the naming scheme, or they'll just add the transfer rate in the title in parentheses.
@@Duaality. just bad spelling is all 🤣
The first number should represent the connector hardware/shape version.
Second number should represent each optimized iteration of the connector "software?"
Third number should be for each fix done to the "software?"
USB Forum is trying to compete with the Fast and the Furious franchise for most disperate naming schemes in a series
Now, lightning cables, next, light speed cables, then Quantum entanglement cable
God save us from upcoming more confusing naming, can't wait for usb4 version 2.0 2x2 type-c!
I found that my camera, the Canon R5 has a USB-C port which can be used to charge the official battery but not from a USB-A to USB-C cable. The cable is required to be a USB-C to USB-C type, I understand that the cables have a printed circuit board that probably handshakes with the camera. I haven’t looked it up but probably the USB-C to USB-C cable is a higher spec than any USB-A to USB-C cable. Therefore I had to buy a new battery (good news it only cost $50).
I cannot wait to get my first laptop with USB 4 stuff. Eat it USB-C.
Regarding usb, there’s a simple solution they’ll never do…
1. Rename all usb variants now to a simplified naming scheme. (I’d say letters to make it clear it’s new, but they used letters for the plugs already…) so usb 1.0 becomes v1, 1.1 becomes v2, 2.0 becomes v3, 3.1 gen 1 becomes v5, and so on.
2. Make a chart indicating device vs host compatibility of modes. Basically, the fastest valid speed the host and device have will be what they most likely run at.
3. Make an app for windows, Mac, Linux, iOS and Android, that is capable of listing all modes the computer’s ports and devices are capable of, highlighting the fastest for each. This should contain the chart and maybe even let you specify a theoretical host device to see what speed everything WOULD run at.
4. Sell preprinted label packs for convenience (label makers work too) so you can relabel your devices with a new universal version number.
The biggest issue will always be making past versions of usb not confusing. Until an upgrade addresses this, nothing they do will be satisfactory, not that this “attempt” even comes close…
Making past revisions not confusing wouldn't be that hard. They just need to leave 1.0, 1.1, and 2.0 untouched, change 3.2 Gen 1 back to 3.0 (which it was called originally before they renamed it twice), rename 3.2 Gen 2 to usb 3.1, which is what everyone was calling it anyways when they introduced it as 3.1 gen 2 (and later renamed to 3.2 gen 2) and rename 3.2 gen 2x2 to 3.2, as many people call it anywyas, and rename USB4 to USB 4.0, version 2.0 can be USB 4.0. It was all perfectly fine and understandable until they started renaming existing standards with the introduction of 3.1, pretty much anyone gets that a full number increase is a big improvement and a small number increase is a smaller revision, they should have just stuck to that.
Yeah that ARM lawsuit seems weird. If Qualcomm bought the company, that typically means the IP too. I don't see what their legal standing is, but it's weird so I'm not going to see it.
Would be pretty funny if ARM lost though
The subtitles saying it's Linus when it's not Linus off screen is infinitely more humourous than I thought it would be
Don't "the new version number is only of interest to developers" me Benson. Any reasonably savvy consumer should be able to correspond a product's version number with its capabilities. It's well known at this point that USB's naming conventions are ass. This doesn't need to be complicated, but it's artificially made so.
Not everyone can become a great artist; but a great artist *can* come from *anywhere*-Ratatouille (2007)
They just want to make it sounds like USB 4 2.0 BLAZE IT
One thing that blows my mind is, I'm hard pressed to know what hardware uses anything close to the speeds these are rated for. Even buying a USB 3.0 flash drive for the speed was almost pointless. The fastest speed I've seen is about 120MBps to an external hard drive. So what am I missing? What device pairings actually communicate anywhere near the speeds that justify purchasing quality and modern equipment and cables?
As far as that art competition goes...he won, right? That means the art was as good or better than the human-generated art on display, which might be shitty in the context of an art contest but it's much worse when you consider that AI‐generated art will end up being good enough for many situations where art is needed. A lot of jobs are going to be lost because of this. There was a time when I thought art would be one of the safer fields but that's over now.
That's just progress. Ton of jobs are lost to automation every day. On the upside though art will be way more accessible. Like if you need some artwork or assets made for a product you're trying to make but you can't afford a decent artist now you can just get an AI to do it for you.
@@zwenkwiel816 if you've ever seen the abilities of that automation, you'd know its NOT happening that fast
And as usual people forget the key difference between an AI mashing together pieces of art found on Google and an actual artist is IMAGINATION. Sure you can get some pretty pictures with Midjourney but at the end of the day it's just a horrible amalgamation of artworks found on Google Images. Nothing will ever replace human artists, stop imagining up this doomsday scenario that doesn't exist
@@zwenkwiel816 There are still limits on what it can do. You can only get so descriptive with your prompt. Like I said before an AI has no concept of what art is supposed to be. It just mashes together relevant pictures to form something barely cohesive. The key selling point in having a human artist is imagination and creativity which an AI has neither. I'd be worried once AI programs start to develop a conscious
@@ToastedLobster that's only a matter of time and developing the right tools. Like they already have tools like Nvidia canvas that let you just paint simple shapes and AI will fill it in with stuff you choose. So you can control the composition and everything while the AI fills in the details.
This program only does landscapes really but there's also AI's that can translate footage from gta 5 into photorealistic video and other crazy stuff like that. So if the user just inputs a simple sketch and can control some variables you'll get the best of both worlds.
Though personally I'm quite surprised how well things like midjourney are already doing with just simple prompts. Like with some iteration you can get pretty specific stuff
Also I don't think you get how this stuff works cuz it's not just mashing together pictures. 0 pixels in the final image correlate to any of the images in its gigantic dataset.
Believe it or not, I'm okay with the current naming scheme for USB. Because they scale. 3.0 means USB-A. 3.1 means type c but not insane speeds. 3.2 means better speeds and PD. 4.0 means better speeds with better PD and USB-C connectors on both ends. 4.1 adds speed and power to 4.0. We asked for this when we named the connectors on each end different things. With the current structure I can say a number and you can determine it's specs almost down to the connectors, unlike USB 2.0.
In fact glass based data storage already existed for >10 years, and it can hold terabytes at a time, with all the benefits of the long lasting life of glass, aside from it breaking. The only hinderence is cost.
Alot of the license contracts have mandatory clauses that they need to enforce. If I had to guess, it will be settled out of court for an undisclosed amount, which will likely just be a "thanks for the increased business hand shake". Sometimes legal requirements are silly, but to maintain licenses and control they are needed
For me in versioning would be
Version.Speed.Variant
4.80.2 or 4.80.B
This would allow even other variants (B) for different speeds (80) of the same generation (4).
i had just figured out that 3.0, 3.1 and 3.2 were Gen 1, Gen 2, and Gen 2x2 respectively, so i knew what it meant it terms of Bytes not bits, and now USB are leaning hard into making the speed the marketing, but with bits... i can't handle this, USE BYTES PLEASE ITS WHAT MOST PEOPLE KNOW
WAIT ITS WORSE THAN I EXPECTED
the version number is COMPLETELY IRRELEVANT
only the GEN and LANE numbers matter, like for PCIe
Gen 1 is 625MB/s
Gen 2 is 1.25 MB/s
Gen 3 exists, and is 2.5MB/s
and all of these can be multiplied by 2, hence Gen 2x2 being 2.5MB/s
WHICH MEANS THAT
Gen 1x2 IS A THING at 1.25 GB/s
Gen 3x1 and Gen 2x2 are the same speed despite being different, 2.5GB/s
and therefore, Gen 3x2, which is what USB 4 v2 80gbps (10GB/s) is, can use existing Gen 3x1 cables, that by necessity have extra lanes for Gen 2x2 support.
USB-IF has actually come up with a consistent and understandable naming scheme BUT HAS INTENTIONALLY OBSCURED IT FOR NO GALE-DAMN REASON
so actually:
2.0: 60 MB/s (oh gales yep, still really slow)
3.0 Gen 1x1: 625MB/s
Sata v3: 750MB/s
3.2 Gen 1x2: 1.25GB/s
3.1 Gen 2x1: 1.25GB/s
3.2 Gen 2x2: 2.5GB/s
4.0 Gen 2x1 20gbps: 1.25GB/s
4.0 Gen 2x2 40gbps: 2.5GB/s
4.0 Gen 3x1 40gbps: 5GB/s
4.1 Gen 3x2 80gbps: 10GB/s
what they should do:
USB 2.0
USB 3.0 Gen 1
USB 3.1 Gen 2
USB 3.2 Gen 2x2
USB 4x1
USB 4x2
what they should have done:
USB 2.0
USB 3.0
USB 4.0
USB 4x2
USB 5x1
USB 5x2
Imagine someone invents an a.i to generate prompts so the prompts can be used to make a.i generated paintings
I don't like the magenta and purple background with orange on the right hand side. In fact, even without considering the contrast with orange I think you shouldn't mix two colours; as in, using orange and [orange mixed with white] or blue and light blue is okay, but these two colours clearly differ in the amount of blue (at least on my screen). This means the purple pixels look like highlights, and the background pulls too much attention.
But this is just my humble opinion.
They probably just need to hire a junior prompt engineer with 5 years experience to fix it.
"Things get faster, and they make less sense" this should make it into one of those platitude books of the future
It seems pretty clear now that the members of the USB-IF don't actually know the definition of 'universal'.
Their versioning is all over the place, they even retcon them, their same connectors are rated wildly differently with no visual delineation, they don't standardise protocols (a-la Switch), some devices/ports just aren't backward compatible for no reason, and they seem to have completely abandoned the sensible colour differentiation.
Just stick with what pci express does. Increment the major version number and double the speed. The details can just be part of the new spec.
PCI Express 5 has double the bandwidth of PCI Express 4. But USB decides to go with USB4 Version 2.0, which has double the bandwidth of USB4 Version 1.0
Tech Linked in purple!
This is huge for hte EGPU crowd. Previously Thunderbolt/USB4 topped out at 3070 spec. The question is when will we see this in laptops?
USB is a quite large collection of standards, with a few of them being more consumer facing, like connectors, medium, bandwidth, power delivery, HDMI/DP/Ethernet/etc over USB, daisy chaining devices... Many of them optional.
I'm not sure how you'd come with a simple naming scheme for both a couple dollar and ten of thousands dollar devices.
by not making all those things optional
5:27 "...force them to innovate, possibly create something even more powerful"
Lmaoo this is historically accurate. When the US unilaterally revoked their deal of F-16s with Pakistan, the latter was forced to accelerate its military spending and innovate jointly with China to create JF-17 Thunder aircrafts.
This strengthened ties between Pakistan and China, spoiled the chances of US for region influence and created more competition in the aircraft market.
When they replace the blue background with a purple background: 🤯🤯😱😱😮😮
"Shut the front door" my new favourite slur lol
$5 a terabyte of optial disk backup? Nice it'll still cost my $60-70 to back up all my drives but cheaper then extra hard drives. Hope it hits the market soonish.
It randomly popped up in my recommendation and I like that you are go right to the news without explaining everything. OH yeh also i was at 1.75x that made it even better for news highlight :D
So, that "1TB disc" is actually very old news, 7 years old, and Tom's hardware was just super late to the party.
They're called "Datafilm Discs" and are actually just thick, multi-layer BD XL discs with an extra focusing element on the laser to switch between layers. Problem is, the only prototype ever publicised was 7 layers (so theoretically 700gb) of their own custom pressed film per disc, with the 7th having focusing issues, and it spins slower than a 1x CD from the late 80s.
but do they sell them?
Ah they anticipate reaching market by 2026. If they work Ill buy at least 10 petabytes worth. This will let me archive my entire family history and data bank and store it in multiple low cost locations that could survive well until a better solution is available.
@@Whatsup_Abroad Yeah, they're not going to have a lot of time left on their patent by the time it's on the market. Their longevity is definitely up for debate though. The more layers you add, the shorter the actual lifespan of the disc is, yet they're claiming the same lifespan and durability as a Verbatim M Disc BDXL for half the price? I doubt they're gonna make that mark...
While USB 4 20/40/80 GBit might be less confusing the real issue, just like with USB 3, is when all they put on a box is "USB 4" and oyu have to guess what speed it is.
If they don't specify otherwise, it's the lowest. Still a dumb naming system, regardless.
Inventors of revolutionary data transfer connections, absolutely at the forefront of technology, used worldwide by literally everyone.
Their kryptonite: Naming their products.
Still trying to understand how a single usb-c cable would reach 80gbps while my chonky ethernet port can just reach 1gbps
Every tech person ever: Just call it USB 4.1
my "existing" usb4 cables? y'all have been buying new electronics during the pandemic? my cables are from back when the versioning schemes made sense.
No split screen in Halo is ridiculous. This is one of the many reasons I upgraded my PC instead of getting a new console. If my girlfriend and all my friends need their own $500 console, plus another screen, controller, copy of the game, etc. Why not just have a PC at that point. Split screen multiplayer is the best thing about consoles and so few games support it anymore
"Just get Xbox Game Pass, deeerrrrrr" - some gamer somewhere.
No, really I've already seen someone make that argument.
@@tams805 yeah I usually hear that argument or "they can't do it because of graphical limitations, performance,etc" yeah ok. My n64 with it's 4MB of RAM could do 4 player split screen but a modern system can't. Devs can figure it out in most games, the game companies just don't want to because online only makes more money.
"shut the front door!" got me lmao
i love that you told me about the name change, then said my existing usb4 stuff and my immediate response was that i dont have any usb4 stuff yet
I can't believe I'm saying this but Riley's opening movements was my highlight for RUclips today!
Pretty sure Tetsuya Nomura is in charge of naming USB revisions at this point. Can’t wait for USB 358/2 Days Final Chapter Prologue
The non-split screen business is a nightmare for us parents who want our kids to be able to game at the same time in consoles. Two TV's Two Consoles Two Games for my twins. That's why they are almost exclusively on PC's.
I just want usb to just go by
Usb (A,B,C,microB) - (data speed in gb), (power delivery in Watts)
E.g USB C -10,100
lol "shut the front door"🤣🤣🤣
Somewhere in the near future:
USB hub with 80 Gbps ports $60
8 port ethernet switch with a single 10 Gbps port $300
Like come on, a single 40 Gbps QSFP+ module can cost like $500-1000, but then again, those have ranges of over 10km.
Linus said he's "never been so angry"! I can just see his tiny little fists shaking in rage.
1:40 - Actually, cables have two parameters, top bandwidth, and top wattage.
You better shut that front door, Riley! We're not paying to keep the outside cool!
The Qualcomm/Arm issue feels a lot like the unauthorized family sharing problem. Yes, your parents are paying for the service, however you don't even live in the same city and can afford your own sub. Isn't that more privateering than adblock?
4:23 THE PINK FLOYD REFERENCE ON HIS SHIRT