This is the problem we have with USB-C right now. Every connector looks the same but they aren't the same. Just because it fits doesn't mean it's compatible.
Yes and no. Assuming standards are followed, they *are* compatible, but they don't have the same capabilities. What I really dislike about the situation is that you can't tell what a specific cable can do just by looking at it.
@@samiraperi467 that's one thing. Every port on any cable should be Thunderbolt XY (or whatever) compatible by standard so that it doesn't matter which cable you grab for a given task. But not only that. The protocols aren't even standardised. My Google phone doesn't charge on a Samsung cable or vice versa and so on. No matter how much your smartphone could handle or the adapted could deliver. This was way easier with mini USB. But now it is like before the days of charging via USB. Every phone needs its own cable and adapter again. And this goes for everything where USB C is involved.
This exactly. From display outputs to PCIE tunneling. You don't know what's what. Past that you don't know if the cable you are using even supports the bandwidth for those capabilities. Even more problematic is power delivery. We already have different standards using the same connector that aren't compatible. In more extreme examples we have examples like early USB-C to type A cables or the Nintendo switch that lead to fried devices.
The USB cable itself should be colored because you dont know what speed is can use. I got one that can handle 65 watts and one that can handle 100 watts. They are the SAME color other then the color of the cable.
Isolated? Where? On the board I guess, because definitely not inside the cable, so we could just use the same cable there. Because the USBc cable is already being abused for literally everything anyways xD
@@LuLeBe Its isolated on the PCB. Its part of the ethernet standard. I guess you could force a USB cable for that, but there are less twisted pairs inside a USB cable, so we would need a new standard for that. Also crimping it "in the field" would be a pain. One of the good things about ethernet cables is that you can fit them trough small holes and then crimp then with inexpensive and simple tools. For USB its not so simple.
Actually, Ethernet has its place in houses. Run wire to remote access points to extend your network beyond what Mesh can do. You can't daisy chain Mesh, all access points must connect directly back to the main gateway. So, it has to be in the dead center of a house. This causes issues in homes where you don't really have a choice where the main gateway lives. On top of that, some people want to extend their network outside or have big houses. Mesh won't always cover a whole house because again you can't daisy chain.
@@OgdenM You can daisy chain mesh. Depending on the devices, you can daisy chain them in terms of WiFi only, or Ethernet daisy chaining. You couldn't do this in the past but we got past that. Do you think orbi or Google is going to have people running wires all over their house? They wouldn't sell any units if it was that complicated.
1:58 You put the HDMI plug in the wrong way. I saw the animation and felt an internal pain of broken computer parts. When I rewatched the scene I knew why.
I would have liked if he talked more about USB potentially replacing regular power outlets, which is what is being alluded to in the thumbnail with the picture of the toaster with the USB-A cable.
@@vadnegru there would have to be some plugs for stuff like heating elements, but wouldn't it be much safer if most of a house's wiring was low power?
@@gamagama69 There is such a thing as USB-C HDMI Alt Mode, but it kinda sucks cuz it doesn't support 4K@60FPS or HDR. I dunno why they gimped the standard that way when DisplayPort Alt Mode works just fine for those features.
I think we're currently in a pretty sweet spot between one cable and lots of cables. And looking at the mess usb is making with they naming and supported protocols it's probably a good thing 1 single cable isn't a reality (yet)
It's like USBIF is shooting itself in the foot by letting members get away with stretching the truth about standards compliance by loosening those standards, letting greed of their members take over the common goal of actually universally compliant standard.
There won't be 1 cable to rule them all, because by the time that tech is developed, everything is likely going to be wirelessly connected. Power hungry appliances like certain monitors or power supplies will likely never lose their cables.
@@Not_interestEd- Correct! Except all those wireless devices will need USB cables to charge them. And they will all require different levels of power delivery... Oh wait. We have that problem now.
@@finkelmana charging through wireless exists, although that would require a cable to do..... ehhhhhh........ I'd say if any cable will rule them all, it's gonna be USB-C
@@Not_interestEd- Wireless charging really is just a step above being a novelty. It is EXTREMELY inefficient. Most of the power is lost. It also has no range. As for USB-C, being the future of charging? Nope. USB-C cannot be *THE* universal charger. The wires inside simply do not have the mass support higher amperages. Not to mention USB-C cables are far more expensive to make. Its fine for low power devices, but for anything that takes real current, its just not good enough.
it is not impossible. While older units had compressor rated up to 800W, modern ones are around 100W. Energy efficient and small units can do with compressors as small, as 40W. However, for the most time we do trade insulation for extra storage volume
The one thing i don't like about USB C is how small and fragile it was, especially on the lower end side. Like, cheap USB A is pretty decent, durability wise.
I agree. It's not just fragile, but the small size makes it harder to get a good grip. Pulling out a USB or HDMI plug is easy because you can get a good grip, so you pull it out straight. USB-C is very small, so sometimes people wiggle it side to side while pulling to get it out. That's not gonna be good for the plug or port in the long run.
I work in school IT support. We service Chromebooks and Lenovo laptops (for staff) that have USB-C as the charging port. We have numerous devices with bad USB-C ports that we cannot fix nor replace the USB-C port. It would be nice if the ports themselves were modular. That way if they go bad, we can easily swap out the port. We don't have tools, training, parts, and most importantly, time for those types of repairs. If the device is still under warranty, we have a repair partner who can usually do warranty work. However, if it is out of warranty or it doesn't cover that damage, we're out of luck.
USB-C ports and cable tips are pretty durable when compared to USB micro-B, mini-B, lightning, and mini-HDMI. I will say that USB-C ports are difficult to repair, unless they are on their own daughterboard. The one issue with using USB-C for charging, is that's the port that gets the most use, and it's usually the first to need repairs on laptops.
The funny thing I've noticed is that alot of android smartphones have a "proper" way to insert the USB-C charger. But the charging cables themselves don't show anything like that. So if I insert the USB-C cable on the wrong side then the phone will charge slowly instead of fast charging.
a nice thing about ethernet cables is the fact that even if it breaks, you can simply cut it and replace the plug with a new one for like what? 20 cents? maybe less, as long as you know how to do it, you can't repair USB cables this easily on your own
Literally can just splice those ends together by twisting wires and tape. If ya wanna be cheap, a lighter + a grocery bag will suffice. The real bitch is those damn connector ends, which are very fragile, which most people don't wanna bother buying new ones of and soldering.
Eh a the Tool you need to press a Jack costs like 300$ or more you can buy cheaper ones but 99% of those are faulty, unreliable and most likely don’t get the job or just for a short time. A jack that doesn’t need a special tool costs like 10-35$ depends on cat certification and quality
@@freewayross4736 I've never had a problem using a basic crimper (quality of what you'd get from home Depot or Lowe's). Doubt both of mine would have added up to $100, and I can't remember the last time I had to redo an end due to a fault. I don't do terminations daily, but I've used it at least 100 times and with various rj45 connector brands.
There is Alt. mode for USB-C for passing analog audio out & mic in and AFAIK some phones use this and give you just a very short USB-C to female jack passive "dongle"
I know I'm not alone here, but holy crap that last point really hit home. The amount of times I've laid on the floor and unplugged everything out of the back of my PC and then had to plug everything back in again, in low-light conditions, is kind of absurd. If everything was the same connector I would definitely have to label it all.
I had to explain to my parents over the phone that my (younger) brother’s PC has to have the hdmi plugged into the graphics card not the motherboard. But first I had to explain to them what a graphics card was. I can’t even imagine having to tell them which of 15 identical ports to plug the display USB-C into.
I work at Best Buy. So many accessories still have USB A style ports it is insane. A lot of laptops do use usb-c for power but I barely sell any usb-c accessories.
For networking, at least, the SFP approach seems like a pretty neat compromise between a bunch of things here. Having a common interface for a transceiver that you can make whatever you want (from a DAC cable that only goes a meter, to a fiber transceiver that can go multiple kilometers) is super powerful.
Pity there's a vendor war with SFP's where one vendor's SFP slot will not talk to another vendor's SFP module. Not all vendors, but the major ones do this, yet the modules are all made by like 2 actual manufacturers.
Analog signals through USB, specifically audio, IS a thing. It's called audio adapter accessory mode. It uses the USB 2.0 datalines to transfer stereo audio.
Yeah, that's why there are Lightning to audio and type-C to audio that are nothing but an adapter with no real electronics of their own. While analogue audio might work a certain way, that can be actually replicated digital without special converter chips. It's just *better* if done that way.
There was a 3.5mm Jack in that first screen of connectors followed by an explanation saying they were all digital 👀 I understand with remotes and triggers it's technically digital but NO I SAY 99% of uses are analogue. I know you guys like to be accurate:)
There is also an analog standard for USB type c as long as there is a dac inside the device. This is an optional part of the spec and isnt used in budget devices but most cheap adapters that you can find will say what phones they are compatible with because those are the phones that work with that optional spec
@@habilain ah they were used for BNC input at one point and are used in hifi for digital so I'll let those slide because it's replacing toslink in modern hifi for digital signal
@@xathridtech727 that's correct, though nearly all the time when you use a USB adapter the adapter itself has a fax built in, in early USB c when phones started to remove the 3.5mm it was common for them to have an analogue output rail though now it's almost exclusively requiring of a DAC, so when you plug in USB c analogue headphones which don't have a self contained DAC into a new device it spits an error code of device not supported, as it's shorting the digital signal outputs. The devices are short protected though so it just reads an error message.
@Linus Tech Tips I highly doubt that the real LTT appreciates anything you’re doing. There’s also not a “Part 2” of this video. LTT never does “Part 2”s of any of their videos.
If every connector looked same the it should do all of it for eg if a laptop has just 4 type c ports then all those ports should be capable of display out ie hdmi, display port then normal usb connection then charging the laptop ie everything could be handled by each of those 4 ports that'd be much more user friendly and awesome to have!
The keyboard & mouse connectors were also triggered by keystrokes/moves instead of 'always listening in intervals' (aka engineered latency) And the game port: more ways of registering input than USB. (That's why early USB controllers were shit / less capable) It took years and years before they were on the same level (and I'm not even sure they are now) Those standards were designed purposely for those specific applications. And maybe... Just maybe... They're still just 'better'
Enemy of modern controller is Xinput that dumbed down all controllers into xbox 360 gamepad. 4 analog axis, 10 buttons, 2 triggers and 8 direction dpad. DirectInput on another hand were much more advanced, with 128 buttons and 8 axis.
I have a video suggestion about Android Auto and Apple CarPlay. How does one really use it? Does your car need to be 'enabled' with it? Does your car need a screen? Do you connect your phone to your car via wire or wireless? Is it simply something that runs on your phone and audio is played through your car speakers? I don't really know how I'd get started and if my car supports it.
Usually, once a computer goes to standby, the DisplayPort connection gets terminated and removed from the display connection topology in the hardware, and once the computer goes out of standby, sometimes the monitors connected via DisplayPort do not get properly detected, or the graphics hardware has to do it, so there is either a noticeable delay, or no picture at all.
Back to optical / fiber connectors. Yes is does cost more but you could have a fiber optical connector with power lines running in a "one cable to ruin them all"
I remember when fiber optics first started becoming cheap and efficient enough to be used for low volt signal transfers. I figured within ten years it would be used in everything. Especially peripherals. Instead wireless became the norm wich has much higher latency than even old school metallic conductors lol.
Whenever my family asks me to plug in a console or a computer or anything, I remind them of these kids toys with blocks of certain shapes, like circle, rectangle or even a star, that you have to fit into the corresponding hole. It's the same thing, just different shapes. When everything's USB I need a new analogy!
well the idea is that once we reach the One Connector we also reached the One Specification, or at the very least the One Connector is compatible enough to not break
If one is so determined, you can usually do just about anything. I work in tech support for a school district. I had a teacher plug the USB-C cable from her dock into a regular USB type A port. That was interesting.
One thing I'm surprised you didn't bring up is power delivery. USB-C can deliver enough power to run an intergrated-gpu laptop but good luck getting it to deliver the 230w required for a modern gaming one, or the utterly ridiculous power deliveries for desktop replacements
One way would be to have a unique visible or physical shape and color on the ends to correspond to what it's meant to do, similar to the colors used for USB 1.0-3.0, before they threw it out of the window since then.
A great thing is that we got rid of large parts of the myriad of charge/power connectors for small stuff, like when I can use same cable to supply power to my phone, flashlight, laptop, Zigbee hub or microcontroller, including data when relevant.
Using one port for everything means you can do more things with limited ports. My laptop has a charging port, 3 USB ports, an HDMI port, a VGA port 💀, an ethernet port and headphone and microphone ports, even though I don't think any microphone even uses that port. It's either USB or XLR. If USB controlled all monitors, my laptop could have 5 USB ports, which means up to 5 monitors if keyboard and mouse can be plugged into a monitor. If my laptop also got powered by USB, that's 6 monitors and you can charge through the monitor.
The 3.5mm Headphone Jack STILL EXISTS(I have a Smartphone that has that type of jack as I love Wired Heaphones(they give clearer,never need no charging,are lighter and more cost effective and very long lasting too)
I thought this would be about power delivery. Theres a good point about making things that are under 100w just power delivery. Might be more expensive though.
It seems to me the answer to the confuse a cable situation would be to have the system be able to recognize the device that’s been plugged in and give some kind of a visual or audible notice to the user. For instance, let’s say you plug a USB-C cable into the wrong port on a PC and then plug the other end into the monitor. The monitor could send a pilot signal out to the computer, which would have logic in it to identify what wants to communicate with it. If the port is not of sufficient level to support the peripheral, it would then send a signal back. This might be able to be done with both devices off, just using the standby power. there could be lights on either the cable or the device that would come on briefly to let you know that the ports are not compatible. Plugging in and seeing a quick red light at last for three seconds on the back of the monitor or the back of the computer would give a quick indication that the ports are not compatible. additionally, if you’re plugging in a device, like a hard drive, a monitor, a printer, etc., if they have any form of a screen or LED, that could instead be used to indicate to the user that there’s a problem with the connection. The color of the light or the message on the device could give more information. For instance, if the LED is full RGB, then the light blinks, or be different color as much as we see on other small devices to give the user information as to whether the cable is the problem, the compatibility is the problem, or other things that I can’t think of at the moment. This would initially increase costs, but costs would be somewhat reduced by not having to support many different input output, standards. Additionally, a standard for the color of the various ports, much like what is used for 3 mm jacks on the back of PCs to differentiate between mic, input and headphone output on a PC, could be used on USB ports to give another indication as to the purpose of a specific port. for instance, a port designed for video speeds, and below could be bright blue, while a port, that is only sufficiently fast for keyboards could be white. And we could revert to the old standards of pink, pale green and whatever the line output color is to indicate USB ports that could be used for analog signals.
good script, covers a lkt of points and presents in a new thoughtful way to look at the problem statement. thanks for this one. good luck for the next one.
Quest link compresses the video and sends it over as regular USB data, your computer doesn't need video output to use It. USB 3 (5 gbps) is all you need, even USB 2 can work.
Why not make an analogue specification for usb-c. Like if there is a specific resistance between two specific pins, it uses all the pins for analogue signals, except the middle ones for data an the outer ones for for ground. Banda Bing bada boung: 4 lanes analogue with 4 for balancing error correction.
If the usb forum(or another group to replace them) can get thier act together(its to late with usb c unless theres major changes) to make clear marketing and labeling indications of cable/port capabilities and release products to make it easier for channels like this to test capabilities to make sure they are up to standards claimed. I truly think by 10 years from that there would only be 2 types of ports. The "usb whatever" and an analog port. With how much power and back-and-forth communication the main thing holding back cables is not ever being able to tell if the cable will work for any given thing and when it doesnt not knowing if its the cable, the port, the other port, or software level., or device. If you have a way to know all pprts and cables are compatible then its just down to software or device issue wich is the same with any port
Why do new keyboards and mice still come with type A... Almost everything has type C now. The same cannot be said for type A. When was the last time you saw a type A to type C dongle?
The solution for having different types of cables all using the same USB-C connector is to have a universal standard colour code for both connector and the port for different cable types. You put the yellow connector into the yellow port, the green connector into the free port, etc.
It's awesome how single port can be used for so many capabilities. It's just the matter of differentiate what each ports and cable able to do. Maybe USB Consortium should standardized tier level and give it a codename, something simple, short, so it can be printed both in the port and on the cable. Make a standardized table how to read it. Instead of calling it something like USB 3.2 Gen 2 x 2 or add different logo for each different capabilities I don't know, maybe something like: AA - USB Data 480Mbps AF - USB Data 480Mbps + 15W PD AG - USB Data 480Mbps + 30W PD AH - USB Data 480Mbps + 65W PD AI - USB Data 480Mbps + 100W PD BG - USB Data 5Gbps + 30W PD CH - USB Data 10Gbps + 65W PD CI - USB Data 10Gbps + 100W PD DU - USB Data 40Gbps + 100W PD + Display Output + PCIe Tunneling (Thunderbolt 3) EU - USB Data 80Gbps + 100W PD + Display Output + PCIe Tunneling (Thunderbolt 4) etc... So if my grand parents buy new phone, and it's printed AF on the port, they'll just get a matching cable and charger. if external GPU case needs DU, buy a laptop that provides DU, get a DU cable, done, simple. 8k monitor needs EU, just the matter matching the requirements. If you use lower tier cable, charger, etc. it'll still work as the weakest chain in the link. For longer transmission you can use fiber optic cable, it'll still handle All the capabilities except Power Delivery. For the fiber optic maybe there's new mechanism before going into type c, just so if the cable breaks, you can just replace the cable only which is cheap, and reuse the expensive part. Custom length, self crimp, of the shelf FO, etc.
Ethernet is likely to stay (different use-case: multi-device network that isn't centered around a computer, a lot of any to any). Audio & hotplug-able video links (RCA, and those DCI e.t.c) is likely to stay around too (a computer isn't given to be the center of the setup). Why would stuff need to be computer centric?
I mean if you get a ssd like a saming t5 or t7 you are even running a hard drive off usb-c. It is odd that i have never had that video transfer issue though you mention. I have had that with previous versions of usb but never on usb-c
If the connectors are all the same, it will be easy not hard for technically challenge people. I mean you have to plug your monitor to your dedicated gpu of course while everything else on your mobo back plate.
It is incredible how many things use usb, given the fact that the connector is really not mechanically good (the connector itself is sturdy), there is basically no 'hook' mechanism to prevent slipping, and the 'contacts' are (in certain devices) sheer luck...
I misunderstood the title. I thought he was going to talk about why shavers, toothbrushes and some other gadgets use USB but not others. I already know the answer as the voltage requirements cannot be met by USB... at least yet but we are getting there. My television only uses 9 volts which is far less than a 1980s CRT (Cathode Ray Tube) that likely took nearly 30 volts. Just to clear the air, a device should use no more than 4.5 volts to run on USB. That leaves a buffer of .5 volts for fluctuation. I would like to hear someone else's perspective on the subject matter. Still a interesting video.
other than audio and some psu cables (not pc) we already have a very overseeable lineup. audio is a beast in itself but powersupplies for many devices could really need a slim down in different connectors and supply unit standarts. the interns of a pc are not commonly used "casual" cables so its fine i would say how it is.mainstream devices besided some audio and psu stuff is already on a good base
This is the problem we have with USB-C right now. Every connector looks the same but they aren't the same. Just because it fits doesn't mean it's compatible.
Yes and no. Assuming standards are followed, they *are* compatible, but they don't have the same capabilities. What I really dislike about the situation is that you can't tell what a specific cable can do just by looking at it.
@@samiraperi467 that's one thing. Every port on any cable should be Thunderbolt XY (or whatever) compatible by standard so that it doesn't matter which cable you grab for a given task.
But not only that. The protocols aren't even standardised. My Google phone doesn't charge on a Samsung cable or vice versa and so on. No matter how much your smartphone could handle or the adapted could deliver. This was way easier with mini USB. But now it is like before the days of charging via USB. Every phone needs its own cable and adapter again. And this goes for everything where USB C is involved.
This exactly. From display outputs to PCIE tunneling. You don't know what's what. Past that you don't know if the cable you are using even supports the bandwidth for those capabilities.
Even more problematic is power delivery. We already have different standards using the same connector that aren't compatible. In more extreme examples we have examples like early USB-C to type A cables or the Nintendo switch that lead to fried devices.
@Linus Tech Tips frig off
The USB cable itself should be colored because you dont know what speed is can use. I got one that can handle 65 watts and one that can handle 100 watts. They are the SAME color other then the color of the cable.
That last bit about tech support for a universal “one cable fits all,” situation is so true it hurts.
just be glad your brand new toaster doesn't come with a cable but no brick
Mine diddnt come with either 😢
@@Mrmonke6000 wireless toaster
Yeah more e waste
4090?
To be honest, I have too many single port adapters at home now
Ethernet is also galvanically isolated. For runs inside your house that doesn't add a lot of value, but for servers and apartment buildings, it does.
Isolated? Where? On the board I guess, because definitely not inside the cable, so we could just use the same cable there. Because the USBc cable is already being abused for literally everything anyways xD
@@LuLeBe Its isolated on the PCB. Its part of the ethernet standard. I guess you could force a USB cable for that, but there are less twisted pairs inside a USB cable, so we would need a new standard for that.
Also crimping it "in the field" would be a pain. One of the good things about ethernet cables is that you can fit them trough small holes and then crimp then with inexpensive and simple tools. For USB its not so simple.
Actually, Ethernet has its place in houses. Run wire to remote access points to extend your network beyond what Mesh can do.
You can't daisy chain Mesh, all access points must connect directly back to the main gateway. So, it has to be in the dead center of a house. This causes issues in homes where you don't really have a choice where the main gateway lives.
On top of that, some people want to extend their network outside or have big houses. Mesh won't always cover a whole house because again you can't daisy chain.
@@OgdenM You can daisy chain mesh. Depending on the devices, you can daisy chain them in terms of WiFi only, or Ethernet daisy chaining. You couldn't do this in the past but we got past that. Do you think orbi or Google is going to have people running wires all over their house? They wouldn't sell any units if it was that complicated.
1:58 You put the HDMI plug in the wrong way.
I saw the animation and felt an internal pain of broken computer parts. When I rewatched the scene I knew why.
Yeah, another pain is that it floats from the bottom 😅
it can be done if you apply enough force
@@edfx by the same logic you could force a screwdriver in there 🤷
@@Ben4A or a brick
I would have liked if he talked more about USB potentially replacing regular power outlets, which is what is being alluded to in the thumbnail with the picture of the toaster with the USB-A cable.
RUclips is only clickbait nowadays 😭
Now you could use USB C 48V 5A to have 240W. Too low for a toaster (
Agree
Very bad idea for computer security
@@vadnegru there would have to be some plugs for stuff like heating elements, but wouldn't it be much safer if most of a house's wiring was low power?
and for completeness sake, the real hidden reason that DisplayPort, HDMI, USBC video, etc, compete is the avoidance of patents and licensing fees
usbc video is display port and hdmi tho.
@@gamagama69 There is such a thing as USB-C HDMI Alt Mode, but it kinda sucks cuz it doesn't support 4K@60FPS or HDR. I dunno why they gimped the standard that way when DisplayPort Alt Mode works just fine for those features.
@@MikeTrieu are most usbc to hdmi adpter just dp alt mode running hdmi? cuz displayport can just past through hdmi passively
isn't Display port free to use unlike HDMI ?
@@MikeTrieu HDMI over USB-C alt mode was specced out, but effectively no one used it. IIRC it's been dropped.
I think we're currently in a pretty sweet spot between one cable and lots of cables. And looking at the mess usb is making with they naming and supported protocols it's probably a good thing 1 single cable isn't a reality (yet)
It's like USBIF is shooting itself in the foot by letting members get away with stretching the truth about standards compliance by loosening those standards, letting greed of their members take over the common goal of actually universally compliant standard.
There won't be 1 cable to rule them all, because by the time that tech is developed, everything is likely going to be wirelessly connected.
Power hungry appliances like certain monitors or power supplies will likely never lose their cables.
@@Not_interestEd- Correct! Except all those wireless devices will need USB cables to charge them. And they will all require different levels of power delivery... Oh wait. We have that problem now.
@@finkelmana charging through wireless exists, although that would require a cable to do..... ehhhhhh........
I'd say if any cable will rule them all, it's gonna be USB-C
@@Not_interestEd- Wireless charging really is just a step above being a novelty. It is EXTREMELY inefficient. Most of the power is lost. It also has no range. As for USB-C, being the future of charging? Nope. USB-C cannot be *THE* universal charger. The wires inside simply do not have the mass support higher amperages. Not to mention USB-C cables are far more expensive to make. Its fine for low power devices, but for anything that takes real current, its just not good enough.
I guess having USB c on everything for DC power delivery would be much better than having those stupid always slightly different Barrell plugs ...
Love the animation of the HDMI going into its port upside-down.
The pain
Some are like that actually
(1:57) just noticed the connector is reversed when plugged in 😅
As a UX designer when you said "you don't know what it does" really hit home.
I want my fridge powered via USB C
Mine is! 🥶
it is not impossible. While older units had compressor rated up to 800W, modern ones are around 100W. Energy efficient and small units can do with compressors as small, as 40W. However, for the most time we do trade insulation for extra storage volume
1:58 I'm upset that the port and cable aren't facing the same way
@@Doroga05 too
I had to do a double take because I couldn't believe what I saw, lol
@@destoru same.
The one thing i don't like about USB C is how small and fragile it was, especially on the lower end side. Like, cheap USB A is pretty decent, durability wise.
I agree. It's not just fragile, but the small size makes it harder to get a good grip. Pulling out a USB or HDMI plug is easy because you can get a good grip, so you pull it out straight. USB-C is very small, so sometimes people wiggle it side to side while pulling to get it out. That's not gonna be good for the plug or port in the long run.
I work in school IT support. We service Chromebooks and Lenovo laptops (for staff) that have USB-C as the charging port. We have numerous devices with bad USB-C ports that we cannot fix nor replace the USB-C port. It would be nice if the ports themselves were modular. That way if they go bad, we can easily swap out the port. We don't have tools, training, parts, and most importantly, time for those types of repairs. If the device is still under warranty, we have a repair partner who can usually do warranty work. However, if it is out of warranty or it doesn't cover that damage, we're out of luck.
Really? That hasn't been my experience, I've had far more USB-A ports/cables die on me than USB-C
USB-C ports and cable tips are pretty durable when compared to USB micro-B, mini-B, lightning, and mini-HDMI.
I will say that USB-C ports are difficult to repair, unless they are on their own daughterboard.
The one issue with using USB-C for charging, is that's the port that gets the most use, and it's usually the first to need repairs on laptops.
The funny thing I've noticed is that alot of android smartphones have a "proper" way to insert the USB-C charger. But the charging cables themselves don't show anything like that. So if I insert the USB-C cable on the wrong side then the phone will charge slowly instead of fast charging.
a nice thing about ethernet cables is the fact that even if it breaks, you can simply cut it and replace the plug with a new one for like what? 20 cents? maybe less, as long as you know how to do it, you can't repair USB cables this easily on your own
Literally can just splice those ends together by twisting wires and tape. If ya wanna be cheap, a lighter + a grocery bag will suffice.
The real bitch is those damn connector ends, which are very fragile, which most people don't wanna bother buying new ones of and soldering.
Eh a the Tool you need to press a Jack costs like 300$ or more you can buy cheaper ones but 99% of those are faulty, unreliable and most likely don’t get the job or just for a short time. A jack that doesn’t need a special tool costs like 10-35$ depends on cat certification and quality
@@freewayross4736 I've never had a problem using a basic crimper (quality of what you'd get from home Depot or Lowe's). Doubt both of mine would have added up to $100, and I can't remember the last time I had to redo an end due to a fault. I don't do terminations daily, but I've used it at least 100 times and with various rj45 connector brands.
@@volvo09 what are your plugs rated for? Cat5?
There is Alt. mode for USB-C for passing analog audio out & mic in and AFAIK some phones use this and give you just a very short USB-C to female jack passive "dongle"
I know I'm not alone here, but holy crap that last point really hit home. The amount of times I've laid on the floor and unplugged everything out of the back of my PC and then had to plug everything back in again, in low-light conditions, is kind of absurd. If everything was the same connector I would definitely have to label it all.
I had to explain to my parents over the phone that my (younger) brother’s PC has to have the hdmi plugged into the graphics card not the motherboard. But first I had to explain to them what a graphics card was.
I can’t even imagine having to tell them which of 15 identical ports to plug the display USB-C into.
en.wikipedia.org/wiki/Keep_Talking_and_Nobody_Explodes
@@son-tchori7085 well, at least tech support is lower stakes than that
"The one that says GRAPHICS CARD" on it"
Boss did the same, moved pc and plugged into the motherboard.
This is where I strongly recommend these plastic plugs. It won't stop the more determined users, but helps avoiding the wrong ports a little.
That AI art 1:20 lmaoooooo the keyboard what the helllllll
I work at Best Buy. So many accessories still have USB A style ports it is insane. A lot of laptops do use usb-c for power but I barely sell any usb-c accessories.
For networking, at least, the SFP approach seems like a pretty neat compromise between a bunch of things here. Having a common interface for a transceiver that you can make whatever you want (from a DAC cable that only goes a meter, to a fiber transceiver that can go multiple kilometers) is super powerful.
Pity there's a vendor war with SFP's where one vendor's SFP slot will not talk to another vendor's SFP module. Not all vendors, but the major ones do this, yet the modules are all made by like 2 actual manufacturers.
Anyone else notice the HDMi goes in backwards at 1:58
Was gonna mention this.😂
Analog signals through USB, specifically audio, IS a thing. It's called audio adapter accessory mode. It uses the USB 2.0 datalines to transfer stereo audio.
Yeah, that's why there are Lightning to audio and type-C to audio that are nothing but an adapter with no real electronics of their own.
While analogue audio might work a certain way, that can be actually replicated digital without special converter chips. It's just *better* if done that way.
i realy hate fact they removed audiojacks from some new devices makes harder to use old soud systems as amplifier with just one cable
4:44 that feeling when a banger starts in your game
at 1:58 it cracked me up to see the illustration of a HDMI Cable being pluged into the monitor in the wrong orientation.
There was a 3.5mm Jack in that first screen of connectors followed by an explanation saying they were all digital 👀 I understand with remotes and triggers it's technically digital but NO I SAY 99% of uses are analogue. I know you guys like to be accurate:)
There is also an analog standard for USB type c as long as there is a dac inside the device. This is an optional part of the spec and isnt used in budget devices but most cheap adapters that you can find will say what phones they are compatible with because those are the phones that work with that optional spec
The composite cables (Yellow/Red/White) on the same screen are an old analogue TV cable, and I'm not aware of any uses of it that are non-analogue.
@@habilain ah they were used for BNC input at one point and are used in hifi for digital so I'll let those slide because it's replacing toslink in modern hifi for digital signal
@@xathridtech727 that's correct, though nearly all the time when you use a USB adapter the adapter itself has a fax built in, in early USB c when phones started to remove the 3.5mm it was common for them to have an analogue output rail though now it's almost exclusively requiring of a DAC, so when you plug in USB c analogue headphones which don't have a self contained DAC into a new device it spits an error code of device not supported, as it's shorting the digital signal outputs. The devices are short protected though so it just reads an error message.
@Linus Tech Tips I highly doubt that the real LTT appreciates anything you’re doing. There’s also not a “Part 2” of this video. LTT never does “Part 2”s of any of their videos.
I love that little dance at 4:40 x33
I remember we bought a bunch of servers and was dismayed to find they were USB 1.1. Servers were usually a version behind.
At least you could add add-on card with newer USB
@@vadnegru I think that’s what we ended up doing to a few of them.
4:40 lol 😂😂😂😂
If every connector looked same the it should do all of it for eg if a laptop has just 4 type c ports then all those ports should be capable of display out ie hdmi, display port then normal usb connection then charging the laptop ie everything could be handled by each of those 4 ports that'd be much more user friendly and awesome to have!
4:40 Great b-roll moment
That HDMI connector being the wrong way around at 1:58 oof ouch. Having seen HDMI ports ripped apart by people..
the 3.5mm jack is essential as it does it's job perfectly but phone manufacturers seem to not understand that
Which is why bluetooth is the alternate choice while being wireless.
1:58 MAGIC!!! The HDMI is upside down and connected perfectly hehehe
The keyboard & mouse connectors were also triggered by keystrokes/moves instead of 'always listening in intervals' (aka engineered latency)
And the game port: more ways of registering input than USB. (That's why early USB controllers were shit / less capable)
It took years and years before they were on the same level (and I'm not even sure they are now)
Those standards were designed purposely for those specific applications. And maybe... Just maybe... They're still just 'better'
Enemy of modern controller is Xinput that dumbed down all controllers into xbox 360 gamepad. 4 analog axis, 10 buttons, 2 triggers and 8 direction dpad. DirectInput on another hand were much more advanced, with 128 buttons and 8 axis.
I have a video suggestion about Android Auto and Apple CarPlay. How does one really use it? Does your car need to be 'enabled' with it? Does your car need a screen? Do you connect your phone to your car via wire or wireless? Is it simply something that runs on your phone and audio is played through your car speakers? I don't really know how I'd get started and if my car supports it.
I feel like if a toaster used USB like in the thumbnail we would have to "waste" a power brick instead of plugging it straight in.
1:58 HDMI cable going in upsidedown, lol
Could have mentioned the DAC in a USB C to 3.5mm jack at the end of the vid too
I'd like to point out that the HDMI cable at 1:59 is going in backwards.
3:50 I just noticed linus has a bald spot lol
Why display port sucks sometimes? I have never use it. But I was about to buy a tv an use dp to connect it at 10 mts
Everything sucks sometimes, I guess
Usually, once a computer goes to standby, the DisplayPort connection gets terminated and removed from the display connection topology in the hardware, and once the computer goes out of standby, sometimes the monitors connected via DisplayPort do not get properly detected, or the graphics hardware has to do it, so there is either a noticeable delay, or no picture at all.
Back to optical / fiber connectors. Yes is does cost more but you could have a fiber optical connector with power lines running in a "one cable to ruin them all"
Don't forget about powered USB or the 20v usb which are common on cash registers
I remember when fiber optics first started becoming cheap and efficient enough to be used for low volt signal transfers. I figured within ten years it would be used in everything. Especially peripherals. Instead wireless became the norm wich has much higher latency than even old school metallic conductors lol.
Whenever my family asks me to plug in a console or a computer or anything, I remind them of these kids toys with blocks of certain shapes, like circle, rectangle or even a star, that you have to fit into the corresponding hole. It's the same thing, just different shapes.
When everything's USB I need a new analogy!
well the idea is that once we reach the One Connector we also reached the One Specification, or at the very least the One Connector is compatible enough to not break
You're speaking to a person with a wild, self-made display port to vga because I wanted to use the hdmi port for my switch
We as a civilization can't even agree on a single power plug, socket, voltage, and frequency standard.
How did that HDMI cable get plugged in upside down?!
Force. A large amount of Force. Maybe even The Force. The Tri-Force?
You haven't seen molex power connector plugged in into hdd upside down yet.
@@yaroslavpanych2067 (Shh, you're going to scare him.)
If one is so determined, you can usually do just about anything. I work in tech support for a school district. I had a teacher plug the USB-C cable from her dock into a regular USB type A port. That was interesting.
3:47 - you know USB just won't cut it when even Linus became so frustrated and activated his sharingan.
Would love to see you guys do a video on starlink and the future of alternative data sources!
One thing I'm surprised you didn't bring up is power delivery. USB-C can deliver enough power to run an intergrated-gpu laptop but good luck getting it to deliver the 230w required for a modern gaming one, or the utterly ridiculous power deliveries for desktop replacements
USB C can deliver up to 250w nowadays I believe
@@adamdavies5599 If you can show me that in use I'll be pretty amazed
One way would be to have a unique visible or physical shape and color on the ends to correspond to what it's meant to do, similar to the colors used for USB 1.0-3.0, before they threw it out of the window since then.
Like Headphones with a usb plug and without an analog/digital converter arent a thing nowadays.
A great thing is that we got rid of large parts of the myriad of charge/power connectors for small stuff, like when I can use same cable to supply power to my phone, flashlight, laptop, Zigbee hub or microcontroller, including data when relevant.
Using one port for everything means you can do more things with limited ports. My laptop has a charging port, 3 USB ports, an HDMI port, a VGA port 💀, an ethernet port and headphone and microphone ports, even though I don't think any microphone even uses that port. It's either USB or XLR. If USB controlled all monitors, my laptop could have 5 USB ports, which means up to 5 monitors if keyboard and mouse can be plugged into a monitor. If my laptop also got powered by USB, that's 6 monitors and you can charge through the monitor.
3:18 Ethernet also electrically isolates every device, so ground differences etc. aren't a problem. Ethernet is robust.
5:18
"Grab the cable"
"Which one"
"The Chosen One!"
Hey LTT, are you guys going to chatGPT for your scripts again?
Attenuation is not degraded speeds, its degrading of signal strength.
The 3.5mm Headphone Jack STILL EXISTS(I have a Smartphone that has that type of jack as I love Wired Heaphones(they give clearer,never need no charging,are lighter and more cost effective and very long lasting too)
I thought this would be about power delivery. Theres a good point about making things that are under 100w just power delivery. Might be more expensive though.
It seems to me the answer to the confuse a cable situation would be to have the system be able to recognize the device that’s been plugged in and give some kind of a visual or audible notice to the user.
For instance, let’s say you plug a USB-C cable into the wrong port on a PC and then plug the other end into the monitor. The monitor could send a pilot signal out to the computer, which would have logic in it to identify what wants to communicate with it. If the port is not of sufficient level to support the peripheral, it would then send a signal back. This might be able to be done with both devices off, just using the standby power. there could be lights on either the cable or the device that would come on briefly to let you know that the ports are not compatible. Plugging in and seeing a quick red light at last for three seconds on the back of the monitor or the back of the computer would give a quick indication that the ports are not compatible. additionally, if you’re plugging in a device, like a hard drive, a monitor, a printer, etc., if they have any form of a screen or LED, that could instead be used to indicate to the user that there’s a problem with the connection. The color of the light or the message on the device could give more information. For instance, if the LED is full RGB, then the light blinks, or be different color as much as we see on other small devices to give the user information as to whether the cable is the problem, the compatibility is the problem, or other things that I can’t think of at the moment.
This would initially increase costs, but costs would be somewhat reduced by not having to support many different input output, standards.
Additionally, a standard for the color of the various ports, much like what is used for 3 mm jacks on the back of PCs to differentiate between mic, input and headphone output on a PC, could be used on USB ports to give another indication as to the purpose of a specific port. for instance, a port designed for video speeds, and below could be bright blue, while a port, that is only sufficiently fast for keyboards could be white. And we could revert to the old standards of pink, pale green and whatever the line output color is to indicate USB ports that could be used for analog signals.
good script, covers a lkt of points and presents in a new thoughtful way to look at the problem statement. thanks for this one. good luck for the next one.
Damn DVI is from the 20th century? I was in high school when this came out!
I still use it... like not on retro rigs even!
iam still using dvi also for pc screen . fells slighlty more responsive then hdmi for me . but maybe its placebo
2:47 Whats that cable!!?? Ethernet!!!
My Quest 2 does use video over USB, actually. But it's been a pain to find the right cable, and also the right USB-C connector on a motherboard.
Quest link compresses the video and sends it over as regular USB data, your computer doesn't need video output to use It. USB 3 (5 gbps) is all you need, even USB 2 can work.
Ethernet cables can also be easily cut and crimped by someone running cables, you’d pretty much need a soldering iron for USB
*Who's that Pokémon?*
-"IT'S SPERMM!!!!"
*It's Ethernet!*
-"FuUÆcCK!"
Why not make an analogue specification for usb-c. Like if there is a specific resistance between two specific pins, it uses all the pins for analogue signals, except the middle ones for data an the outer ones for for ground. Banda Bing bada boung: 4 lanes analogue with 4 for balancing error correction.
Riley dancing at his desk made my day 😂
If the usb forum(or another group to replace them) can get thier act together(its to late with usb c unless theres major changes) to make clear marketing and labeling indications of cable/port capabilities and release products to make it easier for channels like this to test capabilities to make sure they are up to standards claimed. I truly think by 10 years from that there would only be 2 types of ports. The "usb whatever" and an analog port. With how much power and back-and-forth communication the main thing holding back cables is not ever being able to tell if the cable will work for any given thing and when it doesnt not knowing if its the cable, the port, the other port, or software level., or device. If you have a way to know all pprts and cables are compatible then its just down to software or device issue wich is the same with any port
it bothers me so much that the graphic of the HDMI cable being plugged in had the cable plugged in upsidedown
Why do new keyboards and mice still come with type A... Almost everything has type C now. The same cannot be said for type A. When was the last time you saw a type A to type C dongle?
The solution for having different types of cables all using the same USB-C connector is to have a universal standard colour code for both connector and the port for different cable types. You put the yellow connector into the yellow port, the green connector into the free port, etc.
I used a CAT6 cable to make a 30 foot usb cable. It might not meet spec, but it works
1:18
what monitor is that??? ITS BEAUTIFUL
Universal* Serial BUS
\*Some restrictions may apply*
1:58 makes me wanna scream
It's awesome how single port can be used for so many capabilities. It's just the matter of differentiate what each ports and cable able to do.
Maybe USB Consortium should standardized tier level and give it a codename, something simple, short, so it can be printed both in the port and on the cable.
Make a standardized table how to read it. Instead of calling it something like USB 3.2 Gen 2 x 2 or add different logo for each different capabilities
I don't know, maybe something like:
AA - USB Data 480Mbps
AF - USB Data 480Mbps + 15W PD
AG - USB Data 480Mbps + 30W PD
AH - USB Data 480Mbps + 65W PD
AI - USB Data 480Mbps + 100W PD
BG - USB Data 5Gbps + 30W PD
CH - USB Data 10Gbps + 65W PD
CI - USB Data 10Gbps + 100W PD
DU - USB Data 40Gbps + 100W PD + Display Output + PCIe Tunneling (Thunderbolt 3)
EU - USB Data 80Gbps + 100W PD + Display Output + PCIe Tunneling (Thunderbolt 4)
etc...
So if my grand parents buy new phone, and it's printed AF on the port, they'll just get a matching cable and charger.
if external GPU case needs DU, buy a laptop that provides DU, get a DU cable, done, simple.
8k monitor needs EU, just the matter matching the requirements.
If you use lower tier cable, charger, etc. it'll still work as the weakest chain in the link.
For longer transmission you can use fiber optic cable, it'll still handle All the capabilities except Power Delivery.
For the fiber optic maybe there's new mechanism before going into type c, just so if the cable breaks, you can just replace the cable only which is cheap, and reuse the expensive part.
Custom length, self crimp, of the shelf FO, etc.
I don't know why you don't have more upvotes
anyone else notice how at 1:58 the hdmi cable being the wrong way? yet plugs in......
Current limit. At most is .9 Amp at 5 Volts or 4.5 Watts. Typical is .5 Amps or 2.5 Watts.
Ethernet is likely to stay (different use-case: multi-device network that isn't centered around a computer, a lot of any to any).
Audio & hotplug-able video links (RCA, and those DCI e.t.c) is likely to stay around too (a computer isn't given to be the center of the setup).
Why would stuff need to be computer centric?
Thank you for breaking down cable lengths from metric to bananas so I, an American, can understand the scale
I mean if you get a ssd like a saming t5 or t7 you are even running a hard drive off usb-c. It is odd that i have never had that video transfer issue though you mention. I have had that with previous versions of usb but never on usb-c
1:00 Respect
Usb c cables / thunder bolt 3 / can have chips in their connectors. Ethernet is so simple in comparison with twisted pairs in an rj45
I like the use of bananas as measuring references
If the connectors are all the same, it will be easy not hard for technically challenge people. I mean you have to plug your monitor to your dedicated gpu of course while everything else on your mobo back plate.
Can't wait for the laps to help me buy usb cables that actually goddamn work.
It is incredible how many things use usb, given the fact that the connector is really not mechanically good (the connector itself is sturdy), there is basically no 'hook' mechanism to prevent slipping, and the 'contacts' are (in certain devices) sheer luck...
I misunderstood the title. I thought he was going to talk about why shavers, toothbrushes and some other gadgets use USB but not others. I already know the answer as the voltage requirements cannot be met by USB... at least yet but we are getting there. My television only uses 9 volts which is far less than a 1980s CRT (Cathode Ray Tube) that likely took nearly 30 volts. Just to clear the air, a device should use no more than 4.5 volts to run on USB. That leaves a buffer of .5 volts for fluctuation. I would like to hear someone else's perspective on the subject matter. Still a interesting video.
1:59 How that connector works?
other than audio and some psu cables (not pc) we already have a very overseeable lineup. audio is a beast in itself but powersupplies for many devices could really need a slim down in different connectors and supply unit standarts. the interns of a pc are not commonly used "casual" cables so its fine i would say how it is.mainstream devices besided some audio and psu stuff is already on a good base
There’s like 5 million barrel jack connectors, sometimes on even devices that don’t need it necessarily and can use USB-C just fine 😭
The HDMI cable being the wrong way at 1:58 bothers me mora than it should.
that edit of James coming out the printer so silly but it made me laugh out loud 😅
Thanks for the video!
Omg a million yeses for the Who's That Pokémon cut away before the sponsor
Ultimately, I do have a single connector for everything. I have a dock connected via USB-C which breaks out the Ethernet and DisplayPort.
And charges laptop and data for km and my usb soundbar. Blew my mind 18 months ago 😂 and maybe 4 monitors