the "jitter" in the elevator scene is actually an artifact of the source engine. They always jitter during elevator scenes because of how they didn't attach players to moving platforms, so they're falling and hitting the ground ~30x/second (the tickrate for L4D2)
Yup, been looking for people pointing that out in the comments. Not their obligation to know that, but still is an "unfair" call to the GPU (not that it could be considered good anyways, given the other problems it has lol).
@@dirtypink3197 Haven't played L4D2 in a long time but the last time I played the 360 version it was a buggy mess*. The last update completely ruined it. PC version still worked great the last time I played it. Never payed enough attention in the elevator to notice jittering (usually too busy healing up and prepping for the next section, especially when playing with randos or NPCs that like to disturb the witch). *A lot of Falling through floors, elevator just didn't open half the time or levels would otherwise be incomplete-able, Rochelle or Zoey would end up in a weird slanted state or constantly falling down and getting up again, voting didn't work and would either pass the vote no matter what or not pass the vote no matter what (whatever if felt like).
*CORRECTION* 17:30 - That jitter isn't due to the GPU/WHDMI but due to Left 4 Dead 2 thanks to the Source Engine's physics actually moving you and your AI-controlled teammates in an elevator. If you pay attention to the HUD elements in the bottom left or your weapons/items in the middle right, there is no jitter. Cool bit of outmoded tech, though! Thanks LTT for covering this!
Yeah fr And then I'm all "well, is your GPU at least not busted tf loosened solder and all" on 6800XT mining cards idk man, I seen some like, tech pro tier stuff out there on the used market with all the materials even the sticker for CPU and such, and then you get those really weird guys who literally didn't clean the cheetoh grease off their laptop and GPU. Guys please be like this guy. Keep your materials safe for resale.
After I got upto 300K trading with expert cathie wood .I bought a new House and I'm now able to send my kids to a better school in the states When someone is straight forward with what he or she is doing people will always speak up for them.
I had 4 of these units as TX RX pairs back in the day without the Graphics card. Their job was to pass 1920 X 1080 Pixel 30 Hz Music video from the DJ booth laptop HDMI to the ceiling projectors, where hard writing was impossible due to distance. They worked well with no line-of-sight problems until CAT 5 HDMI extenders were possible. All my clients were then moved to Network cable HDMI distribution systems. They had full IR remote passthrough. International Music Services (Malta)
Was the antenna / signal polarized? The way they have it is perpendicular how a card would sit in a tower. I wonder that's what caused signal degradation?
The jitter in 17:35 is because of the elevators being "real" in L4D2 and the engine struggles to keep the models on the elevator floor when its moving.
This definitely had its use in audio industry, especially with audio mixing studios for movies! There were strict regulations of noise levels, so all of the audio equipment including PC's were in different rooms
@@xx1simon1xx it is done by cables it is still the most reliable connection! This is just one step closer to perfect sound isolation and quite safe option, since if it stops working you replace it... the cables are usually in a network that goes through walls (which are acoustically treated) so if one breaks its expensive to replace them
You can use separate audio instead that would give you MUCH better sound anyway. And don't go into price, nobody on industries care about price so much so about actual qualities. You imply that yourself already
The reason for all of this jitter and weird 1080i vs. 1080p is NOT that the receiver is deinterlacing. The GPU is transmitting at 1080i and the Receiver is receiving at 1080i. The MONITOR is the one deinterlacing. It's using a Bob-deinterlacing strategy which means it takes every field and stretches the height to be full height and displays every now-resized field. So every odd field (every 2nd field, now full height frame) will feel displaced by 1px downward.
Absolutely. Was looking for someone who had commented that, L4D2 I a great game but on hell of a glitchy/buggy mess, specially on elevators lol. (Still, the GPU ain't no good).
@@cURLybOi Around 14:53 you can hear Jake say that he can see interlacing on the screen capture but Linus said he couldn't see it on the actual display. There seems to be something off with the display they were using for testing. They have a weird setup using a special screen that has HDMI passthrough. A normal capture setup is video output -> capture card -> passthrough to monitor. The setup in this video is video output -> monitor that has HDMI out -> passthrough to capture card. I think the monitor is causing interlacing on its output to the capture card as well as the "jelly" effect that they saw around 15:18. The jelly effect isn't visible on the captured output which means it's likely an artifact of the LCD panel itself rather than the video signal. It would have been nice if they tested with another screen to verify but I also understand why they didn't devote any more time to troubleshooting strange video artifacts on an obsolete video card from 12 years ago.
You guys should have gone to the Nvidia control panel to set 1080p, I think its defaulting to 1080i as safe resolution because 1080i monitors were a thing back then.
I agree to this, i used a 32inch samsung TV, and it defaulted to 1080i when i set it to 1080, but you can force it to 1080p and that's what i did and what i am using right now
Agree, there was many problems in this video, many things weren’t properly looked into, or even gone deeper into. They quickly threw it to the side as being crap without checking to see if the error was on there end, which it was. Wish this got a more thorough look.
18:35 "I Have no idea where we would find drivers for something like that at this point" Maybe in the CD that came with the box that has the drivers for it?
@@kiloneie definitely done "by amateurs". I understand that they want to have fun while making a video, but most of us are interested in the technical details and quality testing of such a card - because this is the only opportunity to see something like this...
@@goranjosic dude they literally discontinued any work on wireless video output after that, its not worth it for consumers, its too much work and too pricey. its not that they are experts, but the stream quality for sure sucked ass.
The artifacts in the Crab Rave video are actually part of the video here on YT and therefore not a problem of the compression. And compression of the stream is impossible while maintaining the low latency, because compression of a FHD steam would introduce 20-30 milliseconds of latency.
this old style LTT video is actually way better than the newer ones... it feels raw and a lot more personal... not too many things going on on the screen
I have GPU with similar cooler, 影馳 海外版 Galaxy GTX 550 Ti HQV (but my GPU only have 1 DVI and 1 DP), this tall boi even longer than GIGABYTE R9 270 WindForce 2X OC
4.5 x 10⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹$
A couple things: L4D2 is almost always like that, especially with elevators, due to how the engine works all of the collisions for the models. Also, with the video for Crab Rave, you can still see a lot of compression with a 3080 - both at 1080p and 4k. The only good way to show how it is, was CS:GO. With that said, 5GHz Wi-Fi being used a lot now compared to when it was released, that might also be affecting the latency, and could be causing at least some of the artifacting due to the now busy/noisy signal band.
Please keep doing more of these minimalist videos. It's not to say that your modern sets are bad or anything, but this is a very pleasant change of pace. :)
Seeing a video formatted like this feels so nostalgic. Great job calling back to the feel of LTT of 10+ years ago. Hell, watching this in 360p almost makes me feel NCIX still exists... lol
Fun fact: WHDI was made by Amimon, which was acquired by Vitec (now Vivendum) and folded into their subsidiary, Teradek, which still produces new wireless devices today, albeit only for professional camera departments.
I think the jitter you point out at 17:40 is actually just the game. Coach is moving up and down, and you are in an elevator. The game just isn't quite updating the characters vertical position and the player camera at the same time.
This video seems very rushed. Linus didn’t test the range of a WIRELESS GPU, I was sure he would do it. He pointed out the flaw of the engine as a flaw of the product. He wasn’t bothered to check the wireless usb and didn’t dig into settings of the card at all. It sucks becasue it was such an interesting product. Take your time and make good full videos. We will wait
I even would have liked to have seen them test it on multiple (or at least one other) monitor to verify that there was no image degradation on the (presumably old) monitor they were using in this video. I agree, they could have run more tests on this thing, even if it still made for an interesting video.
@@dnevill I think the point is, if you're going to test it and bother making a video, at least test it and do so properly instead of shit talking it immediately and ending on that.
There's gotta be a way to send a progressive signal, cuz that is all deinterlacing artifacts. I use a special deinterlaced for old consoles for similar reasons. Either in the GPU drivers or Windows, switching the resolution specifically has gotta help maybe to 900p or something. Or even the wireless card itself maybe has something. The interlacing is definitely half the issue, especially on an LCD panel.
This is quite impressive. As someone who had multiple 480s, 470s at the time and keeping up with tech, I don’t ever remember hearing about this product. Kinda reminds me of my dell widi dock that couldn’t even penetrate the other room even though it was advertised as. Range test would have been interesting.
I just LOVE when LTT takes a ride in their timemachine and goes back to revisit old tech. I feel that young and upcoming pro tech stars can gain from understandning a bit more of the evolution of tech. I'm old enough to have tinkered with most of it myself and I often meet younger generations distanced from "basic" tech. From a technical problemsolving perspective, peeling off that extra layer, helps alot with the insight and understanding of what is actually going on. Well done
With VR being reasonably popular im surprised it hasnt already. Wireless setups already exist and work "decently", dedicated hardware should be a net improvement.
The tech already exists in the modern world, is just not built into the GPU, cuz thats kinda dumb. There are alot of wireless HDMI solutions that can just plug into any GPU.
Because it introduces constant latency, which can get worse depending on what you're doing; it could be fine for watching movies or just browing the web but you can forget about gaming
The answer is its more common than you think. Its built into windows, you just need to find a display that supports casting. Which is basically all smart TVs.
HDCP gives me bad memories of back when I bought a new HD TV about a decade ago and I tried to connect the TV to my laptop and use the TV as the only monitor while the lid of my laptop was down and the laptop-screen turned off while the laptop kept running. Suddenly the TV screen turns green and fluorescent. I tried to change the HDMI cable but I kept getting the same problems. Turns out HDCP doesn't work that well for consumers that tries to follow the law yet it's easy for criminals to circumvent. Just like every other copy protection in other words.
A while back my brother owned a HDMI wireless transmitter and receiver combo. The transmitter box device is where the HDMI devices plug into and it wirelessly transmits the signal to a receiver box which connects to TV via HDMI cable. According to my brother it worked pretty good when outputting his PlayStation 4 to the TV in his living room. Though he lived in a small apartment at the time.
I wanted that card so bad for my photography business. The idea of being able to cast my computer to another screen without tons of rewiring was just so cool. Unfortunately, the price was prohibitive at the time when compared to a regular card. Still, it was cool to see this one!
@williandossantos5200 at the time, a mini pc wasn't an option. They didn't exist. A laptop was an awkward solution due to our lack of surface space. We had room for one table, and it was dedicated to our portfolio and sample albums. The solution ended up being a long hdmi cord and a wireless keyboard and mouse. This gpu would have eliminated the ugly long cord. It was super cool at the time! But it way too expensive and unreliable to justify the purchase.
17:30 I don't think thats from the wireless video, I think that's just Source being Source. The characters are bouncing because you're in an elevator that is actually physically moving your through the level. Notice that nothing else is jittering, only the characters are.
This thing deserves major cred for just working like that. The folks that create the HP Wireless printing crap software could learn some lessons here. If it could be done that seamlessly 10+ years ago, you can make it work that seamlessly now. Great vid LTT!
Hey Linus, have a little more respect for the person who lent you this card. Obviously he was keeping it as new as possible in the box. Don’t peel the peels, don’t shake everything. I know you think the video is the only thing that matters, but there are people who are limited to what they have and these things matter! I hope you compensated the lender for you making money off his card!
Now I wonder how this card would perform for both DVD and Blu Ray playback. Imagine if it was used in a theater setup, and the latency shown persisted throughout entire movies.
Watching your videos, it's the second time that I have the impression that you guys dont fully comprehend the difference between interlaced and progressive. It is not a difference of resolution. At that time, you would buy this to broadcast on a tv screen ( most of the time interlaced) and this GPU probably came setup interlaced out of the box. I'm quite conviced that you can set it up in progressive video output in the settings. And if so, as you are using it on a progressive monitor, would make everything fine. Dig in. :)
Considering they've discussed on multiple videos the difference between interlaced and progressive, including how it works on different technologies, I'm pretty sure they comprehend it better than the average comment section know-it-all.
@@Gamerzsociaty Just use network streaming like moonlight for anywhere in your house. That's going to be much better than something like this. They've laid out pretty clearly why nobody makes stuff like this anymore (too expensive/tied to the GPU).
This reminds me of the Ati All in wonder cards of the late 90s' early 2000's. They were basically a capture card built into a graphics card. Would be cool to see some retro reviews of these card if you can find them.
I found one at Goodwill like a week ago, new in box and still plastic wrapped. I debated buying it but really had no reason to. I always wanted the 9800Pro all in a wonder. Able to play bideo games and capture TV ! Young me would have loves that.
@@Gatorade69 Oh Jesus I forgot about that. You know, if I didn't have such sentimental value attached to my hardware, I'd be sorely tempted to just give away some my shit, idk what Salvation Army or how Goodwill or whatever would even price a GTX 980 or a 5700XT.
@@pandemicneetbux2110 The ram I'm running in my PC I also found at Goodwill... Yeah. Found unused new in the box 32gb of Corsair Vengeance DDR4 3200 ram for 5$. I don't know how it ended up in a Goodwill but it was a perfect upgrade from my 16gb of 3200 DDR4. Still probably my best find, that day had good stuff. Also found a like new Wii, 2 Wiimotes and a wii zapper. Got all that for 30$ and I got homebrew software running on the wii. Before that the only tech finds besides DVD-RW drives was an AMD Athlon X2 4200+ which if it was back in the day would have been a perfect upgrade for my system.
These kind of videos, when they are all like "omg did you see that? so baaad" I'm like "... what? Where? ... Looks fine to me" 🤣 Would be nice if they could point it out in the edditing what the heck they were talking about, but I think that even the editors don't know
That's cause RUclips destroys the picture as well. If you saw the picture in person you'd understand. Even 4k RUclips looks atrocious compared to 1080p offline.
Reminds me of some of the glorious insanity Asrock got into (and still gets into to some extent, see some of their Threadripper and Epyc offerings) in the motherboard space.
Would you mind telling me more about those boards? I really love weird Hardware and especially Threadrippers, but don’t have the knowledge to know about every gimmick, so you telling me about some quirky stuff so I can google it, that would be amazing! :)
@@rolux4853 if i remember correctly, ASRock sells for example an Mini ITX epyc board (where the CPU socket is like a quarter if the entire Board...0) or a m.2 slot VGA Graphics Card In genersl they do pretty much whatever niche they find
@@rolux4853 Well they made a full featured (multiple NVME slots, a full 4 PCIE x16 slots, quad channel memory) mATX X399 board (I think it was a TaiChi), and I think they came out with some even smaller (lie miniDTX) Epyc boards more recently under their "Asrock Rack" professional line (Linus has a video on one of them). Bitwit did a build with the x399 board but AFAIK never did a followup with the benchmarks like he promised.
Thinking about it, for the time this made a lot of sense to me, given that airplay wasn't a thing and running cables to your big screen from a desktop setup is probably something most didn't wanna do.
the 460 was like the nokia 3310 of gpus. you can overclock and overvoltage and overheat the thing without breaking it. If it was a motor engine, you may forget about oil change and oil levels and that thing will still work. I used one before and it survived 9 whole years of neglect and abuse, until finally the metals decayed and disintegrated. The only problem with it which is true to all electronic device is the half-life of the metals used in the circuitry, capacitors, connectors, etc. which range from only just a few hours to a few weeks. On average you really have to change hardware in 3-5 years as the metals will degrade on its own over time. The silicon chips wont, but whatever metallic element they used for circuitry, will.
It wouldn't, HDMI is digital, it either works or it doesn't. The real problem here is the monitor, which is the one deinterlacing the signal badly. It's perfectly fine 1080i, it's just that interlaced images suck on most modern LCDs, due to differences in how they work compared to CRTs (and lazy deinterlacing techniques).
I wonder if the Windows display setting defaulted to 60i instead of 60p? Because that sure sounds like it is the case. The monitor is doing the de-interlacing in this case, which might cause visual artifacts. But the video playback test leads me to believe there is more to this. From amimon's website, creator of WHDI: WHDI™ takes the uncompressed HD video stream and breaks it into elements of importance. The various elements are then mapped onto the wireless channel in a way that give elements with more visual importance a greater share of the channel resources, i.e. they are transmitted in a more robust manner. Elements that have less visual importance are allocated fewer channel resources, and therefore are transmitted in a much less robust way. Allocation of channel resources can include, for example, setting power levels, spectrum allocation and coding parameters. So it seems as though that the data might be uncompressed, but data is definitely being altered in ways. The nowadays heavy use of the 5 GHz band might also make this tech less useful today.
The computer I'm watching this on was built in 2011. i5 2500k, 8G ddr3, EVGA GTX-560, and no SSD, just a 500G hard drive. It's since been upgraded to an Msi 970, 24G of very miss matched ram, and a 250G SATA SSD. I have the CPU overclocked to 4GHz but it'll turbo up to 5. It still reaches the minimum specs for most newer games. And is still very smooth and capable in general. Better than the laptops most people I know have.
@@grafando Originally it was a pirated copy of win 7 ultimate but I did the free upgrade to win 10 a few months after it released and got legitimized in the process.
I'm still using a GPU from 2011... Windows also seems to think it's old and AMD/ATI no longer support it. I use old drivers that you need to sort of force Windows into using and for ages it would keep trying to "upgrade" them to some god awful generic drivers with crap resolution even though the card does 1080p perfectly and could handle 4k30fps.
I actually really need this GPU. I have a workstation in a server rack that is located outside the office. At the moment I have a 15m hdmi cables going from the rack to my desk.
@@leviathanx0815 RDP isn't the same as physically being connected to a pc nowhere close. Getting to bios or keyboard shortcuts are way easier on a pc you're connected to vs RDP.
@@monkeyoperator1360 you know, in the most cases you have Average Joe in front of your PC. You don't want him to wreck his installation just for cheaping out. Chrome Remote Desktop is another quick alternative. In the end it depends on your needs which software works best for your use-case.
You know you can technically upgrade that GPU with a more modern chip for some extra power, shouldn't need much change. That would be sick to see a modern GPU fitted with a reverse engineered version of this.
Cool tech. Wish you guys had tested its maximum usable range and whether it actually could go through walls since that was a feature explicitly mentioned earlier in the video. I could imagine some sort of business use for this where video needs to be transmitted over a distance with no wiring capability. Maybe in a large booth at a convention or to a monitor out in a fab shop. That could justify the cost that doesn't seem justifiable for you average home user. Definitely wouldn't mind seeing more about this card. Maybe send it another channel with more of an incentive to go in depth on it after you get it back, Mr. Owner.
Love these dives into old obscure tech that not many people got to play with, I vote you look at the Ageia Physx cards that were around for a little while with ASUS / BFG and Ghost Recon (few other games too) before Nvidia purchased the company and tech.
Gotta remember, back in the day this was new a 100ft hdmi cable cost 300+ dollars and the wireless HDMI boxes were in their infancy, they started at 300+ dollars. This would be a solid setup for a lot of projector setups or perhaps big meeting room TVs. Some of the jitter may be down to a lack of proper drivers or firmware issues.
This is actually a very good idea for the time imo, disregarding performance. The idea of not needing to feed super long display cables for environments like large meeting rooms and/or eliminating the need for any external cable management is a great concept. Would also be good for standing desks since setting up display cables to be able to use the height adjustment is a pain in the butt, unless you’re fine with dangling and sagging cords. Obviously today the add on device is better, but i bet this inspired it. It seems almost like a proof of concept device to me.
I wonder if all the additional wireless signals going around today caused some kind of interference with the older device? Perhaps back in its day that card was able to perform better.
Can you revisit this? Because just like with wifi signals the signal can be polarized and orientation matters A LOT. Baby computer houses got out of fashion after Intel 386 (decade before this card) and your test rig keeps the card upright and thus the antennas are horizontal. If the card would be in a laid down position in a tower (as probably intended) then all the four antennas would be perpendicular to what you have! From that perspective it's kind of a miracle that the card performed as good as it could (because the receiver circuits might be perpendicular to the antennas). Can you confirm this?
I used to troubleshoot 5G wireless In people’s homes and if this is using that same standard, you might actually benefit from keeping the devices farther away than that. Attenuation is one thing, but I’ve experienced this paradoxical thing where some minor distance improves video streaming on wireless tv set-top boxes and throughput on wireless devices. I have no idea why this is the case, but it seems like some kind of error correction algorithm expects a few feet of separation instead of being right on top of a device.
Did you ever try to swap out the HDMI cable? I don't think it would make a difference, but still worth a shot. Maybe it also would be better if the TV was at least 10 feet away from the receiver, since it was probably designed to actually be used at a distance. Otherwise, very entertaining video, and cool to see this old tech in a video!
I agree. Testing with known good HDMI cable and having sensible distance would have been good thing to test. Wireless technology typically has some minimum distance for which the transmission power is optimized and the transmission is "too hot" for closer distances causing signal saturation at the receiver.
Antenna on the GPU and the receiver are not phased correctly, GPU is intended to be in a case, so all 5x Antenna would be vertical not horizontal on a test bnch - aka vertical = omni and in phase with the Antenna in the receiver.
Will there be a labs video about the Hotspots on the Radeon cards? Der8auer has made a great video showing the problem but I am sure the Labs team can explain/show more of the details.
This would have been before 5GHz WiFi I think so when this was new it probably worked a lot better considering the 5GHz band was pretty much free. Nowadays the 5GHz spectrum is almost as bad as 2.4GHz was when this card came out and the technology they are using was probably never designed to deal with that. Maybe you could test this in a faraday cage/RF chamber?
460 was the only time I ever tried SLI - 2 x 460 got both better performance and was cheaper than a 480 (I can't recall if that was general or I got a good deal at the time mind). Worked right through until I replaced it with a 760.
SniperTeamTango everyone hated the 460/ 560 cards, too cheap ! Even the old GTX 380 it was based on was not a gamers favorite. ATi did sold better cards back then, that Changed with the GTX 1080 launch.
I had a simpler version of this that just transmitted video over the 2.4 ghz wifi and it produced so much wireless noise that no other devices could connect to my router. You mention that this was 5ghz, so you could also check if it impacts other wireless performance in other devices.
Being an engineer who worked on this and seeing it turn on in an instant when everyone is doubtful must be one of the best feelings in the world
8
@@memerified inch
@@mreli_ deep
@@terra3673😈
@@terra3673 In
10 years is a long time for shipping... I wonder why it took so long?
COVID
north america is wide
Border Customs
@@crby101 true
its cuz the shipper was a dad so he had to get the milk
the "jitter" in the elevator scene is actually an artifact of the source engine. They always jitter during elevator scenes because of how they didn't attach players to moving platforms, so they're falling and hitting the ground ~30x/second (the tickrate for L4D2)
Yup, been looking for people pointing that out in the comments. Not their obligation to know that, but still is an "unfair" call to the GPU (not that it could be considered good anyways, given the other problems it has lol).
Would explain why if you "take a break" in L4D2 while in an elevator will sometimes make you clip through, and fall to the ground underneath it.
@@El_Mouse that doesn't happen anymore, for well over a year already
@@dirtypink3197 Haven't played L4D2 in a long time but the last time I played the 360 version it was a buggy mess*. The last update completely ruined it. PC version still worked great the last time I played it. Never payed enough attention in the elevator to notice jittering (usually too busy healing up and prepping for the next section, especially when playing with randos or NPCs that like to disturb the witch).
*A lot of Falling through floors, elevator just didn't open half the time or levels would otherwise be incomplete-able, Rochelle or Zoey would end up in a weird slanted state or constantly falling down and getting up again, voting didn't work and would either pass the vote no matter what or not pass the vote no matter what (whatever if felt like).
@@dirtypink3197 A year you say, in a 15 year old game?
*CORRECTION* 17:30 - That jitter isn't due to the GPU/WHDMI but due to Left 4 Dead 2 thanks to the Source Engine's physics actually moving you and your AI-controlled teammates in an elevator. If you pay attention to the HUD elements in the bottom left or your weapons/items in the middle right, there is no jitter. Cool bit of outmoded tech, though! Thanks LTT for covering this!
Should pin this if they ever look at old videos comments
I was looking for this :D
@@davidwatkins9545their ego won't let em
gay
@@ultrae you aint wrong
Props to the guy who kept that card in such good condition!
Yeah fr
And then I'm all "well, is your GPU at least not busted tf loosened solder and all" on 6800XT mining cards idk man, I seen some like, tech pro tier stuff out there on the used market with all the materials even the sticker for CPU and such, and then you get those really weird guys who literally didn't clean the cheetoh grease off their laptop and GPU. Guys please be like this guy. Keep your materials safe for resale.
After I got upto 300K trading with expert cathie wood .I bought a new House and I'm now able to send my kids to a better school in the states When someone is straight forward with what he or she is doing people will always speak up for them.
Her good reputation already speaks for her last month i invested over $100,000 with her and I've already made over $250,000 profit.
+1518
310
I had 4 of these units as TX RX pairs back in the day without the Graphics card. Their job was to pass 1920 X 1080 Pixel 30 Hz Music video from the DJ booth laptop HDMI to the ceiling projectors, where hard writing was impossible due to distance. They worked well with no line-of-sight problems until CAT 5 HDMI extenders were possible. All my clients were then moved to Network cable HDMI distribution systems. They had full IR remote passthrough. International Music Services (Malta)
I miss old-school video clubs
That's a nice story.
Was the antenna / signal polarized? The way they have it is perpendicular how a card would sit in a tower. I wonder that's what caused signal degradation?
damn i bet thats quite a project configuring all these devices to communicate properly especially in a live setting.
What was the usb for?
The jitter in 17:35 is because of the elevators being "real" in L4D2 and the engine struggles to keep the models on the elevator floor when its moving.
yes
Correct, I suspect that will be edited out once they realise :)
This definitely had its use in audio industry, especially with audio mixing studios for movies! There were strict regulations of noise levels, so all of the audio equipment including PC's were in different rooms
So.. why not run a cable from the other room?
@@xx1simon1xx it is done by cables it is still the most reliable connection! This is just one step closer to perfect sound isolation and quite safe option, since if it stops working you replace it... the cables are usually in a network that goes through walls (which are acoustically treated) so if one breaks its expensive to replace them
you can just get a long ass cable 😆
You can use separate audio instead that would give you MUCH better sound anyway.
And don't go into price, nobody on industries care about price so much so about actual qualities. You imply that yourself already
@@bocahdongo7769 You missed the point.
The reason for all of this jitter and weird 1080i vs. 1080p is NOT that the receiver is deinterlacing. The GPU is transmitting at 1080i and the Receiver is receiving at 1080i. The MONITOR is the one deinterlacing. It's using a Bob-deinterlacing strategy which means it takes every field and stretches the height to be full height and displays every now-resized field. So every odd field (every 2nd field, now full height frame) will feel displaced by 1px downward.
@anunoriginaljoke5870 He's posted this after watching the video to make himself look knowledgeable and clever .....
@Anunoriginaljoke no because the receiver isn't what's deinterlacing
@Anunoriginaljoke No, they said it was the receiver... The monitor interlacing never even came up in the video.
can i connect this gpu to a quest 2?
@@pdbsstudios7137 no lmao
Coach jittering is caused by the engine working our collisions as the models all travel down the elevator. Happens every time
Absolutely. Was looking for someone who had commented that, L4D2 I a great game but on hell of a glitchy/buggy mess, specially on elevators lol. (Still, the GPU ain't no good).
really? it also looks like bob deinterlacing method, where the interlaced fields just rapidly switch
@@cURLybOi Around 14:53 you can hear Jake say that he can see interlacing on the screen capture but Linus said he couldn't see it on the actual display. There seems to be something off with the display they were using for testing. They have a weird setup using a special screen that has HDMI passthrough. A normal capture setup is video output -> capture card -> passthrough to monitor. The setup in this video is video output -> monitor that has HDMI out -> passthrough to capture card. I think the monitor is causing interlacing on its output to the capture card as well as the "jelly" effect that they saw around 15:18. The jelly effect isn't visible on the captured output which means it's likely an artifact of the LCD panel itself rather than the video signal.
It would have been nice if they tested with another screen to verify but I also understand why they didn't devote any more time to troubleshooting strange video artifacts on an obsolete video card from 12 years ago.
Or maybe, the jitter is to simulate the fact that elevators aren't 100% smooth?
@@2slogan yes yes of course 😂😂 that's what the devs tell you
Only Linus would put a 12900K and GTX 460 together
And Windows 10 no less.
Should he buy a Phenom II just for benchmarking?
@@lurch1539gt 210: *you underestimate my powers*
@Lurch Well yeah.... XOC guys just need an output and often prefer compact cards that can run entirely from the PCIE slot.
I disagree
“Sry guys my gpu disconnected”
And the best part is no one is going to believe you
"Did your mom walk between you and your GPU and blocked the signal?"
-A cod lobby in the early 2010s of a paralel universe.
@@RAEJDER awesome burn! Actually made me lol. Haven't heard a good yo mamma joke for a long while.
"sry guys one of my antennas are loose"
You guys should have gone to the Nvidia control panel to set 1080p, I think its defaulting to 1080i as safe resolution because 1080i monitors were a thing back then.
I agree to this, i used a 32inch samsung TV, and it defaulted to 1080i when i set it to 1080, but you can force it to 1080p and that's what i did and what i am using right now
Agree, there was many problems in this video, many things weren’t properly looked into, or even gone deeper into. They quickly threw it to the side as being crap without checking to see if the error was on there end, which it was. Wish this got a more thorough look.
It's 1080i because they're at 60 fps. If they ran 30 fps they could probably run 1080p
18:35 "I Have no idea where we would find drivers for something like that at this point"
Maybe in the CD that came with the box that has the drivers for it?
5GHz wifi has also become hugely more used since those days, I wonder if that could affect it and be causing the artifacting?
Read more above comments, it's the monitor and them not actually understanding what they are talking about, neither of them are experts.
@@kiloneie definitely done "by amateurs". I understand that they want to have fun while making a video, but most of us are interested in the technical details and quality testing of such a card - because this is the only opportunity to see something like this...
@@goranjosic my thoughts exactly
@@goranjosic dude they literally discontinued any work on wireless video output after that, its not worth it for consumers, its too much work and too pricey. its not that they are experts, but the stream quality for sure sucked ass.
@@goranjosic nobody watches LTT for technical details lmao
The artifacts in the Crab Rave video are actually part of the video here on YT and therefore not a problem of the compression. And compression of the stream is impossible while maintaining the low latency, because compression of a FHD steam would introduce 20-30 milliseconds of latency.
this old style LTT video is actually way better than the newer ones... it feels raw and a lot more personal... not too many things going on on the screen
agreeded
Huh, I'm not the only one who thinks this.
@@rikogarza1729 eh I kinda like em
@@rikogarza1729 They are a youtube channel, they need sponsors, if that's what you are talking about.
@@rikogarza1729 you want videos or not? Merch is not enough.
It's funny that at the time someone would probably say "this card is huge" but compared to the stuff today it looks tiny.
I have a pretty small graphics card and it's about the same length as this 💀
But it was really good for 60fps gaming at medium settings back in 2011
My 4080 is almost as big as my keyboard lol
Not really the 260 was a real monster though
I have GPU with similar cooler, 影馳 海外版 Galaxy GTX 550 Ti HQV (but my GPU only have 1 DVI and 1 DP), this tall boi even longer than GIGABYTE R9 270 WindForce 2X OC
Great. Nvidia will now release a RTX 4090 wireless for $4500
45000$
4.5 x 10⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹$
4085 ti*
$4090
@@m4heshd a Dollar per Mhz sounds like something they would try to pull off and justify.
A couple things:
L4D2 is almost always like that, especially with elevators, due to how the engine works all of the collisions for the models. Also, with the video for Crab Rave, you can still see a lot of compression with a 3080 - both at 1080p and 4k. The only good way to show how it is, was CS:GO.
With that said, 5GHz Wi-Fi being used a lot now compared to when it was released, that might also be affecting the latency, and could be causing at least some of the artifacting due to the now busy/noisy signal band.
Please keep doing more of these minimalist videos. It's not to say that your modern sets are bad or anything, but this is a very pleasant change of pace. :)
At 15:08 Linus says “1080p-ness” I think even HE didn’t see the pun
@linus
Bro
There you are. The second I heard him say that I went to the comment section. I knew I wasn't the only one! LOL
Seeing a video formatted like this feels so nostalgic. Great job calling back to the feel of LTT of 10+ years ago. Hell, watching this in 360p almost makes me feel NCIX still exists... lol
Fun fact: WHDI was made by Amimon, which was acquired by Vitec (now Vivendum) and folded into their subsidiary, Teradek, which still produces new wireless devices today, albeit only for professional camera departments.
I think the jitter you point out at 17:40 is actually just the game. Coach is moving up and down, and you are in an elevator. The game just isn't quite updating the characters vertical position and the player camera at the same time.
This video seems very rushed. Linus didn’t test the range of a WIRELESS GPU, I was sure he would do it. He pointed out the flaw of the engine as a flaw of the product. He wasn’t bothered to check the wireless usb and didn’t dig into settings of the card at all. It sucks becasue it was such an interesting product. Take your time and make good full videos. We will wait
I even would have liked to have seen them test it on multiple (or at least one other) monitor to verify that there was no image degradation on the (presumably old) monitor they were using in this video. I agree, they could have run more tests on this thing, even if it still made for an interesting video.
I would like to see Lazy Vide Game Reviewer reviewing this.
It's 11 years old. Does it really need to be thoroughly tested? Lol
@@dnevill I think the point is, if you're going to test it and bother making a video, at least test it and do so properly instead of shit talking it immediately and ending on that.
@@MidnightMarrow he still thanks you for the view.
There's gotta be a way to send a progressive signal, cuz that is all deinterlacing artifacts. I use a special deinterlaced for old consoles for similar reasons. Either in the GPU drivers or Windows, switching the resolution specifically has gotta help maybe to 900p or something. Or even the wireless card itself maybe has something. The interlacing is definitely half the issue, especially on an LCD panel.
I'd guess it's detecting the Monitor to only support 1080i and therefore sending an interlaced signal.
I agree, when I backup some old dvd I have the same awful issues, until I properly deinterlace them
This is quite impressive. As someone who had multiple 480s, 470s at the time and keeping up with tech, I don’t ever remember hearing about this product. Kinda reminds me of my dell widi dock that couldn’t even penetrate the other room even though it was advertised as. Range test would have been interesting.
I just LOVE when LTT takes a ride in their timemachine and goes back to revisit old tech. I feel that young and upcoming pro tech stars can gain from understandning a bit more of the evolution of tech. I'm old enough to have tinkered with most of it myself and I often meet younger generations distanced from "basic" tech. From a technical problemsolving perspective, peeling off that extra layer, helps alot with the insight and understanding of what is actually going on. Well done
If they made this 10+ years ago, I’d love to say a take on it with more modern (and likely better) tech.
With VR being reasonably popular im surprised it hasnt already. Wireless setups already exist and work "decently", dedicated hardware should be a net improvement.
as they epxlained better tech exists already. Especially 60GH wireless but also Miracast over 2,4ghz works well enough
The tech already exists in the modern world, is just not built into the GPU, cuz thats kinda dumb. There are alot of wireless HDMI solutions that can just plug into any GPU.
I have so much respect for the fact that you guys still have that fps_doug Pure Pwnage poster after all these years. (14:06)
TIL AntVenom watches LTT. Nice.
@@atticusnari I think all YTer watches LTT... except themselves...
Antvenom has been watching and commenting on LTT videos for many years at this point. It's nice to still see him comment occasionally
Hi Ant
i use linux
This needs to be a thing now that wireless tech has come so far. An adapter for the monitor and a wireless GPU would be amazing
Idk why this isn’t way more common nowadays. Would be nice if my computer could stream to other devices as easily as a modern phone can.
i have no problem bluetoothing my pc to my tv
Because it introduces constant latency, which can get worse depending on what you're doing; it could be fine for watching movies or just browing the web but you can forget about gaming
Windows supports something similar
Wireless displays exist
The answer is its more common than you think. Its built into windows, you just need to find a display that supports casting. Which is basically all smart TVs.
Windows supports Miracast if your wifi card supports it
I loved how this just worked out the box. These old tech unboxings are great. Always nice to see Linus genuinely enthusiastic about earlier tech.
HDCP gives me bad memories of back when I bought a new HD TV about a decade ago and I tried to connect the TV to my laptop and use the TV as the only monitor while the lid of my laptop was down and the laptop-screen turned off while the laptop kept running. Suddenly the TV screen turns green and fluorescent. I tried to change the HDMI cable but I kept getting the same problems. Turns out HDCP doesn't work that well for consumers that tries to follow the law yet it's easy for criminals to circumvent. Just like every other copy protection in other words.
can u explain?
Story of all DRM. A hassle for honest consumers, not a problem for pirates.
A while back my brother owned a HDMI wireless transmitter and receiver combo. The transmitter box device is where the HDMI devices plug into and it wirelessly transmits the signal to a receiver box which connects to TV via HDMI cable. According to my brother it worked pretty good when outputting his PlayStation 4 to the TV in his living room. Though he lived in a small apartment at the time.
17:45 isnt this just the elevator vibrating ingame?
correct, it is been like that since the game launched and any L4D2 enthusiast will know exactly thats what happens in the elevator scene
correct, i've played thousands of hours of l4d and can confirm this is a source engine problem, nothing related to the GPU
@@pirriu exactly
Lmg editors and writers clearly don't play l4d
I would habe loved to see how far away it actually works.
and with win 7 and right drivers from the disk for both gpu and usb
I wanted that card so bad for my photography business. The idea of being able to cast my computer to another screen without tons of rewiring was just so cool. Unfortunately, the price was prohibitive at the time when compared to a regular card. Still, it was cool to see this one!
Just buy a penpc, minipc or such and used remote desktop or similar
@williandossantos5200 at the time, a mini pc wasn't an option. They didn't exist. A laptop was an awkward solution due to our lack of surface space. We had room for one table, and it was dedicated to our portfolio and sample albums. The solution ended up being a long hdmi cord and a wireless keyboard and mouse. This gpu would have eliminated the ugly long cord. It was super cool at the time! But it way too expensive and unreliable to justify the purchase.
@@williandossantos5200 besides, where's the fun in that!
17:30 I don't think thats from the wireless video, I think that's just Source being Source. The characters are bouncing because you're in an elevator that is actually physically moving your through the level. Notice that nothing else is jittering, only the characters are.
I figured this would either lag a lot or be compressed to hell. Looks like it was the latter with occasional lag spikes.
@@gptcdc3 gotta get them internet points amiright?
@@gptcdc3 The beauty of chapters & 2x speed.
No way you watched the video
You may want to test it somewhere with less 5GHz WiFi signals. Probably when it was released, they weren't expecting the 5GHz band to be occupied
This thing deserves major cred for just working like that. The folks that create the HP Wireless printing crap software could learn some lessons here. If it could be done that seamlessly 10+ years ago, you can make it work that seamlessly now. Great vid LTT!
This just made me remember a printer that if it was moved, I had to reinstall the driver for it to work!
the crazyness of HP printers
@@belther7 yeah HP printer software just sucks - at least the wireless side of it. It should just work like this thing did! Lol
Hey Linus, have a little more respect for the person who lent you this card. Obviously he was keeping it as new as possible in the box. Don’t peel the peels, don’t shake everything. I know you think the video is the only thing that matters, but there are people who are limited to what they have and these things matter! I hope you compensated the lender for you making money off his card!
Now I wonder how this card would perform for both DVD and Blu Ray playback. Imagine if it was used in a theater setup, and the latency shown persisted throughout entire movies.
Watching your videos, it's the second time that I have the impression that you guys dont fully comprehend the difference between interlaced and progressive. It is not a difference of resolution. At that time, you would buy this to broadcast on a tv screen ( most of the time interlaced) and this GPU probably came setup interlaced out of the box. I'm quite conviced that you can set it up in progressive video output in the settings. And if so, as you are using it on a progressive monitor, would make everything fine. Dig in. :)
I kept wondering the same thing. Surely there would be a setting in there somewhere to setup interlaced or progressive if the box was advertising it.
Considering they've discussed on multiple videos the difference between interlaced and progressive, including how it works on different technologies, I'm pretty sure they comprehend it better than the average comment section know-it-all.
@@jjones2582 Verified with only 33 subs, interesting
@@gamin546 - Verified before they created a limit. The weird part: why do I have 33 subs with no content. 🙂
@@jjones2582 The same reason I have 100+ with no content.
Slap a DJI Transmission onto an RTX4090 for a DIY setup
Slap a DJI Transmission onto an RTX4090 for a DIY setu
Slap a DJI Transmission onto an RTX4090 for a DIY setup
Slap a DJI Transmission onto an RTX4090 for a DIY setup
@Zetsubou Slap a DJI Transmission onto an RTX4090 for a DIY setup
@jimmified nice pfp
I love seeing these unboxings of old tech. It could have been interesting to test the 100 ft. range and test it through walls.
Linus do more videos like this, your genuine enthusiasm really sells it
Imagine this gpu would exist today you literally could put your pc anywhere
I know right! Like given how common stuff like Chromecast and streaming is, I'm surprised this isn't more common plus imagine an RTX 3090 with this
Even on the moon?
You can probably have the smallest pc ever or even laptops,
Maybe steam deck🤨🤨🤨🤨🤨
Nah im playing with myself
@@Gamerzsociaty Just use network streaming like moonlight for anywhere in your house. That's going to be much better than something like this. They've laid out pretty clearly why nobody makes stuff like this anymore (too expensive/tied to the GPU).
@@rikogarza1729 true but like considering how expensive GPUs are now a days, it should just come included.
This reminds me of the Ati All in wonder cards of the late 90s' early 2000's. They were basically a capture card built into a graphics card. Would be cool to see some retro reviews of these card if you can find them.
Think I may still have one of these laying around 😂
I found one at Goodwill like a week ago, new in box and still plastic wrapped. I debated buying it but really had no reason to. I always wanted the 9800Pro all in a wonder. Able to play bideo games and capture TV ! Young me would have loves that.
@@Gatorade69 Oh Jesus I forgot about that. You know, if I didn't have such sentimental value attached to my hardware, I'd be sorely tempted to just give away some my shit, idk what Salvation Army or how Goodwill or whatever would even price a GTX 980 or a 5700XT.
Now all cards have that feature built in.
@@pandemicneetbux2110 The ram I'm running in my PC I also found at Goodwill... Yeah.
Found unused new in the box 32gb of Corsair Vengeance DDR4 3200 ram for 5$. I don't know how it ended up in a Goodwill but it was a perfect upgrade from my 16gb of 3200 DDR4. Still probably my best find, that day had good stuff. Also found a like new Wii, 2 Wiimotes and a wii zapper. Got all that for 30$ and I got homebrew software running on the wii.
Before that the only tech finds besides DVD-RW drives was an AMD Athlon X2 4200+ which if it was back in the day would have been a perfect upgrade for my system.
If this had only come out half a decade later, this would have been an awesome addition to VR headsets
These kind of videos, when they are all like "omg did you see that? so baaad" I'm like "... what? Where? ... Looks fine to me" 🤣 Would be nice if they could point it out in the edditing what the heck they were talking about, but I think that even the editors don't know
or maybe make a comparision with the cable, it could also be an effect. Also not sure why did they not go to settings and set the resolution.
Right? It didn't look that bad at all, especially for 10 years old
That's cause RUclips destroys the picture as well. If you saw the picture in person you'd understand. Even 4k RUclips looks atrocious compared to 1080p offline.
Reminds me of some of the glorious insanity Asrock got into (and still gets into to some extent, see some of their Threadripper and Epyc offerings) in the motherboard space.
Would you mind telling me more about those boards?
I really love weird Hardware and especially Threadrippers, but don’t have the knowledge to know about every gimmick, so you telling me about some quirky stuff so I can google it, that would be amazing! :)
@@rolux4853 Well, iirc they made a mini itx threadripper board... the socket took up so much space it used SODIMM memory, instead of regular DIMMs
@@rolux4853 if i remember correctly, ASRock sells for example an Mini ITX epyc board (where the CPU socket is like a quarter if the entire Board...0) or a m.2 slot VGA Graphics Card
In genersl they do pretty much whatever niche they find
@@rolux4853 Well they made a full featured (multiple NVME slots, a full 4 PCIE x16 slots, quad channel memory) mATX X399 board (I think it was a TaiChi), and I think they came out with some even smaller (lie miniDTX) Epyc boards more recently under their "Asrock Rack" professional line (Linus has a video on one of them). Bitwit did a build with the x399 board but AFAIK never did a followup with the benchmarks like he promised.
I love how insane ASRock was early on. Like I have their X79 Extreme11 and it has literally everything on it for the time period.
1080p-ness. Nice.
Haha I came to see if anyone else noticed.
Thinking about it, for the time this made a lot of sense to me, given that airplay wasn't a thing and running cables to your big screen from a desktop setup is probably something most didn't wanna do.
Heh heh 1080Penis 15:02
I love how jaded we’ve become with certain tech that we’re amazed when it works at all.
I'm still amazed when standby on a laptop works xD (of course it still doesn't and actually got worse lol)
@@Bozebo lmao tell me about it
I am surprised that the setup works so well. As close as the antennas are arranged, a neutralization of the radio waves is possible.
the 460 was like the nokia 3310 of gpus. you can overclock and overvoltage and overheat the thing without breaking it. If it was a motor engine, you may forget about oil change and oil levels and that thing will still work. I used one before and it survived 9 whole years of neglect and abuse, until finally the metals decayed and disintegrated.
The only problem with it which is true to all electronic device is the half-life of the metals used in the circuitry, capacitors, connectors, etc. which range from only just a few hours to a few weeks. On average you really have to change hardware in 3-5 years as the metals will degrade on its own over time. The silicon chips wont, but whatever metallic element they used for circuitry, will.
I loved the older format and style of this video!
The slow VGA driver lag in Windows !
Should have tried it with a fresh hdmi cable. Not sure if it would have made a difference but would have at least given it a chance
It wouldn't, HDMI is digital, it either works or it doesn't. The real problem here is the monitor, which is the one deinterlacing the signal badly. It's perfectly fine 1080i, it's just that interlaced images suck on most modern LCDs, due to differences in how they work compared to CRTs (and lazy deinterlacing techniques).
@prodbyfaith Digital is just a special case of analog. If the wires are kinked, it could matter to high speed digital.
@@LeeMcc_KI5YPR If it's kinked so bad that the throughput is affected, it would either have horrible artifacts or just cut off completely.
@@RezTechTube It could also result in less bandwidth available, causing the device to switch to 1080i instead of 1080p for example
@@bastienx8 HDMI, especially then, probably was not smart enough to do anything like that. after all, HDMI is exactly DVI in this instance
12:33 I'm so sorry to hear that Linus.
😂
I wonder if the Windows display setting defaulted to 60i instead of 60p? Because that sure sounds like it is the case. The monitor is doing the de-interlacing in this case, which might cause visual artifacts. But the video playback test leads me to believe there is more to this.
From amimon's website, creator of WHDI:
WHDI™ takes the uncompressed HD video stream and breaks it into elements of importance. The various elements are then mapped onto the wireless channel in a way that give elements with more visual importance a greater share of the channel resources, i.e. they are transmitted in a more robust manner. Elements that have less visual importance are allocated fewer channel resources, and therefore are transmitted in a much less robust way. Allocation of channel resources can include, for example, setting power levels, spectrum allocation and coding parameters.
So it seems as though that the data might be uncompressed, but data is definitely being altered in ways. The nowadays heavy use of the 5 GHz band might also make this tech less useful today.
Yeah I have had that happen before
The set very much looks ye ol' blue NCIX Tech Tips set. Good memories, and glad Riley made the jump.
I'm distressed by the idea of something from 2011 being "old."
The computer I'm watching this on was built in 2011. i5 2500k, 8G ddr3, EVGA GTX-560, and no SSD, just a 500G hard drive. It's since been upgraded to an Msi 970, 24G of very miss matched ram, and a 250G SATA SSD. I have the CPU overclocked to 4GHz but it'll turbo up to 5.
It still reaches the minimum specs for most newer games. And is still very smooth and capable in general. Better than the laptops most people I know have.
It's 12 years ago now 🥺
@@melty4204 What OS are you running?
@@grafando Originally it was a pirated copy of win 7 ultimate but I did the free upgrade to win 10 a few months after it released and got legitimized in the process.
I'm still using a GPU from 2011... Windows also seems to think it's old and AMD/ATI no longer support it. I use old drivers that you need to sort of force Windows into using and for ages it would keep trying to "upgrade" them to some god awful generic drivers with crap resolution even though the card does 1080p perfectly and could handle 4k30fps.
Linus: "What makes it a wireless GPU?"
No wires
0:54 little stoned there bud?
I actually really need this GPU. I have a workstation in a server rack that is located outside the office. At the moment I have a 15m hdmi cables going from the rack to my desk.
you know, there are several remote desktop software on the market... and even included in windows given you don't use a home edition
@@leviathanx0815 you can just install some parts of it if you know what you're doing
@@leviathanx0815 RDP isn't the same as physically being connected to a pc nowhere close. Getting to bios or keyboard shortcuts are way easier on a pc you're connected to vs RDP.
@@monkeyoperator1360 you know, in the most cases you have Average Joe in front of your PC. You don't want him to wreck his installation just for cheaping out. Chrome Remote Desktop is another quick alternative. In the end it depends on your needs which software works best for your use-case.
I'd stick to the HDMI or some kind of Ethernet based solution lol
This style of LTT reminds me a lot of the style I'm trying to go for. I like it! There should definitely be more videos like this one.
I heard US mail is slow but damn not 10 years to deliver a GPU
US mail handing off to Canadian mail = gong show
They have royal India mail here ! Ebay !
Honestly even if its a little bit less efficient it would be cool to see wireless gpus more often
You should really have tried a different HDMI cable just to be sure. Also an actual measurement of the latency would have been nice.
You know you can technically upgrade that GPU with a more modern chip for some extra power, shouldn't need much change. That would be sick to see a modern GPU fitted with a reverse engineered version of this.
Cool tech. Wish you guys had tested its maximum usable range and whether it actually could go through walls since that was a feature explicitly mentioned earlier in the video.
I could imagine some sort of business use for this where video needs to be transmitted over a distance with no wiring capability. Maybe in a large booth at a convention or to a monitor out in a fab shop. That could justify the cost that doesn't seem justifiable for you average home user. Definitely wouldn't mind seeing more about this card. Maybe send it another channel with more of an incentive to go in depth on it after you get it back, Mr. Owner.
Love these dives into old obscure tech that not many people got to play with, I vote you look at the Ageia Physx cards that were around for a little while with ASUS / BFG and Ghost Recon (few other games too) before Nvidia purchased the company and tech.
You have to manually select 1080p in Nvidia control panel or it'll default to 1080i
0:34 Today I learned the plural of antenna is antennas when referring to tech and antennae when referring to insects.
Same
Gotta remember, back in the day this was new a 100ft hdmi cable cost 300+ dollars and the wireless HDMI boxes were in their infancy, they started at 300+ dollars. This would be a solid setup for a lot of projector setups or perhaps big meeting room TVs. Some of the jitter may be down to a lack of proper drivers or firmware issues.
The jitter is from interlacing. This is literally the best this image could look at 60 Hz.
This is actually a very good idea for the time imo, disregarding performance. The idea of not needing to feed super long display cables for environments like large meeting rooms and/or eliminating the need for any external cable management is a great concept. Would also be good for standing desks since setting up display cables to be able to use the height adjustment is a pain in the butt, unless you’re fine with dangling and sagging cords. Obviously today the add on device is better, but i bet this inspired it. It seems almost like a proof of concept device to me.
I wonder if all the additional wireless signals going around today caused some kind of interference with the older device? Perhaps back in its day that card was able to perform better.
Can you revisit this? Because just like with wifi signals the signal can be polarized and orientation matters A LOT. Baby computer houses got out of fashion after Intel 386 (decade before this card) and your test rig keeps the card upright and thus the antennas are horizontal. If the card would be in a laid down position in a tower (as probably intended) then all the four antennas would be perpendicular to what you have! From that perspective it's kind of a miracle that the card performed as good as it could (because the receiver circuits might be perpendicular to the antennas). Can you confirm this?
That's.... Not how a wireless signal works with a unidirectional antenna.
I used to troubleshoot 5G wireless In people’s homes and if this is using that same standard, you might actually benefit from keeping the devices farther away than that. Attenuation is one thing, but I’ve experienced this paradoxical thing where some minor distance improves video streaming on wireless tv set-top boxes and throughput on wireless devices. I have no idea why this is the case, but it seems like some kind of error correction algorithm expects a few feet of separation instead of being right on top of a device.
LTT guys being excited kind of gets me excited. Contagious, man.
18:41 err, the driver disk in the box?
Did you ever try to swap out the HDMI cable? I don't think it would make a difference, but still worth a shot. Maybe it also would be better if the TV was at least 10 feet away from the receiver, since it was probably designed to actually be used at a distance.
Otherwise, very entertaining video, and cool to see this old tech in a video!
I agree. Testing with known good HDMI cable and having sensible distance would have been good thing to test. Wireless technology typically has some minimum distance for which the transmission power is optimized and the transmission is "too hot" for closer distances causing signal saturation at the receiver.
I imagine that the IR sensor was for when you paired it with a TV. Using one remote, super fancy!
Antenna on the GPU and the receiver are not phased correctly, GPU is intended to be in a case, so all 5x Antenna would be vertical not horizontal on a test bnch - aka vertical = omni and in phase with the Antenna in the receiver.
I'd love to see this sort of idea with current tech, I think it could be much better if implemented correctly.
Will there be a labs video about the Hotspots on the Radeon cards? Der8auer has made a great video showing the problem but I am sure the Labs team can explain/show more of the details.
labs is just a gimmick word till the website is up
@@infinite-tech I have no idea what to call their engineering department :D
Just dont buy a ref card, problem solve xD
@@BNHardwarereviews Some other companies also are affected, not only ref cards
You think LTT will do a better job than der8auer? 😂
Ahhh L4D2. So much nostalgia.
it would have been very interesting to see if the performance changes if all your 5ghz gear was off to free up the spectrum.
This would have been before 5GHz WiFi I think so when this was new it probably worked a lot better considering the 5GHz band was pretty much free. Nowadays the 5GHz spectrum is almost as bad as 2.4GHz was when this card came out and the technology they are using was probably never designed to deal with that.
Maybe you could test this in a faraday cage/RF chamber?
I'm surprised you guys didn't give the wireless aspect a challenge. Like how far would it actually reach?
And could it actually "punch" through walls
Pretty sure 460s were the most popular card of the era... Very unlike this long forgotten brother xD
Well... It was 315$ more costly (416$ by today's standards according to usinflationcalculator).
460 was the only time I ever tried SLI - 2 x 460 got both better performance and was cheaper than a 480 (I can't recall if that was general or I got a good deal at the time mind). Worked right through until I replaced it with a 760.
@@Saitir942 I got one of these super cursed 460 2win cards from EVGA it barely ever worked and was a total disaster but when it did work it was epic
SniperTeamTango
everyone hated the 460/ 560 cards, too cheap ! Even the old GTX 380 it was based on was not a gamers favorite.
ATi did sold better cards back then, that Changed with the GTX 1080 launch.
@@lucasrem uhhhh, sure thing mate. Nvidia definitely won the 400s/500s era battle with ATI. The 7000s vs 600s was the first time AMD gained ground.
This is the kind of content I used to love LTT for.
Please go back to your roots more often!
they just need to get over the latency bottleneck that is wifi and we have the next stage to technological gaming advancement
the post office or what? because 10 years is quite a lot of latency. xD
I'm not gonna lie that fan shroud looks pretty cool
Yeah prefer this style over curvy lines and overblown rgb any day of the week
I had a simpler version of this that just transmitted video over the 2.4 ghz wifi and it produced so much wireless noise that no other devices could connect to my router. You mention that this was 5ghz, so you could also check if it impacts other wireless performance in other devices.
2:18
I suspect the reason we never heard about this was the fact that it's much easier to send control inputs wirelessly than video output.