@@florianlucs7229 what if they hit the die, a memory chip, or some power management circuitry? If it's a multilayer PCB, they could have mushed some traces together internally. There's a lot more than soldering traces back together.
I found Easy way split GPU to both system runs on windows 10 I split my GPU into two and me and my bro used 1 GPU to play multiplayer game Borderlands 1,2, pre Sequel multiplayer we made two users in my desktop and run space desk and install persec on second user and use login to second user and connected second user to laptop and then i switch to first user he use mouse and keyboard on laptop to play game using parsec and I use desktop to play form desktop his laptop is so low it has no fan celeron also has no 4 gb ram and hdd Hard drive we played all game smooth using this method no vm not complex it's simple and
Or you could do what I did, get a 64-Core Epyc Rome, 256GB of DDR4 ECC, 3x Titan X Pascal and 10 1.2TB Enterprise SSD to run 12x Gamers off one rack mounted system. Awesome video Colin (and Linus). This tech is incredibly powerful for distributed computing, and I can't wait to see what the future holds.
Does this actually work? Meaning using multiple GPUs and doing this? Me and some friends often meet up for league of legends and games like that. Usually with 5 people. Instead of them all bringing their fulltower they could grab any ultrabook and another brings a second gpu and we could chop it up. Like I have a 3080. My friend has a 3090. We could take 2 streams from his gpu and one from mine and it still works?
I was about to buy one of those GPU servers to start up a VM provider, I almost bought up the one with six 2070 thinking I'd host atleast 6 clients for now... That was when splitting GPUs were not really a great experience yet, I ended up just waiting on it, stocking up on gold while the world started brunning down. I kinda regret it considerin the GPU price right now, so now I have to wait for them to go down again, and I'll do this except split them in two!
I feel that as a kid linus' experiences with being broke and wanting to game coupled with his experience as an actual company owner has really helped his want/need to get around arbitrary bs like this, it's really really solidly good for gamers.
well yea, but i just thought of a better idea for home. i mean my wife sometimes wahts to play civ6, but not htat often and when she palys, she likes to do it on her abit older and slower laptop. well i havent had the need to upgrade it because she does not play that much. but she could log in to the VM on her laptop and run the game there. This gave me a good idea.!
@@AndresUffert2 Exactly. It still looks a bit too fiddly for me to bother but I can really see this being a great way to make a home server (even more) worthwhile.
I mean, looking at performance numbers, I’d expect that any midrange card would still significantly outperform this VM. Plus, if you’re playing competitive games like CS:GO, the latency would likely be unbearable.
do you see them running 4 games in 1 instance playable for 4 players and the game there running is cs-go an extremely easy game to run...just go buy 4 haswell systems and GTX 970's cheaper then 1 3090 the only reason your even seeing a video like this is cause linus has been paid to make it by NZXT not because its a good idea or good value
Oh most definitely. Exam browsers as well. Took a PSI remote exam and their exam browser detected that I had VirtualBox installed. Even though the service was disabled and all. Only thing that worked was uninstalling it.
I gamed in a VM with PCIe passthrough for a year until some of my most played games stopped working because of anti cheat and there were some quirks that were annoying too
this might be an interesting one for households with a kid or two kids that are looking to move from console to pc. Having a hub pc for the household is an interesting concept.
I'm surprised nvidia hasn't jump on this would encourage people to buy more expensive gpus. I own a rtx 3080 but would happily waste the extra money buying a 3090 if this was possible with out the work arounds
I found Easy way split GPU to both system runs on windows 10 I split my GPU into two and me and my bro used 1 GPU to play multiplayer game Borderlands 1,2, pre Sequel multiplayer we made two users in my desktop and run space desk and install persec on second user and use login to second user and connected second user to laptop and then i switch to first user he use mouse and keyboard on laptop to play game using parsec and I use desktop to play form desktop his laptop is so low it has no fan celeron also has no 4 gb ram and hdd Hard drive we played all game smooth using this method no vm not complex it's simple and
One of the biggest introductions of latency is queue submission and having to re-instantiate parts of the pipeline. Now I haven't investigated how para-virtualization works but at least each game submitting their own queue submissions is entirely unavoidable. This is because these submissions are specific to the geometry being rendered, which obviously is different for each game. This is why dividing a video card across users will never have good scaling and you will always see a bigger reduction than half the framerate. There is major other factors to such as much more cache invalidations because the geometry in between frames is different,as well as more memory contention due to memory access patterns being now less optimally scattered from the different workloads.
@@devintolliver92 GPU tech is advancing by keeping or even extending the rendering pipeline, absolute performance is going to increase over time but relative performance will stay stagnant
Does this mean that if you wanted to run this setup for say 2 users, you'll have to contend for a bit less than half gaming performance for both instances, instead of exactly half, just to account for the overhead of this workload not being as predictable and easy to calculate as normal gaming? In that case I'd love to see how it fares with a benchmark running at the same time on both instances once setup properly, just to find out roughly how much overhead is expected for each configuration.
I think it's worth mentioning that some anticheat software (Vanguard, etc.) rejects running the game they are bundled with in a VM. Maybe there are workarounds for the particular game, maybe not, but don't expect to be able to play every AAA competitive game with this setup.
@@MikkoRantalainen not all AAA games are played competitive, but I get your answer is directed at Alejandro's "but don't expect to be able to play every AAA _competitive_ game with this setup." For just evening match or 2 in games that are really restrictive with anticheat, then most likely you already have dedicated machine and no need to share GPU. The GPU sharing seems more of use in maybe setting up as some sort of couch machine/home media machine or for someone else than yourself in same household (to lower latency) to play something together.
Those anti-cheats also have a tendency to reject Parsec's input, or even sometimes it's screen capture, most of the time. Even without being in a VM, Vanguard is not playable remotely
@@krisavi With "AAA competitive game" I was thinking on games that value competitive integrity in a way that makes them resort to this kind of anticheat software. An "AAA competitive game" doesn't necessarily have to be a first-person shooter, where reaction times are paramount, or similar fast-paced games, but yeah, both of you are right in that such a setup won't offer the best experience on such games even if it worked. What I think it is more important is that games with this kind of anticheat won't run at all. You can't even choose to play private, custom matches with your friends to have a good time, where some extra latency doesn't really matter.
I think leaving the host VM preview open is hindering the performance a bit during the demo. It should be closed and the VM should be started from the menu option instead of opening the preview for a more realistic result
@@wakannnai1 I don't *think* the 3090 is able to max out 16 PCIe lanes, we have yet to see GPU's hit those limits. Typically those limits are only reached with enterprise hardware (networking, storage, etc.). The biggest bottleneck is undoubtedly Windows overhead (as Windows is running 4 times on the same CPU/GPU), with the second likely being storage (though that should not effect in-game performance). Either way -- like Linus said it is never going to compare to running on bare metal as long as you have to connect your display and peripherals over a network (no matter how fast your network is). If Nvidia opens up the driver (unlikely) to allow VM assignment to the GPU outputs that would drastically improve performance.. BUT Hyper-V would no longer work for that as it works differently than other hypervisors. You cannot assign specific hardware/ports to a VM only allocate their resources. Though it's possible that is the reason this works at all.
In a shared GPU environment like this, the host always takes precedence, thus if the host requires virtualization, it will be done first then the vm's.
I was doing academic work in the virtualization high ability field circa 2011... For years there was a flopping between hardware and para in terms of what provides better performance and what will be the future. At one point around 2015 it seemed hardware will be it for good. I personally always felt para was a more elegant and cooperative design, though back then not available for Windows.
I think the idea behind the laptop is good, but the build quality needs some work, especially the hinge. Check out how much the screen shakes compared to the other laptops.
I doubt you would see a difference in performance on Windows Server, but you would gain the ability to directly map hardware to a VM…which granted, isn’t really the goal here.
I'd think that the host OS being Windows server might help the VM performance, but then I'd expect compatibility issues playing games directly on the host.
@@brychaus you're probably right about host gaming issues, but for a home-server use case, the three (or more maybe?) VMs would be enough for the home. Who has more than 2 friends anyways lol?
@@christopherprevost763 Yeah I see what you mean, but the host got so much better performance in LTT's testing that I would want to use the host to take advantage of the performance.
@@brychaus Host will always have the least overhead for hyper-v since guests have the hypervisor as additional overhead. I'm going to give this a shot over the next week or so with one VM as a proof-of-concept for my home setup.
The sped up, step-by-step, screen cap of the vm being spun up in windows, is actually very genius, and should be a standard for content like this-- that's technical and takes many steps-- maybe even queued up a bit earlier during talking points. It's almost like the first time I had seen a teardown on GN where the unscrewing etc was chopped up and played back normal speed. Kudos to the editor of this vid.
It should be noted that having these instances virtualized means you will loose some performance due to the overhead involved in virtualization. How much of an overhead depends on things such as how efficient the hypervisor is.
This is actually a great option for trying out windows 11 from a windows 10 gaming PC. Parsec makes it a much more native feeling experience than hyper-v alone.
“You’ll need a windows license for each VM” well, not exactly, the host computer needs 1 Enterprise license and each client must also run Enterprise . Professional doesn’t cover delivering a remote windows desktop…and oddly, MS cares how many computers run the desktop, not how many instances of Windows are running.
@@jammiewins sure, that was my point. Don’t license the VMs, doing so doesn’t actually make it valid. Either way you are just trying to fly under the radar.
@@finnderp9977 correct, and in that scenario you basically have two licensing options. Licensing named users, or licensing devices. And contrary to what is logical, licensing the device means licensing the client device, not the server or the VMs it is running.
@@finnderp9977 All you need is a valid license for the operating system you're using as guest. Windows 10 has a limit of 32 virtual processors allowed.
I would love to see how this would run on and AMD RX 6k 16GB series card cause i get the TI is powerful but i want to see the power that AMD can output on the similar work load
"You should get 4 different Windows licenses. Definitely don't just spin up a new one when your license expires." *bites lip nervously* Thanks for telling me what NOT to do Linus, I will definitely NOT do this if I try this build.
Every time I hear people say activate your windows, I laugh at all the shop computers and test benches running the same non activated version of windows 10 pro that I did in a single night on like 5 towers 🙃
microsoft doesn't deserve your money 1 time, let alone 4. For an inferior OS filled with adware, they don't deserve $100. Linux is free and the only disadvantage is lack of third party support, which microsoft have done nothing to deserve.
@@itdepends604 to be fair, Windows licenses are practically free with any prebuilt computer... I wish they'd just move to a freemium model and let me disable ads, automatic updates, etc on a pro license. Linux gives you choice but the average consumer isn't great at dealing with tech issues by themselves.
@@DelphinusMAch1 windows licenses are NOT free with prebuilts. While OEM's obviously get bulk discounts, it still adds around an extra $50 to the cost of the computer. Almost all of the problems with linux are software support. Apart from this, linux is actually more stable then windows. People are obviously going to find it harder to sort linux issues if they're used to windows, but linux issues generally aren't any trickier to solve then windows issues. So I completely stand by what I said. I am not paying a useless company to use my already payed for software, when they are already getting payed by their adware anyway. My VM's will stick with unregistered windows.
16:23 "Windows License For Each VM" This is where gaming on Linux can really take off. Spooling up 4-8 instances of a game with just one box could bring LAN parties back to mainstream.
Amazing timing for this video. My GF and I have just started to play some games side by side in the evening. I‘m playing on an R7 5800X + 3080 Ti tower, she‘s using a Surface Pro 4. I was almost thinking about getting some used hardware for a few hundred to get her a better experience. Now I can just share my system. I’ll definitely try with my gaming buddy later on, he’s rocking a GTX 560 Ti… Thanks so much!
It would be interesting if instead of splitting the VRAM up evenly if it could detect if multiple clients had the same game open and have a shared memory pool so they could share the vram.
@@marcogenovesi8570 From a quick google, the passthrough functionality works on AMD too (and they never blocked it off). But it might need to be done in another way than in this video.
So they're steaming the signal to their laptops, but the processing is being done by the desktop in the background!? Holy crap that is awesome for lan gaming sessions!
I see what you did their….. ”steaming” 1 game to 2 laptops from a desktop. 😂 (I’m sure that was probably a typo, but it actually works out since they are indeed using steam….also totally not trying to hate or anything but I’d write “lan” = LAN . Since “Ian” looks exactly like my buddys name “ian” as capital i’s and lowercase L’s look identical.
I've seen forum posts asking why ignore Aster, a response iirc was 'windows could potentially patch that out' yet here we are on the cusp of W10 going EOL. I wonder why Microsoft doesn't want to support native multiseat, their licencing keys are under 10 dollars constantly on sale, and The Linux Challenge overlooked multiseat too. Pls Anthony, show them how linux does multiseat + proton.
Bit Aster seems to have issues running multiple of the same instances at once where it is not allowed, like it is often the case with games. Or am I wrong?
@@lugaidster ah thanks. Tried Sandboxie before and it didn't work for the games I wanted to play, so I used the HyperV approach with GPU paravirtualision.
The idea of running multiple game instances on one machine reminds me of the days of split screening locally on older game consoles as a kid. Would be a killer config for LAN parties.
This is an intriguing idea for my wife and I (both architects WFH) to have one workstation two architects. She’s remoting into her office and I work locally so this really could work even with a 1060 & 8700k.
More then what you really wants always, its a cool way but fanboys and competitive play is out of window. This is why virtualization works in datacenter because its whole other technique but still with lag. Always!
Most of the input delay comes through the streaming process, even on a wired connection with the "low input lag" feature in parsec enabled it's still slightly noticeable.
That little bit of overhead is surprisingly small, and less impactful than I'd imagined. But yeah ima just save myself the money and throw another RX 580 in my rig when I decide I need solutions to problems that don't exist
@@waderyun.war00034 I may be wrong but i think hes talking about the imaginary problem being nvidia's lock on splitting gpu, not "1 pc multiple gamers"
I've been researching this on and off for forever. This was the main thing keeping me from switching to a fully virtualized setup. VFIO time, maybe! Time to actually watch the rest of the video to see if it's practical.
You can do that with Looking Glass or VFIO is also an option. SR-IOV(the splitting of gpu) is somewhat of a hack right now, unfortunately nvidia are still buttholes about it. But it you can make it work and even share basically hardware connection to it with Looking Glass. Or you can do VFIO passthrough if you have a few gpus laying around(lol)
Technically wouldn't there be three instances of the pc running- like the original and then the two virtualizations- maybe it is holding back a little in case the original instance needs juice? I honestly do not know anything.
I'm curious - could there be a 'comparison' in cost between having four individual systems (with similar performance) to having this sort of setup with four virtual machines? I mean, is it actually cost-effective to run one powerful machine over four not-so-powerful ones? What about the cost to run the systems (power draw, etc)?
From some quick math and assumptions, assuming you get the GPU's at MSRP and buy all new parts, the expensive system (3080 Ti, 5950x, 32GB RAM) and 4x cheap systems (GTX 1650, i3 10100, 16GB RAM) would be within a couple hundred dollars of each other in either direction depending on exact parts chosen. If we swap the 3080 Ti for a 3080 however and go off MSRP the expensive system would be much cheaper and I don't expect performance would change too much. As for power consumption they probably wouldn't be terribly off from each other but this is really hard to estimate without actually building these systems and getting exact power numbers (not enough to just add max power limits of the parts together as they probably won't push to the complete max and PSU chosen makes a difference). Of course there's multiple ways to go about this, for the cheap system I chose parts that made sense to me but someone else could choose cheaper or more expensive parts. If you go used the story is completely different and 4x cheap systems would most likely come out way cheaper than a single expensive system but power consumption would probably favor the expensive system in that scenario. If you have any questions about any specifics on the systems or rationale feel free to ask
Way cheaper if you don't all game at the same time. Have two people game during the day and sleep during the night Have two people game during the night and sleep during the day 2x performance for cheaper price.
@@taylervest6594 depending on how many, that might require a much beefier host, in terms of available PCIe lanes more than anything. I've run into this problem with my 3900x based server, add two GPUs, 10gb network and a disk controller there's not enough lanes. I'm upgrading to first gen Epyc for this reason (I can also consolidate some of my other gear), 128 lanes all the way.
I've been doing this for years on linux, using the DRI subsystem to shard a single GPU between multiple users on multiple seats. Works best if you either use integrated graphics or toss an ultra-low-end graphics card in to get more display outputs. You *can* drive multiple window systems off a single GPU, but it gets a bit unreliable compared to one-gpu-per-seat. You can usually get suitable GPUs for about $15 each. And without the overhead of a full VM stack for each copy, the performance hit is much less.
Is there anywhere I could read about this? If I understand right, your main GPU supplies the processing power, but the extra, crappy GPU supplies an output port, yes?
@@prydzen this is actually how this project that linus featured works. Basic Display Driver/Hyper-V Display for, display and for Graphics its your GPU.
I would love to see technical videos like this, but for running more basic things on Linux. This takes me back to when I was learning how to use Windows as a kid.
This really makes for some nice use cases. I have two kids, and using parsec on a pair of raspberry pi’s to play some local games with them sounds amazing.
So, since Nvidia prevents partitioning on consumer cards, would this be theoretically easier on a 6900? And would that work for splitting a hardware port per virtualization?
The GPU isn't being split in a traditional manner, Windows is doing all the work. So long as the drivers are set up right, either one works. I recommend Nvidia though, since it's encoder is higher quality and lower latency, making it better for Parsec
the amd card? yeah i looked into this because i want a home virt server and sadly instead of the hardware being locked on amd it’s just not there, they have had vgpu gpus but they cost way to much for a card with crappy performance
Be aware if you're using other VMs already. When enabling Hyper-V, other VMs like VirtualBox will not work anymore. So if you use the same PC for work and fun don't panic on monday morning
Windows 11 comes with Windows Hypervisor Platform which enables VMWare Workstation to coexist with Hyper-V w/o disabling one of them beforehand. IDK if VirtualBox supports it.
@@Skyline_NTR I ran into this issue when I activated Hyper-V on Win 10 in late 2021. Maybe if it's supported, at least some additional magic is required to make it work. I did not dive too deep into this issue because it's my productive system and I did not want to mess any further.
I'm already doing this with Windows Remote desktop and RDP Wrapper, it needs a bit a tweaking in the windows groupe policies and registry but once it's set up it's working great, I've only tested with one extra user gaming at the same time as the host though
So I just got done trying to set this up, but was having issues with connecting any extra controllers, such as my Xbox controller and USB steering wheel. Have you found any good way to connect these through RDP, or anything?
@@paularie2202 for some reason I wasn’t able to get that to work. But what I was able to do (for anyone else that reads this) was start a RDP session and then use Parsec to connect my peripherals. This allowed for a seamless connection, but also made sure that I could still use my host computer for other activities at the same time.
One of the best and most helpful video I've ever seen! I bought very good PC, but was unable to share it with my girlfriend which was stuck on old laptop. Now, we can play together with very good performance! Thank you Linus :D
can u tell how much ram do u put into the VM, and wich GPU do u have? Also if u are using 2 separate ssd for each machine (host and VM) or just HDD for the VM
I've put 32GB to VM, and my GPU is RTX 3090 Ti. Also, I don't have separate HDD, my whole PC is running 2TB SSD M.2 NVMe, from which I put 200GB to VM. However take note, that host computer has always CPU and GPU priority over VM, so when I play Apex Legends on ultra details on my 32:9 144 Hz, she got only 30 FPS in Cities Skylines on her FullHD montor. But I've tested it on Leagues of Legends, and there is no problem in running 4 instances of game (host + 3VMs) in 240 FPS.
This is an awesome way to manage such sad times with GPU "shortages". My concern would be if Nvidia sees this as a thing and limits or nukes it's possibility all together then push it to newer GPUs as a "feature" just so it can be -further- up sold.
@@ikkuranus oh I didn't see that, okay maybe it does work in windows 10, I swore it didn't use to as this was a big controversial thing when I first heard about Nvidia disabling gpu splitting on the lowerend models
GPU-P was also introduced in Windows 10 20H1 I think so it’s technically possible since them, I hope in next OS updates of Windows 10 and 11 they add a GUI configuration for GPU-P so anybody can set it up by themselves without even needing to watch and store tutorials anytime they want to create one
You guys should make a video about the new 3D printed OLED. A research group of scientists at the University of Minnesota Twin Cities has succeeded in 3D-printing the entirety of a flexible organic light-emitting diode (OLED) display for the first time. I know it's only 1.5" and only 64 pixels. But still looks promising.
@@ezicarus8216 So what? Knowing that they can already do it should be interesting enough. Plus, things can scale way faster today. Specially if we already have the tech, just making it cheaper. Just like they are already using sodium-ion on batteries. It's way faster now.
@@ezicarus8216 Sorry, did I ever said about mass production (tech being ready =/= viable on the market)? I did said about things that can go faster now days. Even the batteries are about to get on the market, only because now they look like a good option. We had solar energy while we used coal, but the latter was cheaper and easier. Back to 3D Oled printing... stil impressive and can be a cheaper option. Never said that IT WILL be. You just went way too emotional about this, buddy. Chill = )
@@ezicarus8216 Sure, pal. I just said that this looks interesting and would be a nice option for cheaper screens. Learn to control your emotions, big guy. Have a good one.
@@ezicarus8216 should =/= demand. Not a promotion, just an idea. Linus have some videos about "old tech" being crafted in a curious/cool way. You are way over your emotions. I feel sorry for how triggered you are over a simple idea for a video. Please, do yourself a favor and play outside more often. Cheers!
@@ezicarus8216 Oh boy. I think it's better if you get some help. I meant it. Been here on LTT since 2009. You are the first salty, childsh, spoiled, triggered boy I have had the displeasure to find. You did it... you are that one guy nobody likes. Congrats! I really hope you find the help you need, hit the books (you desperately need some interpretation), go outside and grow up to be a better non trashy human being. The year just started, you still can make this your resolution. I wil be reporting you and blocking your profile. Again, have a good one. = )
GeForce Now means very little to Nvidia’s balance sheets. Quadro card and workstation sales on the other hand are huge. Anybody doing anything related to machine learning and AI probably runs CUDA on an Nvidia GPU, and they’d rather we didn’t buy the much cheaper GeForce cards to run all our compute instances.
@@benjaminoechsli1941 GeForce NOW is and always was free... it's just that you have to wait in the queue for 10-20 minutes to get 1 hour of playtime. Paid plan skips queue, gets you 6h sessions, 120fps and RTX. However, if you want to play singleplayer games you don't have to wait with free subscription. When I had a slow computer I used to play HITMAN there, with a free subscription. Never had to wait. Multiplayer games had queues, though.
@@mediumplayer1 Correct, but the point of this video is multiplayer 100-200+ fps RTX gaming, which would require a paid subscription if you don't want queue times.
Frankly i'm more interested in the para virtualization than I am in anything else. My understanding is that para virtualization amounts to virtual machines sharing a single copy of the windows kernel rather than each VM having its own copy of the windows kernel. Which makes the para virtual machine overhead smaller then the overhead cost of having each virtual machine run its own copy of windows. (As I said "my understanding", which is hazy, and I could be wrong) Does the para virtualization only work on Windows 11? Can I run the para virtualization software on Windows 10 if I don't care about sharing a GPU?
Could cs go have been memory bound? You have a great GPU and CPU, could it be the ram speed being overwelmed? Could you try a cpu with quad channel mode?
@@TitanBoreal Moonlight should be able to do something similar. It's an open source implementation of Nvidia's game streaming tech. I was using it wirelessly on the opposite end of my house to play games on my jank laptop before.
@@TitanBoreal my team is actually working on a parsec competitor focused on general consumer use where each window is seperate rather than the entire desktop, will be a few more months until we get it out but is planned to be free for personal use.
Talking about how Nvidia locks down this functionality while ignoring how this might work on an AMD GPU is a great way to give credit to the "LMG is owned by Nvidia" people
The performance theories part is pretty simple in my opinion: 1 Virtualization is not 100% effective 2 the GPU not only needs to run games, it needs to run windows, and stuff
The most problematic part is moving all that extra data between the systems. 4K image at 300 fps would require pushing 3840x2160x4x300 bytes per second for a single client or 10 GB/s. Unless you have RDRAM networking in all devices and GPU drivers to use that, that will eat 10 GB/s out of your memory bandwith and PCIe bus. Multiple by N clients and you could up being memory bandwidth limited no matter what CPU and GPU you have. That Threadripper with quad channel RAM might actually help here even though you end up sacrificing some single core CPU performance.
@@MikkoRantalainen The frames remain in VRAM, rather than being moved to System RAM. Parsec encodes the frames in VRAM, so only the encoded frames need transferring out, a nice and cool 5-50Mbps per Parsec host (per VM, in this case) depending on the configuration. If the full frames were being transferred around, we (I'm one of Parsec's devs) wouldn't be able to achieve the performance that we do.
@@kodikuu Yeah, this is a great example why you want your GPU to include hardware encoder even if you had multiple extra CPU cores available for the task.
@@MikkoRantalainen Parsec doesn't use software encoding to begin with, it will fail without hardware. It will also only encode on the GPU it's running on, which must also be where the display is connected, and what Parsec is running on
I'm guessing the 3080 Ti is splitting the memory into 4x3GB. 3GB is enough to run Doom Eternal and Halo Infinite without vram limitations but 3 is a minimum, there just might be a purpose for the 3090 having 24GB after all.
That's a nice way to do this stuff, props to Parsec! I used to do it with Aster, it's very light on resources and disk (Because everyone is using only 1 windows and the same installed programs but everyone with a different user) and it does not have this lag/frame drops/compression as it is all run on host, no vms, the only downside afaik of Aster is that only 1 person can run Steam 100%, because to run 2 or more steams with this method you need to use Sandboxie and some games(VAC/EAC) won't work inside sandboxie but evething else works, ubi, origin, riot games.. Guess I'll try this Parsec mode to run Steam games that have this problem.
@@sntg_p It's really that good. My cousin lives in a house near mine and we have a lan connection between them, I use a dummy plug in my GPU and he uses SteamLink app in his Smart TV and we play Payday 2 really smooth even with average hardware (1070, 2700x, 16gb ram)
And it's all auto, turn on PC it autologs both accounts and opens Steam at his side to connect with SteamLink. If you're going to try this method remember to setup the CPU cores at Aster because if you let Windows manage it some games will try to use 0, 1, 2 at same time and leaving the rest idle.
I have to say, even though this looks like a good idea, this is not the correct aproach. The scaling wasnt correct because you have to consider that you are running about 3 windows os instances on the scaled scenario, the host and the 2 vms. The correct aproach would not use vm at all, just enveloping the game to limit resources and streaming the game window to the clients, that would give you the native performance.
windows may have issues managing more than one game as the foreground app will get priority over the background apps. also youd have to direct inputs to each app. the optimal setup would be using kvm with multiple dedicated gpu's. Or if we limit the framerate and resolution of the host and guests to allow more even distribution of cu's. hyper v gpu-pv likely will never be 100% efficient.
Problem is having 2 windows open AND in focus. I did try it a year ago and didn't succeed. I got 2 separate keyboards and mice running on a sigle PC but Windows just doesn't allow you to interact with multiple windows at once
@@taylervest6594 Little bit more educated on the matter now, you can use rdpwrap to patch windows to allow more than one instance of rdp connection, change the registry to allow 60+ fps rdp, and then just use Remote Desktop Connection, i dont have the hardware to test that but it should eliminate all issues.
Having grown up with token ring terminals in our school computer lab I've always been surprised that the idea of having one main computer and then a bunch of dumb terminals never caught on for home use.
Hey Linus & your crew! How is this compared to unraid? Any performance difference? Also possible to locally access the stream (got get rid of latency) and for example have one PC for an isolated Gaming installation and an Isolated work installation? Very interesting topic 👍🏻
This is actually incredible. I remember trying to replicate the 2 Gamers 1 CPU build you did ages ago while in high school. And now I can try and do the same thing again but with my current build and only one GPU. Also that wasnt hacks, its a very common spot to just spray and they got lucky. Pain.
Any idea if this would work on AMD video cards, or is it nvidia exclusive? It could be interesting to have this as an option for when friends want to play games when they're over but don't have their computers or something of that nature as well.
Nice work Colin! I've seen other "TechTubers" doing very similar videos, though that was prior to Parsec publishing a script to make it easier. Given your recent Linux Gaming Challenge, what would be super sweet is to see a video of how to run Windows VMs with GPU para-virt enabled on a Linux Desktop, so that one may use the Linux desktop for "work" and then after work is done, boot up some Windows VMs for "play"... with friends! :-)
It's possible to do with any kind of graphics, AND without configuring VM's at all using software called Nucleus Co-op, which sadly has no Linux release as the devs don't see enough interest atm.
The fps reduction for the VMs could be due to additional usage from having to send the connection through the network, something the host doesn't have to do. May also be windows prioritizing the host in the case of a bottleneck somewhere
There is probably also some overhead in the virtualisation layer. Depending on what kind of drivers are installed in the VMs this *could* have an impact. I distinctly remember that Citrix Xen Server (a virtualization solution that also utilizes para-virtualization) required additional guest drivers to be installed on the VMs. It'd work without them however certain features were disabled (like controlling the VM from the Hypervisor - i.e. shutting it down gracefully and such) and performance was worse. Though that was compute only - so no graphics cards involved.
I’m interested to know how this works for VR. My wife and I both have a Quest 2 and I stream my games wirelessly with Virtual Desktop but she just plays games natively on the Quest. It would be good to both be able to play PCVR together without having to buy a new PC with GPU’s you can’t get anywhere.
@@JackStratton Makes me want to try it even just with my 3060ti. Maybe 90fps per quest is asking for too much so 72fps would be ideal. Much better hardware would be pretty neat if you had the spec for 90fps each and well depends on the games too
Nvidia has quietly removed some of the concurrent video encoding limitations from its consumer graphics processing units, so they can now encode up to five simultaneous streams. So now you can have 6 players (one local + 5 VM). But I think you definitely have to have 24Gb VRAM for this :)
Here is almost certainly why the scaling isn't perfect: context switch. When you only have a single process on the CPU, the process runs at rate 100. When there are two processes, there is now a non-zero context switch overhead, so the rate becomes (100 / 2 - context switch overhead) < 50. Same story for GPU.
This, but I think it's more because the video output is still being rendered and outputted by the GPU. So the GPU is rendering/outputting every other frame (one for each VM). Then you've got the context switching between the two VMs on top reducing total framerate
Parsec provides some statistics on this Typical values for latency is: Decoding: 3-5ms, Encoding: 2-5ms Network: 1ms So a total of around 10ms just for running parsec - surely theres some more latency and i wouldnt do it as a comp. player but most games run fine
It helps to know that Parsec has quite a heavy GPU overhead in addition to what ever the game requires. That said, I LOVE parsec and use it to game on my main PC on my old laptop. It’s a great gaming experience. I’ve also used a dummy hdmi and sent my physical output to big screen, while remotely connecting to the secondary monitor. That was my karaoke setup so big screen has video output, and my laptop is great for the control software.
If they had really cut that GPU in half they could probably still re-sell both halves for higher than the MSRP given the state of the world right now.
"Radeon RX6900XT $1399"
*not working, for parts only
I wonder how much they could actually get for that GPU since it looks like a quite old Radeon card with those 2 DVI ports.
No problem. Just solder the traces back together and there you go.
@@florianlucs7229 what if they hit the die, a memory chip, or some power management circuitry? If it's a multilayer PCB, they could have mushed some traces together internally. There's a lot more than soldering traces back together.
I found Easy way split GPU to both system runs on windows 10
I split my GPU into two and me and my bro used 1 GPU to play multiplayer game Borderlands 1,2, pre Sequel multiplayer we made two users in my desktop and run space desk and install persec on second user and use login to second user and connected second user to laptop and then i switch to first user he use mouse and keyboard on laptop to play game using parsec and I use desktop to play form desktop his laptop is so low it has no fan celeron also has no 4 gb ram and hdd Hard drive we played all game smooth using this method no vm not complex it's simple and
Or you could do what I did, get a 64-Core Epyc Rome, 256GB of DDR4 ECC, 3x Titan X Pascal and 10 1.2TB Enterprise SSD to run 12x Gamers off one rack mounted system. Awesome video Colin (and Linus). This tech is incredibly powerful for distributed computing, and I can't wait to see what the future holds.
Your videos are always fascinating and fun to watch. I thought about your content first when Linus mentioned the topic of this video 🙂 Keep it up 👍
Finally a way to play AoE2 with friends like the old days
Does this actually work?
Meaning using multiple GPUs and doing this?
Me and some friends often meet up for league of legends and games like that. Usually with 5 people. Instead of them all bringing their fulltower they could grab any ultrabook and another brings a second gpu and we could chop it up.
Like I have a 3080. My friend has a 3090. We could take 2 streams from his gpu and one from mine and it still works?
so thats why we are short on gpus, you bought all
I was about to buy one of those GPU servers to start up a VM provider, I almost bought up the one with six 2070 thinking I'd host atleast 6 clients for now...
That was when splitting GPUs were not really a great experience yet, I ended up just waiting on it, stocking up on gold while the world started brunning down.
I kinda regret it considerin the GPU price right now, so now I have to wait for them to go down again, and I'll do this except split them in two!
I feel that as a kid linus' experiences with being broke and wanting to game coupled with his experience as an actual company owner has really helped his want/need to get around arbitrary bs like this, it's really really solidly good for gamers.
Sharing the cost of 1 top tier GPU with the boys would definitely be easier and cheaper than finding 4 mid-range GPUs
well yea, but i just thought of a better idea for home.
i mean my wife sometimes wahts to play civ6, but not htat often and when she palys, she likes to do it on her abit older and slower laptop. well i havent had the need to upgrade it because she does not play that much. but she could log in to the VM on her laptop and run the game there.
This gave me a good idea.!
@@AndresUffert2 Exactly. It still looks a bit too fiddly for me to bother but I can really see this being a great way to make a home server (even more) worthwhile.
I mean, looking at performance numbers, I’d expect that any midrange card would still significantly outperform this VM. Plus, if you’re playing competitive games like CS:GO, the latency would likely be unbearable.
do you see them running 4 games in 1 instance playable for 4 players and the game there running is cs-go an extremely easy game to run...just go buy 4 haswell systems and GTX 970's cheaper then 1 3090 the only reason your even seeing a video like this is cause linus has been paid to make it by NZXT not because its a good idea or good value
Isn’t it sad, that’s the current situation we live in.
I was all into virtualizing my gaming PC until I learned how much anti-cheat software hates VMs.
Oh most definitely. Exam browsers as well. Took a PSI remote exam and their exam browser detected that I had VirtualBox installed. Even though the service was disabled and all. Only thing that worked was uninstalling it.
yup. it's frustrating
That's something Linux users are used to.
@@tuxi04 absolutely, been trying to find a way around this issue for four years because of college :(
I gamed in a VM with PCIe passthrough for a year until some of my most played games stopped working because of anti cheat and there were some quirks that were annoying too
this might be an interesting one for households with a kid or two kids that are looking to move from console to pc. Having a hub pc for the household is an interesting concept.
The nightmare every time someone cuts out...
I'm surprised nvidia hasn't jump on this would encourage people to buy more expensive gpus.
I own a rtx 3080 but would happily waste the extra money buying a 3090 if this was possible with out the work arounds
yea it's far from that I am sure but the principle is interesting for sure.
I did this myself and my kids and I can play No Man's Sky together without issue.
I've had this working for months with no practical reason other than it's cool.
To share a GPU with friends, it requires two things:
1) a GPU
2) friends
I only have a GPU
You have a GPU? Lucky.
I can be the friend
same here lol
I have a gpu too!
i have neither :
Linus' GPU is so big, he has decided to split it to 2 and share it with others.
Such a generous man.
those communists!
I found Easy way split GPU to both system runs on windows 10
I split my GPU into two and me and my bro used 1 GPU to play multiplayer game Borderlands 1,2, pre Sequel multiplayer we made two users in my desktop and run space desk and install persec on second user and use login to second user and connected second user to laptop and then i switch to first user he use mouse and keyboard on laptop to play game using parsec and I use desktop to play form desktop his laptop is so low it has no fan celeron also has no 4 gb ram and hdd Hard drive we played all game smooth using this method no vm not complex it's simple and
If you can make them to read my message it would be helpful for may new easy way video will come
This should be the discription of the video!
@Danny Blue that's expected from a Canadian lol
Anti-cheat software: Oh no you don't.
Scalpers: Oh no you don't.
lol
I remember you mentioning this project a few weeks ago, so I’m really excited to see it finally happen. Cool use of Paraec, too.
@K A L I S A !!!!!!! get out of my swamp
Parsec?
Yeah that app to share access to a pc
Parsec is far from perfect. But it beats not being able to play games with your friends
Use Aster.
This would have been great for your "Leftist PC" build, it really is our GPU.
Hah! Yeah
?
Linus should contact AZAN and ask him to let Linus set this up on his PC
@@mcslender2965 ?
Commie Jarrod…the Regime Loves you!
I remember Linus mentioning this when the 30 series was originally preparing to launch. I'm glad he was able to get it running!
I LITTERALLY WAS TRYING TO DO THIS YESTERDAY! Thank you so much Linus for making this video and James Stringer for making this possible.
Give this method a try, it works better, Aster multiseat
@@marcogenovesi8570 video name or link, pls.
One of the biggest introductions of latency is queue submission and having to re-instantiate parts of the pipeline. Now I haven't investigated how para-virtualization works but at least each game submitting their own queue submissions is entirely unavoidable. This is because these submissions are specific to the geometry being rendered, which obviously is different for each game. This is why dividing a video card across users will never have good scaling and you will always see a bigger reduction than half the framerate.
There is major other factors to such as much more cache invalidations because the geometry in between frames is different,as well as more memory contention due to memory access patterns being now less optimally scattered from the different workloads.
We're getting there. One day this might be as easy as running a N64 emulator. Tech is advancing fast imo.
@@devintolliver92 GPU tech is advancing by keeping or even extending the rendering pipeline, absolute performance is going to increase over time but relative performance will stay stagnant
cpu scheduling is catching up and theres another gen of gpus coming
Does this mean that if you wanted to run this setup for say 2 users, you'll have to contend for a bit less than half gaming performance for both instances, instead of exactly half, just to account for the overhead of this workload not being as predictable and easy to calculate as normal gaming?
In that case I'd love to see how it fares with a benchmark running at the same time on both instances once setup properly, just to find out roughly how much overhead is expected for each configuration.
Or you could just use the program Aster multiseat to remove virtualization from the equation.
I think it's worth mentioning that some anticheat software (Vanguard, etc.) rejects running the game they are bundled with in a VM. Maybe there are workarounds for the particular game, maybe not, but don't expect to be able to play every AAA competitive game with this setup.
1-2 frame extra latency will take care of playing competitive esports games already.
@@MikkoRantalainen not all AAA games are played competitive, but I get your answer is directed at Alejandro's "but don't expect to be able to play every AAA _competitive_ game with this setup." For just evening match or 2 in games that are really restrictive with anticheat, then most likely you already have dedicated machine and no need to share GPU. The GPU sharing seems more of use in maybe setting up as some sort of couch machine/home media machine or for someone else than yourself in same household (to lower latency) to play something together.
Those anti-cheats also have a tendency to reject Parsec's input, or even sometimes it's screen capture, most of the time. Even without being in a VM, Vanguard is not playable remotely
@@krisavi With "AAA competitive game" I was thinking on games that value competitive integrity in a way that makes them resort to this kind of anticheat software. An "AAA competitive game" doesn't necessarily have to be a first-person shooter, where reaction times are paramount, or similar fast-paced games, but yeah, both of you are right in that such a setup won't offer the best experience on such games even if it worked.
What I think it is more important is that games with this kind of anticheat won't run at all. You can't even choose to play private, custom matches with your friends to have a good time, where some extra latency doesn't really matter.
@@krisavi couldn't agree more.
I think leaving the host VM preview open is hindering the performance a bit during the demo. It should be closed and the VM should be started from the menu option instead of opening the preview for a more realistic result
There's also going to be some overhead with the streaming/encoding. The main bottleneck here is going to be the limited PCIe bandwidth though.
Also the host always has priority in performance. So its better to do 3VMs than 2VMs + 1Host
@@wakannnai1 I don't *think* the 3090 is able to max out 16 PCIe lanes, we have yet to see GPU's hit those limits. Typically those limits are only reached with enterprise hardware (networking, storage, etc.). The biggest bottleneck is undoubtedly Windows overhead (as Windows is running 4 times on the same CPU/GPU), with the second likely being storage (though that should not effect in-game performance).
Either way -- like Linus said it is never going to compare to running on bare metal as long as you have to connect your display and peripherals over a network (no matter how fast your network is). If Nvidia opens up the driver (unlikely) to allow VM assignment to the GPU outputs that would drastically improve performance.. BUT Hyper-V would no longer work for that as it works differently than other hypervisors. You cannot assign specific hardware/ports to a VM only allocate their resources. Though it's possible that is the reason this works at all.
@@wakannnai1 most gpus work fine on pcie 2x16 so i dont imagine that would be happening on a modern system.
In a shared GPU environment like this, the host always takes precedence, thus if the host requires virtualization, it will be done first then the vm's.
How many takes did Linus need to keep a straight face during "definitely don't just spin up a new one whenever your license expires."
That was a real informative warning.
I mean you could probably run wine with Linux
Whatever you do, DEFINITELY don't build a KMS server and have your DNS point everything to it.
@@JosephHalder Yeah... at that point you could just use MAS and force a HWID activation through that. Way less of a headache than selfhosting a KMS
I was doing academic work in the virtualization high ability field circa 2011... For years there was a flopping between hardware and para in terms of what provides better performance and what will be the future. At one point around 2015 it seemed hardware will be it for good. I personally always felt para was a more elegant and cooperative design, though back then not available for Windows.
Linus should get a sticker for the back of his laptop that says something like “Full disclosure I am invested in Framework”
How about an "I'm Invested In" sticker above the Frameworks logo?
@@benjaminoechsli1941 nah let it actually say the word framework I want that shit to succeed and badly
I think the idea behind the laptop is good, but the build quality needs some work, especially the hinge. Check out how much the screen shakes compared to the other laptops.
Two questions I'd love to see a follow up on:
1) amd hardware
2) would windows server offer lower overhead / higher performance?
I doubt you would see a difference in performance on Windows Server, but you would gain the ability to directly map hardware to a VM…which granted, isn’t really the goal here.
I'd think that the host OS being Windows server might help the VM performance, but then I'd expect compatibility issues playing games directly on the host.
@@brychaus you're probably right about host gaming issues, but for a home-server use case, the three (or more maybe?) VMs would be enough for the home. Who has more than 2 friends anyways lol?
@@christopherprevost763 Yeah I see what you mean, but the host got so much better performance in LTT's testing that I would want to use the host to take advantage of the performance.
@@brychaus Host will always have the least overhead for hyper-v since guests have the hypervisor as additional overhead. I'm going to give this a shot over the next week or so with one VM as a proof-of-concept for my home setup.
The sped up, step-by-step, screen cap of the vm being spun up in windows, is actually very genius, and should be a standard for content like this-- that's technical and takes many steps-- maybe even queued up a bit earlier during talking points. It's almost like the first time I had seen a teardown on GN where the unscrewing etc was chopped up and played back normal speed. Kudos to the editor of this vid.
Really cool to see you trying this out. It's also possible to do this in Linux using VGPU unlocker.
Doesn't work with Ampere cards right now, so not entirely true.
@@Grimbakor That's true, not as wide spread, but it's another option.
It should be noted that having these instances virtualized means you will loose some performance due to the overhead involved in virtualization. How much of an overhead depends on things such as how efficient the hypervisor is.
Yeah, that's why people use Aster Multiseat, there isn't any virtualization taking place. This method sucks.
This is actually a great option for trying out windows 11 from a windows 10 gaming PC. Parsec makes it a much more native feeling experience than hyper-v alone.
If you ever use Moonlight. That to me feels even more native than Parsec also much smoother in my opinion.
"You need a Windows license for each VM"
I didn't know Linus is a comedian
It's probably more like a "I have to say this for legal reasons" situation. ^^
@@StitchExperiment626 same thing
Linus - "Here's how to do this thing Nvidia hates people doing on the cheap"
Nvidia - "Oh heck naw! Get to work people. You got work to do"
Sad but true
Also Linus. "Definitely don't just spin up a new VM every time your windows licence expires"
How about limiting the Fps in your Host system, should result in more evenly distributed Performance across the vm's i think?
Linus Tech Tips before: “No GPUs were harmed.”
Linus Tech Tips today: “GPUs were harmed.”
Colin: *Tries to saw a graphics card in half*
Everyone: *Sweating profusely*
It's a nearly 13-year-old GPU, no worries
@@cyberspectre8675 In this market, someone could probs still benefit from it.
Imagine you want to play a round of that game and having Linus testing his VM on your team.
This works on Windows 10 also (20H2 and higher) Craft computing did a video on it.
“You’ll need a windows license for each VM” well, not exactly, the host computer needs 1 Enterprise license and each client must also run Enterprise . Professional doesn’t cover delivering a remote windows desktop…and oddly, MS cares how many computers run the desktop, not how many instances of Windows are running.
If you're just using this for you and your friends when they come over you'll be fine. The laptops that are using the VMs all have licences anyway.
@@jammiewins sure, that was my point. Don’t license the VMs, doing so doesn’t actually make it valid. Either way you are just trying to fly under the radar.
I think only way MS officially lets you run Windows 10/11 in VM would be EA agreement with VDI permissions?
@@finnderp9977 correct, and in that scenario you basically have two licensing options. Licensing named users, or licensing devices. And contrary to what is logical, licensing the device means licensing the client device, not the server or the VMs it is running.
@@finnderp9977 All you need is a valid license for the operating system you're using as guest. Windows 10 has a limit of 32 virtual processors allowed.
I would love to see how this would run on and AMD RX 6k 16GB series card cause i get the TI is powerful but i want to see the power that AMD can output on the similar work load
"You should get 4 different Windows licenses. Definitely don't just spin up a new one when your license expires." *bites lip nervously*
Thanks for telling me what NOT to do Linus, I will definitely NOT do this if I try this build.
Every time I hear people say activate your windows, I laugh at all the shop computers and test benches running the same non activated version of windows 10 pro that I did in a single night on like 5 towers 🙃
microsoft doesn't deserve your money 1 time, let alone 4. For an inferior OS filled with adware, they don't deserve $100. Linux is free and the only disadvantage is lack of third party support, which microsoft have done nothing to deserve.
@@itdepends604 to be fair, Windows licenses are practically free with any prebuilt computer... I wish they'd just move to a freemium model and let me disable ads, automatic updates, etc on a pro license. Linux gives you choice but the average consumer isn't great at dealing with tech issues by themselves.
@@DelphinusMAch1 windows licenses are NOT free with prebuilts. While OEM's obviously get bulk discounts, it still adds around an extra $50 to the cost of the computer.
Almost all of the problems with linux are software support. Apart from this, linux is actually more stable then windows. People are obviously going to find it harder to sort linux issues if they're used to windows, but linux issues generally aren't any trickier to solve then windows issues.
So I completely stand by what I said. I am not paying a useless company to use my already payed for software, when they are already getting payed by their adware anyway. My VM's will stick with unregistered windows.
16:23 "Windows License For Each VM"
This is where gaming on Linux can really take off.
Spooling up 4-8 instances of a game with just one box could bring LAN parties back to mainstream.
You do realize that was a joke? You don't actually need to pay for Windows these days...
when he said, "you DEFINITELY shoudn't spin up new instance when it expires" there is hidden wink there
@@shiverwind Yeah that's what I get for watching everything at 2X.
Amazing timing for this video. My GF and I have just started to play some games side by side in the evening. I‘m playing on an R7 5800X + 3080 Ti tower, she‘s using a Surface Pro 4. I was almost thinking about getting some used hardware for a few hundred to get her a better experience. Now I can just share my system. I’ll definitely try with my gaming buddy later on, he’s rocking a GTX 560 Ti… Thanks so much!
I liked the format of the LTT store promo, less intrusive and not repetitive - if you could do it like that every time it'd be cool
It would be interesting if instead of splitting the VRAM up evenly if it could detect if multiple clients had the same game open and have a shared memory pool so they could share the vram.
Aster multiseat < Basically does this in real time with ALL of your hardware.
I’ve been waiting for this one since the short clip, thanks.
Hyper-V is pretty damn easy to set up but stilll impressive for someone that has never touched it.
Could you do this with a AMD card aswell, would love to see what the performance would be like on team red
My biggest question after this video
Thats what I was wondering. My 5700 xt would be nice for two 1080p games side by side
@@Tekgnome. thats my exact question. It's definitely strong enough to handle it. Would love to co op Total Warhammer 2 with my gf.
@@marcogenovesi8570 From a quick google, the passthrough functionality works on AMD too (and they never blocked it off). But it might need to be done in another way than in this video.
@@marcogenovesi8570 That's completely wrong, it definitely works with AMD, Intel GPUs as well.
Running a LAN party off 1 or 2 machines when your friends don’t have PCs is pretty sick
So they're steaming the signal to their laptops, but the processing is being done by the desktop in the background!? Holy crap that is awesome for lan gaming sessions!
I see what you did their…..
”steaming” 1 game to 2 laptops from a desktop. 😂
(I’m sure that was probably a typo, but it actually works out since they are indeed using steam….also totally not trying to hate or anything but I’d write “lan” = LAN . Since “Ian” looks exactly like my buddys name “ian” as capital i’s and lowercase L’s look identical.
I have been using Aster for years on Windows 10. It allows spliting outputs. I highly recommend trying it out.
Yeah, I used it too and I'm very surprised they nevet heard about it, because it's more than capable for gaming.
I've seen forum posts asking why ignore Aster, a response iirc was 'windows could potentially patch that out' yet here we are on the cusp of W10 going EOL.
I wonder why Microsoft doesn't want to support native multiseat, their licencing keys are under 10 dollars constantly on sale, and The Linux Challenge overlooked multiseat too.
Pls Anthony, show them how linux does multiseat + proton.
Bit Aster seems to have issues running multiple of the same instances at once where it is not allowed, like it is often the case with games. Or am I wrong?
@@lugaidster ah thanks. Tried Sandboxie before and it didn't work for the games I wanted to play, so I used the HyperV approach with GPU paravirtualision.
@@lugaidster GTA 5 also didn't run neither duplicated nor in sandboxie
The idea of running multiple game instances on one machine reminds me of the days of split screening locally on older game consoles as a kid. Would be a killer config for LAN parties.
This is why I want build a PC like this
Damn makes me think this was their plan all along. Make us buy two copies of the games 😂😂😂
This is an intriguing idea for my wife and I (both architects WFH) to have one workstation two architects. She’s remoting into her office and I work locally so this really could work even with a 1060 & 8700k.
What is the actual input delay though when using two separate machines through virtualization? Because that matters a lot when playing fps games.
Depends on what hypervisor your using I use proxmox on my home server and can’t notice any perceptible input or output latency
More then what you really wants always, its a cool way but fanboys and competitive play is out of window. This is why virtualization works in datacenter because its whole other technique but still with lag. Always!
Most of the input delay comes through the streaming process, even on a wired connection with the "low input lag" feature in parsec enabled it's still slightly noticeable.
This method sucks due to the delay, I'd recommend aster multiseat instead.
@@vladger6097 How does Multiseat resolved any of that? You're still dealing with network latency for both display and input over a network.
That little bit of overhead is surprisingly small, and less impactful than I'd imagined.
But yeah ima just save myself the money and throw another RX 580 in my rig when I decide I need solutions to problems that don't exist
unfortunatly this doesnt work on rx 580s :(
They do exist when you have 4 or 5 children trying to game at the same time this could save a lot of wasted energy 5 pc's can become 1 pc
@@waderyun.war00034 and if your kid has a school project that need good computer, they can weild a PC that would be 5 smaller PC if budget is equal
@@waderyun.war00034 I may be wrong but i think hes talking about the imaginary problem being nvidia's lock on splitting gpu, not "1 pc multiple gamers"
@@acatch22 little of column a, little of column b
I've been researching this on and off for forever. This was the main thing keeping me from switching to a fully virtualized setup. VFIO time, maybe!
Time to actually watch the rest of the video to see if it's practical.
ah fuck it's hyper-v specific.
Use Aster!
Check out Craft Computing's misadventures, if you didn't already
You can do that with Looking Glass or VFIO is also an option. SR-IOV(the splitting of gpu) is somewhat of a hack right now, unfortunately nvidia are still buttholes about it. But it you can make it work and even share basically hardware connection to it with Looking Glass. Or you can do VFIO passthrough if you have a few gpus laying around(lol)
this is nothing even remotely distant to VFIO.
It's great to see Jonathan in the background of this, cracking up playing while Linus delivers the script.
00:00 strong red shirt jeff vibes
First thing i did when a saw the intro was look for this comment.
Technically wouldn't there be three instances of the pc running- like the original and then the two virtualizations- maybe it is holding back a little in case the original instance needs juice? I honestly do not know anything.
Iv been waiting for a follow up video like this!
I'm curious - could there be a 'comparison' in cost between having four individual systems (with similar performance) to having this sort of setup with four virtual machines? I mean, is it actually cost-effective to run one powerful machine over four not-so-powerful ones? What about the cost to run the systems (power draw, etc)?
From some quick math and assumptions, assuming you get the GPU's at MSRP and buy all new parts, the expensive system (3080 Ti, 5950x, 32GB RAM) and 4x cheap systems (GTX 1650, i3 10100, 16GB RAM) would be within a couple hundred dollars of each other in either direction depending on exact parts chosen. If we swap the 3080 Ti for a 3080 however and go off MSRP the expensive system would be much cheaper and I don't expect performance would change too much. As for power consumption they probably wouldn't be terribly off from each other but this is really hard to estimate without actually building these systems and getting exact power numbers (not enough to just add max power limits of the parts together as they probably won't push to the complete max and PSU chosen makes a difference).
Of course there's multiple ways to go about this, for the cheap system I chose parts that made sense to me but someone else could choose cheaper or more expensive parts. If you go used the story is completely different and 4x cheap systems would most likely come out way cheaper than a single expensive system but power consumption would probably favor the expensive system in that scenario. If you have any questions about any specifics on the systems or rationale feel free to ask
Way cheaper if you don't all game at the same time.
Have two people game during the day and sleep during the night
Have two people game during the night and sleep during the day
2x performance for cheaper price.
You could use one system with multiple gpus and assign them to the various vms. This might be cheaper depending on the gpu that was purchased.
@@taylervest6594 depending on how many, that might require a much beefier host, in terms of available PCIe lanes more than anything.
I've run into this problem with my 3900x based server, add two GPUs, 10gb network and a disk controller there's not enough lanes. I'm upgrading to first gen Epyc for this reason (I can also consolidate some of my other gear), 128 lanes all the way.
@@morosis82 i have 2 gpus but not the 10gb networking with my 3900x though i think that setup would be fine with my mobo asus pro ws x570 ace.
Just 40 seconds in and I'm allready hooked. This looks interesting.
I've been doing this for years on linux, using the DRI subsystem to shard a single GPU between multiple users on multiple seats. Works best if you either use integrated graphics or toss an ultra-low-end graphics card in to get more display outputs. You *can* drive multiple window systems off a single GPU, but it gets a bit unreliable compared to one-gpu-per-seat. You can usually get suitable GPUs for about $15 each. And without the overhead of a full VM stack for each copy, the performance hit is much less.
Is there anywhere I could read about this? If I understand right, your main GPU supplies the processing power, but the extra, crappy GPU supplies an output port, yes?
@@CogoGaming display driver and graphics driver are not the same basically. Look in windows hardware and you see the same. They are different.
@@prydzen this is actually how this project that linus featured works. Basic Display Driver/Hyper-V Display for, display and for Graphics its your GPU.
I would love to see technical videos like this, but for running more basic things on Linux. This takes me back to when I was learning how to use Windows as a kid.
No GPUs were harmed in the making of this video.
Are you sure?!
@@Sinaeb dunno, ask linus
Liesss
Mm until overheating
I was just hoping that that GPU was DOA or something
This really makes for some nice use cases. I have two kids, and using parsec on a pair of raspberry pi’s to play some local games with them sounds amazing.
You should test fixing the frame rate on each VM to get more consistent performance without rinsing the GPU
So, since Nvidia prevents partitioning on consumer cards, would this be theoretically easier on a 6900? And would that work for splitting a hardware port per virtualization?
The GPU isn't being split in a traditional manner, Windows is doing all the work. So long as the drivers are set up right, either one works. I recommend Nvidia though, since it's encoder is higher quality and lower latency, making it better for Parsec
the amd card? yeah i looked into this because i want a home virt server and sadly instead of the hardware being locked on amd it’s just not there, they have had vgpu gpus but they cost way to much for a card with crappy performance
This has been working fine for months on my 6900XT graphics card.
Finally! I've been longing for this guide for some time now.
Be aware if you're using other VMs already. When enabling Hyper-V, other VMs like VirtualBox will not work anymore. So if you use the same PC for work and fun don't panic on monday morning
Not even with nested virtualization?
Windows 11 comes with Windows Hypervisor Platform which enables VMWare Workstation to coexist with Hyper-V w/o disabling one of them beforehand. IDK if VirtualBox supports it.
Apparently it's also a W10 feature (as of 2020). So at least VMware workstation and virtualbox should work just fine with Hyper-V on.
@@Skyline_NTR I ran into this issue when I activated Hyper-V on Win 10 in late 2021. Maybe if it's supported, at least some additional magic is required to make it work. I did not dive too deep into this issue because it's my productive system and I did not want to mess any further.
I'm already doing this with Windows Remote desktop and RDP Wrapper, it needs a bit a tweaking in the windows groupe policies and registry but once it's set up it's working great, I've only tested with one extra user gaming at the same time as the host though
So I just got done trying to set this up, but was having issues with connecting any extra controllers, such as my Xbox controller and USB steering wheel. Have you found any good way to connect these through RDP, or anything?
RDPGamepad works great for xbox controller, I don't know if it would make a steering wheel usable though
@@paularie2202 for some reason I wasn’t able to get that to work. But what I was able to do (for anyone else that reads this) was start a RDP session and then use Parsec to connect my peripherals. This allowed for a seamless connection, but also made sure that I could still use my host computer for other activities at the same time.
One of the best and most helpful video I've ever seen! I bought very good PC, but was unable to share it with my girlfriend which was stuck on old laptop. Now, we can play together with very good performance! Thank you Linus :D
can u tell how much ram do u put into the VM, and wich GPU do u have?
Also if u are using 2 separate ssd for each machine (host and VM) or just HDD for the VM
I've put 32GB to VM, and my GPU is RTX 3090 Ti. Also, I don't have separate HDD, my whole PC is running 2TB SSD M.2 NVMe, from which I put 200GB to VM.
However take note, that host computer has always CPU and GPU priority over VM, so when I play Apex Legends on ultra details on my 32:9 144 Hz, she got only 30 FPS in Cities Skylines on her FullHD montor.
But I've tested it on Leagues of Legends, and there is no problem in running 4 instances of game (host + 3VMs) in 240 FPS.
This is an awesome way to manage such sad times with GPU "shortages". My concern would be if Nvidia sees this as a thing and limits or nukes it's possibility all together then push it to newer GPUs as a "feature" just so it can be -further- up sold.
They already are, which is why it doesn't exist in windows 10. They probably will try to block it
@@dratrav if it don't work in windows 10 then why did it say it supports 10 pro in the easy gpu-p script? 11:46
@@ikkuranus oh I didn't see that, okay maybe it does work in windows 10, I swore it didn't use to as this was a big controversial thing when I first heard about Nvidia disabling gpu splitting on the lowerend models
@@ikkuranus that’s why the script exists, to unlock the functionality, they’ll probably get sued for this
@@pineapplepie4929 The scripts don't unlock anything, GPU partitioning is a feature built into Hyper-V. The scripts just set up the VMs for you.
GPU-P was also introduced in Windows 10 20H1 I think so it’s technically possible since them, I hope in next OS updates of Windows 10 and 11 they add a GUI configuration for GPU-P so anybody can set it up by themselves without even needing to watch and store tutorials anytime they want to create one
linus i was just looking to KVM and i found this video of yours, havent seen it whole yet but it might be a blessing
You guys should make a video about the new 3D printed OLED. A research group of scientists at the University of Minnesota Twin Cities has succeeded in 3D-printing the entirety of a flexible organic light-emitting diode (OLED) display for the first time.
I know it's only 1.5" and only 64 pixels. But still looks promising.
@@ezicarus8216 So what? Knowing that they can already do it should be interesting enough. Plus, things can scale way faster today. Specially if we already have the tech, just making it cheaper. Just like they are already using sodium-ion on batteries. It's way faster now.
@@ezicarus8216 Sorry, did I ever said about mass production (tech being ready =/= viable on the market)? I did said about things that can go faster now days. Even the batteries are about to get on the market, only because now they look like a good option. We had solar energy while we used coal, but the latter was cheaper and easier. Back to 3D Oled printing... stil impressive and can be a cheaper option. Never said that IT WILL be.
You just went way too emotional about this, buddy. Chill = )
@@ezicarus8216 Sure, pal. I just said that this looks interesting and would be a nice option for cheaper screens. Learn to control your emotions, big guy. Have a good one.
@@ezicarus8216 should =/= demand. Not a promotion, just an idea. Linus have some videos about "old tech" being crafted in a curious/cool way. You are way over your emotions. I feel sorry for how triggered you are over a simple idea for a video. Please, do yourself a favor and play outside more often. Cheers!
@@ezicarus8216 Oh boy. I think it's better if you get some help. I meant it. Been here on LTT since 2009. You are the first salty, childsh, spoiled, triggered boy I have had the displeasure to find. You did it... you are that one guy nobody likes. Congrats!
I really hope you find the help you need, hit the books (you desperately need some interpretation), go outside and grow up to be a better non trashy human being. The year just started, you still can make this your resolution. I wil be reporting you and blocking your profile. Again, have a good one. = )
Geforce Now is probably the reason they don't let you split GPU's you don't even need a gpu with geforce now just a decent ISP
Right. Why would I pay a subscription for GeForce Now when I can take a slice of my friend's rig for free?
GeForce Now means very little to Nvidia’s balance sheets.
Quadro card and workstation sales on the other hand are huge. Anybody doing anything related to machine learning and AI probably runs CUDA on an Nvidia GPU, and they’d rather we didn’t buy the much cheaper GeForce cards to run all our compute instances.
@@benjaminoechsli1941 GeForce NOW is and always was free... it's just that you have to wait in the queue for 10-20 minutes to get 1 hour of playtime. Paid plan skips queue, gets you 6h sessions, 120fps and RTX. However, if you want to play singleplayer games you don't have to wait with free subscription. When I had a slow computer I used to play HITMAN there, with a free subscription. Never had to wait. Multiplayer games had queues, though.
@@mediumplayer1 Correct, but the point of this video is multiplayer 100-200+ fps RTX gaming, which would require a paid subscription if you don't want queue times.
You should try this with Nucleus Co-op. Then you won't need virtualization or sandboxing, which drastically improves performance.
I just realized something... Linus being the face of LTT means he's the LTT mascot
Flow
He really does ooze "mascot" vibes. Love it.
Imagine someone at one of their events just walking around in a big Linus costume
Frankly i'm more interested in the para virtualization than I am in anything else. My understanding is that para virtualization amounts to virtual machines sharing a single copy of the windows kernel rather than each VM having its own copy of the windows kernel. Which makes the para virtual machine overhead smaller then the overhead cost of having each virtual machine run its own copy of windows. (As I said "my understanding", which is hazy, and I could be wrong)
Does the para virtualization only work on Windows 11? Can I run the para virtualization software on Windows 10 if I don't care about sharing a GPU?
I'm wondering about this as well. Windows 11 is kinda shitty and I don't want to have to install it if I can avoid it.
Sounds like a great way for worms, ransomware, and other viruses to propagate LOL. gotta think like a black hat hacker ;)
Could cs go have been memory bound? You have a great GPU and CPU, could it be the ram speed being overwelmed? Could you try a cpu with quad channel mode?
Seconded. We need a second video on this topic. Perhaps even give Radeon a chance?
I want to see something like this on Linux. I wonder how much the lower load of the OS would help things.
Inform Nucleus Co-op devs that you would be amongst the people that want a port.
Now that's a good intro and how to properly "use" power tools, Linus.
I wonder how well this would work using Steam's Remote Play in the VMs vs Parsec, since they both operate in similar ways.
I'm triying it rigth now and they are allmost the same. I'm serching alternatives to Parsec so if you have any idea please coment.
@@TitanBoreal Moonlight should be able to do something similar. It's an open source implementation of Nvidia's game streaming tech. I was using it wirelessly on the opposite end of my house to play games on my jank laptop before.
@@TitanBoreal my team is actually working on a parsec competitor focused on general consumer use where each window is seperate rather than the entire desktop, will be a few more months until we get it out but is planned to be free for personal use.
Talking about how Nvidia locks down this functionality while ignoring how this might work on an AMD GPU is a great way to give credit to the "LMG is owned by Nvidia" people
With an NZXT sponsorship I expected fire.
The performance theories part is pretty simple in my opinion:
1 Virtualization is not 100% effective
2 the GPU not only needs to run games, it needs to run windows, and stuff
The most problematic part is moving all that extra data between the systems. 4K image at 300 fps would require pushing 3840x2160x4x300 bytes per second for a single client or 10 GB/s. Unless you have RDRAM networking in all devices and GPU drivers to use that, that will eat 10 GB/s out of your memory bandwith and PCIe bus. Multiple by N clients and you could up being memory bandwidth limited no matter what CPU and GPU you have. That Threadripper with quad channel RAM might actually help here even though you end up sacrificing some single core CPU performance.
Didn't think bout PCI express bus
@@MikkoRantalainen The frames remain in VRAM, rather than being moved to System RAM. Parsec encodes the frames in VRAM, so only the encoded frames need transferring out, a nice and cool 5-50Mbps per Parsec host (per VM, in this case) depending on the configuration. If the full frames were being transferred around, we (I'm one of Parsec's devs) wouldn't be able to achieve the performance that we do.
@@kodikuu Yeah, this is a great example why you want your GPU to include hardware encoder even if you had multiple extra CPU cores available for the task.
@@MikkoRantalainen Parsec doesn't use software encoding to begin with, it will fail without hardware. It will also only encode on the GPU it's running on, which must also be where the display is connected, and what Parsec is running on
I'm guessing the 3080 Ti is splitting the memory into 4x3GB. 3GB is enough to run Doom Eternal and Halo Infinite without vram limitations but 3 is a minimum, there just might be a purpose for the 3090 having 24GB after all.
Yes a 3090 with 24gb will be a better option .. but does gpu p support amd gpus?
@@Geoforcee100 GPU P?
The Parsec team is truly a treasure
That's a nice way to do this stuff, props to Parsec! I used to do it with Aster, it's very light on resources and disk (Because everyone is using only 1 windows and the same installed programs but everyone with a different user) and it does not have this lag/frame drops/compression as it is all run on host, no vms, the only downside afaik of Aster is that only 1 person can run Steam 100%, because to run 2 or more steams with this method you need to use Sandboxie and some games(VAC/EAC) won't work inside sandboxie but evething else works, ubi, origin, riot games.. Guess I'll try this Parsec mode to run Steam games that have this problem.
It would be cool if they can test this setup with Aster! I remember sharing my computer with my brother one time a PSU died.
@@sntg_p It's really that good. My cousin lives in a house near mine and we have a lan connection between them, I use a dummy plug in my GPU and he uses SteamLink app in his Smart TV and we play Payday 2 really smooth even with average hardware (1070, 2700x, 16gb ram)
And it's all auto, turn on PC it autologs both accounts and opens Steam at his side to connect with SteamLink. If you're going to try this method remember to setup the CPU cores at Aster because if you let Windows manage it some games will try to use 0, 1, 2 at same time and leaving the rest idle.
I have to say, even though this looks like a good idea, this is not the correct aproach. The scaling wasnt correct because you have to consider that you are running about 3 windows os instances on the scaled scenario, the host and the 2 vms. The correct aproach would not use vm at all, just enveloping the game to limit resources and streaming the game window to the clients, that would give you the native performance.
windows may have issues managing more than one game as the foreground app will get priority over the background apps. also youd have to direct inputs to each app. the optimal setup would be using kvm with multiple dedicated gpu's. Or if we limit the framerate and resolution of the host and guests to allow more even distribution of cu's. hyper v gpu-pv likely will never be 100% efficient.
Problem is having 2 windows open AND in focus. I did try it a year ago and didn't succeed. I got 2 separate keyboards and mice running on a sigle PC but Windows just doesn't allow you to interact with multiple windows at once
The whole basis for this is Nvidia drivers won't let you run multiple instances on one GPU. In any normal manner anyway.
@@taylervest6594 Little bit more educated on the matter now, you can use rdpwrap to patch windows to allow more than one instance of rdp connection, change the registry to allow 60+ fps rdp, and then just use Remote Desktop Connection, i dont have the hardware to test that but it should eliminate all issues.
Having grown up with token ring terminals in our school computer lab I've always been surprised that the idea of having one main computer and then a bunch of dumb terminals never caught on for home use.
Well, I saved up so long for a GPU and still can't afford it. :)
Hey Linus & your crew! How is this compared to unraid? Any performance difference? Also possible to locally access the stream (got get rid of latency) and for example have one PC for an isolated Gaming installation and an Isolated work installation? Very interesting topic 👍🏻
This is actually incredible. I remember trying to replicate the 2 Gamers 1 CPU build you did ages ago while in high school. And now I can try and do the same thing again but with my current build and only one GPU.
Also that wasnt hacks, its a very common spot to just spray and they got lucky. Pain.
Any idea if this would work on AMD video cards, or is it nvidia exclusive? It could be interesting to have this as an option for when friends want to play games when they're over but don't have their computers or something of that nature as well.
I have this working on my AMD 6900 XT.
You're talking about NVIDIA not allowing you to do some stuff. But what about using AMD GPU?
Nice work Colin! I've seen other "TechTubers" doing very similar videos, though that was prior to Parsec publishing a script to make it easier. Given your recent Linux Gaming Challenge, what would be super sweet is to see a video of how to run Windows VMs with GPU para-virt enabled on a Linux Desktop, so that one may use the Linux desktop for "work" and then after work is done, boot up some Windows VMs for "play"... with friends! :-)
Quite interesting topic. I would like to see if it's possible to with AMD card and/or on Linux.
It's possible to do with any kind of graphics, AND without configuring VM's at all using software called Nucleus Co-op, which sadly has no Linux release as the devs don't see enough interest atm.
@@jokroast6912 I use nucleus regularly however its a bit trickier to run 2 legit instances of steam for example.
The fps reduction for the VMs could be due to additional usage from having to send the connection through the network, something the host doesn't have to do. May also be windows prioritizing the host in the case of a bottleneck somewhere
Also the fact that they are running 4 windows installs on same hardware
I would think cpu calls through the host and vms adds extra latency.
There is probably also some overhead in the virtualisation layer. Depending on what kind of drivers are installed in the VMs this *could* have an impact. I distinctly remember that Citrix Xen Server (a virtualization solution that also utilizes para-virtualization) required additional guest drivers to be installed on the VMs. It'd work without them however certain features were disabled (like controlling the VM from the Hypervisor - i.e. shutting it down gracefully and such) and performance was worse. Though that was compute only - so no graphics cards involved.
Props to LTT for using an ultrawide in their intro animation.
I’m interested to know how this works for VR. My wife and I both have a Quest 2 and I stream my games wirelessly with Virtual Desktop but she just plays games natively on the Quest. It would be good to both be able to play PCVR together without having to buy a new PC with GPU’s you can’t get anywhere.
I was thinking the same thing. Be great to see it in action
I don't see why it wouldn't work. Running two Oculus apps
@@JackStratton Makes me want to try it even just with my 3060ti. Maybe 90fps per quest is asking for too much so 72fps would be ideal. Much better hardware would be pretty neat if you had the spec for 90fps each and well depends on the games too
@@tech360gamer problem is, it's 72hz per eye, so 144hz.
Nvidia has quietly removed some of the concurrent video encoding limitations from its consumer graphics processing units, so they can now encode up to five simultaneous streams. So now you can have 6 players (one local + 5 VM). But I think you definitely have to have 24Gb VRAM for this :)
hey can you help me with running counterstrike on my vm, its partitioned but complains the drivers could be corrupt or something
How do I get the dummy hdmi to work. So far its just black screen for me similar to how a headless display is
Here is almost certainly why the scaling isn't perfect: context switch. When you only have a single process on the CPU, the process runs at rate 100. When there are two processes, there is now a non-zero context switch overhead, so the rate becomes (100 / 2 - context switch overhead) < 50. Same story for GPU.
This, but I think it's more because the video output is still being rendered and outputted by the GPU. So the GPU is rendering/outputting every other frame (one for each VM). Then you've got the context switching between the two VMs on top reducing total framerate
Would've been great to test input latency even if it's only on local network
Parsec provides some statistics on this
Typical values for latency is: Decoding: 3-5ms, Encoding: 2-5ms Network: 1ms
So a total of around 10ms just for running parsec - surely theres some more latency and i wouldnt do it as a comp. player but most games run fine
It helps to know that Parsec has quite a heavy GPU overhead in addition to what ever the game requires. That said, I LOVE parsec and use it to game on my main PC on my old laptop. It’s a great gaming experience. I’ve also used a dummy hdmi and sent my physical output to big screen, while remotely connecting to the secondary monitor. That was my karaoke setup so big screen has video output, and my laptop is great for the control software.