Here is hoping that someone working at such "company with deep pockets" does a whoopsie and "regrettably discloses drivers and documentation to an unknown third party"
With custom drivers like nimez for AMD out there with such deep levels of access to the inner workings of AMD drivers, I'd say it's worth a shot reaching out to the team that makes them to see if they can get the instinct cards working. They've pulled off a lot of miracles before, like backporting rsr to older GPUs, and enabling video encoding on APUs where it shouldn't even be physically present on the die.
@@Nightykk Yeah, AMD was far worse in the 00's (or actually, ATi) ... Ironically the X800 unreleased beta drivers for Windows 98 (unreleased cause of dropped support) actually managed to be extremely stable on that OS.
I do compute work and a couple years back I purchased an AMD workstation pro card. I knew it wasn't good at gaming, but it was competitive for compute. The problem I ran into was a year or two after I purchased the brand new card, I had discovered keras dropped support for openCL and went full tensorflow only. PlaidML backend no longer worked.... AMD also dropped their support for the card with ROCm... in the end AMD chose no longer to support the card with its drivers and companies are no longer supporting it. I have no idea how to use AMD cards for compute without custom coding everything, so I begrudgingly went with NVIDIA.
I bought 2 RX Vegas on launch due to the promise of Raja Koduri that they support SR-iov. Then, I found out that they don't; not even the Frontier Edition card did. Obviously, this really pissed me off.
I was just recently eyeing MI25's on ebay, thank you for making this video! This is VERY disappointing since I would love to get some GPU performance for my VM's. Everything available at this point seems to be a compromise of some sort and I'm just waiting for a "whole" solution (3d acceleration in VM & video encode/decode acceleration without recurring licensing fees or hacks). Trying to wrap my mind around pros and cons of what's currently available is confusing. My hope is that Intel's Arctic Sound-M will eventually provide SR-IOV and drivers for a reasonable price and encourage competition.
That is a bummer for sure.. When you teased this I was hopeful that the caveat you threw in was something we could overlook but it seems that was not the case. Love these videos - I haven't had a gaming VM in years now, this makes me want to build something!
This is why you need to switch to ESXi, Xen is ok for VDI, but Nvidia and AMD really developed for ESXi and Horizon. You should be able to get the VIB’s needed to get these working directly from VMWare. Also you can snag an unlimited Horizon license as part of the $200/yr VMUG advantage program to do all of this legally. You would be surprised with how well Blast Extreme and even PCOIP work for gaming, just gotta enable relative mouse mode. Done this quite a few times with GRID and AMD products both professionally for CAD VDI workstations, and personally for Gaming VDI’s, so I know it works!
This kind of defeats the purpose. I run a Citrix farm for a couple hundred engineers running Revit and AutoCAD and Bently etc. (with, oddly some Horizon clients as well because...west coast, don't ask) and a) ESXi is...I mean it's just another class, but with that class comes compatibility concerns and feature limitations with the free version or pricey licenses with the paid version b) the drivers don't come from VMWare they come from nvidia and THAT is where things get sucky. If we are talking "designed for vmware" then we are really talking GRID (regular passthrough GPU is a whole different beast, but that isn't where nvidia has put their money behind vmware) and GRID means licensing, which, ok older cards like K2's (which I don't consider viable for gaming in 2022) had different licensing and at this point you can get the bundle from nvidia. So you are back to HEFTY license fees for EXPENSIVE products to use anything current. tl;dr: "designed for vmware" really only applies to GRID and GRID is pricey, and it's drivers are locked behind a paywall and require a network license server even if you can get your hands on them some other way. Standalone doesn't require software licenses, but is really not meant for vmware.
@@PaulFulbright I guess my answer is 2 fold. One being that I recommended esxi because these enterprise class GPU’s are designed to run with enterprise class hypervisors, and nobody is running proxmox… and along with that, the VIB’s that are required as the “hypervisor” host driver don’t exist for other hypervisors than esxi or Citrix. Yes, I know the free version of esxi doesn’t include vcenter which is required to modify and assign vGPU profiles, however to make a RUclips video about, it’s not hard to google and find a license key thats full featured and doesn’t expire, or if he wants to make it more legit, I’m sure he could justify $200 annually for a much more robust hypervisor, especially with how much enterprise gear he tries to cobble to work on proxmox that would generally just work on esxi. As for GRID, I agree. There’s a reason the latest I’ve messed around with on the NVIDIA side for vGPU is the k1 / k2 and the odd GeForce grid k340 & 520 which are clocked higher, along with an m40 and m4 due to the obvious licensing costs as there isn’t an easy way around it, other than just passing each individual gpu through, which doesn’t work for using templates to automatically build VM’s as it requires the manual assignment. However Jeff is attempting to get AMD GPU’s working, which they advertise without license fees. On AMD’s driver support page, I did find the VIB for esxi located under the blanket “Instinct MI Series” selection, not under the mi25 itself, even though it is the correct driver and is compatible with 7.0 u3. VMware also has the windows drivers available on their support site under the horizon vGPU compatibility matrix site. AMD does have Linux drivers for the MI25 under the MI25 heading itself, which would be a full setup for Linux gaming, or go over to VMware’s site and get the windows drivers too. Also, technically from amd’s site their generic driver downloader is supposed to work to grab the latest vGPU profile drivers
I tried passing thru my wx7100 in hyper-v, it'd blue screen the host every time. also tried gpu-p, wouldnt work. Eventually got two quadro rtx 4000 and everything has been going smoothly under both GPUP and DDA. I'll never consider an amd graphics card ever again for server use. I'll also say I'm really glad I have CUDA, self hosting dall-e mini is amazing.
He didn't really address that but based on what Jeff did say, it should work fine for that. The Mesa driver is leagues ahead of the closed source nVidia driver on Linux in terms of features and stability and if I heard correctly, para-virtualization is what disabled the hardware encoder.
@@jacekjagosz I hear ya, but a quick look at MI25 prices on ebay would lead me to think the RX6600 would be a better deal in most countries. The 6600 performs better on average than a Vega 64 / MI25 at less than half the power consumption. Navi cards work perfectly on every distro I've tried (currently typing this from my Fedora gaming rig with a 6900XT).
There is a single mini DP on this guy (or one's I've seen at least) that is non working by default but reportedly can be used by flashing both bios chips with those from a WX9100. With the ebay price on these being down to $90 I've considered buying one and trying this out for a traditional gaming PC. The lack of cooler is cause for pause but in my case I already have a Raijintek GPU cooler for Vega and for others having the on package HBM memory may make it simple enough to strap an off the shelf CPU cooler to it and using some thermal adhesive to get some heatsinks on VRMs. I haven't tried it myself so I don't know how well it would actually work, but it seems doable.
I'd love to get my hands on one of these to experiment with, I am fairly sure I could get it working with some trickery and grab the output using Looking Glass.
Well I know this comment is from some time ago but they're down to $90 on ebay right now and some have reported success flashing both bios's with an external tool to operate the same as their Radeon Pro WX 9100, but with only one mini displayport on board. With the RAM being on package making the ability to rig a CPU cooler for both a possibility this things begs for *modification*
@@lewisliu7002 yup works great WX v bios has TDP of 230w so you are leaving 70w of performance of the table. some people have gotten the FE vbios working but others haven’t gotten video out with it so i haven’t tested it.
Oh, and here I thought there was going to be an obscure work around that was going to get Moonlight to just work. Well maybe like an Infinite Improbability Drive, you just have to let it use AI to figure out how to do it on its own as a side project.
Glad to see more tech RUclipsrs calling out the makers of tech for BS they do. Some of it is tilting at windmills but if you knock the windmills down, people listen and pay attention.
Just in case you were wondering these older datacenter cards aren't as fast at running Stable Diffusion as current moderately priced 16gb consumer cards like the RTX 4070 and RX7900 GRE either.
Maybe the coming Intel A770 could be a potential killer here as it was all rumored to have SR-IOV support but haven't heard anything about the software side though
Sorry you couldn't get the cards to work. Man that was really a word salad of acronyms in your description of what went wrong. On a side note, nice haircut. I was beginning to think you were hiding that you are going bald with wearing hats in your recent videos. I guess maybe you were just waiting to get a good haircut?
I have a 6900XT and a 6800XT Virtualized in Proxmox. Runs like a Boss. Just need to passtrugh the BIOS (pull your own via GPU-Z) and then everything workydorky without a noticalble performance loss. I am on a 3900X on a Strix x570e. I then have 2x Win11 with parsec on the machines and. BOOOOM. However thats not Para...
I was actually about to pull the plug on one of these because of your last video.. Glad I saw this, if there's an update, or getting this working, PLEASE let me know. I'm trying to find a decent, license free solution for a cloud gaming setup. Currently using an R9 290X and RX 470 for my cloud gaming server.. Would you be willing to attempt to try passing through the GPU directly?
I think the end game for GPU virtualization should be virtual GPU drivers which allow the host to own the GPU and accelerate the virtual devices. So the machines can share GPU resources in the same way we do with CPU
This is the same type of card that Google built Stadia on, so there’s a way to get it to work. You probably need to build some kind of middleware driver to make it work.
There are drivers on AMD website but for Linux only. These seem to be supported by AMD GPU driver but don't have a proprietary driver anymore (not that you'd want it). It is also possible this would work on Windows with some INI modding. You'd have to try it out. For documentation you'd probably have to look at ROCm older documentation, which has been purged every few releases to match currently supported hardware. (This would mostly tell you how to install drivers and use OpenCL and related software).
Is there some way you could render on one of these and pass the hardware encode off to the likes of a low profile single slot nVidia T400 elsewhere in the machine? Is there a slot one could fit other than the CPU bays? They have the full ampere nVenc and can handle several 4k streams at once (a lot of people have been using them for Plex servers).
there *is* some rudimentary support for MxGPU/SR-IOV in AMDGPU, but it doesn't seem to have all the working parts to enable it completely yet so it's in a nonworking state. it does, however, target *all* of their SR-IOV-capable GPUs from the S7150(x2) into the Instinct series though, so at this point it's just a wait for either AMD engineers or someone else to finish gluing things together. with a lack of good documentation though, I wish anyone else luck on getting anything past the S7150(x2) working.... - β
I’d like to see steam deck streaming to other devices like tablets or handhelds. Some shared screen multiplayer games like nfl blitz or overcooked would be an awesome stream from device to device rather than having to use internet. Obviously there are pc games that have couch coop that aren’t really on console but I was just using those games as reference for why I would enjoy an “AdHoc” experience. If something like this is possible with the steam deck I’d like to see it in action.
I would use those instinct cards for openCL compute monsters (if you can get the rocm or legacy opencl drivers to work right, I have had much better luck with AMD's old legacy drivers than the rocm crap). BOINC anyone??. Oh your using them for VM gaming stuff, GL with that🤣
Some times ago i tried spliting my vega 56 into two using just one vm with host for playing games, well my gtx1050 2gb was actually having more performance than that solution. Also i wonder if you could mod (changing id of hardware in driver ini files) drivers from amd to trick that you are using vega 56/64 gpu would help?
You can. These can be used as-is, or by flashing them with a modified retail BIOS. But the whole point of these 'Cloud Gaming' projects that Jeff builds is to partition the GPUs using paravirtualization.
What part of the country are you in? We should swap local brews. We seem to enjoy similar tastes and there are a few local beers that I think you would really appreciate.
hey jeff, thinking to getting these mi25, i want to build a pc to use ansys mechanical (just for hobby) does ansys able to do this card or should i use tesla?
I thought AMD was always the open source hero etc...maybe not this time.....they'll come around, if we're loud enough that is. Kinda like the zen 3 on b350/x370 boards bios updates..they'll come around...eventually....maybe
Myconst dabbled on some radeon datacenter cards from the gcn architecture and managed to make them run by flashing the bios of another card these cards also seem to come with a mini hdmi port but is blocked by the pcie bracket, maybe with a vega 64 bios or comparable workstation bios with video outputs can make this thing work and even output from the hdmi, at least the hdmi worked for Myconst and the GCN cards after the flashing
I was actually just talking about this. Miyconst got 9300 x2s to act like a pair of R9 Nanos. The main thing is, if you can't split them, can you pass them through 1 to 1. If they can be seen in windows and everything works, then perhaps a modded Vega BIOS flash is a possible solution.
oh my bad, i didnt remember the video well enough, anyways there is still hope for the Mi 25 to work for gaming and if you guys manage to get it working, it can be a good alternative to the Tesla M40 at these low prices btw @Myconst can you share the link for the Tesla M40 slideshow of that stream you made for it?, your page is down and cant find it anymore, will need it soon as i bought a tesla m40 (and also going with your take on cooling with the Kraken G12 but in black), thanks in advance!
@@Prophes0r nope, he just made the whole car work like a normal graphics card that can even output from the mini DP port that AMD hides on their datacenter cards for whatever reason
I’m wondering if you could cross flash the mi25 to a Vega 64. I know Vega 56 and 64s can have their firmware changed. You might be able to at least pass through one card per vm at least.
im quite surprised that u couldnt get those cards running, ive had readeon MI8 for a few months and used it without a problem after i flashed a radeon nano bios on it
The point isn't to use it as a regular GPU though. Jeff is trying to use it as a split paravirtualized GPU, which requires drivers/software. And it's THOSE that don't seem to exist in the public space.
Did you give the NIMEZ drivers a try on a windows instance? Honestly though its almost never a good idea to purchase an AMD PRO card for gaming, the "PRO" drivers are always behind date and they dont give you compatible installers for running the gaming drivers (which work perfectly fine and fix most of the PRO issues after modding).
Planed Obsolescence on his best suit, making an appearance!!! This is making hardware with the e-waste in mind as soon as possible!!! @AMD you are being bad, very very bad....
That's exactly what enrages me , people having to buy inefficient old and weak hardware , rather than being able to share resources from a powerful system , which essentially probably couldn't be properly utilized by a single user . Basically partial e-waste as well .
@@yasirrakhurrafat1142 exactly. If this video had a different outcome, more people would keep AMD in a better light and probably would consider new AMD cards when they could afford a new card... This way, not having the ability to use this hardware and not having the possibility to know that this card is actually able to play, no one will touch any AMD card, and just default to GeForce. Calling it a Radeon and not being able to play, is a bad moove. At least, nVidia dont call their datacenter cards GeForce's, that mitigates the facts that those cards are not usable (easily) as gaming cards to some degree... Let us play with these AMD... Make them only usable with AMD Ryzen Integrated GPU's if you want to have some more money out of us, but let us use this and not make it end up in a landfill...
@@NiPPonD3nZ0 Dam dude , reading your comment's receptiveness... makes me feel gutwrenched exactly as i felt typing my previous reply . Thankfully there exists a solution to one os multiseat , in the form of software . And you seem to know about that hyper-v trick I assume . I'm hoping Intel's entrance brings some competition to this Monopoly , and chaos in the form of better and more available features from gpus' all companies .
any updates on this MI25 issue? I am trying to get hold of parts for my Intel R2312WT Server System. I already got two P4s for $89 each ish as it got cheap as heck. Also P40 got cheap as heck on Taobao and eBay($230 on eBay) Weirdly enough after MI25(and bunch of vega10s) being retired for ROCm, now it sells for $89.99 Is it now a good time to get a used MI25? or should I stay with tesla P4s or should I get a Tesla P40? probably my Hypervisor is gonna be proxmox but I am kinda open to whole nvidia GRID licensing rabbithole via ESXi
I think AMD may see this cutting into their bottom line. It wouldn't surprise me if a Chinese company sucks up all these cards when they are really cheap and making a Frankenstein (or Frankenstein's monster) card. I am still planning on buying one and seeing if I can get it to run on ROCm. Was there ever a response to the shout-out (or taunt) to AMD.
@@CraftComputing please do, It would be so cool to see how to do that. If these things are available and stuff, if we need to know if there was a way to get some kind of use out of them
@@CraftComputing I tried that and unfortunately it did not work, additionally the system would not complete post so I had to remove the card, change the bios settings - "PCIe/CPI/Pnp Configuration" -> "Slot N PCI-E OPROM" to "Legacy" - "Boot" -> "Boot mode select" to "Both" Then I reinstalled the card and could get to the the EFI shell and using a usb stick formatted as fat32 I could use amdvbflash.efi(v2.93) to flash back to the instinct mi25 bios. Before I reflashed the card back to an mi25, I had configured my bios to NOT load the OPROM from the card so that it could complete the boot process and I could try it in a VM, everything was detected correctly by the VM (win11) but when the latest amd drivers tried to start the card it failed and device manager showed (Code43) on the device.
Here is hoping that someone working at such "company with deep pockets" does a whoopsie and "regrettably discloses drivers and documentation to an unknown third party"
As I mentioned... my DM's are open.
@@CraftComputing Shakka, my Walls are open
Craft, his wallet empty
@@dhgodzilla1 That made me smile more than it should have.
With custom drivers like nimez for AMD out there with such deep levels of access to the inner workings of AMD drivers, I'd say it's worth a shot reaching out to the team that makes them to see if they can get the instinct cards working. They've pulled off a lot of miracles before, like backporting rsr to older GPUs, and enabling video encoding on APUs where it shouldn't even be physically present on the die.
AMDs biggest weakness was never their hardware (when it comes to GPUs) it's their software support.
Sadly Nvidia seem to be doing their best at catching up to them, if not beating.
@@Nightykk Yeah, AMD was far worse in the 00's (or actually, ATi)
... Ironically the X800 unreleased beta drivers for Windows 98 (unreleased cause of dropped support) actually managed to be extremely stable on that OS.
That's why most Linux gamers prefer AMD GPU. Open source support is strong 💪
I do compute work and a couple years back I purchased an AMD workstation pro card. I knew it wasn't good at gaming, but it was competitive for compute. The problem I ran into was a year or two after I purchased the brand new card, I had discovered keras dropped support for openCL and went full tensorflow only. PlaidML backend no longer worked.... AMD also dropped their support for the card with ROCm... in the end AMD chose no longer to support the card with its drivers and companies are no longer supporting it. I have no idea how to use AMD cards for compute without custom coding everything, so I begrudgingly went with NVIDIA.
I bought 2 RX Vegas on launch due to the promise of Raja Koduri that they support SR-iov. Then, I found out that they don't; not even the Frontier Edition card did. Obviously, this really pissed me off.
That guy was full of shit
I was just recently eyeing MI25's on ebay, thank you for making this video! This is VERY disappointing since I would love to get some GPU performance for my VM's. Everything available at this point seems to be a compromise of some sort and I'm just waiting for a "whole" solution (3d acceleration in VM & video encode/decode acceleration without recurring licensing fees or hacks). Trying to wrap my mind around pros and cons of what's currently available is confusing.
My hope is that Intel's Arctic Sound-M will eventually provide SR-IOV and drivers for a reasonable price and encourage competition.
Your best bet is the Radeon Pro V340
I've been waiting for either Jeff or Wendel to give these a go. On paper they seem to be VEGA cards that can do SR-IOV.
That is a bummer for sure.. When you teased this I was hopeful that the caveat you threw in was something we could overlook but it seems that was not the case. Love these videos - I haven't had a gaming VM in years now, this makes me want to build something!
Always love your journey in these projects, even though your road always seems pretty bumpy, I feel like you will still get there!
This is why you need to switch to ESXi, Xen is ok for VDI, but Nvidia and AMD really developed for ESXi and Horizon. You should be able to get the VIB’s needed to get these working directly from VMWare. Also you can snag an unlimited Horizon license as part of the $200/yr VMUG advantage program to do all of this legally. You would be surprised with how well Blast Extreme and even PCOIP work for gaming, just gotta enable relative mouse mode.
Done this quite a few times with GRID and AMD products both professionally for CAD VDI workstations, and personally for Gaming VDI’s, so I know it works!
This kind of defeats the purpose. I run a Citrix farm for a couple hundred engineers running Revit and AutoCAD and Bently etc. (with, oddly some Horizon clients as well because...west coast, don't ask) and a) ESXi is...I mean it's just another class, but with that class comes compatibility concerns and feature limitations with the free version or pricey licenses with the paid version b) the drivers don't come from VMWare they come from nvidia and THAT is where things get sucky. If we are talking "designed for vmware" then we are really talking GRID (regular passthrough GPU is a whole different beast, but that isn't where nvidia has put their money behind vmware) and GRID means licensing, which, ok older cards like K2's (which I don't consider viable for gaming in 2022) had different licensing and at this point you can get the bundle from nvidia. So you are back to HEFTY license fees for EXPENSIVE products to use anything current.
tl;dr: "designed for vmware" really only applies to GRID and GRID is pricey, and it's drivers are locked behind a paywall and require a network license server even if you can get your hands on them some other way. Standalone doesn't require software licenses, but is really not meant for vmware.
@@PaulFulbright I guess my answer is 2 fold. One being that I recommended esxi because these enterprise class GPU’s are designed to run with enterprise class hypervisors, and nobody is running proxmox… and along with that, the VIB’s that are required as the “hypervisor” host driver don’t exist for other hypervisors than esxi or Citrix. Yes, I know the free version of esxi doesn’t include vcenter which is required to modify and assign vGPU profiles, however to make a RUclips video about, it’s not hard to google and find a license key thats full featured and doesn’t expire, or if he wants to make it more legit, I’m sure he could justify $200 annually for a much more robust hypervisor, especially with how much enterprise gear he tries to cobble to work on proxmox that would generally just work on esxi.
As for GRID, I agree. There’s a reason the latest I’ve messed around with on the NVIDIA side for vGPU is the k1 / k2 and the odd GeForce grid k340 & 520 which are clocked higher, along with an m40 and m4 due to the obvious licensing costs as there isn’t an easy way around it, other than just passing each individual gpu through, which doesn’t work for using templates to automatically build VM’s as it requires the manual assignment. However Jeff is attempting to get AMD GPU’s working, which they advertise without license fees. On AMD’s driver support page, I did find the VIB for esxi located under the blanket “Instinct MI Series” selection, not under the mi25 itself, even though it is the correct driver and is compatible with 7.0 u3. VMware also has the windows drivers available on their support site under the horizon vGPU compatibility matrix site. AMD does have Linux drivers for the MI25 under the MI25 heading itself, which would be a full setup for Linux gaming, or go over to VMware’s site and get the windows drivers too. Also, technically from amd’s site their generic driver downloader is supposed to work to grab the latest vGPU profile drivers
Nice new look Jeff, it suits you!
I have that same "save icon" coaster! 🤪
I tried passing thru my wx7100 in hyper-v, it'd blue screen the host every time. also tried gpu-p, wouldnt work. Eventually got two quadro rtx 4000 and everything has been going smoothly under both GPUP and DDA. I'll never consider an amd graphics card ever again for server use. I'll also say I'm really glad I have CUDA, self hosting dall-e mini is amazing.
It's weird to say never, vs when it's a better choice
It doesn't even work for Linux Gaming with the Opensource drivers, virtualisation aside, just single user gaming?
If not then it is such a shame!
He didn't really address that but based on what Jeff did say, it should work fine for that. The Mesa driver is leagues ahead of the closed source nVidia driver on Linux in terms of features and stability and if I heard correctly, para-virtualization is what disabled the hardware encoder.
@@mordacain3293 Because if it is working fine, then it can bee a great deal for a Linux Gaming GPU, even if Virtualisation doesn't work.
@@jacekjagosz I hear ya, but a quick look at MI25 prices on ebay would lead me to think the RX6600 would be a better deal in most countries. The 6600 performs better on average than a Vega 64 / MI25 at less than half the power consumption. Navi cards work perfectly on every distro I've tried (currently typing this from my Fedora gaming rig with a 6900XT).
If that works great, then games on whales may be usable.
There is a single mini DP on this guy (or one's I've seen at least) that is non working by default but reportedly can be used by flashing both bios chips with those from a WX9100. With the ebay price on these being down to $90 I've considered buying one and trying this out for a traditional gaming PC. The lack of cooler is cause for pause but in my case I already have a Raijintek GPU cooler for Vega and for others having the on package HBM memory may make it simple enough to strap an off the shelf CPU cooler to it and using some thermal adhesive to get some heatsinks on VRMs. I haven't tried it myself so I don't know how well it would actually work, but it seems doable.
There are 3d printed parts for the card up on eBay, to adapt various sizes of fans
I'd love to get my hands on one of these to experiment with, I am fairly sure I could get it working with some trickery and grab the output using Looking Glass.
Well I know this comment is from some time ago but they're down to $90 on ebay right now and some have reported success flashing both bios's with an external tool to operate the same as their Radeon Pro WX 9100, but with only one mini displayport on board. With the RAM being on package making the ability to rig a CPU cooler for both a possibility this things begs for *modification*
I heard these can be BIOS flashed to a WX9100 and the onboard mDP will work just fine.
Really? What I've heard is that the original bios, minidp, works with linux
@@lewisliu7002 yup works great
WX v bios has TDP of 230w so you are leaving 70w of performance of the table. some people have gotten the FE vbios working but others haven’t gotten video out with it so i haven’t tested it.
I can't say I didn't see this coming. Not in the details, but simply AMD + Cloud gaming.
I have had some luck with low end WX2100s @720p in emulation gaming... but the quadros and 1650s perform MUCH better on the encode end.
Oh, and here I thought there was going to be an obscure work around that was going to get Moonlight to just work. Well maybe like an Infinite Improbability Drive, you just have to let it use AI to figure out how to do it on its own as a side project.
Glad to see more tech RUclipsrs calling out the makers of tech for BS they do. Some of it is tilting at windmills but if you knock the windmills down, people listen and pay attention.
Just in case you were wondering these older datacenter cards aren't as fast at running Stable Diffusion as current moderately priced 16gb consumer cards like the RTX 4070 and RX7900 GRE either.
I would love to hear how the MI25s are for being passed through completely to the VM (making that a 3 gamer system).
Do it.
Maybe the coming Intel A770 could be a potential killer here as it was all rumored to have SR-IOV support but haven't heard anything about the software side though
Sorry you couldn't get the cards to work. Man that was really a word salad of acronyms in your description of what went wrong.
On a side note, nice haircut. I was beginning to think you were hiding that you are going bald with wearing hats in your recent videos. I guess maybe you were just waiting to get a good haircut?
I have a 6900XT and a 6800XT Virtualized in Proxmox. Runs like a Boss. Just need to passtrugh the BIOS (pull your own via GPU-Z) and then everything workydorky without a noticalble performance loss. I am on a 3900X on a Strix x570e. I then have 2x Win11 with parsec on the machines and. BOOOOM.
However thats not Para...
I was actually about to pull the plug on one of these because of your last video..
Glad I saw this, if there's an update, or getting this working, PLEASE let me know. I'm trying to find a decent, license free solution for a cloud gaming setup. Currently using an R9 290X and RX 470 for my cloud gaming server..
Would you be willing to attempt to try passing through the GPU directly?
I'm interested in everything you said, you last question over evereything
Nothing could go wrong... Famous last words!
I think the end game for GPU virtualization should be virtual GPU drivers which allow the host to own the GPU and accelerate the virtual devices. So the machines can share GPU resources in the same way we do with CPU
Linux containers, such as Docker or Proxmox LXC containers, do that exactly. Doesn’t help if you want a windows vm unfortunately
This is the same type of card that Google built Stadia on, so there’s a way to get it to work. You probably need to build some kind of middleware driver to make it work.
A few weeks ago i heard google was some sort of upgrade Stadia with nvidia GPU?
Looking sharp!
There are drivers on AMD website but for Linux only. These seem to be supported by AMD GPU driver but don't have a proprietary driver anymore (not that you'd want it).
It is also possible this would work on Windows with some INI modding. You'd have to try it out.
For documentation you'd probably have to look at ROCm older documentation, which has been purged every few releases to match currently supported hardware. (This would mostly tell you how to install drivers and use OpenCL and related software).
Is there some way you could render on one of these and pass the hardware encode off to the likes of a low profile single slot nVidia T400 elsewhere in the machine? Is there a slot one could fit other than the CPU bays? They have the full ampere nVenc and can handle several 4k streams at once (a lot of people have been using them for Plex servers).
AMD really dropped the ball here. Nvidia may have loosened up their restrictions a little but it isn't perfect. Your move Intel.
DEFINITELY have eyes on Intel right now.
Intel is not cheap in server market as well
@@s.i.m.c.a im guessing you are unaware of the new arc GPUs
Surprised he hasn't done a revisit flashing these with FE or WX vBIOS.
Do review of AMD MI60 next, it has 32 GBs , do Stable Diffusion performance tests too with ONNX
@LTT in ads lol
He paid a premium. Couldn't say no.
LTT is officially aware and OK with it
there *is* some rudimentary support for MxGPU/SR-IOV in AMDGPU, but it doesn't seem to have all the working parts to enable it completely yet so it's in a nonworking state.
it does, however, target *all* of their SR-IOV-capable GPUs from the S7150(x2) into the Instinct series though, so at this point it's just a wait for either AMD engineers or someone else to finish gluing things together.
with a lack of good documentation though, I wish anyone else luck on getting anything past the S7150(x2) working....
- β
I'm sure you can find some folks to tweak some drivers and make it (partially at first) run.
I’d like to see steam deck streaming to other devices like tablets or handhelds. Some shared screen multiplayer games like nfl blitz or overcooked would be an awesome stream from device to device rather than having to use internet. Obviously there are pc games that have couch coop that aren’t really on console but I was just using those games as reference for why I would enjoy an “AdHoc” experience. If something like this is possible with the steam deck I’d like to see it in action.
You can flash them with a USB BIOS chip flasher to the WX9100 bios and you should have better luck with the cards in that state.
It also allows you to use the micro display port on the back for video out.
I would use those instinct cards for openCL compute monsters (if you can get the rocm or legacy opencl drivers to work right, I have had much better luck with AMD's old legacy drivers than the rocm crap). BOINC anyone??. Oh your using them for VM gaming stuff, GL with that🤣
Some times ago i tried spliting my vega 56 into two using just one vm with host for playing games, well my gtx1050 2gb was actually having more performance than that solution.
Also i wonder if you could mod (changing id of hardware in driver ini files) drivers from amd to trick that you are using vega 56/64 gpu would help?
Dont some of these have a miniDP socket? can't you run your monitor (and thus games) directly from there?
looks like you’ve lost some weight - you;re looking healthy man keep it up!
But It does have video output look at the back It has a mini display port inside the bracket
I was bout to ask "Can you use this in a normal PC n pass it to a VM" but I cropped it out cuz, no drivers. What a shame
You can. These can be used as-is, or by flashing them with a modified retail BIOS.
But the whole point of these 'Cloud Gaming' projects that Jeff builds is to partition the GPUs using paravirtualization.
For a minute there I thought it was Captain Foley from Trekyards from the thumbnail 🤣
Does it work if forwarded to VM without partioning? Like i could use even one full card to have remote gaming.
You could, but if you did you’d be better off just buying cheaper consumer GPUs instead.
@@rodti_ what car is better for $200?
What part of the country are you in? We should swap local brews. We seem to enjoy similar tastes and there are a few local beers that I think you would really appreciate.
hey jeff, thinking to getting these mi25, i want to build a pc to use ansys mechanical (just for hobby) does ansys able to do this card or should i use tesla?
Just flash the wx9100 vbios and apply the Vega FE pp tables and you have a cheap Vega FE with no fan and one mini displayport out
Best ad segment EVAR!
Lmao, the ltt store scene killed me.
I thought AMD was always the open source hero etc...maybe not this time.....they'll come around, if we're loud enough that is. Kinda like the zen 3 on b350/x370 boards bios updates..they'll come around...eventually....maybe
Would it be useful at all to reflash the bios of one of these to show a “vanilla” Vega 64 and have the software treat it as so?
Haircut FTW!
This mi25 does not have an hiden mdp vídeo out?
This mdp video out works with default BIOS?
Myconst dabbled on some radeon datacenter cards from the gcn architecture and managed to make them run by flashing the bios of another card
these cards also seem to come with a mini hdmi port but is blocked by the pcie bracket, maybe with a vega 64 bios or comparable workstation bios with video outputs can make this thing work and even output from the hdmi, at least the hdmi worked for Myconst and the GCN cards after the flashing
I was actually just talking about this. Miyconst got 9300 x2s to act like a pair of R9 Nanos. The main thing is, if you can't split them, can you pass them through 1 to 1. If they can be seen in windows and everything works, then perhaps a modded Vega BIOS flash is a possible solution.
Yeah, I managed to make Instinct MI8 and some other GPUs work with consumer BIOS, but the output was DisplayPort, not HDMI.
oh my bad, i didnt remember the video well enough, anyways there is still hope for the Mi 25 to work for gaming and if you guys manage to get it working, it can be a good alternative to the Tesla M40 at these low prices
btw @Myconst can you share the link for the Tesla M40 slideshow of that stream you made for it?, your page is down and cant find it anymore, will need it soon as i bought a tesla m40 (and also going with your take on cooling with the Kraken G12 but in black), thanks in advance!
@@Miyconst Correct me if I'm wrong, but you were using the whole card right?
You weren't trying to partition them and pass through just a piece?
@@Prophes0r nope, he just made the whole car work like a normal graphics card that can even output from the mini DP port that AMD hides on their datacenter cards for whatever reason
I’m wondering if you could cross flash the mi25 to a Vega 64. I know Vega 56 and 64s can have their firmware changed. You might be able to at least pass through one card per vm at least.
Marc Wolfe flashed them first to be WX9100 and then Vega64 Frontier - and uploaded streams from various first person shooters on his youtube channel.
Enterprise people - please share your documentations!
im quite surprised that u couldnt get those cards running, ive had readeon MI8 for a few months and used it without a problem after i flashed a radeon nano bios on it
The point isn't to use it as a regular GPU though.
Jeff is trying to use it as a split paravirtualized GPU, which requires drivers/software.
And it's THOSE that don't seem to exist in the public space.
@@Prophes0r i remember that one of the ebay sellers was also offering the drivers as well when i was looking at MI25
With the specs it sounds like a Beasty GPU
Can you try gaming on a Tesla P100? Theres very little reviews over this card and I dont know why.
Bro nice haircut
Since going full Linux I'm done with nVidia
I'm curious how will these cards perform on Topaz Video Enhance and Topaz Giga Pixel
WAIT... encoding is disabled when you use the MX GPU? ...what about 1 to 1 passthrough?
could you flash it to a WX9100, also some of these have a single mini dispay port output. yours?
Could I use this for a GPU for a budget build no cloud gaming just as a graphics card
Works fine with my 6900XT, using Hyper-V
(angry linux noises)
@@marcogenovesi8570 Sunshine works well for hosting a remote connection to Linux as far as I know...
I haven't personally tried splitting up AMD GPUs in Linux. Only 1 to 1 pass-through.
Did you give the NIMEZ drivers a try on a windows instance? Honestly though its almost never a good idea to purchase an AMD PRO card for gaming, the "PRO" drivers are always behind date and they dont give you compatible installers for running the gaming drivers (which work perfectly fine and fix most of the PRO issues after modding).
Realy like your haircut and tshirt!
Gotta wait for a driver leak on a forum, I guess.
1:03 *16GB
For 64GB you need MI200 ;)
You should look into flashing these with a vega 64 bios
due to the size of memeory, it would be better a wx9100 bios, is pretty the same card
Planed Obsolescence on his best suit, making an appearance!!! This is making hardware with the e-waste in mind as soon as possible!!! @AMD you are being bad, very very bad....
That's exactly what enrages me , people having to buy inefficient old and weak hardware , rather than being able to share resources from a powerful system , which essentially probably couldn't be properly utilized by a single user . Basically partial e-waste as well .
@@yasirrakhurrafat1142 exactly. If this video had a different outcome, more people would keep AMD in a better light and probably would consider new AMD cards when they could afford a new card... This way, not having the ability to use this hardware and not having the possibility to know that this card is actually able to play, no one will touch any AMD card, and just default to GeForce. Calling it a Radeon and not being able to play, is a bad moove. At least, nVidia dont call their datacenter cards GeForce's, that mitigates the facts that those cards are not usable (easily) as gaming cards to some degree... Let us play with these AMD... Make them only usable with AMD Ryzen Integrated GPU's if you want to have some more money out of us, but let us use this and not make it end up in a landfill...
@@NiPPonD3nZ0
Dam dude , reading your comment's receptiveness... makes me feel gutwrenched exactly as i felt typing my previous reply .
Thankfully there exists a solution to one os multiseat , in the form of software .
And you seem to know about that hyper-v trick I assume .
I'm hoping Intel's entrance brings some competition to this Monopoly , and chaos in the form of better and more available features from gpus' all companies .
Lenovo support page has drivers for AMD RAdeon Instinct MI25 for linux updated on 2021, might that driver work for this purpose?
You can convert the mi8 to a fury, so this is a bummer
that also doesn't have encode/decode capabilities...
AMD has the Radeon Pro V series (V340 is the MI25) which supports MXGPU… And decided to shoot themselves in the foot by not selling them directly.
Hi there, have you tried flashing bios to WX9100 vbios and seeing if mDP port on MI25 works?
Nice haircut
any updates on this MI25 issue? I am trying to get hold of parts for my Intel R2312WT Server System. I already got two P4s for $89 each ish as it got cheap as heck. Also P40 got cheap as heck on Taobao and eBay($230 on eBay) Weirdly enough after MI25(and bunch of vega10s) being retired for ROCm, now it sells for $89.99
Is it now a good time to get a used MI25? or should I stay with tesla P4s or should I get a Tesla P40?
probably my Hypervisor is gonna be proxmox but I am kinda open to whole nvidia GRID licensing rabbithole via ESXi
you can flash WX 7100 vbios to them, i’m using one for stable diffusion model training no idea how virtualization is
C'mon AMD. Put your development where your PR its.
When Craft Computing becomes Crap Computing. Keep dunking on AMD and Nvidia for shady BS practices. They deserve it.
Unfortunately nvidia is just too good and even has situations like this locked down…..Vegeta voice: Damn it hes powerful
with amdvbflash you can flash the wx9100 bios to these just a fyi
Did you try amd open-cl driver
My question is: will it work well without splitting gpu? just as normal passtrough?
And your overloard Linus commands you to buy the jacket, Jeff.
Whoa! Solid haircut!
I think AMD may see this cutting into their bottom line. It wouldn't surprise me if a Chinese company sucks up all these cards when they are really cheap and making a Frankenstein (or Frankenstein's monster) card. I am still planning on buying one and seeing if I can get it to run on ROCm. Was there ever a response to the shout-out (or taunt) to AMD.
Can one GPU feed the other so that one does Encoding while another renders the game frames?
if it's for VM's why not run a virtual machine on it and see how it games.
What , in your opinion, is the best card to do for cad and distributed gaming ?
curious about new drivers for support... :)
I've heard of people flashing vega56 BIOS onto these and using them. They work fine for mining at least as-is.
This is when you call Wendell to come on in and see what's up. Or Tech Jesus to go to AMD and ask WTF then things will get sorted.
Seems quite a bit easier to get Linux drivers for these.
Could i request a video on arm servers? ;)
I wonder if it's got hardware support in ESXI
Thanks for screwing up so we don't have to. You at least got a video out of it.
Can you Flash them with a Vegas 64 bios? So you can at least use in someway?
Probably. And at this point, I'd might as well, since I can't use them for anything else :-/
@@CraftComputing please do, It would be so cool to see how to do that. If these things are available and stuff, if we need to know if there was a way to get some kind of use out of them
@@CraftComputing I tried that and unfortunately it did not work, additionally the system would not complete post so I had to remove the card, change the bios settings
- "PCIe/CPI/Pnp Configuration" -> "Slot N PCI-E OPROM" to "Legacy"
- "Boot" -> "Boot mode select" to "Both"
Then I reinstalled the card and could get to the the EFI shell and using a usb stick formatted as fat32 I could use amdvbflash.efi(v2.93) to flash back to the instinct mi25 bios.
Before I reflashed the card back to an mi25, I had configured my bios to NOT load the OPROM from the card so that it could complete the boot process and I could try it in a VM, everything was detected correctly by the VM (win11) but when the latest amd drivers tried to start the card it failed and device manager showed (Code43) on the device.