There is an Ubuntu Server distribution you could install instead of disabling the gui login on Ubuntu Desktop. The Server version won’t install any gui components, hence smaller.
I wonder if there was some limitation with the blade that didn't allow ubuntu server versions. He did mention there was some specific version/build of ubuntu that worked. But I did have the same question myself.
@@krisclem8290 specific as it was 22.04LTS and not another version . Anyway, based on the video I had the impression (might be wrong tho) that he only has next-next-finish type experience with Linux. If it does not work that way then onto the next distro.
I doubt use had reason to optimise the power consumption at idle since you're use it for it's proper use. Another comment mentioned 80w at idle, that's crazy. I'm assuming this is going to be down to drivers / configuration. How you been bothered by its power consumption?
You will probably need to patch the kernel to add support for the APU. Its mostly just adding the IDs so the driver loads for that device. ps4-linux patches are pretty similar
You just have to rebuild the kernel and include the appropriate kernel module for the igpu in the chip. It just needs a slightly weird and modified version of the driver, and I'm almost positive it has been included as a build option for several years, it's just not shipped as the default because it is so uncommon
@@callowaysutton this is just a lie, I got it running games with hardware acceleration using fedora 6.2.0-fc38 or whatever kernel and the most recent mesa drivers with literally one letter changed in them before compilation. The igpu works and renders games
Very cool video Logan! I think the real value in something like this is getting to tinker with it. For $100 its a great learning opportunity, even if its not the most practical board for your use case.
I'd get one to play around with if I had $100 to burn right now. Would probably turn it into an emulation machine once I got the GPU working. It uses like twice as much power at idle as my janky old Xeon based NAS. so not really great for networking. But it would be great for emulation with the GPU working. I bet the driver is already out there somewhere.
I've seen a couple videos about these and no one had anything good to say about them aside of for mining and even that wasn't great. Kind of sort of supported in Linux but use a ton of power even at idle (80w), especially considering the specs. If what I've read is accurate 1 of those blades eat up as much power as my Epyc 7302 proxmox server while active (around 120w) which is kind of crazy considering you can grab say an N100 that would probably be as performant for a decent number of tasks at like 15w for (depending on where you get it and what model) $150-$200. If you can get one cheap not bad as a interesting bit of hardware to keep as an oddity though.
Thanks for watching! Settled on much of the same conclusion. Unless this can really provide you some kind of application specific benefit or you're using the whole chip all the time, I figured the idle power consumption was quite poor just from the heat it kicks out, and setting it up was really janky haha. Maybe a bargain for some, not for most, still glad I got to test it for myself :)
when going with risc-V or ARM you can get a server capable enough for most home uses at around 2W full load. I had a pine A64+, which actually is fast enough for most home server usages, and can even run many of them at the same time, and it would only draw 2W max, 4 cores, but only 2gb of ram which was a problem, and the powersuply also had problems, but that was a old raspberry pi kit powersuply. back then cost around €30 new or such for the entire server. these days you can however get servers which also have nvme and such buildin and which actually have 8 cores at much higher speeds and up to 32gb of ram, which also use only around 2w to 4w at max power usage. for many such low power servers however the PCB and SBC design quality have a great effect on the powerusage, as well as the way you clock or volt it. the pine a64+ however did run like that with stock settings, couldn't do much there since the powersuply just would go to 2w once it got some load on it even if it was way less than 2w, 2w was kind of it's minimum stable output, any draw below it and it would just waste the rest of the energy to still draw 2w(from the wall), so even when the pine would only draw like half a watt(from cable) the old raspberry pi psu would still draw 2w from the wall. those newer 8core chips however are similar in price to the board in this video for the 16gb version, and perhaps with the current prices even similar to the 32gb version. so they are really worth checking out, and they have a working gpu. and even a NPU, and while it is only arund 8TOPS for the NPU alone, it is super efficient on small AI when using the NPU.
If the use case for that APU is to mine crypto then there has to be driver support for it under Linux I’d think. Probably just not in the Linux kernel, I bet seller of these had to recompile the Linux kernel and I bet it shipped with a Linux version that already had that APU support in a precompiled kernel…
Thanks for watching! That is quite possible, the driver situation has been really weird, it seems like some support for compute and 2D rendering on this graphics core was added to the mainline amdgpu driver a couple of years ago and then was subsequently removed. I haven't ran across anyone else who has gotten this to successfully rasterize 3D graphics so it is possible that the chip has been partially disabled in firmware as well? Haven't gotten anywhere with a BIOS update but still hopeful and looking for more info on these!
The reason why you don't get some serious graphics is due to Ubuntu being unable to auto-detect this machine's GPU, so it throws you to VESA mode. As someone mentioned here in the comments, you can install graphics drivers.
I figured that would be the case. This thing would work a lot better as a cheap gaming machine than a server. Was there even an attempt to install Windows? I bet the LTS embedded version has support for this device.
A few things about the specs on this thing that you got wrong, only 24 of the CUs on the apu are enabled so even though the hardware has 36, 12 of those are disabled. You can get them running games in fedora (at least that's what I used) using modded mesa drivers to get them recognized under a NAVI10 part, you probably could get them working as navi22 using a kernel patch but that's besides the point. I'm using fedora 6.2.0-fc38 and the most recent mesa driver
It is even cheaper now. Like 42$. But idle CPU usage is just not too good. 80-100W in idle, is just crazy bad. The "Asrock" 15000$ server, well, they are worthless now. You can get them now basically for free with chassis, PSUs, cooling, and all 12 cards. Even for home labing, k8s, experiments, it is not worth it, due to crazy idle power usage.
I'm curious about this myself. The small amount of digging into LLM locally is that on the AMD side there are few chipsets that are supported right now, so you'd have to make sure it was on that list, otherwise it'll back down to CPU vs. GPU. I'm tempted to buy one to see if this is possible.
I don’t remember the file you have to edit but you can edit a file so on boot at the shell login page it can display the IP of the machine. I always have to google it but I do it on my servers so I can just get the IP from the display of or when needed without logging in every time because I don’t always use those machines often and doubt have static IPs
dude, whats at the moment the best to go for a Jellyfin server setup? I got lot of old parts here and thinking of going with them, i got a Ryzen 9 5900X but no other AMD parts sadly, but i also got a i7 6800k Skylake with a very old Z170 Pro gaming with 16GB Ram, an 1 TB Samsung 990 Pro+ Arctic liquid freezer. (the i7 should handle 4K movies well i tm sure?) And i was thinking of going with about 6 Ironwolfs for RAID 50 i dont know, maybe about 30 or 40 GB of Media Size for the Jellyfin Server, the GPU i would just go with anything what could fit well and help with transcoding. I just wanna be able to Stream 1080p and 4k movies here at home, no multistreaming, only 1 at a time. im building 2 more kinda like servers at the moment, and would love to use for this media server just mostly old parts which i already have
I wonder how thing thing would do as an inference server. A all in one unit with 16gb of GDDR6 for this price could be VERY compelling if it will run stable diffusion, llama3 8b, etc.
basically what this is. but its a cpu with a gpu integrated. just ssh into the original install and enable the drivers that came with it. its an inferance card you can log into and setup to do tasks in clusters. for $100 or less. others in this price range 4gb single core "nodes"
I wonder if you would have any luck using the M.2 as an occulink port using adapters. The only problem with that is then you would need to run an OS over USB.
You could setup PXE boot via tftp as the boot device instead. Chances are, if you are using this PC on a card you already have a home lab setup, so might as well throw a small tftp server on a VM host ^^
Definitely the kind of thing I would pick up for my FX-8370 box if I never made the jump to this Ryzen 5 3600 and it would eliminate a primary point of failure for a junker NAS that is so old it can't even handle containers. This is very cool.
Bought two of these for $50 each off eBay a few weeks ago, just getting around to messing with them. Kinda hoped to use one as a dedicated Ollama LLM server (what with GDDR6 unified RAM, memory bandwidth should be better than most PCs) and the other inside a funky wooden box as a music workstation (running FL Studio under Bottles/Wine) but that depends on GPU drivers being usable.
Same chip thats in my Tesla, but all 8 cores are enabled and it has 16GB of shared memory. Searching this I see lots that have 16GB of Vram. I think there are also some of these Oberon chips in compute clusters for render farms. As for the drivers, someone mentioned open CL in the comments, and that driver does work according to Techpowerup, but don't expect to run games on it because things like shaders may be disabled if this was a bad batch of Tesla/PS5 chips.
I think the gpu chip might be fried, i was installing proxmox on an old computer and it booted at bios correctly but after starting install procedure dhe screen went off and i could not continue. I thougt the computer was too old to work with proxmox but after that i found a gpu card for 10$, swapped that with the computer gpu and it worked. Tried the old gpu on another computer and had the same black screen when loading os.
for me it sounded interesting, but when I heard high idle powerusage, that kind of killed it for me, idle power usage needs to be great for most of my use cases, even my 8 year old 4 core webserver drew only 2W under full load, later due to a faulty ssd it drew a lot more before I pulled it out, because there was one ssd which just drew something like 3w to 5w of powerusage even when under very low load, which obviously is insanely terrible both for SSD's and even more so in a server which normally draws less than 2w under full load. the gpu not working also isn't as nice to have, since that prevents offloading things to the gpu, even though I don't often need that it still is a problem that it is lacking.
How come you decided to install a full desktop environment if you are not planing to use the GUI, you could of installed the Ubuntu Server in that case
I do that too. Usually because it saves 5 minutes of etcher making a new image on a flash drive. Besides, Ubuntu can turn into a headless server with a single command, and back to a desktop at will.
@@JonVB-t8l Why not burn yourself a Ventoy USB and skip all the ether writing...just slap your iso files and it handles the rest...even windows install (but who needs that anymore) ;)
Streaming video? Come, this thing has access to gddr6. Do some in-memory database benchmarks. It likely blows everything commercial out of the water. I'd use it as a caching proxy.
yes, that is even worse than 4 nvidia consumer GPU's, and nvidia gpu's have terrible powerdraw at idle, just look at the rtx 4050 and rtx 4060 drawing almost 20W at idle(16w to 18w). even worse when comparing it to servers optimized for efficiency. since I had a old server now around 6 years old, 4 cores 2gb ram but back then that was more normal as that was before computers got more cores and ram, as back then 2 cores was the norm. also note that server cost me €20, mistakingly said €30 somewhere else in the comments, but just found back it was around €20 new. it was a server also optimized for lower power usage, could run all basic home server uses, even multiple together in general. but it only drew 2W at max load. in the modern day, while the prices of such things have gone up insane amounts, as when I got it such hardware was more for early developers and such(but stable already). the newer hardware on the market also has boards with 8 cores at much higher speed and AI accelerators NPU build in, as well as special other accelerators and encoders and a gpu, and now with up to 32gb ram, and with nvme build in by default and they also use around 2w to 4w of power under full load. the powerusage in such boards however depends a lot on the design and quality, as well as how they are tuned, that is why the range is quite broad with 2w to 4w at max load(this is based on a seciffic chip, however other chips might even use 6W. but a well made and tuned board might run great at only 2w max, but in such thigns some boards try to compensate for worse SOC decisions by overclocking the chip and at such thigns small overclocks can have drastic changes compared to the sweet spot, while some boards just are designed right and get similar performance using only around 2W. these boards are much more expensive than my old Pinea64+ board, but they still cost around as much as the board in the video for the 16gb versions and the 32gb versions might be around as expensive as what they cost now.
Great find! I really hope you will go further with it. I understand you feel like going down the crypto route will be a hassle but I believe it will yield more fruit than just getting Jellyfin working in the way you implemented it here! Either way, pretty cool!
Still hard to beat an Asus PRIME N100I-D D4-CSM, which is around 100 $ or less, new. 15 W Intel Celeron N100 Chip, with plenty of performance, included. But yes, thanks for an interesting project, tinkering is where the fun is at.
Don't think windows runs on genemutated bananas, by which I mean this doesn't seem to be a PC. It's some sort of something, but that's damn far from a windows compatible something. Linux runs on anything from washingmachines to wrist watches, windows doesn't.
Netflix costs round 23$ + up to 2x 8$. That makes 39$ for three people, or if you have a group of 6 friends, that is 78$ per month or 936$ per year? I am not from the us, but this prices don’t even include tax, right? So if you or one of your family or friends is a technical enthusiast who likes stuff like this, and you pay together for your own server, you might be better off by far. For 10 years of Netflix you are now paying 9360$. Netflix is getting less and less good tripple A movies and more trash productions, you need multiple subscriptions to stream your favorite movies or shows since there are so many streaming services available now and if you cancel the subscription everything is gone or the service provider is taking it off. The data just don’t belong to you. Of course it is easy and such a server takes a huge amount of time, if you want to do things right, but for me, i love doing this type of stuff, so I go with a self hosted solution.
Although it is an interesting item I would not really want to buy one. There are similar computers that are essentially a console in a PC style case. They have that same 4700S with the GDDR 6 being the only system RAM. While these devices appear to have "acceptable" performance you can get a better computer and one that is upgradeable. I admit I am not a fan of soldered down RAM either. I only tolerate it on a graphics card since they generally work well for their purposes. It is just that 8 GB is not a lot anymore and that will show when the RAM is full and Windows makes use of the swap file that can make some computers slower than molasses in January!
Its nice to see young dudes playing around with the old homelab stuff. But that 100W idle is a no-go for labs.
I have light bulbs over 100w
Big ouch on that
@@delresearch5416 wdym?? it's like $11USD for a cheapo 8-14w LED bulb.. incandescent bulbs are just being wasteful lol
@@delresearch5416but is that bulb on 24/7? (Your power company hopes so!)
There is an Ubuntu Server distribution you could install instead of disabling the gui login on Ubuntu Desktop. The Server version won’t install any gui components, hence smaller.
I wonder if there was some limitation with the blade that didn't allow ubuntu server versions. He did mention there was some specific version/build of ubuntu that worked. But I did have the same question myself.
@@krisclem8290 specific as it was 22.04LTS and not another version . Anyway, based on the video I had the impression (might be wrong tho) that he only has next-next-finish type experience with Linux. If it does not work that way then onto the next distro.
Hey I have 1 for mining and the amd drivers that work for it is 6.1.10 and opencl 21.50.2
I doubt use had reason to optimise the power consumption at idle since you're use it for it's proper use. Another comment mentioned 80w at idle, that's crazy. I'm assuming this is going to be down to drivers / configuration.
How you been bothered by its power consumption?
@@affieuk the CPU/GPU has no p state, it ment to be doing work vs idel
@@backwardsminer8470 hey, that's so cool, so the device shows up as an opencl device?
what do you mine with it and what hash rate can it do with it?
Dude I would love to try running some deep learning on it
Well now I want one, but I don't know why.
I think i am going to buy some and build a selfhosted cloud gaming rig
It got no games... No, wait.
Yep, same... I want one too.. don't need it
Same, exept i know why
@@tntflo_V2but the igpu doesn't work and it doesn't have a pcie slot for gpu 😢
You will probably need to patch the kernel to add support for the APU. Its mostly just adding the IDs so the driver loads for that device. ps4-linux patches are pretty similar
You just have to rebuild the kernel and include the appropriate kernel module for the igpu in the chip.
It just needs a slightly weird and modified version of the driver, and I'm almost positive it has been included as a build option for several years, it's just not shipped as the default because it is so uncommon
more info? or where to look to solve it
The iGPU is hardware disabled
@@callowaysutton this is just a lie, I got it running games with hardware acceleration using fedora 6.2.0-fc38 or whatever kernel and the most recent mesa drivers with literally one letter changed in them before compilation. The igpu works and renders games
it's not official but I modded my drivers and got it working
Very cool video Logan! I think the real value in something like this is getting to tinker with it. For $100 its a great learning opportunity, even if its not the most practical board for your use case.
Definitely! Thanks for watching :)
I'd get one to play around with if I had $100 to burn right now. Would probably turn it into an emulation machine once I got the GPU working. It uses like twice as much power at idle as my janky old Xeon based NAS. so not really great for networking. But it would be great for emulation with the GPU working. I bet the driver is already out there somewhere.
I've seen a couple videos about these and no one had anything good to say about them aside of for mining and even that wasn't great. Kind of sort of supported in Linux but use a ton of power even at idle (80w), especially considering the specs. If what I've read is accurate 1 of those blades eat up as much power as my Epyc 7302 proxmox server while active (around 120w) which is kind of crazy considering you can grab say an N100 that would probably be as performant for a decent number of tasks at like 15w for (depending on where you get it and what model) $150-$200.
If you can get one cheap not bad as a interesting bit of hardware to keep as an oddity though.
Thanks for watching! Settled on much of the same conclusion. Unless this can really provide you some kind of application specific benefit or you're using the whole chip all the time, I figured the idle power consumption was quite poor just from the heat it kicks out, and setting it up was really janky haha. Maybe a bargain for some, not for most, still glad I got to test it for myself :)
when going with risc-V or ARM you can get a server capable enough for most home uses at around 2W full load. I had a pine A64+, which actually is fast enough for most home server usages, and can even run many of them at the same time, and it would only draw 2W max, 4 cores, but only 2gb of ram which was a problem, and the powersuply also had problems, but that was a old raspberry pi kit powersuply. back then cost around €30 new or such for the entire server.
these days you can however get servers which also have nvme and such buildin and which actually have 8 cores at much higher speeds and up to 32gb of ram, which also use only around 2w to 4w at max power usage. for many such low power servers however the PCB and SBC design quality have a great effect on the powerusage, as well as the way you clock or volt it. the pine a64+ however did run like that with stock settings, couldn't do much there since the powersuply just would go to 2w once it got some load on it even if it was way less than 2w, 2w was kind of it's minimum stable output, any draw below it and it would just waste the rest of the energy to still draw 2w(from the wall), so even when the pine would only draw like half a watt(from cable) the old raspberry pi psu would still draw 2w from the wall.
those newer 8core chips however are similar in price to the board in this video for the 16gb version, and perhaps with the current prices even similar to the 32gb version.
so they are really worth checking out, and they have a working gpu.
and even a NPU, and while it is only arund 8TOPS for the NPU alone, it is super efficient on small AI when using the NPU.
If the use case for that APU is to mine crypto then there has to be driver support for it under Linux I’d think. Probably just not in the Linux kernel, I bet seller of these had to recompile the Linux kernel and I bet it shipped with a Linux version that already had that APU support in a precompiled kernel…
Thanks for watching! That is quite possible, the driver situation has been really weird, it seems like some support for compute and 2D rendering on this graphics core was added to the mainline amdgpu driver a couple of years ago and then was subsequently removed. I haven't ran across anyone else who has gotten this to successfully rasterize 3D graphics so it is possible that the chip has been partially disabled in firmware as well? Haven't gotten anywhere with a BIOS update but still hopeful and looking for more info on these!
@@TwoGuyzTech maybe find out if the company that sold them shipped it with their own version of some Linux OS… just an idea.
Excellent video!
Definitely hoping to see this device return to the channel! Hopefully there’s a solution to getting the graphics to work!
@@TwoGuyzTech craft of computing shared new biod of this
The reason why you don't get some serious graphics is due to Ubuntu being unable to auto-detect this machine's GPU, so it throws you to VESA mode. As someone mentioned here in the comments, you can install graphics drivers.
I figured that would be the case. This thing would work a lot better as a cheap gaming machine than a server. Was there even an attempt to install Windows? I bet the LTS embedded version has support for this device.
A few things about the specs on this thing that you got wrong, only 24 of the CUs on the apu are enabled so even though the hardware has 36, 12 of those are disabled. You can get them running games in fedora (at least that's what I used) using modded mesa drivers to get them recognized under a NAVI10 part, you probably could get them working as navi22 using a kernel patch but that's besides the point. I'm using fedora 6.2.0-fc38 and the most recent mesa driver
given the prices of i5-8500 micro PC's on ebay, i rate this "a neat toy" at best.
I guess this could be interesting for people that might need that GPU for AI inference, rendering, decoding or other GPU accelerated tasks
I have a somewhat unrelated question. You mentioned using ip addr to get IP address, what's the difference between using that and ifconfig?
It is even cheaper now. Like 42$. But idle CPU usage is just not too good. 80-100W in idle, is just crazy bad.
The "Asrock" 15000$ server, well, they are worthless now. You can get them now basically for free with chassis, PSUs, cooling, and all 12 cards. Even for home labing, k8s, experiments, it is not worth it, due to crazy idle power usage.
Uhm, did you try windows 11 for it for 3d acceleration there? Would be amazing if working well.
Is it good to run local LLM?
I'm curious about this myself. The small amount of digging into LLM locally is that on the AMD side there are few chipsets that are supported right now, so you'd have to make sure it was on that list, otherwise it'll back down to CPU vs. GPU. I'm tempted to buy one to see if this is possible.
I don’t remember the file you have to edit but you can edit a file so on boot at the shell login page it can display the IP of the machine. I always have to google it but I do it on my servers so I can just get the IP from the display of or when needed without logging in every time because I don’t always use those machines often and doubt have static IPs
Check out craft computing, he did a video on this as well :) he also annoyingly found out that there were features disabled on the gpu
For the ssh part if you don´t want to reboot right away your could enable ssh and start it with systemctl enable --now ssh
What about Proxmox and passing GPU to a VM?
Windows Cluster Server would work great for a few of them.
Great video, I wish we can run a ChimeraOS for gaming with this board for remote gaming . With a chip like that it should rock !
dude, whats at the moment the best to go for a Jellyfin server setup? I got lot of old parts here and thinking of going with them, i got a Ryzen 9 5900X but no other AMD parts sadly, but i also got a i7 6800k Skylake with a very old Z170 Pro gaming with 16GB Ram, an 1 TB Samsung 990 Pro+ Arctic liquid freezer. (the i7 should handle 4K movies well i tm sure?) And i was thinking of going with about 6 Ironwolfs for RAID 50 i dont know, maybe about 30 or 40 GB of Media Size for the Jellyfin Server, the GPU i would just go with anything what could fit well and help with transcoding. I just wanna be able to Stream 1080p and 4k movies here at home, no multistreaming, only 1 at a time. im building 2 more kinda like servers at the moment, and would love to use for this media server just mostly old parts which i already have
Really unfortunate it's stuck at 8gb, only usecase I can think of for it is distributed computing projects like Folding@home.
I wonder how thing thing would do as an inference server. A all in one unit with 16gb of GDDR6 for this price could be VERY compelling if it will run stable diffusion, llama3 8b, etc.
basically what this is. but its a cpu with a gpu integrated. just ssh into the original install and enable the drivers that came with it.
its an inferance card you can log into and setup to do tasks in clusters. for $100 or less.
others in this price range 4gb single core "nodes"
I wonder if you would have any luck using the M.2 as an occulink port using adapters. The only problem with that is then you would need to run an OS over USB.
You could setup PXE boot via tftp as the boot device instead. Chances are, if you are using this PC on a card you already have a home lab setup, so might as well throw a small tftp server on a VM host ^^
We love your videos Logan!
haha thank you so much :)
Definitely the kind of thing I would pick up for my FX-8370 box if I never made the jump to this Ryzen 5 3600 and it would eliminate a primary point of failure for a junker NAS that is so old it can't even handle containers. This is very cool.
Bought two of these for $50 each off eBay a few weeks ago, just getting around to messing with them. Kinda hoped to use one as a dedicated Ollama LLM server (what with GDDR6 unified RAM, memory bandwidth should be better than most PCs) and the other inside a funky wooden box as a music workstation (running FL Studio under Bottles/Wine) but that depends on GPU drivers being usable.
Same chip thats in my Tesla, but all 8 cores are enabled and it has 16GB of shared memory.
Searching this I see lots that have 16GB of Vram. I think there are also some of these Oberon chips in compute clusters for render farms. As for the drivers, someone mentioned open CL in the comments, and that driver does work according to Techpowerup, but don't expect to run games on it because things like shaders may be disabled if this was a bad batch of Tesla/PS5 chips.
I think the gpu chip might be fried, i was installing proxmox on an old computer and it booted at bios correctly but after starting install procedure dhe screen went off and i could not continue. I thougt the computer was too old to work with proxmox but after that i found a gpu card for 10$, swapped that with the computer gpu and it worked. Tried the old gpu on another computer and had the same black screen when loading os.
the gpu wasn't fried, it's just not supported in the driver so you need to boot with a special grub parameter/modded drivers
for me it sounded interesting, but when I heard high idle powerusage, that kind of killed it for me, idle power usage needs to be great for most of my use cases, even my 8 year old 4 core webserver drew only 2W under full load, later due to a faulty ssd it drew a lot more before I pulled it out, because there was one ssd which just drew something like 3w to 5w of powerusage even when under very low load, which obviously is insanely terrible both for SSD's and even more so in a server which normally draws less than 2w under full load.
the gpu not working also isn't as nice to have, since that prevents offloading things to the gpu, even though I don't often need that it still is a problem that it is lacking.
i think that could be a nice solution to build a cheap blender render cluster
How come you decided to install a full desktop environment if you are not planing to use the GUI, you could of installed the Ubuntu Server in that case
I do that too. Usually because it saves 5 minutes of etcher making a new image on a flash drive. Besides, Ubuntu can turn into a headless server with a single command, and back to a desktop at will.
@@JonVB-t8l Why not burn yourself a Ventoy USB and skip all the ether writing...just slap your iso files and it handles the rest...even windows install (but who needs that anymore) ;)
Streaming video? Come, this thing has access to gddr6. Do some in-memory database benchmarks.
It likely blows everything commercial out of the water. I'd use it as a caching proxy.
Nice video, but they are not available in Europe, at least not for that price. :(
@TwoGuyzTech could you try installing a cloudgaming hosting service when you get the gpu to work
I really want to get one to see how it works with Folding@home 😅
the worst part of this board is the consumption, 100w in idle is wild.
yes, that is even worse than 4 nvidia consumer GPU's, and nvidia gpu's have terrible powerdraw at idle, just look at the rtx 4050 and rtx 4060 drawing almost 20W at idle(16w to 18w).
even worse when comparing it to servers optimized for efficiency. since I had a old server now around 6 years old, 4 cores 2gb ram but back then that was more normal as that was before computers got more cores and ram, as back then 2 cores was the norm.
also note that server cost me €20, mistakingly said €30 somewhere else in the comments, but just found back it was around €20 new.
it was a server also optimized for lower power usage, could run all basic home server uses, even multiple together in general.
but it only drew 2W at max load.
in the modern day, while the prices of such things have gone up insane amounts, as when I got it such hardware was more for early developers and such(but stable already). the newer hardware on the market also has boards with 8 cores at much higher speed and AI accelerators NPU build in, as well as special other accelerators and encoders and a gpu, and now with up to 32gb ram, and with nvme build in by default and they also use around 2w to 4w of power under full load.
the powerusage in such boards however depends a lot on the design and quality, as well as how they are tuned, that is why the range is quite broad with 2w to 4w at max load(this is based on a seciffic chip, however other chips might even use 6W. but a well made and tuned board might run great at only 2w max, but in such thigns some boards try to compensate for worse SOC decisions by overclocking the chip and at such thigns small overclocks can have drastic changes compared to the sweet spot, while some boards just are designed right and get similar performance using only around 2W.
these boards are much more expensive than my old Pinea64+ board, but they still cost around as much as the board in the video for the 16gb versions and the 32gb versions might be around as expensive as what they cost now.
Why didn't you install Ubuntu server/Fedora server
why didnt you just install Ubuntu server wich has alle the settings already enabled
Great find! I really hope you will go further with it. I understand you feel like going down the crypto route will be a hassle but I believe it will yield more fruit than just getting Jellyfin working in the way you implemented it here! Either way, pretty cool!
Still hard to beat an Asus PRIME N100I-D D4-CSM, which is around 100 $ or less, new. 15 W Intel Celeron N100 Chip, with plenty of performance, included. But yes, thanks for an interesting project, tinkering is where the fun is at.
If the dirver works. A Remote-Desktop (Linux) Monster, as GPU Acceleration for Apps and stresming of the session get boosted.
Install Windows on it?
Don't think windows runs on genemutated bananas, by which I mean this doesn't seem to be a PC. It's some sort of something, but that's damn far from a windows compatible something. Linux runs on anything from washingmachines to wrist watches, windows doesn't.
$100 for a AMD RX 6700. ????
I’d love to see steamOS working on this
I keep seeing these videos for NAS and home servers. My question is why? Why would a person want or need one?
Netflix costs round 23$ + up to 2x 8$. That makes 39$ for three people, or if you have a group of 6 friends, that is 78$ per month or 936$ per year? I am not from the us, but this prices don’t even include tax, right?
So if you or one of your family or friends is a technical enthusiast who likes stuff like this, and you pay together for your own server, you might be better off by far. For 10 years of Netflix you are now paying 9360$. Netflix is getting less and less good tripple A movies and more trash productions, you need multiple subscriptions to stream your favorite movies or shows since there are so many streaming services available now and if you cancel the subscription everything is gone or the service provider is taking it off. The data just don’t belong to you. Of course it is easy and such a server takes a huge amount of time, if you want to do things right, but for me, i love doing this type of stuff, so I go with a self hosted solution.
Although it is an interesting item I would not really want to buy one. There are similar computers that are essentially a console in a PC style case. They have that same 4700S with the GDDR 6 being the only system RAM. While these devices appear to have "acceptable" performance you can get a better computer and one that is upgradeable. I admit I am not a fan of soldered down RAM either. I only tolerate it on a graphics card since they generally work well for their purposes. It is just that 8 GB is not a lot anymore and that will show when the RAM is full and Windows makes use of the swap file that can make some computers slower than molasses in January!
OMG if you could leverage the GPU on these cards it would make one hell of a Plex server LOL!!
Better you than me :) But cool to see.
this might be qite good for a minecraft server cuz there is no need for a GPU
Almost seems like a fpga
Cool find. Rather you than me, though!
channel is two guyz tech
but the neon tetra fish is no where found to be!
50$
run hashcat lol
A GPU rig where you can't use the GPU. Bruh
ruclips.net/video/4dFmwjDCdLw/видео.html
Ubuntu is garbage and unusable and has the snap spyware that has extremely old versions of software.
so much unnecessary work you could easily go with the Ubuntu server
Moist