AI Greatness? We Slammed a GPU in a QNAP!
HTML-код
- Опубликовано: 9 апр 2024
- NVIDIA has publish a code set that lets just about anyone run a private ChatGPT on their data. While you can run this on a laptop, we wanted to get the models a little closer to where data resides in many small business settings, the NAS. So we cracked open a QNAP NVMe NAS and dropped in an A4000 to do the dirty work.
There's a bit more to the story though Chat with RTX runs on Windows, so we used QNAP's Virtualization Station to create a Windows VM, then went about installing the packages. This video has all of the detail, the full guide on how we did this is linked below.
QNAP has subsequently added the A4000 to the compatibility list.
Full Guide on QNAP Chat with RTX Config -
www.storagereview.com/review/...
#qnap #nas #ChatWithRTX #ai - Наука
Liking the new clear audio - can focus on your topics now - thanks!
We always appreciate input ;)
@@StorageReview A good bit of noise reduction software in post can work wonders.
@@leadwhite1249 People like the background lab noise typically.
@@StorageReview you don't want to remove it entirely - it adds character and authenticity to the set. But you could dial it down - I've stopped watching some of the videos before when the noise was too aggressive.
I have a Tesla P4 sitting happy in my QNAP for years. Had to recompile the kernel for mdev/vfio support, but once the hassle all now works nicely in Qemu. Need to edit the xml file for passthrough. Also managed to passthrough gpu using Docker which runs Ollama docker.
At times you gotta get creative to make things work, but glad you're having a good experience.
CHAT with RTX no longer does RUclips retrieval. :(
Why does your thumbnail editor hate you?
Jordan did this one to himself.
chat with rtx will work all the way down to a 3050 8gb