Dual 3090Ti Build for 70B AI Models

Поделиться
HTML-код
  • Опубликовано: 15 мар 2024
  • In this video, I take you through my exciting journey of upgrading my computer setup by adding an additional Nvidia RTX 3090Ti, with the ultimate goal of running highly demanding 70B localllm models and other GPU-intensive applications. For those who share a passion for pushing the boundaries of AI research and computational power, you know how crucial having the right hardware can be. That's exactly why I embarked on this upgrade mission.
    After extensive research and monitoring the market for the best deals on GPUs, I stumbled upon a golden opportunity at my local Micro Center. To my surprise, they had refurbished Nvidia 3090 and 3090Ti Founders Edition cards on offer at prices that undercut even the second-hand market. This was a deal too good to pass up, especially for a high-performance enthusiast like myself looking to bolster my system's capabilities for handling some of the most compute-intensive tasks out there.
    In this detailed build log, I'll show you every step of the process, from the decision-making to the installation and eventual performance testing. We'll explore why the Nvidia 3090Ti is a game-changer for anyone interested in deep learning, AI model training, and running sophisticated algorithms that demand significant GPU resources.
    Furthermore, I'll share insights on how to spot great deals on high-end hardware, the importance of considering refurbished components, and tips for ensuring your system is ready to take on the challenges of next-generation computing. Whether you're a seasoned AI developer, a deep learning enthusiast, or simply someone fascinated by the capabilities of modern technology, this video is packed with valuable information.
    Join me as I boost my computer's performance to new heights, making it capable of running 70B localllm models and beyond. Don't forget to like, share, and subscribe for more content on AI, technology, and high-performance computing builds. Your support helps me bring more of these in-depth guides and tutorials. Let's dive into the world of high-end computing together!
  • РазвлеченияРазвлечения

Комментарии • 32

  • @UpNorth937
    @UpNorth937 21 день назад

    Great video!

  • @i6od
    @i6od 18 часов назад

    i seen a reddit post of guy running 4 p100 16gb under 1300$ getting 30 tokens a second with vLLM , on 70b llama 3 lol, im so happy to see other builds like dual 3090's too, so far i have managed to pick up one titan rtx im hoping to shoot for a 3090 or another titan rtx,

    • @OminousIndustries
      @OminousIndustries  15 часов назад

      Its been very cool to see the use cases of older cards for localllm setups. I want to grab a tesla p40 at some point and put in in a restomod llm pc if nothing more than for the cool factor of how it looks.

  • @mcdazz2011
    @mcdazz2011 2 месяца назад +1

    One of the best things you can do in the short term, is to clean the front air filters. I can see one at 11:48, and there's a fair amount of dust between the filter and the fan. You'll get better air intake just by cleaning them, which will help with any heat generated in that case (which is a BIG heat trap).
    Longer term, definitely look at getting a new case with better air flow.
    The way it is at the moment, that case is going to act like an oven and you'll likely find that the CPU/GPUs might thermal throttle and rob you of performance.
    Thermaltake make some pretty big cases (on wheels if that's your thing), so you might like the Core W100 or Core W200.

    • @OminousIndustries
      @OminousIndustries  2 месяца назад

      Excellent advice, ironically enough I recently purchased a Thermaltake View 71 to transfer all the components into. I am excited to do the swap.

  • @cybermazinh0
    @cybermazinh0 2 месяца назад

    The video is very cool, the case of the 3090 could be very beautiful

    • @OminousIndustries
      @OminousIndustries  2 месяца назад +1

      Thanks very much! I am going to be swapping everything over into a Thermaltake View 71 case very soon.

    • @jamesvictor2182
      @jamesvictor2182 17 дней назад

      Unlike the inside of that case!

  • @mixmax6027
    @mixmax6027 Месяц назад +1

    How'd you increase your swap file? I have the same issues with 72B models running dual 3090s

    • @OminousIndustries
      @OminousIndustries  Месяц назад +1

      These instructions should work, though I have only used them on 2022.04 wiki.crowncloud.net/?How_to_Add_Swap_Space_on_Ubuntu_22_04#Add+Swap

  • @mbe102
    @mbe102 2 месяца назад

    What is the aim for using opendalie? Is it just... for fun, or is there some monetary gain to be had through this?

    • @OminousIndustries
      @OminousIndustries  Месяц назад

      Personally I just use it for fun. Some people use these uncensored image models to generate NSFW images that they then release on patreon, etc to make some money, but that is not in my wheelhouse.

  • @jamesvictor2182
    @jamesvictor2182 17 дней назад

    I am awaiting my second 3090 ti, probably going to end up water cooling. How has it been for you with heat management?

    • @OminousIndustries
      @OminousIndustries  17 дней назад

      I have not seen crazy temps while running localllama. I did render something in keyshot pro that made the cards far too hot but for any llm stuff it hasn't been too bad at all.

  • @atabekkasimov9702
    @atabekkasimov9702 2 месяца назад +1

    Did You plan to use NVLink with new Ryzen setup?

    • @OminousIndustries
      @OminousIndustries  2 месяца назад +2

      It is something I would like to add once I swap over to a threadripper. I have seen conflicting opinions on how much it helps but I would like it for "completeness" if nothing more.

  • @M4XD4B0ZZ
    @M4XD4B0ZZ Месяц назад +1

    Ok so i am very interested in local llms and found that my system is way too weak for my likings. But i really have to ask.. what are you doing with this technology? I have no "real" use case for it and wouldn't consider buying two new gpus for it. What are actual beneficial use cases for it? Maybe coding?

    • @OminousIndustries
      @OminousIndustries  Месяц назад +2

      I have a business that utilizes LLMs for some of my products so it is a 50/50 split between business-related research and hobbyist tinkering. The requirements to run LLMS locally are heavily dependent on the type and size of model you want to run. You don't need a large vram setup like this to fool around with them, I just went for this so that I could run larger models like 70B models. Some of the smaller models would run fine on an older card like a 3060 which can be had without breaking the bank. Some of the model "curators" post the requirements for vram for the models on huggingface, bartowski being one who lists the requirements.

    • @M4XD4B0ZZ
      @M4XD4B0ZZ Месяц назад

      @@OminousIndustries thank you for the insights really appreciate it

    • @OminousIndustries
      @OminousIndustries  Месяц назад

      @@M4XD4B0ZZ Of course!

  • @MikeHowles
    @MikeHowles 8 дней назад

    Bro, use nvtop. You're welcome.

    • @OminousIndustries
      @OminousIndustries  8 дней назад

      I'm going to install that tonight for my intel gpu build, I previously hadn't found a monitor for that gpu on linux.

  • @codescholar7345
    @codescholar7345 Месяц назад

    What CPU and motherboard? What is the temperature of the cards? Thanks!

    • @OminousIndustries
      @OminousIndustries  Месяц назад

      The cpu is an I7-12700K and the mobo is a MSI PRO Z690-A. I purchased them as a micro center bundle about a year ago. I have not seen the card temps get over about 75c when using the text-gen-webui. I was using keyshot pro for something and decided to use both cards to render the project and they got far too hot, so cooling is first priority to be upgraded.

    • @codescholar7345
      @codescholar7345 Месяц назад

      @@OminousIndustries Okay thanks. Yeah there's not much space in that case. I have a bigger case, I'm looking to get another 3090 or 4090 and possibly water cool them. Would be nice to get an A6000 but too much right now

    • @OminousIndustries
      @OminousIndustries  Месяц назад

      @@codescholar7345 I have a thermaltake view 71 to swap them into when I get the time. The A6000 would be awesome but yeah that price could get you a dual 4090 setup. A water cooling setup would be very cool and a good move for these situations.

  • @emiribrahimbegovic813
    @emiribrahimbegovic813 8 дней назад

    Where did you buy your cafd

    • @OminousIndustries
      @OminousIndustries  8 дней назад

      I got it at Micro Center, they were selling them refurbished. Not sure if they still have any in stock. They also had 3090s.

  • @skrebneveugene5918
    @skrebneveugene5918 12 дней назад

    What about llama3?

    • @OminousIndustries
      @OminousIndustries  11 дней назад

      I tested a small version of it in one of my more recent videos!

  • @m0ntreeaL
    @m0ntreeaL 8 дней назад

    BIG Price ...i guess 200bucks to High