AI and Python - Ollama for Local LLM AI Usage

Поделиться
HTML-код
  • Опубликовано: 11 сен 2024
  • RSVP for Classes at - www.SiliconDoj...
    Notes and Code at - github.com/Sil...
    Support Classes at - donorbox.org/etcg
    Find All Classes at - www.elithecomp...
    LinkedIn at - / eli-etherton-a15362211
    Ollama allows you to run LLM's locally and get the power of AI without having to deal with OpenAI or other API services. Ollama can run at the command line, or with a module you can interact with it through Python.
    ollama.com
    Ollama is a framework that allows you to run a huge number of different LLM's from tiny ones to the new 405B parameter one from Meta. You don't need an expensive new computer. I've run the Phi3 model from Microsoft on a 2012 MacBook Pro with 4GB of RAM. (It's slow... but it works...)
    We'll learn how to install and run Ollama on your system, and then how to connect your Python scripts to it.
    The class will go over:
    - What is Ollama
    - What are LLM's
    - Installing Ollama and Pulling Models
    - Running Ollama at the command line
    - Connecting Python Scripts to Ollama

Комментарии • 14

  • @enilenis
    @enilenis 29 дней назад +8

    Congrats Eli, on outliving Susan Wojcicki! You won.

  • @TransmentalMe
    @TransmentalMe 29 дней назад +5

    Holy cow, I feel like I haven't seen one of your videos in years. RUclips's recommendation feed is garbage, but nice to see you still making content. I went from a nobody helping family with printers in KY to an engineer in SV thanks to the inspiration from your content. Awesome to see you covering AI.

  • @mohcinelayati7765
    @mohcinelayati7765 29 дней назад +1

    Amazing the quality of those classes
    I never had at my life time
    Thank you eli

  • @BiblePeopleAlive
    @BiblePeopleAlive 16 дней назад

    You have lots of endurance. Well done Eli keep it up. Unfortunately people have limited concentration span and they watch other short intriguing clips.

  • @vertigoz
    @vertigoz 28 дней назад

    Saudations from Portugal, very interesting presentation

  • @karthikkrishnamoorthy670
    @karthikkrishnamoorthy670 29 дней назад +1

    The resemblance between Ollama's models and ChatGPT depends on the specific LLM you run through Ollama. Generally, Ollama can run models that are comparable to ChatGPT in some tasks, especially with advanced models like Meta's 405B. However, the overall performance and similarity to ChatGPT will vary based on the model's size, architecture, and training data.
    In percentage terms, if you’re using a state-of-the-art model in Ollama, it might achieve around 70-85% of ChatGPT's capabilities, but it won't fully replicate ChatGPT's performance, particularly in nuanced or complex tasks.

    • @0_1_2
      @0_1_2 23 дня назад

      Okay, Thanks ChatGPT 🤨

    • @karthikkrishnamoorthy670
      @karthikkrishnamoorthy670 23 дня назад

      @@0_1_2 You're welcome zero and uh one oh sorry I missed the underscore earlier 😁

  • @TRXST.ISSUES
    @TRXST.ISSUES 29 дней назад

    You’re the man Eli

  • @jantechniczek1625
    @jantechniczek1625 29 дней назад +1

    May shwarz be with yaaa ;)

  • @marcsawyer8709
    @marcsawyer8709 29 дней назад

    Good to see your videos again. It has been a minute. Get a better camera and LED lighting and a clip on mic. Try a canon M50. You will thank me later.

  • @Saa42808
    @Saa42808 20 дней назад

    Hey man, I have known you probably since 2010 or so, you are not ADHD right! You look though. There is nothing wrong with that, you are successful anyway.

  • @piedpiper404
    @piedpiper404 29 дней назад

    Eli....longest time...😊