Run your own ChatGPT Alternative with Chat with RTX & GPT4All

Поделиться
HTML-код
  • Опубликовано: 19 июн 2024
  • Find a Lenovo Legion Laptop here: lon.tv/ro8xj (compensated affiliate link) - You can now run Chat GPT alternative chatbots locally on your PC and Mac! In this video I look at two "turnkey" solutions : Nvidia's Chat with RTX and GPT4All. See more AI - • ChatGPT and AI and subscribe! lon.tv/s
    Links:
    Chat with RTX : www.nvidia.com/en-us/ai-on-rt...
    GPT4All: gpt4all.io/index.html
    VIDEO INDEX:
    00:00 - Intro
    01:16 - Chat with RTX
    05:47 - GPT4All Setup
    08:02 - GPT4All Example
    11:38 - GPT4All Model Change
    14:18 - GPT4All On a Mac
    16:23 - ChatGPT Comparison
    17:16 - Conclusion
    Visit my Blog! blog.lon.tv
    Subscribe to my email lists!
    Weekly Breakdown of Posted Videos: - lon.tv/email
    Daily Email From My Blog Posts! lon.tv/digest
    See my second channel for supplementary content : lon.tv/extras
    Follow me on Amazon too! lon.tv/amazonshop
    Join the Facebook group to connect with me and other viewers!
    lon.tv/facebookgroup
    Visit the Lon.TV store to purchase some of my previously reviewed items! lon.tv/store
    Read more about my transparency and disclaimers: lon.tv/disclosures
    Want to chat with other fans of the channel? Visit our Facebook Group! lon.tv/facebookgroup, our Discord: lon.tv/discord and our Telegram channel at lon.tv/telegram !
    Want to help the channel? Start a Member subscription or give a one time tip!
    lon.tv/support
    or contribute via Venmo!
    lon@lon.tv
    Follow me on Facebook!
    / lonreviewstech
    Follow me on Twitter!
    / lonseidman
    Catch my longer interviews and wrap-ups in audio form on my podcast!
    lon.tv/itunes
    lon.tv/stitcher
    or the feed at lon.tv/podcast/feed.xml
    We are a participant in the Amazon Services LLC Associates Program, an affiliate advertising program designed to provide a means for us to earn fees by linking to Amazon.com and affiliated sites.
  • НаукаНаука

Комментарии • 47

  • @Razor2048
    @Razor2048 3 месяца назад +7

    I wish the makers of these GPT models as well as even stable diffusion, would at least offer the option to allow for the use of shared system memory. On the PC, by default a video card can utilize in additional to the dedicated VRAM, it can additionally allocate 50% of the system memory for use by the GPU. For VRAM throughput intensive tasks, this will carry a large performance penalty, but it would still be good to have, as many people would rather the task run slowly than not at all. It would especially be good with things like stable diffusion where it would generating higher res images, can easily use 24-32GB of VRAM which most consumer cards lack, but nothing is stopping them from allowing a slower shared memory process to redo a generation at a higher res.

    • @dwalinfundinson
      @dwalinfundinson 3 месяца назад

      They can. You're looking for LM Studio.

  • @vsdfsdfsdfwew
    @vsdfsdfsdfwew 3 месяца назад

    how did you start it? after install no icon appears/link etc

  • @cr-pol
    @cr-pol 2 месяца назад

    A followup video about any updates for these two GPT tools would be interesting to see if they
    1: got better at understanding
    2: stopped making things up

  • @normcfu
    @normcfu 3 месяца назад

    I don't know a lot about the AI models and just watching your video I learned some stuff. Question, do these models take advantage of multiple cpus in your PC or are they single threaded outside the GPU?

    • @zivzulander
      @zivzulander 3 месяца назад +2

      They are threaded. CPU utilization will be very high across cores if running on CPU vs GPU.

    • @biglevian
      @biglevian 3 месяца назад

      in gpt4all you can even tell it how many threads it can use

  • @JohnPMiller
    @JohnPMiller 3 месяца назад +3

    Can a user provide a random seed to get repeatable results?
    Also, I'm disappointed that Nvidia didn't think it was worth their effort to support my RTX 2070.

  • @enermaxstephens1051
    @enermaxstephens1051 2 месяца назад

    But when has that hermes model been updated last? If you ask it, it says it was in 2019. Probably false, but I'd still be curious how up to date all this is. Might be better if a much newer model downloaded off huggingface was used.

  • @adriangpuiu
    @adriangpuiu 2 месяца назад

    i downloaded and installed the chat with rtx but the youtube url is missing from the drop down :) if anyone has the older zip file to share it on internet would be great and highly appreciated

  • @alexmodern6667
    @alexmodern6667 3 месяца назад

    Thank you. Very impressed with common sense illustrated ideas

  • @drumbyte
    @drumbyte 3 месяца назад

    To load llama13b on a computer with less video memory: Goto the installation folder -> ChatWithRTX_Offline_2_15_mistral_Llama\RAG -> Edit the llama13b.nvi file -> (Change Line 26 to the value 7). The line should look like this: then run the installation and it will install Llama 13b:)

    • @richardtschuggi2834
      @richardtschuggi2834 Месяц назад

      Perfect ! Thank you a lot. Having 11 GB VRAM would be a pity missing bigger model

  • @nrnoble
    @nrnoble 3 месяца назад +1

    It is already interesting to see how AI tech is impacting YT. There are many YT channels now using AI to write scripts combined with cloned voice narration. It does not appeal to me because the videos look and sound manufactured like they were created on an assembly line. It is like the content creators feed the AI the parameters for the video, and the AI assembles the video ready to be posted on YT. I am sure it is not that automated yet.

    • @RobertDunn310
      @RobertDunn310 3 месяца назад +1

      This is actually a great opportunity for human creators to stand out even more. I don't see anything wrong with using AI to help you draft a script, but I do hate hearing that robotic text-to-speech voice.

    • @LonSeidman
      @LonSeidman  3 месяца назад +1

      I agree completely - it’s already very obvious what’s AI generated vs created by a real thinking human.

    • @nrnoble
      @nrnoble 3 месяца назад

      @@LonSeidman I hope YT implenets a way to filter out AI generated content.

    • @nrnoble
      @nrnoble 3 месяца назад

      @@RobertDunn310 It's not the AI scripts that bothers me its the cloned voices that sound fake. The combination allows them flood YT with frequent videos that currently I am unable to filterout when searching for a topic. It's a matter of personal preference. Overall the use AI is not a bad thing. I use it daily for getting info.

  • @CF542
    @CF542 3 месяца назад +1

    I'm not sure I see the point of these chat models.

    • @Lysander-Spooner
      @Lysander-Spooner 3 месяца назад

      To dispense agitprop to a generation to lazy to do any real research. Someone reading the results without seeing the original video would have a completely false understanding of what was siad. Imagine what happens when applied to government actions, politics and the news.

  • @Ockv74
    @Ockv74 3 месяца назад +1

    ❤❤❤

  • @nrnoble
    @nrnoble 3 месяца назад

    Currently we are at early dawn of AI, much like late 70s with micro processor computers (Apple 1, Commodor Pet, etc) or the early days of internet in the late 80s (dialup modems). Over the next decade AI will explode in terms of abilities, and unfortunately misuse of the technology by those who want to use AI for unethical, illegal, and dangerous reasons very much like technology that have preceded it.

  • @ericB3444
    @ericB3444 3 месяца назад +3

    LonBall, who needs an offline chatbot unless it’s the apocalypse and you have absolutely zero connection to the online products?

    • @Kevin-oj2uo
      @Kevin-oj2uo 3 месяца назад +14

      Privacy concerns. I will highly prefer a local AI.

    • @ericB3444
      @ericB3444 3 месяца назад +1

      @@Kevin-oj2uo my dog Ragula and myself don’t think people are worried about what you’re asking your chatbot because there’s a lot of other problems in the world that are bigger than that.

    • @Kevin-oj2uo
      @Kevin-oj2uo 3 месяца назад

      ​@@ericB3444lol

    • @Kevin-oj2uo
      @Kevin-oj2uo 3 месяца назад

      ​@@ericB3444lol😂

    • @dwalinfundinson
      @dwalinfundinson 3 месяца назад +9

      Why pay a subscription for something your own PC can do? That's the point of having a computer... to compute.

  • @villageidiot8718
    @villageidiot8718 3 месяца назад

    This would be great if you could trust the output.