NVIDIA Chat With RTX - Create a FREE Personal AI Chatbot on Your GPU

Поделиться
HTML-код
  • Опубликовано: 5 окт 2024

Комментарии • 78

  • @SurfacedStudio
    @SurfacedStudio  7 месяцев назад +4

    ⭐ Just some clarifications in case I wasn't clear enough in the video :)
    Chat With RTX does *not* run ChatGPT locally, it uses *the same type of* AI model - a GPT LLM. The training dataset is different and so are the responses.

  • @c6jones720
    @c6jones720 7 месяцев назад +4

    I used it with RTX3080 + 64GB RAM, to be honest it seems ok when you use it with PDF text documents and ask it questions on the text content. It doesn't seem quite as good for the you tube video transcripts I've tried it with, but I suppose that's a bit hit and miss. I don't know how good it would be with numerical data sets

    • @mariogross
      @mariogross 6 месяцев назад +1

      how long takes to process a PDF with many pages? any benchmarks? what about processing a youtube video?

    • @c6jones720
      @c6jones720 6 месяцев назад

      I tried it with a pdf of frankenstein, 20000leagues under the sea, and snowcrash, a few hundred pages. I tried them all together and then separately, obviously it works better sepearatly. These took less than 5 minutes to process and you could ask questions about the text, but it completely loses context after one or two questions if you try to follow up on the last conversation. I tried it with some french homework, and it could do some translation and inference but struggles with how text is formatted on a page.The youtube thing probably depends on how good the transcript is@@mariogross

  • @sunilrayapudi2948
    @sunilrayapudi2948 7 месяцев назад +4

    Nice and Clean Video. Keep going 🙌

  • @Niter88
    @Niter88 7 месяцев назад +6

    Hahaha finally some use to my RTX

    • @SurfacedStudio
      @SurfacedStudio  7 месяцев назад +1

      Haha! And this is a pretty good one IMO. Though I have to say a lot of RTX powered games look fantastic!

    • @Niter88
      @Niter88 7 месяцев назад +1

      @@SurfacedStudio yeah, it's good to see things other than games, it is a pretty powerful tool

  • @marcus_rigonati
    @marcus_rigonati 5 месяцев назад +2

    I don't have the option for youtube URL and no matter how much I search about it I cannot find anyone talking about it, is it only me who has this problem?

  • @BodySculptTV
    @BodySculptTV 7 месяцев назад +2

    After watching your video I tried this "Chat with RTX" demo, not too impressed with it. It runs perfect on a 14900K with 48Gb of ram and a RTX 4090. Unfortunately it did not impress me much with its responses. It seems Llama 2 and Mistral 7B were educated by car mechanics. Every single question about cars and their engine specifications gave me correct answers but asking it the specifications of an Intel I9 14900K were all wrong. It said it had a total of 20 threads and 10 cores, it also said its turbo boost was 5Ghz and a TDP of 95w. With UHD 630 graphics, why it would know more about cars than CPU's makes me wonder. Will continue to play with it but, I can't trust its answers 100%.

    • @SurfacedStudio
      @SurfacedStudio  7 месяцев назад +1

      Oh I've found plenty of inaccuracies with it - as I have with all other chatbots, including ChatGPT ;)
      It once told me I should heat my oven to 60 degrees to bake my pizza... ☃️
      All (current) AI LLMs suffer from false positives and *any* information you get out of them should be double checked if you're planning to rely on it in any sort of meaningful way. They're great for ideas, rephrasing or summarising text, writing content outlines and much much more, but asking specific technical questions is likely going to give you some inaccuracies...
      ChatGPT does a better job of this mostly because of the much larger dataset it has consumed for information :)

  • @aketo8082
    @aketo8082 14 дней назад

    Nice. But at the moment not useful for German documents. Hopefully they improve ChatRTX.

  • @BlakeCDMedia
    @BlakeCDMedia 2 месяца назад

    Does it retain the information after you've fed it the data once? I'm thinking about making a model for a specific software and just feeding it a ton if info. Thx

  • @centerfresh8472
    @centerfresh8472 7 месяцев назад

    So, basically it's a platform to teach people how to train AI? If so, it's a great initiative from Nvidia.

    • @SurfacedStudio
      @SurfacedStudio  7 месяцев назад

      I don't think there's much 'teaching' involved. You point it at a folder / YT video or playlist and it just consumes the info into the model. But it's working pretty well for a tech demo

  • @Vaios1981
    @Vaios1981 4 месяца назад

    Everything you feed it from your hard drive folders is kept locally on your computer? I know you already answered that..am asking just in case there was any concerning updates since your upload.

    • @SurfacedStudio
      @SurfacedStudio  4 месяца назад +1

      Nothing I've heard changes my understanding of how this works. Everything runs locally on your graphics card. No data is shared or uploaded anywhere :)

    • @Vaios1981
      @Vaios1981 4 месяца назад

      @@SurfacedStudio Thank you for your time and response :) I just got my new RTX4060 and i will try this as well.

  • @starvinglawstudent
    @starvinglawstudent 7 месяцев назад

    Wondering if this could replace the bots on Zendesk, and whether an API can be created.

    • @SurfacedStudio
      @SurfacedStudio  7 месяцев назад +1

      It already runs on a local server and you can easily run this in the cloud and expose the API if you wanted to - assuming your compute resources have the required GPU hardware to run it

    • @starvinglawstudent
      @starvinglawstudent 7 месяцев назад +1

      @@SurfacedStudio I figured dual 3090 in SLI should be enough. Thanks!!

  • @MarkAnthonyFernandez-q7y
    @MarkAnthonyFernandez-q7y 6 месяцев назад

    I can't find it on my start menu which is very Annoying you got any fix about it

    • @SurfacedStudio
      @SurfacedStudio  5 месяцев назад

      You can try to locate where it installed and launch it directly. It usually adds a shortcut onto your Desktop. Mine points to C:\Users\surfa\AppData\Local\NVIDIA\ChatWithRTX\RAG\trt-llm-rag-windows-main\app_launch.bat

  • @ay-mikey
    @ay-mikey 7 месяцев назад

    But can u load it with every lotto number what's been drawn over a Yr and get it to make a prodict on which are the most likely number to come out

    • @SurfacedStudio
      @SurfacedStudio  7 месяцев назад

      I don’t think that’s how probability works ;)

    • @zimizi
      @zimizi 6 месяцев назад

      @@SurfacedStudio unless there is a repeating pattern if it is rigged... which happened in the US before on a Certain date there were certain numbers
      but LLMs are pretty bad at math i believe so anyway

  • @yashag
    @yashag 7 месяцев назад

    Which GPU are you using? Will it work in 3050?

    • @SurfacedStudio
      @SurfacedStudio  7 месяцев назад +1

      It works on 30 and 40 series graphics card. I've had it run on a 3090 and a 4080

    • @mariogross
      @mariogross 6 месяцев назад

      any big different in their performance? @@SurfacedStudio

  • @uerisc
    @uerisc 7 месяцев назад

    Nice video! Do you know if it's possible to change the ai model it uses? (Mistral 7B int4). It's the only model available to me.

    • @NinjaTaskers
      @NinjaTaskers 7 месяцев назад

      I would like to know this as well

    • @SurfacedStudio
      @SurfacedStudio  7 месяцев назад

      I think it's the only model available that works with Chat With RTX at this stage. In a few clips I had 2 models but I was using an early access version. In the official release I only have the one :)

    • @NinjaTaskers
      @NinjaTaskers 7 месяцев назад

      @@SurfacedStudio nooope! I found the solution. To install Llama 13b you need to have GPU with 16gb vram or... just rewrite some file in the installer folder. Worked for me

  • @doriandoesstuff
    @doriandoesstuff 6 месяцев назад

    the official site has removed the download button

    • @SurfacedStudio
      @SurfacedStudio  6 месяцев назад

      Can you share a screenshot? I can still see a big 'Download Now' button

  • @SonnyDali
    @SonnyDali 7 месяцев назад

    Is there a way to run using an rtx 2070 ?

    • @SurfacedStudio
      @SurfacedStudio  7 месяцев назад

      The specs clearly state 30 or 40 series RTX so I doubt it as your hardware likely doesn’t have the required tensor cores, but you can always give it a shot

  • @jasonhession3710
    @jasonhession3710 7 месяцев назад

    is it availible for mac download ?

    • @VideoBee_YT
      @VideoBee_YT 7 месяцев назад +6

      Which part of "Only for 30 and 40 series rtx gpu's" did you not understand?

  • @ramizshould
    @ramizshould 7 месяцев назад

    Hi there bro NVIDIA Chat With RTX looks nice I m interest to see of it in the near future by the way how are you today? man...

    • @SurfacedStudio
      @SurfacedStudio  7 месяцев назад

      I'm good, thank you! How have you been? :)

  • @mimmog2251
    @mimmog2251 7 месяцев назад

    Cool stuff, Thanks.

  • @Nicat_eldostu
    @Nicat_eldostu 7 месяцев назад +1

    👍azerbaijan

  • @Akash_Kakad
    @Akash_Kakad 6 месяцев назад

    6:28
    how to say goodbye in hindi....
    idk which language is this but i think rtx maked new language hindi 2.0😂😂

    • @adamadamadamadam
      @adamadamadamadam 6 месяцев назад

      Seems to be hindi but instead of goodbye its something like "The villagers have their own relationship with Durga. "😂

  • @theshyguy3
    @theshyguy3 6 месяцев назад

    might wanna delete the video.. there is no download link on the website

    • @SurfacedStudio
      @SurfacedStudio  6 месяцев назад

      Can you share a screenshot? I can still see a big 'Download Now' button on their website

    • @theshyguy3
      @theshyguy3 6 месяцев назад

      @@SurfacedStudio where would you like me to give you the screenshot. no im not joining your discord

    • @SurfacedStudio
      @SurfacedStudio  5 месяцев назад

      Share on imgur? Also happy for an email :)

  • @ramizshould
    @ramizshould 7 месяцев назад

    Outstanding!!

  • @brust3871
    @brust3871 7 месяцев назад +4

    In hindi 😂 not even close to goodbye

  • @ramizshould
    @ramizshould 7 месяцев назад

    😎😎🥰🥰😀😀

  • @shgrdyisthenickname8908
    @shgrdyisthenickname8908 7 месяцев назад

    50 Gigs ????!!!!

    • @SurfacedStudio
      @SurfacedStudio  7 месяцев назад +1

      Yup, about the size of your average game these days 😅

    • @SuryaPrasad92
      @SuryaPrasad92 7 месяцев назад

      yeah downloading the package right now, the compressed file is 35GB approx

  • @ShinChven
    @ShinChven 7 месяцев назад +1

    You cannot run GPT locally on your own machine. Your statement is completely wrong. Those models are alternatives to GPT, but they are not GPT.

    • @SurfacedStudio
      @SurfacedStudio  7 месяцев назад +6

      "GPT" is just a term describing a type of AI model and how it works. It's a more specific term than "LLM". Yes you can run a GPT model on your local machine. You can't run (the) ChatGPT (TM) on your local machine if that's what you mean. ChatGPT is just a type of GPT that happens to be only offered online ;)

    • @ShinChven
      @ShinChven 7 месяцев назад

      @@SurfacedStudioYou surely don't know what GPT is. Do your homework first.

    • @SurfacedStudio
      @SurfacedStudio  7 месяцев назад +3

      Right back at ya: blog.stackademic.com/understanding-the-difference-between-gpt-and-llm-a-comprehensive-comparison-1f624c713507

    • @ShinChven
      @ShinChven 7 месяцев назад

      @@SurfacedStudio "GPT, or Generative Pre-trained Transformer, is a class of natural language processing (NLP) models developed by OpenAI."
      in the post you sent me, I see this statement.

    • @SurfacedStudio
      @SurfacedStudio  7 месяцев назад +4

      Yup. It's exactly what I said in the video and in my comments :) That statement doesn't in any way contradict the ability to run a GPT model on your local computer.
      I think we're talking two different languages here... ;)