Local AI models! Code with Ollama and Phi3.

Поделиться
HTML-код
  • Опубликовано: 21 ноя 2024

Комментарии • 21

  • @RajaLakshman
    @RajaLakshman 3 дня назад

    How to install this on my IIS Server so that all users in org can connect to it?

  • @AbhinavKumar-tx5er
    @AbhinavKumar-tx5er 4 месяца назад

    Hi, a very helpful video .Can you hit the URL of the video here where you have created the dashboard for this.

    • @alexthecodewolf
      @alexthecodewolf  4 месяца назад

      Hey, thank you - are you referring to the video that shows you how to chat with a database? ruclips.net/video/hw6oTjw9_Ro/видео.html

  • @TellaTrix
    @TellaTrix 5 месяцев назад +1

    I would love to learn more about AI with dotnet things + local LLMS my fev one.

  • @user-fqlt
    @user-fqlt 5 месяцев назад

    Thanks for the very good content. You channel is underrated and you deserve a million subs.

    • @alexthecodewolf
      @alexthecodewolf  5 месяцев назад +1

      Thank you, that really means a lot. Really trying to grow this channel into something big and deliver content that simplifies convoluted topics.

  • @goodtimeforever2035
    @goodtimeforever2035 5 месяцев назад

    Hi @Thecodewolf,
    Could you please explain how to set up a local laptop to run the .NET code? As I'm not a coding expert, any useful suggestions you can provide would be greatly appreciated. If you can outline the necessary steps, it would make it easier for us to follow along and perform the demo ourselves.

    • @alexthecodewolf
      @alexthecodewolf  5 месяцев назад

      Hey, thanks for watching and providing feedback. This tutorial made some assumptions about coding knowledge with .NET or other languages, but I probably should have mentioned that up front. If you want to get started with .NET to recreate the code sample you can download visual studio here, which is the code editor I'm using that also installs .NET for you: visualstudio.microsoft.com/
      You can then either clone the sample app from GitHub and just open it up in Visual Studio and follow along with the tutorial, or you can create a new project using Visual Studio and copy in the code. File -> New Project -> C# console app. Hope this helps a little bit!

  • @Ajmal_Yazdani
    @Ajmal_Yazdani 5 месяцев назад

    Hi Thanks @The Code Wolf. Is function calling supported with Phi3 or any other local LLM?

    • @alexthecodewolf
      @alexthecodewolf  5 месяцев назад +1

      Yep, local ai models support function calling.

    • @Ajmal_Yazdani
      @Ajmal_Yazdani 5 месяцев назад

      @@alexthecodewolf Thanks, Can you showcase something or share any code article.

    • @alexthecodewolf
      @alexthecodewolf  5 месяцев назад +1

      @@Ajmal_Yazdani You can find some examples in the semantic kernel docs at learn.microsoft.com/en-us/semantic-kernel/agents/plugins/using-the-kernelfunction-decorator?tabs=Csharp
      I would recommend using semantic kernel for local AI dev. Semantic kernel and more involved AI flows with local functions, plugins, services etc is a topic I plan on covering here in the near future. Semantic kernel is a big topic so I'm working on the best way to present tthat.

  • @justinkrawczak4929
    @justinkrawczak4929 5 месяцев назад

    Have you tried to connect your local LLM to talk to an on-premise network database without providing the schema in the instructions? Im wondering how well it would function or if this is even feasible. Asking for a friend :)

    • @justinkrawczak4929
      @justinkrawczak4929 5 месяцев назад

      Also, I want to add that this is great content. Useful real-world examples like this will be golden as more people leverage AI in the enterprise landscape.

    • @alexthecodewolf
      @alexthecodewolf  5 месяцев назад

      Thanks for the feedback! I'm not sure if I understand your original question though - are you asking about essentially just giving the AI a connection string or something so it can access the database directly?

    • @justinkrawczak4929
      @justinkrawczak4929 5 месяцев назад

      Yes, exactly that. ​@@alexthecodewolf

    • @alexthecodewolf
      @alexthecodewolf  5 месяцев назад +1

      @@justinkrawczak4929 As far as I know it's not possible to just give an AI a connection string and have it understand/explore a database. I think the closest solution I've seen to this is to add some kind of layer between the AI and the database, such as using a RAG solution where you connect the AI to a search service that has crawled your database, or using vector search capabilities to provide the AI with database data. I have a couple videos on my channel that explore these types of topics, but I don't know of any "Direct access" solutions.

    • @AbhinavKumar-tx5er
      @AbhinavKumar-tx5er 3 месяца назад

      @@justinkrawczak4929 I have done the similar thing. In my case I have connected the tables with the Join statement (without giving the schema for the tables) and stored the output in the form of JSON and then asked the question from the JSON.I am getting 90% accurate result. Wondering how can I fine tune more

  • @AbhinavKumar-tx5er
    @AbhinavKumar-tx5er 3 месяца назад

    Hi , How can we setup the temperature here?

    • @alexthecodewolf
      @alexthecodewolf  3 месяца назад

      Hey, you can find information on how to do that here: github.com/ollama/ollama/blob/main/README.md#customize-a-prompt

    • @AbhinavKumar-tx5er
      @AbhinavKumar-tx5er 3 месяца назад

      @@alexthecodewolf Thanks :)