Create Anything with LLAMA 3.1 Agents - Powered by Groq API

Поделиться
HTML-код
  • Опубликовано: 1 фев 2025

Комментарии • 20

  • @engineerprompt
    @engineerprompt  6 месяцев назад

    If you want to learn Advanced RAG techniques, checkout my course here: prompt-s-site.thinkific.com/courses/rag
    If you are looking for advising or consulting on AI/LLM, get in touch: calendly.com/engineerprompt/consulting-call

  • @aimaniaco
    @aimaniaco 6 месяцев назад +1

    Great stuff! Thanks for sharing this! Glad to see new OSS models performing so good!
    I wonder if tuning the prompts a bit for the 8B model to force it use the tools only would help with hallucinations! But I guess that’s on us the viewers to experiment. Thanks again. Great job!

  • @ThaLiquidEdit
    @ThaLiquidEdit 6 месяцев назад +1

    I really like your videos! They are very interesting and up to date. They come in pretty handy in addition to my machine learning course.

  • @limjuroy7078
    @limjuroy7078 6 месяцев назад +1

    After watching your video, I suddenly have an idea in my mind but I don't know if it makes any sense or not.
    My idea is to have to have multiple tools where one tool is used to answer user questions on the upload documents (RAG process. E.g., load, chunk, embed, store in vector db, re-ranker), another tool will be used to answer questions that is not within the vector DB (using LLM trained knowledge) and the final tool will be used to answer the question by searching through the internet as the LLM and knowledge base do not have the info or knowledge.

  • @wesleymogaka
    @wesleymogaka 4 месяца назад

    Great! Can I use this code structure with other models like openai/claude ?

  • @konstantinlozev2272
    @konstantinlozev2272 6 месяцев назад +1

    When are the quantised instruct versions of Lama 3.1 8B coming up?

    • @engineerprompt
      @engineerprompt  6 месяцев назад +1

      There were some issues with the quants in llamacpp. Hopefully that will be resolved soon.

    • @konstantinlozev2272
      @konstantinlozev2272 6 месяцев назад

      @@engineerprompt Thanks! Have you tried good ol'' Mistral 7B on function calling? In my experience, it is better at function calling than Lama 3 8B. Although definitely falters too...
      Is also definitely better in JSON output.

  • @jwickerszh
    @jwickerszh 6 месяцев назад

    Interesting, at least there are improvements with 3.1 so perhaps a finetune of the 8B for function calling is possible?

    • @engineerprompt
      @engineerprompt  6 месяцев назад

      Yes, and seems like prompting can also help. Have seen some work on it.

  • @mycloudvip
    @mycloudvip 6 месяцев назад

    Question: Can you kindly share how I can add those highlights to my subtitles? We need something like that for our language class. Thanks

    • @engineerprompt
      @engineerprompt  6 месяцев назад +1

      I use descript for it.

    • @mycloudvip
      @mycloudvip 6 месяцев назад

      @@engineerprompt Thanks for the quick reply! I will look into that asap! Kudos for your awesome content! @MyCloudVIP Chicago

  • @AlpMercan14
    @AlpMercan14 5 месяцев назад

    I have a problem. I am using llama3.1 tool function. The problem is it hallucinates when ı did not supply necessary required info. How can ı make it ask me follow up questions about required parameters

  • @NetZeroEarth
    @NetZeroEarth 6 месяцев назад +3

    Can you make the notebooks shareable please 🙏. Currently have to request permission to them

    • @engineerprompt
      @engineerprompt  6 месяцев назад

      Sorry for that. Now you should have access.

  • @RedCloudServices
    @RedCloudServices 5 месяцев назад

    Imagine when LLMs can create their own tools in order to accomplish a task rather than the human list of static tools

  • @Livanback
    @Livanback 6 месяцев назад

    Is it like agents? Functions calling