2 LLMs Talking With Each Other with Ollama Locally - Xplore TerminaLLM

Поделиться
HTML-код
  • Опубликовано: 13 дек 2024

Комментарии • 10

  • @TheSalto66
    @TheSalto66 2 дня назад

    Fahd Mirza, in the inference process to traslate a prompt in LLM response what is the difference of role=user and role=system ?

    • @TheSalto66
      @TheSalto66 2 дня назад

      In other word if I write "role":"system" , "content":"you are a fish" , or I use "role":"user" , "content":"you are a fish" why the effect will be different ?

  • @tollington9414
    @tollington9414 2 дня назад

    Very interesting!

  • @user-wr4yl7tx3w
    @user-wr4yl7tx3w 2 дня назад

    it is kind of risky. the prompt says, "You are an LLM inside a folder. You are allowed to run whatever Python code you'd like and do whatever fun things you want to ...

  • @irokomause8311
    @irokomause8311 2 дня назад

    I appreciate your hardwork.
    Can i use it with antrophic api key.

  • @A_Me_Amy
    @A_Me_Amy 2 дня назад

    ive treid to do something like this for a while. well, something like it. but im not aprogrammer not that it matters. i have some interesting ideas. but i think god hates me and i want to do things in the way i want and hate the secret restrictions in so many things and only think about how the universe hates me and how i want to watch people burn in hell.