100% Local OpenAI Swarm Agents with Ollama in 7 mins!

Поделиться
HTML-код
  • Опубликовано: 23 окт 2024

Комментарии • 45

  • @RetroGameHound
    @RetroGameHound 3 дня назад +5

    the model name needed to be 'llama3.2' for me before it would work

  • @20windfisch11
    @20windfisch11 2 дня назад +1

    You pin the current date in the search tool and only supply the topic as a function parameter. Would it be possible to use functions with more than one parameter or should I work around that in the prompt, I.e. the agent should create, for example, a JSON string with the parameters and I would have to parse that in the function used as a tool?

  • @markmcandrews1703
    @markmcandrews1703 День назад

    Loved the video and subscribed. I could not find a link to the script that you built out though, would love to try it out. Thank you :)

  •  3 дня назад

    I was gone for a while, what did I lose? 😉👍👍 will look at this Swarm Agents

  • @IslandDave007
    @IslandDave007 16 часов назад

    Great video!
    How can I use a .env file in Python with Swarm() instead of setting OS environment variables?
    I know with openai you can pass api_key to it, but Swarm does not take that parameter.
    Trying to get this to work with Mistral large model so need api_key and base_url parameters.
    Thanks!

    • @IslandDave007
      @IslandDave007 12 часов назад

      I found that you have to use the client= parameter of Swarm() after creating a new OpenAI() client with the api_key parameter, etc.

  • @aadeshabhang2657
    @aadeshabhang2657 3 дня назад

    Can you make a guide for a non-coder guide, From basic to advance ai agents, in which modules could be
    What are the tools needs,
    How to setups those tools.
    And how to create the agents.

    • @Idiot123009
      @Idiot123009 3 дня назад

      What confusion you have about?

  • @unclecode
    @unclecode 3 дня назад +1

    Great content as usual, a question, I think you missed to pass `transfer_to_editor_assistant` to the research agent, amI right?

    • @MervinPraison
      @MervinPraison  2 дня назад

      You are correct. Than you :) Probably I shouldn't have included that function.
      Updated code in the description
      I need to test, how reliable it is to use multiple functions and agents inside one agent.

    • @MervinPraison
      @MervinPraison  2 дня назад

      @unclecode based on my test, it seems llama3.2 is not able to call multiple tools at the same time. But gpt-4o was able to.
      So to run multi agents with small models this is optimal solution (Code in the description)
      Technical: gpt-4o was able to handle transfer_to_editor_assistant and get_news_articles at the same time. But llama 3.2 used only this get_news_articles, even though both functions were provided.
      Based on my 3 tests.

  • @RossTang
    @RossTang 2 дня назад

    it works great. thanks

  • @shiweijie9903
    @shiweijie9903 12 часов назад

    since it does websearch using internet, it isnt 100% private? since its over the public internet

  • @InsightCrypto
    @InsightCrypto 3 дня назад +4

    i ran the script as it is with export also but getting model error gpt 4o .... its during duckduckgo

    • @InsightCrypto
      @InsightCrypto 3 дня назад

      i tried both local and groq same error

    • @MervinPraison
      @MervinPraison  2 дня назад +2

      Sorry for that. Here is the updated code. You need to mention the model name in the Agent function
      mer.vin/2024/10/openai-swarm-local/

  • @saurabhm9774
    @saurabhm9774 2 дня назад

    Can I use Gemini or Anthropic with swarm?

  • @Idiot123009
    @Idiot123009 3 дня назад +1

    Can we add an agents in which it'll be used when no function requirement for query?

    • @WebWizard977
      @WebWizard977 3 дня назад

      I don't think so it possible 🤔

  • @rohithperumandla4302
    @rohithperumandla4302 2 дня назад

    How is it different from writting simple functions

  • @kamalkamals
    @kamalkamals 2 дня назад

    what s difference with ur old video of ollama ?

  • @unveilingtheweird
    @unveilingtheweird 2 дня назад

    well as long ad it's OpenAi compat...I use groq with swarm all day

  • @anjumulazim5596
    @anjumulazim5596 3 дня назад +2

    I am using a virtual environment using conda and in vs code the ollama command is recognized and I can run the model from this environment in vs code but I get the error that gpt-4o not found error. Why? I set up the environment variable already and verified it.
    "openai.NotFoundError: Error code: 404 - {'error': {'message': 'The model `gpt-4o` does not exist or you do not have access to it.', 'type': 'invalid_request_error', 'param': None, 'code': 'model_not_found'}}"

    • @MervinPraison
      @MervinPraison  3 дня назад

      Please try this mer.vin/2024/10/openai-swarm-local/

    • @proxima6522
      @proxima6522 3 дня назад

      code shown in youtube and blog it is not working. for this error you need to add model=os.environ["OPENAI_MODEL_NAME"] to fix error
      but still it seems like the total work flow does not work for ollama. especially transfering agent.

    • @proxima6522
      @proxima6522 3 дня назад

      @@MervinPraison i think transfer_to_editor_assistant function seems not used. isn't it?
      i think it should be defined after editor agent.

    • @PaulStoicaSebastian
      @PaulStoicaSebastian 2 дня назад

      @@proxima6522 so this is a little trick but you have to add also os.environ["OPENAI_API_KEY"]='ollama" to work if you use windows as main OS

    • @MervinPraison
      @MervinPraison  2 дня назад +1

      Sorry for the confusion. Here is the fixed code without that function: mer.vin/2024/10/openai-swarm-local/
      also i have added the model name in the code. Passed to the Agent function

  • @shazanmahmud2973
    @shazanmahmud2973 3 дня назад

    thanks

  • @unveilingtheweird
    @unveilingtheweird 2 дня назад

    almost seems like open ai is trying to go backwards we been coding swarms and agent swarms for a hot minute already...at least I have been

  • @HarshSharma-vm3nm
    @HarshSharma-vm3nm 2 дня назад

    Would it be possible for you to share the code as well

    • @MervinPraison
      @MervinPraison  2 дня назад

      Yes I have shared it in the description link

  • @Kevinsmithns
    @Kevinsmithns 2 дня назад

    my powershell doesnt recognize "export:" how do i fix this?

    • @MervinPraison
      @MervinPraison  2 дня назад

      Use “set” instead of export

    • @Kevinsmithns
      @Kevinsmithns 2 дня назад

      @@MervinPraison worked. and now what about "touch" same issue

  • @ganeshmahajan1985
    @ganeshmahajan1985 2 дня назад

    Error code: 404 - {'error': {'message': 'The model `llama3.2` does not exist or you do not have access to it.
    and it is going to openai instead of my local ollama
    Kindly help .

    • @ganeshmahajan1985
      @ganeshmahajan1985 2 дня назад +1

      I solved it by passing openai object as "client" parameter to Swarm class :)

    • @ganeshmahajan1985
      @ganeshmahajan1985 День назад

      now I am trying to use "run_demo_loop" and again giving same error. But I already passed the openai object to swarm. dont know why always like this :p some or other things always ask for money :d

  • @UygurErdem-l8m
    @UygurErdem-l8m День назад

    it is asking "OPENAI_API_KEY"

    • @MervinPraison
      @MervinPraison  День назад

      export a fake key.
      I’ve provided that in my description. URL to the code

  • @DeepMimd007
    @DeepMimd007 3 дня назад +2

    Can’t langchain do the same with better tooling and integration?

    • @MervinPraison
      @MervinPraison  2 дня назад

      Yes it can. Langchain can be used for beginners.
      But Advanced users want to know whats happening behind the scenes when they run an agent, so in that situation OpenAI Swarm is better.