How To Run Open Canvas Locally

Поделиться
HTML-код
  • Опубликовано: 17 дек 2024

Комментарии • 4

  • @Truizify
    @Truizify 13 часов назад +1

    What if you're running the LLM locally as well, is that supported?

    • @LangChain
      @LangChain  13 часов назад +1

      This is possible, however it'll require forking the repo and updating the code to support calling a local model provider (e.g Ollama). See this section on how to add support for more LLM providers: github.com/langchain-ai/open-canvas?tab=readme-ov-file#troubleshooting

    • @vaidphysics
      @vaidphysics 8 часов назад

      @@LangChainthat seems like a very complicated way of doing something which should be as simple as specifying the llm endpoint which could be either local or remote.

  • @Texa8
    @Texa8 8 часов назад

    Not a big fan of Langsmith. Too convoluted and I’m not sure if it adds any value over open source offerings