Groundcrew: build an LLM RAG on your code

Поделиться
HTML-код
  • Опубликовано: 16 янв 2025

Комментарии • 6

  • @AlexBerg1
    @AlexBerg1 11 месяцев назад

    Nice work! A common problem indeed!

  • @avinashnair5064
    @avinashnair5064 3 месяца назад

    Can we use this for open soruce locally hosted Ollama models ?

  • @prabhugururaj1988
    @prabhugururaj1988 7 месяцев назад

    Does it support ollama models running in local?

  • @FunwithBlender
    @FunwithBlender 11 месяцев назад

    interesting would love to help you guys out