How to install and run LLM (DeepSeek-r1) in your local machine without Internet | Ollama | ChatBoxAI

Поделиться
HTML-код
  • Опубликовано: 1 фев 2025

Комментарии • 9

  • @naveenautomationlabs
    @naveenautomationlabs  3 часа назад +2

    How to install and run LLM (DeepSeek-r1) in your local machine without Internet.
    Model name: deepseek-r1 model [ollama.com/library/deepseek-r1]
    Model engine platform: Ollama [github.com/ollama/ollama]
    AI client application: ChatBox [chatboxai.app/]

  • @kailashpathak9184
    @kailashpathak9184 2 часа назад +1

    Amazing VIDEO ! Thanks Naveen Khunteta for adding detail video on how to set-up and use LLM deepseek-r1

  • @ganeshd.b692
    @ganeshd.b692 3 часа назад +1

    Right Video at Right Time . Thank you .

  • @praveenga883
    @praveenga883 2 часа назад +1

    Thanks very much Naveen, its helpful

  • @nirmalchakraborty8608
    @nirmalchakraborty8608 3 часа назад +1

    Naveentam info,thanks Naveen

  • @karanatreya7314
    @karanatreya7314 Час назад

    @naveen what connection does it have with testing and how it can help testers .. just an informational question as I don’t know what it is ..

  • @Govindandmansi
    @Govindandmansi 3 часа назад +1

    Nice video..

  • @gowthamk8531
    @gowthamk8531 3 часа назад +2

    Hi