Это видео недоступно.
Сожалеем об этом.

Exploring Databricks's Open Source Dolly 2.0 Language Model (Fine-Tuned on 15K Human Instructions!)

Поделиться
HTML-код
  • Опубликовано: 12 апр 2023
  • Hey everyone! In this video, I explore Databricks's new Dolly 2.0 model, an open-source pretrained language model fine-tuned on instructive data. Dolly 2.0 was trained on Databricks's new Dolly 15K dataset which contains over 15,000 human-created instruction examples.
    The most exciting part of Databricks's announcement is the Dolly 15K dataset, the first large-scale instructive dataset available for commercial use. Previous datasets like Alpaca could not be used commercially due to OpenAI's policies.
    Dolly 2.0 is a EleutherAI's pythia-12B model, fine-tuned on the Dolly 15K dataset. While not state-of-the-art, Dolly V 2.12B provides an open-source framework for exploring instruction-following language models.
    Thanks to Databricks for open-sourcing this model and dataset! I'm excited to see what the community builds with these tools.
    Link to Colab (Needs Premium): colab.research.google.com/dri...
    Link to Dataset: github.com/databrickslabs/dol...
    I also provide an Alpaca compatible dataset here: huggingface.co/datasets/c-s-a...
    Link to blog: www.databricks.com/blog/2023/...
    #dolly #databricks #llm #generativeai

Комментарии • 17

  • @Gus4r4po
    @Gus4r4po Год назад +2

    Thank you for uploading this great video.
    I am starting to learn about this new world and it has been really helpful.
    Thanks again.

  • @defnlife1683
    @defnlife1683 Год назад +1

    The rate of products coming out is awesome!

  • @kennethlarsen3907
    @kennethlarsen3907 Год назад +2

    it takes forever to load,,, I can never get a response. It is just sitting there. Can you do a video on how to run this model quickly.. Very fast. And how to train it on our own data.

  • @faridbagheri9659
    @faridbagheri9659 11 месяцев назад

    Hello, thanks for your video and provided information, I am wondering that is it possible to use it for transforming, I need it for my project, if yes Can you please guide me a little, from where I should start?

  • @kennethlarsen3907
    @kennethlarsen3907 Год назад +3

    can we use it offline? once we download it? And use it on our own data? It would be nice to know how to do all of that.

    • @chrisalexiuk
      @chrisalexiuk  Год назад +1

      Yes! You can use this, once you have it downloaded, offline!

    • @xgeoff
      @xgeoff Год назад +2

      This is what I wanted to know as well. And not one useful comment about how to do it.

    • @chrisalexiuk
      @chrisalexiuk  Год назад +1

      Once you've downloaded it, you need to run the model on your local machine. This will require a significant amount of compute.
      What roadblocks are you experiencing with running the model locally?

    • @kennethlarsen3907
      @kennethlarsen3907 Год назад +1

      @@chrisalexiuk yea my GPU RAM just needs more juice

  • @judicalp
    @judicalp Год назад +2

    how long did it take to answer on colab ? on my macbook it took more than an hour

    • @chrisalexiuk
      @chrisalexiuk  Год назад +1

      The Premium Colab instance had inference time between 3-10s.

    • @judicalp
      @judicalp Год назад +2

      that's awesome to hear thanks - I will explore colab

  • @todddeshane4564
    @todddeshane4564 Год назад +1

    A doll among dalls 🎉

    • @chrisalexiuk
      @chrisalexiuk  Год назад

      It's like.... anything but Dolly/Dall-E, amirite?

    • @todddeshane4564
      @todddeshane4564 Год назад +1

      @@chrisalexiuk what about brick doll. As in she's a brick house. Ducks.