OttoDev REVOLUTIONIZES Bolt.new and AI Models in 24 Hours!

Поделиться
HTML-код
  • Опубликовано: 23 ноя 2024

Комментарии • 36

  • @ChetanJariwala
    @ChetanJariwala 17 часов назад +3

    Thanks for putting up this useful video. Another fan of your calming voice 😊. I am not a developer but know enough to create basic web stuff. I would really appreciate it if you can make a video on how to actually take the code and create an app or website from bolt to your own custom domain or cms. I haven't seen anyone create a video on this particular topic.

    • @vincibits
      @vincibits  9 часов назад +1

      Great idea! Will definitely do that!!

    • @youssefjbida7038
      @youssefjbida7038 7 часов назад

      Yed that would be a great idea and a generous gift to we beginners. Because what we need is an A to Z tuto. No one did it yet. Everybody talk about bolt.new and AI but no one goes until publishing the apps and make it public on play store or apple store. Thanks in advance

  • @SSmartyimages
    @SSmartyimages 12 часов назад +1

    You do have a good voice for listening:)

  • @Ph1lH4ck
    @Ph1lH4ck 15 часов назад +1

    Let's built an ElevenLabs Vincibits voice model to spread the Cool all over RUclips 😎

    • @vincibits
      @vincibits  9 часов назад

      Haha yes!! And then the world would just not care lol

  • @saidabedi2836
    @saidabedi2836 22 часа назад +2

    Getting really hard to keep up with the constant updates in the AI world. Any advice keeping up?

    • @vincibits
      @vincibits  22 часа назад +1

      I know. I hear you. Not trying to push anything on you, but I actually started a community group that mitigates this same problem of overwhelming feeling when it comes to AI. Let me know if you are interested in learning more about it.

  • @wolfgangdelaine5284
    @wolfgangdelaine5284 17 часов назад +1

    Thx for your videos !
    And for ollama installation ? I installed with docker, and bolt too. Ollama alone take 3 times less ram than with bolt. You know why ?

    • @vincibits
      @vincibits  17 часов назад

      I’m not sure, but I’d suspect they there just a lot of moving parts as a system itself. Remember, you’re not just running illama when you use Ottodev - it’s an application that has many other dependencies etc.

  • @Ph1lH4ck
    @Ph1lH4ck 15 часов назад +1

    I'm simple guy ! When I see Vincibits ! I click the like button 😉

    • @vincibits
      @vincibits  9 часов назад

      You’re too kind!!

  • @Ph1lH4ck
    @Ph1lH4ck 14 часов назад +1

    The computer 👉 🥵😫🤯 😂

  • @huntinggamer1725
    @huntinggamer1725 22 часа назад +2

    Love your channel joking the community soon

    • @vincibits
      @vincibits  21 час назад +1

      I appreciate that! Thank you so much for you support! It means a lot. Thank you!

  • @MatheusSilva-qm3ph
    @MatheusSilva-qm3ph 12 часов назад

    Very good bro!
    One question is what is the minimum PC configuration to run locally?
    I did not find this statement.
    Thanks

    • @vincibits
      @vincibits  8 часов назад

      It all depends on the model, but at minimum 10GB or RAM and fairly modern PC would do it, maybe not the best but it would do.

  • @SpkSpkS
    @SpkSpkS 16 часов назад +1

    I have a fine tuned llm and how to use it ?

    • @vincibits
      @vincibits  8 часов назад

      You would use it like you would any LLM.

  • @JOJO-ck7cm
    @JOJO-ck7cm 17 часов назад +1

    Can someone build an interface onto OttoDev to prevent all this problems

    • @vincibits
      @vincibits  17 часов назад

      Good question. This is why a lot of people are contributing towards the project on Github. It’s a work in progress type of project.

  • @myWorldDiscover
    @myWorldDiscover 21 час назад +1

    good explanation. thanks 🎉.
    Can you please build a complex solution?

    • @vincibits
      @vincibits  21 час назад

      Yes of course, what did you have in mind?

    • @myWorldDiscover
      @myWorldDiscover 17 часов назад

      @@vincibits like ERP System, or Discord, or CRM app, i mean a real complex and complete web app. not just todo, calculators. these things are so simple.
      please don't forget both backend and frontend. thanks in advance

  • @Ph1lH4ck
    @Ph1lH4ck 15 часов назад +1

    By the way I have installed Bolt with Docker and connected it with Claude, OpenAI and LLama to do some tests. And everything works fine. But the output of apps built locally are way less good and powerful than the online bolt.new app even with the Claude Sonnet API. So I'm a bit disappointed by the local solution.

    • @vincibits
      @vincibits  9 часов назад

      I have noticed somewhat similar symptoms as well… I think there’s something else that Bolt.new, the original one, has left out from the open source repo. We’ll see.

    • @Ph1lH4ck
      @Ph1lH4ck 8 часов назад

      @@vincibits yes definitely

    • @Ph1lH4ck
      @Ph1lH4ck 8 часов назад

      @@vincibits I've tested windsurf in my computer today and I like it better than bolt because you learn the tech by using it. And it's working 🙂

    • @vincibits
      @vincibits  8 часов назад

      @@Ph1lH4ck Interesting! I will check that out as well. Thanks!

  • @learn_generative_ai
    @learn_generative_ai 22 часа назад +1

    Hello I don't know why but I can't able to pull my api key from .env.local but when I gave explicitly in the execution time it's working what do think the error was?

    • @vincibits
      @vincibits  21 час назад

      First you have to change the .env.local to just .env

    • @learn_generative_ai
      @learn_generative_ai 21 час назад

      @vincibits as recommended in the git repo readme I have changed it from. .env.example to .env.local but that doesn't worked also I changed it to just .env still it doesn't works

    • @PauloDichone
      @PauloDichone 21 час назад

      @@learn_generative_aithen I think you have something else that is not right. Are you running docker application? It’s hard to troubleshoot not knowing what you have setup

    • @learn_generative_ai
      @learn_generative_ai 21 час назад

      @PauloDichone Yes, It was running on docker