The BEST GitHub Copilot ALTERNATIVE... (it's FREE forever)

Поделиться
HTML-код
  • Опубликовано: 28 янв 2025

Комментарии • 24

  • @leon_forte
    @leon_forte 14 дней назад +1

    Does anyone know of a self hosted alternative to copilot that is free with no restrictions?

    • @MarcoLenzo
      @MarcoLenzo  12 дней назад +1

      Have you ever looked into lmstudio.ai/ ? It allows you run LLMs locally and with some crafting you can achieve what you want.

    • @leon_forte
      @leon_forte 7 дней назад

      @MarcoLenzo thanks, I will check this out

  • @santiso878
    @santiso878 6 месяцев назад +1

    I'll give it a try! grazie mile!

    • @MarcoLenzo
      @MarcoLenzo  6 месяцев назад

      Recently JetBrains announced some improvements to their assistant. I'll try it out soon again an update you

  • @dishaswarajsoft2834
    @dishaswarajsoft2834 6 месяцев назад +1

    Its really good. I installed for Android studio. Its great. Thank you.

    • @MarcoLenzo
      @MarcoLenzo  6 месяцев назад

      Glad you liked it 😁

  • @FateflyYip
    @FateflyYip 4 месяца назад +2

    omg it is working...i cannot imagine it is free to use. Any limitation that i can use it per day?

    • @MarcoLenzo
      @MarcoLenzo  4 месяца назад

      As far as I know there are no limits when you use their base model. Limitations are on the use of other models like GPT4, etc.

  • @antoineboxho136
    @antoineboxho136 8 месяцев назад +1

    Thanks, this is so good!

  • @RDD87z
    @RDD87z 7 месяцев назад +2

    hi, as i understood, do you manage to make copilot and codeium have context of your entire source? how can i set that up. it never suggests from the other classes i've.
    how to do this for copilot?

    • @MarcoLenzo
      @MarcoLenzo  7 месяцев назад +1

      I know this feature (editing the context manually) is offered by Codeium but I didn't see it on Copilot. I will double check in the coming days and get back to you.

    • @RDD87z
      @RDD87z 7 месяцев назад

      @@MarcoLenzo thank you so much. I'm waiting for this feature I think it will be awesome. Maybe because of being powered by ChatGpt. Is not powerful enough to hold all the context in a source. Probably we will need to wait for 1m token context for that to happen.

  • @pablomasc1982
    @pablomasc1982 8 месяцев назад +2

    Thanks!!

  • @tasfin660
    @tasfin660 6 дней назад +1

    codeium ain't that fast, it's slow, i have to pause for it's to generate suggestion, i would rather type than giving pause after every line

    • @MarcoLenzo
      @MarcoLenzo  6 дней назад

      I filmed this long ago now but I never experienced any slowness.
      I'll double check and update if need be.
      Thank you

  • @0dan1
    @0dan1 Месяц назад +1

    no free

    • @MarcoLenzo
      @MarcoLenzo  Месяц назад

      Are you sure? I just checked and there's still a free plan