On-device Large Language Models with Keras and Android

Поделиться
HTML-код
  • Опубликовано: 4 ноя 2024

Комментарии • 20

  • @TensorFlow
    @TensorFlow  Год назад +5

    Have a burning question? Leave it in the comments below for a chance to get it answered by the TensorFlow team. 👇👇🏻👇🏿👇🏽 👇🏾👇🏼

    • @ramkumarkoppu
      @ramkumarkoppu Год назад

      Can you provide examples for iOS and Raspberry Pi please? and also if possible provide example usage of new quantization functionality with the kerasNLP and kerasCV please.

    • @ramkumarkoppu
      @ramkumarkoppu Год назад

      This notebook does not run on MacBook Pro, required python packages couldn't install on Apple Silicon.

    • @TensorFlow
      @TensorFlow  Год назад

      @@ramkumarkoppu Thank you for your feedback! We will for sure take this into consideration going forward.

    • @Canadianishere
      @Canadianishere Год назад

      Will this work on the web with tensorflow JS instead of android?

  • @SashaBaych
    @SashaBaych 8 месяцев назад +1

    Debugging tensorflow's colabs is just so much fun and totally is not the waste of my life...

  • @octaviusp
    @octaviusp Год назад +4

    So, GPT-2 can run on android devices, with a few delay responses, but of course GPT-2 isn't good as GPT-3 or 4.
    1) How many years do you think that we need to have gpt-3 on our android phones?
    2) What task could improve to have an agent like this in the phone?
    3) We could have better prediction in our screen keyboard to the next word?
    4) Recollect all chats whereas we are the sender and using it to feed the LLM and therefore setup automatic responses when we are out of our phone?
    5) An ultimate advanced-reasoning virtual assistant better than google assistant and siri?
    6) There is some security warnings about having an LLM like this in our phone? And if there is,what are the most recommended advices to handle and llm in our phone in the secure way?
    7) And finally, what other IA types will be available for our phone ? I mean, speech-recognition, image generation, etc...

    • @TensorFlow
      @TensorFlow  Год назад

      Thank you for your questions!
      We are making great progress to make even more powerful LLMs running purely on device. You can see Sundar's keynote here (ruclips.net/user/livecNfINi5CNbY?feature=share&t=715).
      You can also check out a demo of running a version of PaLM on Android here (codelabs.developers.google.com/kerasnlp-tflite#0).

  • @jomfawad9255
    @jomfawad9255 Год назад

    Can tensorflow lite be trained directly on microcontroller? meaning instead of training tensorflow on pc then converting to tensorflow lite and upload to microcontroller to run it there i want to directly train tensorflow lite on microcontroller, is it possible? thank you

  • @notwomy9173
    @notwomy9173 Год назад

    Can you report the speed, latency or memory occupation of this application running on android?

  • @knl-ib8xo
    @knl-ib8xo Год назад

    I am wondering whether I can achieve on-device training, i.e., using local mobile data to fine-tune LLM.

    • @TensorFlow
      @TensorFlow  Год назад

      Currently this is not supported.

  • @alanood9500
    @alanood9500 Год назад +1

    I have a question about YAMNet TensorFlow lite model (Android app). I want to use it with an audio clip as input, Not a live recording.
    Can you help in that? Thank you for your help

    • @TensorFlow
      @TensorFlow  Год назад

      Sure!, you can follow these tutorials that go over the details on how to use the model on-device: goo.gle/440uaan

  • @kumardeepanshu8503
    @kumardeepanshu8503 4 месяца назад

    can i use it in react native ?

  • @flutteraddict
    @flutteraddict Год назад

    I am trying to implement this using tflite_flutter but i keep getting "Make sure you apply/link the Flex delegate before inference. For the Android, it can be resolved by adding "org.tensorflow:tensorflow-lite-select-tf-ops" dependency flutter" error. i went ahead to add the dependency in my build.gradle file but the issue still persist. Does the Tensorflow team by chance have any implementation of LLM models in flutter? if yes I'd love a link to the article/video because i've been stuck on this for weeks

  • @Canadianishere
    @Canadianishere Год назад

    Will this work on the web with tensorflow JS instead of android?

    • @TensorFlow
      @TensorFlow  Год назад +1

      @OmarRabie1998 Yes, you can run it with TFJS since TFJS can run TFLite models.
      Reference: goo.gle/47nzQy5

  • @SashaBaych
    @SashaBaych 8 месяцев назад

    I realize the tutorial is 9 months old, but you could at least update the codebase...

  • @professorop4558
    @professorop4558 9 дней назад

    I dont get the point . If we are using flask api then how its on device ? 😢 🥲..can someone explain ?