Can you provide examples for iOS and Raspberry Pi please? and also if possible provide example usage of new quantization functionality with the kerasNLP and kerasCV please.
So, GPT-2 can run on android devices, with a few delay responses, but of course GPT-2 isn't good as GPT-3 or 4. 1) How many years do you think that we need to have gpt-3 on our android phones? 2) What task could improve to have an agent like this in the phone? 3) We could have better prediction in our screen keyboard to the next word? 4) Recollect all chats whereas we are the sender and using it to feed the LLM and therefore setup automatic responses when we are out of our phone? 5) An ultimate advanced-reasoning virtual assistant better than google assistant and siri? 6) There is some security warnings about having an LLM like this in our phone? And if there is,what are the most recommended advices to handle and llm in our phone in the secure way? 7) And finally, what other IA types will be available for our phone ? I mean, speech-recognition, image generation, etc...
Thank you for your questions! We are making great progress to make even more powerful LLMs running purely on device. You can see Sundar's keynote here (ruclips.net/user/livecNfINi5CNbY?feature=share&t=715). You can also check out a demo of running a version of PaLM on Android here (codelabs.developers.google.com/kerasnlp-tflite#0).
Can tensorflow lite be trained directly on microcontroller? meaning instead of training tensorflow on pc then converting to tensorflow lite and upload to microcontroller to run it there i want to directly train tensorflow lite on microcontroller, is it possible? thank you
I have a question about YAMNet TensorFlow lite model (Android app). I want to use it with an audio clip as input, Not a live recording. Can you help in that? Thank you for your help
I am trying to implement this using tflite_flutter but i keep getting "Make sure you apply/link the Flex delegate before inference. For the Android, it can be resolved by adding "org.tensorflow:tensorflow-lite-select-tf-ops" dependency flutter" error. i went ahead to add the dependency in my build.gradle file but the issue still persist. Does the Tensorflow team by chance have any implementation of LLM models in flutter? if yes I'd love a link to the article/video because i've been stuck on this for weeks
Have a burning question? Leave it in the comments below for a chance to get it answered by the TensorFlow team. 👇👇🏻👇🏿👇🏽 👇🏾👇🏼
Can you provide examples for iOS and Raspberry Pi please? and also if possible provide example usage of new quantization functionality with the kerasNLP and kerasCV please.
This notebook does not run on MacBook Pro, required python packages couldn't install on Apple Silicon.
@@ramkumarkoppu Thank you for your feedback! We will for sure take this into consideration going forward.
Will this work on the web with tensorflow JS instead of android?
Debugging tensorflow's colabs is just so much fun and totally is not the waste of my life...
So, GPT-2 can run on android devices, with a few delay responses, but of course GPT-2 isn't good as GPT-3 or 4.
1) How many years do you think that we need to have gpt-3 on our android phones?
2) What task could improve to have an agent like this in the phone?
3) We could have better prediction in our screen keyboard to the next word?
4) Recollect all chats whereas we are the sender and using it to feed the LLM and therefore setup automatic responses when we are out of our phone?
5) An ultimate advanced-reasoning virtual assistant better than google assistant and siri?
6) There is some security warnings about having an LLM like this in our phone? And if there is,what are the most recommended advices to handle and llm in our phone in the secure way?
7) And finally, what other IA types will be available for our phone ? I mean, speech-recognition, image generation, etc...
Thank you for your questions!
We are making great progress to make even more powerful LLMs running purely on device. You can see Sundar's keynote here (ruclips.net/user/livecNfINi5CNbY?feature=share&t=715).
You can also check out a demo of running a version of PaLM on Android here (codelabs.developers.google.com/kerasnlp-tflite#0).
Can tensorflow lite be trained directly on microcontroller? meaning instead of training tensorflow on pc then converting to tensorflow lite and upload to microcontroller to run it there i want to directly train tensorflow lite on microcontroller, is it possible? thank you
Can you report the speed, latency or memory occupation of this application running on android?
I am wondering whether I can achieve on-device training, i.e., using local mobile data to fine-tune LLM.
Currently this is not supported.
I have a question about YAMNet TensorFlow lite model (Android app). I want to use it with an audio clip as input, Not a live recording.
Can you help in that? Thank you for your help
Sure!, you can follow these tutorials that go over the details on how to use the model on-device: goo.gle/440uaan
can i use it in react native ?
I am trying to implement this using tflite_flutter but i keep getting "Make sure you apply/link the Flex delegate before inference. For the Android, it can be resolved by adding "org.tensorflow:tensorflow-lite-select-tf-ops" dependency flutter" error. i went ahead to add the dependency in my build.gradle file but the issue still persist. Does the Tensorflow team by chance have any implementation of LLM models in flutter? if yes I'd love a link to the article/video because i've been stuck on this for weeks
Will this work on the web with tensorflow JS instead of android?
@OmarRabie1998 Yes, you can run it with TFJS since TFJS can run TFLite models.
Reference: goo.gle/47nzQy5
I realize the tutorial is 9 months old, but you could at least update the codebase...
I dont get the point . If we are using flask api then how its on device ? 😢 🥲..can someone explain ?