I just plain don't get it, maybe I am misunderstanding with fine-tuning means, maybe I don't even need this for my case usage... in the end I have one folder on my desktop with a measly 1.4 GB of markdown files that are totaling over 3 million words in research, that I want google pro 1.5 to represent as a mouthpiece. I guess the quickest way to explain it would be, master levels of needle in a haystack representation, I want it to be able to take all the files into macro context per each question, and give me a higher order perspective between the files that only artificial intelligence could possibly keep a wrangle of, comprehend? How on earth can I achieve this please!? Thank you.🙏
Use pre-trained models available in hugging face transformers, implement RAG and train your, database don't use gemini bcz I don't know if we can implement RAG there as well as gemini is not best for training your data so use other pre trained NLP models
as far as I know its not possible to use system instructions and structured output with fine tuned gemini models
Is possible to use tuned model and function calling?
Yes
Yeah but time consuming for large dataset , if you want responses based on your dataset info only
I just plain don't get it, maybe I am misunderstanding with fine-tuning means, maybe I don't even need this for my case usage... in the end I have one folder on my desktop with a measly 1.4 GB of markdown files that are totaling over 3 million words in research, that I want google pro 1.5 to represent as a mouthpiece.
I guess the quickest way to explain it would be, master levels of needle in a haystack representation, I want it to be able to take all the files into macro context per each question, and give me a higher order perspective between the files that only artificial intelligence could possibly keep a wrangle of, comprehend?
How on earth can I achieve this please!? Thank you.🙏
Use pre-trained models available in hugging face transformers, implement RAG and train your, database don't use gemini bcz I don't know if we can implement RAG there as well as gemini is not best for training your data so use other pre trained NLP models