Here's a revised version of your sentence: Thank you for explaining nicely and clearly. Its working for me. I am stunned to see your internet download speed of 729 MB/s @ 5:30 . Where is it ? Is this speed publicly available in your country or for some secret military weapon making laboratory ? mine is 1.2 MB/s .
Hi Fahd, thanks for all the guides, can you please do a step by step guide for using MicroSoft GraphRAG using Ollama Local LLMs. Can take input pdfs / docs / code files etc and run completely locally offline.
not to mention it runs locally on your computer, unless you host your website on your computer you'd have to install ollama and a chat interface on the webserver that you want to host your site at, or you use a paid online chat service
no one is talking about the "interactions" and theyre ignoring it like it doesnt exist....these guides are useless....that "interactions" is how many times you can chat with the llm EVEN IF ITS RUN LOCALLY!!!!! making this absolutely garbage.
Here's a revised version of your sentence:
Thank you for explaining nicely and clearly. Its working for me. I am stunned to see your internet download speed of 729 MB/s @ 5:30 . Where is it ? Is this speed publicly available in your country or for some secret military weapon making laboratory ? mine is 1.2 MB/s .
lol, thanks
Hi Fahd, im using vs code to connect to EC2 instance.. can you show how to install ollama and codegpt in the EC2 instance? Thanks in advance
Similar videos are already on channel, plz search.
Hi Fahd, thanks for all the guides, can you please do a step by step guide for using MicroSoft GraphRAG using Ollama Local LLMs. Can take input pdfs / docs / code files etc and run completely locally offline.
Already did, please search the channel.
How much ram is need for new model?
im trying this extension its doest work , dont recognise opened files , trying to press explain and nothing happened
Hey please tell me how to use this on my website as a chatbot. Is it possible???
no silly it's a code assistant to be run in vscode... read the title ffs
not to mention it runs locally on your computer, unless you host your website on your computer you'd have to install ollama and a chat interface on the webserver that you want to host your site at, or you use a paid online chat service
There are other options to create chatbot around it, please search the channel for related videos, thanks.
no one is talking about the "interactions" and theyre ignoring it like it doesnt exist....these guides are useless....that "interactions" is how many times you can chat with the llm EVEN IF ITS RUN LOCALLY!!!!! making this absolutely garbage.
thanks for feedback