This may be my favorite simple Ollama GUI

Поделиться
HTML-код
  • Опубликовано: 9 фев 2025

Комментарии • 206

  • @technovangelist
    @technovangelist  9 месяцев назад +23

    Thanks for watching. Here is an initial response from the author you might find helpful:
    - branching off makes more sense when you have multiple AI messages and definitely not for the bottom one (we might as well hide it for the bottom message)
    - we tried it audio transcription again and it seems to be working just fine. We use OpenAI and wondering if the key was correct
    - RAG is something we are working on right now

  • @supercurioTube
    @supercurioTube 9 месяцев назад +8

    I installed it immediately and really like it as well! Lots of good ideas and neat UI. Thanks for the demo and discovery.

  • @davidtindell950
    @davidtindell950 7 месяцев назад +1

    Yes! The "split-chat' feature of MSTY is great for comparisons across models. Thank You Very Much!!!

  • @bdougie
    @bdougie 8 месяцев назад +1

    Really appreciate the ollama content. Super helpful for catching up on the AI and LLM scene

  • @GavinElie
    @GavinElie 8 месяцев назад +2

    MSTY looks like an exciting and useful tool. Eagerly awaiting it's implementation of RAG!

  • @vulcan4d
    @vulcan4d 9 месяцев назад +2

    The best front end would allow remote access since most people only have GPUs in gaming rigs and you may want to host this elsewhere. Also since it is offline, it should be allowed to reference content from local documents also. You got both features, you got gold.

    • @technovangelist
      @technovangelist  9 месяцев назад

      well the first is part of ollama itself. so any ui apart from this one with rag and you are happy then?

  • @e-dev9401
    @e-dev9401 8 месяцев назад +1

    I would appreciate a link in the description, but thank you for the video featuring this app, seems very nice!

  • @jaapjob
    @jaapjob 9 месяцев назад

    Look's like a great option. Thanks for looking into these tools and reviewing them in such detail.

  • @yuvrajdhepe2116
    @yuvrajdhepe2116 9 месяцев назад +6

    I really like the ending water sipper pause and I don't quit the video just to see the end 😄

    • @tecnopadre
      @tecnopadre 9 месяцев назад +1

      Who says it's water?

  • @shuntera
    @shuntera 9 месяцев назад +1

    “Sliding Doors” ? LOVE that movie!

    • @technovangelist
      @technovangelist  9 месяцев назад +1

      all my videos include at least some irrelevant and potentially useless knowledge bouncing around in my head.

  • @Naarii14
    @Naarii14 7 месяцев назад

    This worked perfectly for my needs thank you for this!

  • @danialothman
    @danialothman 9 месяцев назад +3

    great video. I currently use anythingLLM to interface with Ollama in my local network.

    • @Codegix
      @Codegix 7 месяцев назад

      Do you find AnythingLLM is the best

  • @jlgabrielv
    @jlgabrielv 8 месяцев назад

    Thanks Matt for the video! great introduction to Msty

    • @technovangelist
      @technovangelist  8 месяцев назад +1

      there are a lot of updates in the last few versions and I will be putting out another video when a few more things are added.

  • @AdmV0rl0n
    @AdmV0rl0n 9 месяцев назад

    Thanks you for covering this. Good UI's are going to be the thing that gathers pace and gets take up.

    • @paul1979uk2000
      @paul1979uk2000 9 месяцев назад

      Good UI's and easy to install and setup will help a lot with adoption.
      Too many of the others are either too difficult to install or setup, don't work on some gpu's, and so on, which likely puts a lot of consumers off using them, and they end up using an online model as it just works.
      So I'm happy to see that effort is being made to make locally run models more accessible, because a lot more people will likely use them.

    • @technovangelist
      @technovangelist  9 месяцев назад

      unfortunately there are a lot of really bad UIs. There is one that keeps getting suggested that is hard to find positive things to say about.

  • @HyperUpscale
    @HyperUpscale 9 месяцев назад +1

    Hm... I thought open webui was the best... because last time I checked (few months ago) it was the best.
    But now I think this one definitely deserves attention.
    Thank you, Matt!
    This myst was a great alternative 🤗 - I will try it!

  • @madytyoo
    @madytyoo 9 месяцев назад +1

    I really like your videos. Thank you for sharing your experience.

  • @new_artiko
    @new_artiko 9 месяцев назад +4

    thanks for sharing!

  • @davidjameslees635
    @davidjameslees635 9 месяцев назад +1

    Thank you, great video I have installed it and working well for me. Would you do a video explaining how it can reference your own documents and the Web , so a novice can follow. Many thanks I enjoy your presentation, well done. David.

  • @Slimpickens45
    @Slimpickens45 9 месяцев назад +2

    Nice! Didnt know about this one!

  • @spinout77
    @spinout77 8 дней назад

    Thanks for the instructional video but I miss a part where you discuss uploading documents. I really struggle with uploading PDFs and getting them analyzed with local LLMs. Is this possible or not? There is an upload button and I experimented with the size of the context window, but I always get error messages ("fetch failed"). Obviously, local models (7B or 8B) fall behind the cloud models (claude sonnet 3.5 for example), they seem to be much more suitable for this kind of task...

  • @bigpickles
    @bigpickles 9 месяцев назад +1

    Interesting. I've been using the openwebui for prepping multishot prompts in main code flows, but I like the folders option here. Will give it a whirl

  • @milque1854
    @milque1854 9 месяцев назад +2

    very cool, itd be nice to see more uis like this implement tavern cards and multi-user chats like sillytavern, and more stuff like crewai agent cards the way flowise has gui langchain modules

  • @MyAmazingUsername
    @MyAmazingUsername 9 месяцев назад +3

    I really, really love your presentation style. And I love Ollama. You were the person who taught me how to get started and I am forever grateful. By the way, I import GGUF via custom Modelfiles, and sometimed I have to tweak things like the parameters, and I haven't found a way to just update the existing imported weights parameters via the Modelfile. Do you know if it's possible? Currently I delete and re-create the whole import for every change.

    • @technovangelist
      @technovangelist  9 месяцев назад +1

      No need to delete. Just run create again

    • @MyAmazingUsername
      @MyAmazingUsername 9 месяцев назад

      @@technovangelist Thank you so much. It was really confusing that there wasn't an "update" command and I never thought to try "create" on something that was already created, hehe. Now I see that it reuses the old blobs on disk when I do that. :) Thanks!

    • @MyAmazingUsername
      @MyAmazingUsername 9 месяцев назад

      @@technovangelist Thank you so much for clearing that up! :)

    • @ARGOXARK
      @ARGOXARK 3 месяца назад

      do you mean changing the modelfile with an existing ollama model like this ?
      ollama show codegeex4:latest --modelfile > codegeex4_modelfile
      edit codegeex4_modelfile with your changes
      ollama create my_new_model --file codegeex4_modelfile

  • @vipulsrivastav08
    @vipulsrivastav08 23 дня назад

    I wonder if we can deploy this on some cloud ( AWS, Azure etc ) , thanks for the video Matt Williams!

  • @DhruvJoshiDJ
    @DhruvJoshiDJ 9 месяцев назад +10

    make a video on UI with RAG functionality as well.

    • @technovangelist
      @technovangelist  9 месяцев назад +1

      I have

    • @stanTrX
      @stanTrX 9 месяцев назад

      With msky? ​@@technovangelist

    • @technovangelist
      @technovangelist  9 месяцев назад +1

      msty doesn’t do rag yet. But I have done a video on a ui with rag

    • @drmetroyt
      @drmetroyt 9 месяцев назад

      Use anything LLM

    • @YannochFPV
      @YannochFPV 9 месяцев назад

      No rag :(

  • @starmountpictures
    @starmountpictures 17 дней назад

    Great video!! Is there a web version of MSTY??

  • @kevinfox9535
    @kevinfox9535 9 месяцев назад +1

    You should have put in some sort of link to download it. I cant find it on the web

  • @cgmiguel
    @cgmiguel 9 месяцев назад

    Thanks for the video! Really nice app

  • @LochanaMenikarachchi
    @LochanaMenikarachchi 7 месяцев назад

    Msty seems to be using its own version of ollama under the hood. Is there anyway to know what version of ollama it is using? Lack of URL and PPT file support in the RAG is the other deal breaker for me. Hope they will support them in the upcoming versions.

  • @sskohli79
    @sskohli79 7 месяцев назад

    Hi matt, thanks so much for your videos very informative. When i converse with ollama, i see that there are certain things I need to repeat, like answer with new lines after 2-3 lines or space them out. or don't be too adjusting, give a straight advice. is there a way to save these configs somewhere?

  • @marcc2689
    @marcc2689 9 месяцев назад

    Great video. Thanks

  • @pagutierrezn
    @pagutierrezn 9 месяцев назад

    I miss the possibility of creating multiple users available in open webui. I really appreciate this feature to make the models available to non so technical colleagues

  • @maxilp4952
    @maxilp4952 9 месяцев назад +2

    Thank you so much! I was just looking for something like this

  • @Pure_Science_and_Technology
    @Pure_Science_and_Technology 9 месяцев назад

    Thanks for the intro to the 'Misty' UI for Ollama. I'm using Open Web UI at work and it's great for handling multiple users. Does Misty offer the same kind of support for user sessions and data security? How does it manage each user's data?

  • @drp111
    @drp111 9 месяцев назад +1

    Great. Thanks for the video!
    Is there already a canned web interface for ollama that allows me to serve my model public over the internet but without options on the front end for the user to select different models, document uploads, modifying system prompts etc.? I'm looking for the most basic chat functionality possible. Setting up everything in the admin backend? Like chatgpt in the early days.

    • @technovangelist
      @technovangelist  9 месяцев назад +1

      not sure what you are asking. if you want the more complicated thing you asked for first, then open webui seems to be your best bet. for the simple choice there are a few options out there

    • @drp111
      @drp111 9 месяцев назад +1

      @@technovangelist Thank you for your reply. I'm pretty new to the topic. I spent the last three weeks gaining some basic knowledge of LLmodels, and how to configure/use them. So please forgive me for asking my beginner questions. Currently, I'm testing Open Web UI. From what I learned, even the regular non-admin user can configure the system prompt, advanced parameters, and other stuff in his user settings. I'm looking for an option to provide the model I created and tested within the Olama CLI over the web without any model-response-related configuration options by the user. It might be possible for someone with the necessary knowledge to modify the Open Web UI accordingly, but I'm unfortunately not (yet) capable of doing so.

    • @-energy1433
      @-energy1433 7 месяцев назад

      @@drp111 are any news on this? I also need the User Chat interface, not the "near data science UI" to compare the LLM-models. Streamlit - is the option, but may be some others as well for today?

  • @winkler1b
    @winkler1b 8 месяцев назад

    You can close a Dialog with the ESC key. I had the exact same response.... especially because the window border is so faint. Took me a while to realize was in a modal.

    • @technovangelist
      @technovangelist  8 месяцев назад

      I'll have to review it again to remember what I did

  • @abiolasamuel8092
    @abiolasamuel8092 9 месяцев назад

    Does MSTY have a base url, like WebUI, that acts like an API in other applications? I search but couldn't find.

  • @cristianosorio32
    @cristianosorio32 8 месяцев назад

    Hi! Nice tool. Have you tried danswer? I deployed it but i couldnt make it work with my local ollama. Only with open ai api key. As web ui it has a clean interface and nice document organization to make Q&A

  • @gartopu
    @gartopu 9 месяцев назад

    Thanks for the movie, I'll watch it. :)

  • @RickySupriyadi
    @RickySupriyadi 9 месяцев назад

    open webui I'm stuck at connecting docker with ollama ,the webui can't connect to ollama even though I can use ollama with my obsidian copilot , and CLI ollama run...

  • @siddharthkandwal6514
    @siddharthkandwal6514 7 месяцев назад

    Where are the content and chats saved on MAC?

  • @oschwald9784
    @oschwald9784 12 дней назад

    Can you use this for R1?

  • @tecnopadre
    @tecnopadre 9 месяцев назад

    Thnks again Matt. I wouldn't call it simple though. At least for the average people hahaha. Congrats on how you do it.

  • @matthewcuddy129
    @matthewcuddy129 5 месяцев назад

    msty sounds great, but I need a tool that can be installed on my Linux LLM server (like OpenUI) or on the client workstation in my home network - any suggestions?

  • @robwin0072
    @robwin0072 6 месяцев назад

    Hello Matt, you did not demonstrate document upload for analysis, is that capability available now?

  • @BillyBobDingledorf
    @BillyBobDingledorf 9 месяцев назад

    Will it take an image or document as input?

  • @mohamedkeddache4202
    @mohamedkeddache4202 9 месяцев назад +2

    what is the best open-source GUI i can use for my local RAG app ?

    • @technovangelist
      @technovangelist  9 месяцев назад +3

      there are a lot of options out there, but none of them are very good. at least not yet. this is still new stuff.

    • @utawmuddy5940
      @utawmuddy5940 9 месяцев назад

      did you find that the response time was a good bit slower in my thing LLM vs. terminal? or is that normal for GUI's... still learning. I haven't tried LM studio yet but might do that next or prob just stick to command promo for now

    • @psykedout
      @psykedout 9 месяцев назад +2

      You may want to look into flowwise, it took me some tinkering, but I was able to setup local rag with it and ollama

    • @incrastic6437
      @incrastic6437 9 месяцев назад

      AnythingLLM is another great option. That's the one I'm using right now. But, I'm always trying new things, so tomorrow, who knows?

    • @etherhealingvibes
      @etherhealingvibes 9 месяцев назад

      Obsidian, coupled with the Copilot plugin, offers an easy setup and allows for swift interaction with documents.

  • @sammcj2000
    @sammcj2000 9 месяцев назад

    I'd like to see the conversation branching and refinement concepts come to BoltAI which I think is by far the best GUI client.

    • @technovangelist
      @technovangelist  9 месяцев назад

      Bolt only supports Ollama thru the OpenAI compatible API which is always going to be a bit behind so its potentially limited in what it can do

  • @mountee
    @mountee 9 месяцев назад +1

    great video, can I have multiple API LLM providers setup at the SAME time? thanks

    • @technovangelist
      @technovangelist  9 месяцев назад +1

      Yes you can!

    • @mountee
      @mountee 9 месяцев назад +1

      @@technovangelist cool, many thanks for you hard work

  • @HyperUpscale
    @HyperUpscale 9 месяцев назад

    I found another example for "Ollama GUI" (not an application with backend)
    page-assist-a-web-ui-for-local-ai-models - chrome extension.
    This is what I call GUI ;)

    • @technovangelist
      @technovangelist  9 месяцев назад +2

      Just recorded the video about it. It’s not as powerful as the last GUI I covered, msty and attempts to do a bunch of things but not very well. But the things it gets right is great.

    • @n4ze3m
      @n4ze3m 9 месяцев назад +1

      @@technovangelist Thank you for the honest opinion about Page Assist (I'm the creator of it) :)

  • @marhensa
    @marhensa 9 месяцев назад

    why I don't see any image button? (to upload and ask the LLM)? I installed model Llama-3-Instruct and still no image button.

    • @technovangelist
      @technovangelist  9 месяцев назад

      I don't know why you don't see the image button, but you wouldn't use it with llama3 anyway. You would need to use images with llava models.

    • @marhensa
      @marhensa 9 месяцев назад

      @@technovangelist oh maybe that's why, I'll download llava right away, and finish setup my payment of GPT-4 API. I thought it only can be activated if I put some online service that can input an image. but if llava could do it, I will not continue that GPT-4 API. thank you, I will report here later.

    • @marhensa
      @marhensa 9 месяцев назад

      ​@@technovangelist I can confirm that downloading llava makes that button appear. Thank you.

  • @THE-AI_INSIDER
    @THE-AI_INSIDER 9 месяцев назад

    @technovangelist what is the license of msty ? Do u hv link for the license page ?

  • @mirkoturco
    @mirkoturco 9 месяцев назад

    Nice but seems to me that it does not use the context of previous chats when making API requests to Anthropic

  • @ps3301
    @ps3301 9 месяцев назад

    Possible to show us how to use a webui or streamlit with open interpreter ?

  • @chrisBruner
    @chrisBruner 9 месяцев назад +1

    msty doesn't seem to use models already loaded with ollama. It's also closed source, so are you sure it's using ollama at all?

    • @technovangelist
      @technovangelist  9 месяцев назад

      It does use the models from ollama. Maybe you skipped that option. There is a place to change the model path in the app

    • @DrakeStardragon
      @DrakeStardragon 9 месяцев назад

      What I have noticed is that the models used "ollama run" seem to downloaded to one location and the models for "ollama serve" are downloaded to another location and they don't seem to know about each other's models. I have not had the chance to dig into what is going on there

    • @juanjesusligero391
      @juanjesusligero391 9 месяцев назад

      I almost missed this software is closed source, thanks for pointing it out. I don't usually like installing closed source (even if it's free), so I think I'll pass on this one. Anyways it was a nice video.

    • @technovangelist
      @technovangelist  9 месяцев назад

      there arent two ways to run models like that. the ollama client uses the server. they are one

  • @etherhealingvibes
    @etherhealingvibes 9 месяцев назад

    Love it, the inference is fast. Might need text-to-speech.

  • @GeorgeJCapnias
    @GeorgeJCapnias 9 месяцев назад

    Matt, sorry but I think msty is NOT an Ollama client. Don't get me wrong, I am a big fan of your videos.
    The thing is that I am using Ollama through my WSL Ubuntu installation. The whole thing works great as you can still use Ollama in a local address. I need a good UI too, and msty is a great UI.
    The problem is msty is not using Ollama service, or Ollama OpenAI compatible REST service, or even Ollama REST service. It just uses Ollama's models when Ollama is installed on the same machine.
    It is not the same being a client to a service and using a program's data (models)...
    George J.

    • @technovangelist
      @technovangelist  9 месяцев назад +1

      Actually it is an ollama client. It’s just not using your instance of ollama. They have embedded ollama which is one of the ways ollama was originally intended to be used. If you use the ollama cli and point it at the msty service it will continue to work. It is 100% still ollama.

    • @technovangelist
      @technovangelist  9 месяцев назад

      That said they are working on an update that will use your instance as well

  • @voiceoftreason1760
    @voiceoftreason1760 9 месяцев назад

    I can't find the source code, git repo on the page. Is this a commercial app?

    • @technovangelist
      @technovangelist  9 месяцев назад

      It doesn’t seem to be commercial yet but not open source I think.

  • @rodolfozacarias2900
    @rodolfozacarias2900 9 месяцев назад

    Excellent video, as always. I'm following this series and I was curious: is there any ollama model that allows training with its own dataset, like Chat with RTX? Thanks, Matt!

    • @stickmanland
      @stickmanland 9 месяцев назад +2

      Chat with RTX doesnt train the model, it just feeds it your data using RAG.

    • @rodolfozacarias2900
      @rodolfozacarias2900 9 месяцев назад

      ​ @stickmanland Thanks for answering. Is there any Ollama model that could do that?

  • @doriboaz
    @doriboaz 9 месяцев назад

    Matt thanks for the insight do you know if msty can be installed on wsl ubuntu

    • @technovangelist
      @technovangelist  9 месяцев назад +1

      no idea, but both ollama and msty can be installed on windows without wsl

  • @stanTrX
    @stanTrX 9 месяцев назад +3

    I dont like docker very difficult to install, setup and manage

    • @petebytes5010
      @petebytes5010 8 месяцев назад

      Docker desktop makes it easier

  • @NLPprompter
    @NLPprompter 9 месяцев назад +1

    enter the Local Multiverse LLM style UI. somehow this is might really useful for me

  • @josuelapa2271
    @josuelapa2271 9 месяцев назад

    I can't seem to find the source code. Is it close-source? and if yes, why? makes me kind of doubt it...

    • @technovangelist
      @technovangelist  9 месяцев назад +1

      My review was about whether it’s a great ai tool. Open source or closed isn’t really relevant to the discussion. Looks to be closed source.

  • @Michael-London
    @Michael-London 8 месяцев назад

    I am confused. Does it actually use OLLAMA i thought it has its own text service?

    • @Michael-London
      @Michael-London 8 месяцев назад

      Website says:
      Do I need to have Ollama installed to use Msty?
      No, you don't need to have Ollama installed to use Msty. Msty is a standalone app that works on its own. However, you can use Ollama models in Msty if you have Ollama installed. So you don't need to download the same model twice.

    • @technovangelist
      @technovangelist  8 месяцев назад

      Yes it uses ollama.

    • @technovangelist
      @technovangelist  8 месяцев назад

      Correct. You don’t need to install it because it embeds ollama.

  • @kovukumel4917
    @kovukumel4917 8 месяцев назад

    Seems closed source, fat client is an interesting choice, but it is very polished. I like the web deployment of Open WebUI especially because it can do authentication from header so, for example, if you are using Tailscale mesh network it can authenticate you based on your TS identity automatically. Anyway these are clearly aimed at 2 different user groups

  • @giuliogemino6407
    @giuliogemino6407 9 месяцев назад

    Actually most ollama users are GNU+Linux users. You omit how to install and run or associate the GUI with ollama packages in a GNU+Linux OS. And also eventually how to use the GUI as an Ollama Web scraper to get the most updated information...
    Again you omit to describe the weight of the MSTY package, its responsiveness, actual bugs to be aware of... and eventual interaction with other programs.
    Perhaps next time start by how to install, is it quick, quicker, solw... compared with other UI? Does it offer functionality not available elsewhere? Is it implemented better?

    • @technovangelist
      @technovangelist  9 месяцев назад

      No most users are not Linux. Windows outnumbers Linux for ollama by about 3 to 1, then Mac then Linux. Install is the easiest thing to do, not worth showing.and no it doesn’t offer new functionality. That was made very clear. It’s simple.

  • @inout3394
    @inout3394 9 месяцев назад

    Thx

  • @isalama
    @isalama 9 месяцев назад

    Can we upload files and chat with it?

    • @technovangelist
      @technovangelist  9 месяцев назад

      No. Just a simple chat client. Not rag.

  • @autoboto
    @autoboto 9 месяцев назад

    So far seems to assume on windows all local models are on C instead of another volume. Researching a work-around

  • @Arsat74
    @Arsat74 4 месяца назад

    Are you still behind MSTY ?
    Chat GPT has probably also found critical reports about the tool

    • @technovangelist
      @technovangelist  4 месяца назад

      I didn’t create it but I like it. ChatGPT is definitely not an authority to trust about critical reports.

  • @Maisonier
    @Maisonier 9 месяцев назад +1

    I use Lm Studio with Anything LLM.

    • @technovangelist
      @technovangelist  9 месяцев назад

      LM Studio is a great tool to start working with models. A lot of folks run into walls with that pretty soon and migrate over to using Ollama instead. LM Studio has been around a bit longer than Ollama has.

    • @technovangelist
      @technovangelist  9 месяцев назад

      ok, i finally tried using it....its dog slow for everything. Why do you like it?

  • @antoniobruce4678
    @antoniobruce4678 9 месяцев назад

    Is it open-source with permissive licensing?

    • @technovangelist
      @technovangelist  9 месяцев назад

      I don’t think it is open source, at least the code is not easily accessible. That said I have been involved with many open source projects that weren’t on GitHub. No idea what the license is.

  • @alisaid3745
    @alisaid3745 2 месяца назад

    I could not run or download llama models any one can explain for me WHY???

    • @technovangelist
      @technovangelist  2 месяца назад

      Need more info. What did you try? What error do you get? Where are you doing it? Might be better to ask in the ollama discord

    • @alisaid3745
      @alisaid3745 2 месяца назад

      @@technovangelist Thank for your reply:
      I attempted to download "llama3.2-vision
      " using "msty," but encountered the following error message: "Could not add model llama3.2-vision
      to your library. Please try again." Subsequently, I used the command Ollama run llama3.2-vision in the terminal to download the model. The download was successful, and I confirmed its presence by running the Ollama list command. However, when I opened "msty," the model appeared in the list, but after selecting it and attempting to chat, I received the following message: "llama runner process has terminated: exit status 0xc0000409"

    • @technovangelist
      @technovangelist  2 месяца назад

      Not sure if the embedding ollama in msty has been updated to use that. I think you can also point msty to your own install of ollama as well. Try that?

    • @technovangelist
      @technovangelist  2 месяца назад

      Does that model work with ollama on its own on your machine?do other models have that issue

    • @alisaid3745
      @alisaid3745 2 месяца назад

      @@technovangelistYes it does work normally in cmd.

  • @HyperUpscale
    @HyperUpscale 9 месяцев назад

    JUST a correction - this is not a "simple Ollama GUI"!
    This is a complete app that:
    - install libraries
    - installs ollama
    - download separately models
    - runs on its own, regardless if you have one or multiple other servers running on your computer.
    This is not Ollama UI, but an application uses Ollama with UI😅

    • @technovangelist
      @technovangelist  9 месяцев назад +1

      This is an app that is very much in keeping with the goals of Ollama. It is a simple gui that uses Ollama. If you have downloaded models ahead of time, you can use those models in Msty, just like any other client ui that uses ollama. If you download the models from msty, you can use them in the cli, just like any other client ui that uses Ollama. As discussed in the video and in the comments, this will be updated soon to allow configuring to use your own ollama instance on your local machine or remotely. No corrections are needed.

    • @HyperUpscale
      @HyperUpscale 9 месяцев назад

      @@technovangelist I agree with you.
      But it just doesn't match my knowledge:
      Front end = GUI or UI
      Backend = Engine, workflows
      MYST BackEnd (ollama) + FrontEnd (UI), not just GUI.
      That's why doesn't make sense to me to call MYST GUI.

    • @technovangelist
      @technovangelist  9 месяцев назад +2

      MYST was a game in the 80s that was the first 'killer app' for the CD ROM. msty is the gui for ollama we are talking about here. But you can call it whatever you like. It’s a simple gui that helps folks that need a simple gui to use ollama.

    • @HyperUpscale
      @HyperUpscale 9 месяцев назад

      @@technovangelist Alright

  • @OliNorwell
    @OliNorwell 9 месяцев назад +11

    Looks nice but many people run ollama on a headless server with a beefy graphics card then access it from a laptop. So being able to enter an Ollama IP address is key and surely a very easy thing to do.

    • @joeburkeson8946
      @joeburkeson8946 9 месяцев назад +1

      Thanks time well saved.

    • @technovangelist
      @technovangelist  9 месяцев назад +5

      That exact question is covered in the video.

    • @TheAtassis
      @TheAtassis 9 месяцев назад

      ​@technovangelist this makes the title of the video misleading. For now it has no relation to ollama

    • @technovangelist
      @technovangelist  9 месяцев назад +5

      What do you mean @TheAtassis? The title refers to this being a client for ollama. Because it’s a client for ollama. It couldn’t be a more accurate title.

    • @Larimuss
      @Larimuss 6 месяцев назад

      Yup 100% this is exactly what in trying to do lol. Dont want to work on my main comp and want my partner to be able to access it on her laptop.

  • @attaboyabhi
    @attaboyabhi 8 месяцев назад

    having a RAG will be cool

    • @technovangelist
      @technovangelist  8 месяцев назад +1

      Its pretty nice now, but i am looking forawrd to some improvements in the next version or two

  • @IanScrivener
    @IanScrivener 8 месяцев назад

    We are to Msty 0.9 and they have added some of your suggestion
    Pls do a Msty update video…

    • @technovangelist
      @technovangelist  8 месяцев назад

      I think I'm going to wait a couple more versions until they clean up some of the things that they've added in the last few versions. Specifically, being able to automatically update the RAG database when there's changes to the Obsidian Vault.

  • @martin22336
    @martin22336 9 месяцев назад

    I cant find a single on for windows not a single one. Its stupid why is that the case I hate docker I hate the stupidity of the lack of a native app.

  • @Pregidth
    @Pregidth 9 месяцев назад +1

    Only downside is that it is not open source, is it?

    • @technovangelist
      @technovangelist  9 месяцев назад +1

      I don’t know about a downside but it’s just another choice the developer has made. Every dev makes a number of choice they feel are right about a product.

    • @Pregidth
      @Pregidth 9 месяцев назад +1

      @@technovangelist I am not a big expert, but if we are not able to see the code, we also don't know if user request might be send accross to the developer, which would then be a privacy issue and contradicts open source LLMs in my opinion.

    • @technovangelist
      @technovangelist  9 месяцев назад

      You can't see any of the code in most applications from Microsoft or Apple or plenty of other big companies. Doesn't make them less trustworthy.

    • @sc0572
      @sc0572 9 месяцев назад +1

      ​@technovangelist yes, it does. In certain industries, it's a problem. In terms of AI, it creates a bigger problem since at the moment the biggest holdup for some industries is where does the data reside. To promote more adoption today we need more open source solutions that work. The small legal and medical practices I service can't use copilot, openAI is scary, and VMware is to expensive.

    • @technovangelist
      @technovangelist  9 месяцев назад

      OSS is a shield some orgs like to hide behind sure. But being oss doesn’t automatically make safer. How many open source projects have had vulnerabilities that go unnoticed because most don’t look at the code and just assume others do. And any security and compliance team can work with a team from a closed source project to understand the risks. Otherwise no closed source tools would be used and that’s just not the case.

  • @Racife
    @Racife 9 месяцев назад

    Found Jan being mentioned on reddit as an open webui alternative - would love to hear your thoughts on it!

  • @NathanChambers
    @NathanChambers 9 месяцев назад

    Why are all the UIs coming out browser based? Browser based UIs store data which Microsoft, Mozilla, and Google can steal. :/ I personally decided to just make my own personal UI that fit's my needs using python.

    • @technovangelist
      @technovangelist  9 месяцев назад +1

      wait, you made that comment on a ui that is not browser based

    • @NathanChambers
      @NathanChambers 9 месяцев назад +1

      @@technovangelist 1:30am here, I guess I was too zoned out tired at the start that it was an app. And the fact it looks so close to openwebui had me think it must be web. My bad, my bad.

  • @JJ.R-xs8rf
    @JJ.R-xs8rf 9 месяцев назад

    I find LM Studio way easier to install and use.

    • @technovangelist
      @technovangelist  9 месяцев назад

      It’s pretty common to start there and then move to ollama when you hit the wall.

    • @technovangelist
      @technovangelist  9 месяцев назад +1

      I would love to know more about this. Everything about lmstudio is slow and hard. why do folks like it. I have a video about it but its really hard to find anything positive to say.

    • @MyAmazingUsername
      @MyAmazingUsername 8 месяцев назад +1

      ​@@technovangelistI don't think there's any deep reason. Just "it's 1 exe file for everything". Very common motivation among Windows users.

  • @JustinJohnson13
    @JustinJohnson13 9 месяцев назад

    Have you looked at AnythingLLM?

    • @technovangelist
      @technovangelist  9 месяцев назад

      Yes

    • @technovangelist
      @technovangelist  9 месяцев назад +2

      But no video for it. I want to focus on videos I can say mostly good things about

    • @JustinJohnson13
      @JustinJohnson13 9 месяцев назад

      Sounds like you should make one then. 😉 Love your videos, man. Keep'em coming.

    • @technovangelist
      @technovangelist  9 месяцев назад +1

      Ugh. I can....but shouldn’t.

    • @JustinJohnson13
      @JustinJohnson13 9 месяцев назад

      @@technovangelist no? Not a fan?

  • @12wsaqw
    @12wsaqw 9 месяцев назад +2

    Hope it has DARK MODE!!!!

  • @3.cha9
    @3.cha9 9 месяцев назад

    niyce

  • @Techonsapevole
    @Techonsapevole 9 месяцев назад

    Nice, but I like more openweb ui

  • @thebosha90
    @thebosha90 9 месяцев назад

    I wouldn’t call this app “simple”) imo, the simplest way to use ollama is Alfred workflow or raycast extension if one don’t like the ollama cli)

  • @moe3060
    @moe3060 9 месяцев назад

    How Much did they pay you? clearly biased

    • @technovangelist
      @technovangelist  9 месяцев назад +4

      I don’t think Msty is making any money. The amount of money I make from RUclips videos is less than what a high schooler makes at the local McDonald’s in a couple days. And I haven’t taken any sponsorships for any video I have ever made. I made a review of the best tool available for ollama and ollama is the best tool for running models. So you are saying anything positive online is paid for? Are you really that stupid or just trying to rile people up.

    • @technovangelist
      @technovangelist  9 месяцев назад +1

      That said if someone wanted to pay me I am open to it. But I would have to disclose that relationship when posting the video. You can see when a video has taken a sponsorship very clearly in the way the platform presents the video to you.

    • @ayrtonmaradona
      @ayrtonmaradona 9 месяцев назад

      @@technovangelist hey man, don't worry about it and continue doing your awesome work with ollama and youtube videos, I have some suggestions of tools which I had used with ollama: phidata, lobehub, anythingLLM, LM studio, and pinokio, but I have the same question for all those tools, is all of these 100% private and security?

    • @technovangelist
      @technovangelist  9 месяцев назад +1

      That’s not something I can guarantee as I don’t work for them.

  • @user-jk9zr3sc5h
    @user-jk9zr3sc5h 9 месяцев назад

    I really need these UIs to allow us to adjust temp settings, max tokens, etc.

  • @shuntera
    @shuntera 9 месяцев назад

    “Sliding Doors”? LOVE that movie!

  • @printingbooks-d4e
    @printingbooks-d4e 7 месяцев назад

    Do you know where ican get a list of all the 'parameters' '/set' will allow? I stumbled onto the numthread parameter and i like it alot but... Where is a list of all of them? yes i know you dont work for ollama. Thanks :)

    • @technovangelist
      @technovangelist  7 месяцев назад +1

      type /set parameter and press enter. that will get a lot of them. I haven't seen a full list anywhere

    • @printingbooks-d4e
      @printingbooks-d4e 7 месяцев назад

      @@technovangelist ay right on. quality vids. good eve from Frostburg Maryland.

    • @printingbooks-d4e
      @printingbooks-d4e 7 месяцев назад

      *The num_thread parameter. (ex: /set parameter num_thread=13)