How To Install Fabric - Open-Source AI Framework That Can Automate Your Life

Поделиться
HTML-код
  • Опубликовано: 9 июл 2024
  • Fabric is described as an open-source framework for augmenting humans using AI. They have community-driven prompts that are tried and true and work well, and an easy command line interface with a UI coming soon.
    Join My Newsletter for Regular AI Updates 👇🏼
    www.matthewberman.com
    Need AI Consulting? ✅
    forwardfuture.ai/
    My Links 🔗
    👉🏻 Subscribe: / @matthew_berman
    👉🏻 Twitter: / matthewberman
    👉🏻 Discord: / discord
    👉🏻 Patreon: / matthewberman
    Rent a GPU (MassedCompute) 🚀
    bit.ly/matthew-berman-youtube
    USE CODE "MatthewBerman" for 50% discount
    Media/Sponsorship Inquiries 📈
    bit.ly/44TC45V
    Links:
    github.com/danielmiessler/fabric
    Chapters:
    0:00 - About Fabric
    1:06 - Installation
    3:43 - How to Use
  • НаукаНаука

Комментарии • 213

  • @MattLuceen
    @MattLuceen 4 месяца назад +17

    Thank you for keeping me on the cutting edge. I have been trying to cobble something together with CrewAI, but this framework is intriguing and I'm installing it now!

    • @TheMrTriplep
      @TheMrTriplep 4 месяца назад +4

      Already has crewAI integration 🎉

  • @space_ghost2809
    @space_ghost2809 4 месяца назад +12

    This is the best and most practical/useful AI chennels of all. Thanks again!

  • @dr.mikeybee
    @dr.mikeybee 4 месяца назад +9

    What's cool is that one can integrate these prompts into any project.

  • @user-bd8jb7ln5g
    @user-bd8jb7ln5g 4 месяца назад +8

    Interesting, I'm not sure what exactly it's doing, but what we really need is premade(tested and working) functions that work with LLMs to process data, such as:
    Retrieve information from the internet on topic x, then process, summarize, etc, so many result pages.
    Process a pdf/docs, summarize, extract data for model fine-tuning, rewrite, etc.
    That would be an interim solution before we get truly capable, immediately accurate models.
    Not crazy about them inventing what seems like unnecessary nomenclature.

  • @Maisonier
    @Maisonier 2 месяца назад +7

    Could you make a video of this with Llama3 in a local computer please?

  • @MeinDeutschkurs
    @MeinDeutschkurs 4 месяца назад +1

    I’m excited! That’s really powerful. Have to check the original sources. Thx Mathew!

  • @chookady222
    @chookady222 4 месяца назад +27

    Imagine using the patterns in a vector DB (meaningful description and then the output instructions) but instead of using Fabrics pretrained ones they let you build your own in their framework. You can ideally train business ready "templates" to complete and save time and money. I hope they expand on this framework will be interested to see how they can integrate into different environments and allow the user to customize this further!
    As always thank you for keeping us in the loop of new tech out there! Much appreciated!

    • @barrycrowder
      @barrycrowder 4 месяца назад +3

      Yes, you can add your own patterns.

    • @chookady222
      @chookady222 4 месяца назад +2

      @@barrycrowder Ahhh awesome thank you for confirming! Definitely need to set some time and get to know this better it ticks so many good useful boxes

    • @themax2go
      @themax2go 3 месяца назад

      @@chookady222 agreed, seems to have very high ROI(t)

    • @PsyoPhlux
      @PsyoPhlux 3 месяца назад +1

      This will be great for OPs/IT to build proprietary business logic patterns for internal workflows, combined looms and have that be a single call for say the sales report to simply have a function called "My Monthly Sales Report" and it ties to all the personal patterns you need to pull and it just runs with just typing in that pattern name. It would be cool to have a subscribe to pattern and have a dashboard page with whatever boxes of info you want for showing your outputs...
      This will be like a Stenotype for pulling data with patterns and such.
      So massively complex prompts can be combines where the actual Serious Business Dashboard is a tiny single page that any non-tech worker can customize for their own use.

    • @chookady222
      @chookady222 3 месяца назад

      @samstave7079 ideally like you say a IT or Ops team who can focus on mapping the workflows,usage patterns etc would be crucial to getting the accuracies correct for whatever data you loom in. If you took your concept far enough you could do away with the non technical workers and possibly high IT level workers. Pair this with a deployable application that acts as an agent, layered with a LLM of your choice, connection to the loom database, code execution methods such as Open Interpreter and something a bit more versatile to run through browser tasks. You effectively have a pre-trained, defined set of tasks scripted that have been verified and designed around user workflows and business objectives. Build enough looms into the datasets and as you say have a single point of access you effectively create an agent system to execute human tasks, provided you build up a large enough dataset to complete tasks of individual roles within a company

  • @SixTimesNine
    @SixTimesNine 3 месяца назад

    Thanks again, Matthew. Another excellent vid. Concise, accurate, genuine, no fluff.

  • @meinbherpieg4723
    @meinbherpieg4723 4 месяца назад +2

    Matt, you are a machine. Thank you for what you do.

  • @johnbarros1
    @johnbarros1 4 месяца назад +2

    This is gas! ⛽️Thank you mister Berman!

  • @irafuchs
    @irafuchs 4 месяца назад +1

    Thank you! Very well done.

  • @mkznvv
    @mkznvv 3 месяца назад

    Awesome! . Thanks for video. It worked for me.

  • @xmfr
    @xmfr 3 месяца назад +2

    ./setup.sh. doesnt work and ask to use pipx install but I don't know how to do it ? anyclue ?

  • @googleSux
    @googleSux 4 месяца назад +11

    Honestly pretty convoluted. Plus weird terminology like loom, fabric etc if one is prompt challenged one can just ask the LLM to create you a good prompt!

  • @VaibhavPatil-rx7pc
    @VaibhavPatil-rx7pc 3 месяца назад

    Awesome information

  • @cpatocybersecurity
    @cpatocybersecurity 3 месяца назад +1

    Yes awesome project! FYI I’ve made a bunch of Pattern demos as Shorts

  • @FameMonsterD3
    @FameMonsterD3 3 месяца назад +5

    If you are on windows 11 using WSL, pbpaste won't work (naturally) you need to install xsel and use replace (without quotes) "pbpaste" with "xsel --clipboard --output" | your fabric pattern. This should do the same thing but on linux system or windows using wsl.

    • @borntodoit8744
      @borntodoit8744 3 месяца назад

      explain WSL
      if you going to try to be helpful put the effort in to introduce whatever tools & references you using

    • @RafaGamesPT
      @RafaGamesPT 3 месяца назад +1

      Using Powershell Get-Clipboard seems to do the job of pbpaste

    • @FameMonsterD3
      @FameMonsterD3 3 месяца назад

      @RafaGamesPT that's another one too.

    • @theafricanquant501
      @theafricanquant501 3 месяца назад

      `win32yank.exe -o | fabric --pattern summarize` worked for me on my wsl2 installation (with Ubuntu distro)

  • @marshallodom1388
    @marshallodom1388 4 месяца назад +1

    Receiving your grade sounded as awesome as having to do extra yard work.
    I wonder if these prompts could be like written as text on a BBS or SMS? then while holding down the Ctrl key press the letter C, locate an AI interface you want to use it in and again hold down Crtl but press P this time. WA-la!

  • @randyh647
    @randyh647 3 месяца назад +3

    Love to see instructions on how to use this with ollama and or other open source offline tools.

  • @MagusArtStudios
    @MagusArtStudios 4 месяца назад +1

    I made something like this yesterday made a nice collapsing scroll menu with a bunch of sample questions that you can just click on.

  • @rohitdas490
    @rohitdas490 3 месяца назад

    Matthew's smirk when he says I am gonna revoke this key before I publish this video is insane.

  • @royalcanadianbearforce9841
    @royalcanadianbearforce9841 3 месяца назад +1

    Im struggling so much to get this running with Ollama locally. I cannot seem to get fabric to check for local models. I'd love a video on this!

  • @delvorin1841
    @delvorin1841 25 дней назад

    I would like to see a video covering the hardware to run Ai locally. Maybe it could cover what a bare minimum set up is then a ideal based set up and then finally a beast mode setup. To clarify, when I bare minimum I'm talking about a self hosted server (older pc) I know that you can do it with a Raspberry Pi, but seriously I think that would be painfully slow.

  • @rbus
    @rbus 3 месяца назад

    Pro-tip: I always use venv to create customized python environments for any AI project. Stable Diffusion front-ends actually use "venv" by default.
    Create a custom Python environment in your home directory (~/venv)
    python -m venv ~/venv.
    Enable it for Mac/Linux unix users
    . ~/venv/bin/activate
    Enable it for Windows users
    ~/venv/bin/activate.ps1
    May need to replace "bin" with "scripts" in some cases.

  • @lowkeylyesmith
    @lowkeylyesmith 3 месяца назад +1

    Hi and have a nice day,
    my name is René and I am trying to realize a project that connects several OpenSource LLM's with an agent framework. Each LLM is supposed to solve a specific task. Unfortunately, I can't describe it as well as I would like to, but I'll give it a try.
    It is about receiving, unpacking and analyzing a PST file. There should be a separate trained LLM for each area of the analysis. For example, an LLM that graphically displays the connections of the suspect to others, one that is trained with different languages to search in all possible languages, .... so the whole thing is briefly outlined.
    Since I am still relatively new to this area, I would be very happy to receive tips and tricks.
    Thank you and best regards from Austria

    • @borntodoit8744
      @borntodoit8744 3 месяца назад +1

      this might help you
      there is GORRILA AI (a model 2 agent )
      To create a LLM management process that both searches & locates the right LLM for a task
      then also executes as an agent (making API calls [remote website W] to automate original request)

  • @armisis
    @armisis Месяц назад

    Installing now.

  • @I.Carnivore
    @I.Carnivore 3 месяца назад

    The zsh error happened because of the poetry setup. Either the script didn't force the newline or it was manually edited, there wasn't one put in.

  • @UTJK.
    @UTJK. 3 месяца назад

    Can you make a recap video with a summary of the tech you propose and which one you suggest for each category?

  • @hermes537
    @hermes537 3 месяца назад +1

    I didn't understand it at all but i LOVE It

  • @themax2go
    @themax2go 3 месяца назад

    ok you'll get a 👍 for that one (for being an actual usage case; haven't tried it out yet though)

  • @stefang5639
    @stefang5639 4 месяца назад +1

    Cool project. Shouldn't be too hard to write a small frontend with a dropdown list of all prompts for this.

  • @wesleydunn169
    @wesleydunn169 Месяц назад

    Unsolicited shortcut: For selecting all of a transcript, you can use the "End" key, after highlighting the beginning and it will highlight all of the transcript. Use the "Home" key after highlighting the bottom.

  • @marcfruchtman9473
    @marcfruchtman9473 3 месяца назад

    Thanks for this video. Very interesting. Would like to seem more use case examples from the available prompts if you have time to put something together.

    • @cpatocybersecurity
      @cpatocybersecurity 3 месяца назад

      Agree super cool project. FYI - I have some shorts on different fabric patterns.

  • @LilLegyiths
    @LilLegyiths Месяц назад

    Is there a way to change the GPT model this tool is trying to access? Currently, it is looking for GPT-4 turbo preview, which results in an error. I have an OpenAI account with some money in it.

  • @THE-AI_INSIDER
    @THE-AI_INSIDER 3 месяца назад +3

    can you make a video of this with groq?

  • @bradstudio
    @bradstudio 4 месяца назад +3

    Can you show a video on how to feed a .epub book for analysis? Being able to summarize a book like that would be incredibly useful.

    • @rajachirravuri
      @rajachirravuri 3 месяца назад

      Convert epub to PDF and then send it for analysis. Lots of PDF readers

  • @UnchartedDiscoveries
    @UnchartedDiscoveries 4 месяца назад +4

    Are there any open source or close source large action models? I'm not able to find any

    • @XavierAlabart1979
      @XavierAlabart1979 3 месяца назад

      Also interested to know if exist

    • @thomasj0330
      @thomasj0330 3 месяца назад +1

      The only company to actually confirm work on a LAM is rabbit, and theirs is going to be locked to their device I assume. There's rumor openAI is working on one, GPT5 could be a LAM... Short answer is no. Still waiting on that.

    • @UnchartedDiscoveries
      @UnchartedDiscoveries 3 месяца назад

      thank you@@thomasj0330

  • @StringerBell
    @StringerBell 4 месяца назад +9

    Everything sounds so exciting until you see the installation process and the missing UI afterwards. Than the reaction is just "meh"

    • @CrypticConsole
      @CrypticConsole 4 месяца назад +3

      if you cannot use a terminal that is on you

    • @chvckn0rri5
      @chvckn0rri5 4 месяца назад

      *Can you imagine a better way to sharpen your CLI skills?*
      Iron sharpens iron, fren 🫡

    • @PazLeBon
      @PazLeBon 4 месяца назад

      lmao, same as most 'coders' stuff, clueless about the real public lol

    • @Budrew21
      @Budrew21 Месяц назад

      @@CrypticConsole Isn't the whole point of AI to do things for us? Using a terminal in the age of AI feels like riding a horse across a country when cars and planes exist.

    • @CrypticConsole
      @CrypticConsole Месяц назад

      @@Budrew21 you give a command to the AI and it does some processing before returning an output. I would say that fits the idea of a read, eval, print loop very well. Since we already have an extremely advanced and extensible REPL interface, IE the terminal, it makes sense to use it.

  • @ShaunPrince
    @ShaunPrince 3 месяца назад

    Just FYI, the pbcopy / pbpaste is a MacOS exclusive, but still super userful.

  • @poldiderbus3330
    @poldiderbus3330 4 месяца назад +2

    Just a few seconds into the video - I think the solution to your screwed up Python environment, or to not screw it up in the first place, is to use a virtual Python environment for each project?! For me, on Debian 12, that's almost the only solution to not having to constantly read that pip is about to break the systems apt packages.

  • @amernew3ful
    @amernew3ful 3 месяца назад

    is this installation for mac ?
    because i'm getting this error:
    The term 'git' is not recognized as the name of a cmdlet, function, script file, or operable program.

  • @theh1ve
    @theh1ve 3 месяца назад +1

    Sooo just to check i could just lift these prompts and use them anyway i like without the nightmares of installing fabric? Perhaps it should just be a free prompt library???

  • @danshib
    @danshib 2 месяца назад

    What do you do if you use windows with no pbpaste?

  • @outtakontroll3334
    @outtakontroll3334 4 месяца назад +1

    this kind of ai is going to be killer stuff for everyday use, needs to be packaged a little easier though. early days yet.

    • @VisionCapitalist
      @VisionCapitalist Месяц назад

      This is the essence of where they are at - there are 1,000 firms on this but room for 1,000 flowers to bloom

  • @NNokia-jz6jb
    @NNokia-jz6jb 4 месяца назад

    So this is some kind of prompt software that API's gpt4?

  • @ardile
    @ardile 2 месяца назад

    Hi @MatthewBerman, thanks for the video.. it looks like fabric gonna end up being a "swiss knife" AI tool.
    I am having problems setting it up/installing it on ubuntu on my Windowss 11 PC. Everything worked well until I get to running: fabric --setup
    I get an error message that says "fabric: command not found". I have tried looking for a solution online, but found none.
    It will be greatly appreciated if you can spare a few minutes of you busy time to help resolve this.
    P.S. If anyone else has the solution to this problem, I welcome their guidance!

  • @michai333
    @michai333 3 месяца назад

    It looks promising but I’ll wait for a UI.

  • @Stewarts_in_love
    @Stewarts_in_love 4 месяца назад +2

    When i have enough fabric can i make a blanket?

  • @YoungSecurity
    @YoungSecurity 4 месяца назад

    Regarding the error missing "ts". Remember you can put the error message into an LLM and resolve it.

    • @DavidGuesswhat
      @DavidGuesswhat 4 месяца назад

      For me only gpt 4 or llms alike are abre to resolve those kind of things kmao

  • @cosmicaug
    @cosmicaug 3 месяца назад

    How do we use this tool to continue the conversation?
    For example, we might have a directive to respond to a certain type of follow-up prompt. Maybe we have a directive to respond to the user with follow-up clarification questions, etc.. How would we do this from the CLI? Or do we need to use this in a different manner (assuming what I am asking about is possible).

    • @cosmicaug
      @cosmicaug 3 месяца назад

      So I guess that what I am asking is whether it is possible to create and rejoin as session.

  • @ImAnnonymousUser
    @ImAnnonymousUser 3 месяца назад +2

    Can someone tell me which programming should I learn first in order to do all these ai stiff. I also want to understand what I am doing and not just copy paste everything. Thanks in advance.

  • @Antonio-Dev
    @Antonio-Dev Месяц назад

    This is great, congrats for the video, but I have some doubts concerning how to ask
    first you write pbpaste | fabric --pattern extract_wisdom and then you write pbpaste | analize_claims --stream, what I mean, when to use fabric command, when to put pipes, etc. Thanks

  • @-Evil-Genius-
    @-Evil-Genius- 4 месяца назад +4

    🎯 Key Takeaways for quick navigation:
    00:00 🛠️ *Installing Fabric and Understanding its Use Cases*
    - Fabric is an open-source project designed to solve everyday problems using AI.
    - Fabric serves as a library of tried and true prompts generated and reviewed by the community.
    - Use cases of Fabric include extracting interesting parts of videos, writing essays, summarizing academic papers, creating AI art prompts, and more.
    01:12 🔄 *Installing Poetry and Setting up Fabric*
    - Install Poetry by running the provided command in the terminal.
    - Run the setup script to initialize Fabric.
    - Restart the shell or open a new terminal tab to apply changes.
    03:13 🛠️ *Setting Up Fabric and Exploring Terminology*
    - Configure Fabric by providing the GP4 API key using the setup command.
    - Explore Fabric's terminology: mill, pattern, stitch, and loom.
    - Use the fabric CLI to list available patterns and understand their functionality.
    05:04 📚 *Extracting Wisdom from Content*
    - Use Fabric's pattern to extract wisdom from content like videos or articles.
    - The "extract wisdom" pattern retrieves key ideas, quotes, facts, references, and recommendations.
    - Analyze claims and receive an overall score based on the extracted wisdom.
    06:37 📊 *Analyzing Claims and Concluding*
    - Fabric can automatically analyze claims extracted from content and provide supporting evidence or refutations.
    - Each claim is scored, providing insight into its credibility.
    - Fabric offers a comprehensive solution for leveraging AI to enhance various tasks.
    Made with HARPA AI

  • @KEKW-lc4xi
    @KEKW-lc4xi 3 месяца назад

    Interesting. I just want an LLM that can load my entire Java Spring Boot project. Even with doing @workspace command with github copilot its not able to keep track of context. LLMs are getting a lot better but I will be amazed when I can walk it through how I want my website to function and it gives me project wide solutions like update xyz file with this and that code.

  • @Dsuchong
    @Dsuchong 3 месяца назад

    Error while cloning the code
    Fetch-pack: unexpectedly disconnect while reading sideband packet
    Fatal: early EOF
    I repeat same method it still give me this error
    Any help?

  • @justinjja2
    @justinjja2 4 месяца назад +1

    Elon says they are open sourcing grok this week, exciting.

  • @Al-Storm
    @Al-Storm 4 месяца назад +5

    Just seems like a prompt creator, no?

  • @emotionalmindedstate
    @emotionalmindedstate 27 дней назад

    Can it automate local pc tasks i run everyday?

  • @cristian15154
    @cristian15154 3 месяца назад

    Wisdom, what does "to extract all the wisdom" mean, like a categorized summary?

  • @codelucky
    @codelucky 3 месяца назад

    Fabric is basically, Metaprompts in the backend.

  • @thunken
    @thunken 4 месяца назад

    I've had some thoughts about whether something like this would exist. I wonder how they eval prompts and if there's any mechanism that might dynamically determine prompt effectiveness.

    • @therainman7777
      @therainman7777 4 месяца назад

      DSPy is an open-source Python package that is largely made for this purpose. You should check out the docs.

  • @vuvannham1992
    @vuvannham1992 Месяц назад

    I want to create a common workspace so I can review, comment, and approve articles before posting on social networking platforms. Does anyone have any suggestions for a gifclone for this?

  • @shahveer81
    @shahveer81 3 месяца назад

    Can Fabric be used without any API keys? Or are there any free models to try it out?

  • @Hhunted
    @Hhunted 4 месяца назад

    Thanks for this. Tried to install this before, but it failed. - Still the same issue on windows when trying to run ./setup.sh. running it from terminal it pops-up another terminal very quickly and closes. Can't continue the steps after that since nothing is working.

    • @FameMonsterD3
      @FameMonsterD3 3 месяца назад

      you need to use xsel on windows WSL, pbpaste is an exclusive Mac command.

  • @TimMattison
    @TimMattison 3 месяца назад +1

    The majority of Python related AI projects don’t work on my system. I can’t imagine I’m the only one. Even he had issues just getting Poetry to work. Why do Python devs put up with this?

  • @themax2go
    @themax2go 3 месяца назад

    next: how to use it to generate images, running all locally

  • @morespinach9832
    @morespinach9832 2 месяца назад

    On Mac, to install fabric, the "./setup.sh" did not work. But "pipx install ." worked.

    • @morespinach9832
      @morespinach9832 2 месяца назад

      After installation I keep getting this error:
      "Error: Error code: 404 - {'error': {'message': 'The model `gpt-4-turbo-preview` does not exist or you do not have access to it.', 'type': 'invalid_request_error', 'param': None, 'code': 'model_not_found'}}
      Error code: 404 - {'error': {'message': 'The model `gpt-4-turbo-preview` does not exist or you do not have access to it.', 'type': 'invalid_request_error', 'param': None, 'code': 'model_not_found'}}"

  • @heitorborges3353
    @heitorborges3353 3 месяца назад

    Does it work with others or only chatGPT?

  • @ChrisAnders
    @ChrisAnders 3 месяца назад

    How can us Linux users do pbpaste? Can you recommend another way to get this functionality in a Linux terminal?

    • @ChrisAnders
      @ChrisAnders 3 месяца назад

      I figured out that I can use xsel to achieve the functionality of pbpaste! Now I am trying to figure out how to use the youtube API so i can use the extract_wisdom pattern on yt videos.

  • @HarmonyHouse12
    @HarmonyHouse12 3 месяца назад

    How can we use Google Gemini API with this?

  • @rootor1
    @rootor1 4 месяца назад +1

    It's like a collection of python scripts+prompts for a collection of specific use cases but so far not very impressive. There are much more interesting things out there evolving like people making easy user interfaces for langchain. I would love if all this people collaborated to improve and accelerate together their "life automation" tools but we don't live in a perfect world.

  • @dominik1023
    @dominik1023 3 месяца назад

    ./setup.sh didn't work at all (not recognized as external or internal command). typing just 'setup.sh' without the './'' brings up a window that opens and closes way too fast for me to see it, aside from that... nothing. pipx install after that does nothing, typing 'fabric -h' is not recognized.

  • @HAL9000-B
    @HAL9000-B 4 месяца назад +2

    Hi Matthew, always thankful for your Content, but could you please first give some Showcase of the solution, so we can know if it is worth spending time for this?

    • @matthew_berman
      @matthew_berman  4 месяца назад

      I've been thinking about this a lot lately. Usually I do install and then showcase. Maybe I need to switch the order?

    • @MrLeo000
      @MrLeo000 4 месяца назад

      ​@@matthew_berman if not switch order, at least give a sneak peak of the possible results, so like he said, we know if it's worth spending time on it, or just inform about it. Else, like I did, people just probably skip to the showcase part.

    • @HAL9000-B
      @HAL9000-B 4 месяца назад

      @@matthew_berman Human Brain: Incentive>Motivation>Work. I always have Pavlovs Dog in Mind 😎

    • @HAL9000-B
      @HAL9000-B 4 месяца назад +1

      @@matthew_berman Incentive>Motivation>Action.

  • @hoangnam6275
    @hoangnam6275 4 месяца назад +2

    Sorry guys for interupt you by commenting sth not related to this video, but I suddenly seen some weird case for using gemini pro 1.0 and GPT 4 in ChatGPT
    The case is: I have some unreadable pdf which contain text, but none of any tool can extract the text inside it as it is image of an word, then convert to pdf file. I tried to screenshot the file, then give to both gemini and GPT 4 to OCR and something weird happened:
    1. Gemini refuse to give me the extracted text (I only need extracted text only). Try many time but the same result, it refuse that it cannot assist with this
    2. GPT 4 give me the wrong OCR. It performed good at the first few pic, but after that, it happen in 2 case:
    1. It give me the wrong content for extracted text, some part seem to be hallucination.
    2. The second is totally wrong text: The right text is: "Because of the firm's working capital management pratices are affected by...". Instead, it gave me "The firm’s working capital management practices are influenced by...". Somehow, in this case it start to recognize something and give me totally wrong text, but the same meaning
    Have anyone seen this before? How can we control the LLM behavior in this case? It seem pretty weird, cuz my instruction is extract the text only, it did very good at the start, as the conversation start to increase, somehow the best model lost its attention

    • @hoangnam6275
      @hoangnam6275 4 месяца назад

      Hopefully some professional in AI and LLM can explain this. Is it related to losing attention due to the long conversation?

    • @EricB1
      @EricB1 4 месяца назад +1

      Maybe make the temperature 0.1 or 0 for the API calls

    • @rootor1
      @rootor1 4 месяца назад +1

      Maybe the document contain some kind of hidden text to avoid you do what you are trying to do.

    • @kpkp42
      @kpkp42 4 месяца назад

      Image PDFs are the worst: huge and UNstructured

  • @RakibHasan-hs1me
    @RakibHasan-hs1me 4 месяца назад +1

    How much ram required?

    • @cpatocybersecurity
      @cpatocybersecurity 3 месяца назад

      Works on my Mac mini so not an above average amount

  • @_gmiv_
    @_gmiv_ 3 месяца назад

    You should consider keeping all important info like the terminal views with in the title safe of the screen. I can’t follow your tutorial when you are using the cli.

  • @YacoubSabatin
    @YacoubSabatin 3 месяца назад

    I cannot see any automation, these functions can be performed by a custom GPT, did I miss something?

  • @kidinfinity50
    @kidinfinity50 Месяц назад +1

    When I do fabric -h it doesn’t work

    • @hhljr
      @hhljr Месяц назад

      But fabric -help does

  • @Zeroduckies
    @Zeroduckies Месяц назад

    Screen shot desktop every 3 seconds into jpg feed in llava. It has vision !!! 😂

  • @user-iz9sj1nn5q
    @user-iz9sj1nn5q Месяц назад

    Fabric vs OpenDevin vs Devika - What's better?

  • @PeterStJ
    @PeterStJ 3 месяца назад +2

    it is unclear how you analyzed a video from the command line, considering the prompt is designed to work with text...

  • @tgzsolt
    @tgzsolt 3 месяца назад

    anyone could integrate it with local ollama?

  • @3238juan
    @3238juan 3 месяца назад

    Please make a video on the BitNet paper from Microsoft! 1 bit llms

  • @nonenothingnull
    @nonenothingnull 3 месяца назад

    Why can't stuff just be written in c(++) again, with low deps, and (c)make
    files...

  • @Raviraj-do9mf
    @Raviraj-do9mf 3 месяца назад

    Why are we going in reverse direction? Such tools increasing complexities

  • @genebeidl4011
    @genebeidl4011 3 месяца назад

    Windows holds 72% of the market share. Please do this for Windows:)

  • @g.s.3389
    @g.s.3389 Месяц назад

    it is not clear by the author how to use it with a local AI like llama. did you find it?

    • @hhljr
      @hhljr Месяц назад

      yt --transcript "ruclips.net/video/wPEyyigh10g/видео.htmlsi=B1jC4Dx7bVQ49LVk" | fabric -m llama3:70b -p extract-insights --stream
      [example: this takes a RUclips URL, extracts the transcript and feeds it to llama3:70b on my Mac Studio; change the LLM to match yours]

  • @EliSpizzichino
    @EliSpizzichino 4 месяца назад

    Fabric is right: over-simplification sum up pretty much your approach to AI

  • @VesperanceRising
    @VesperanceRising 4 месяца назад +3

    Actually to install fabric: you need a needle!
    Admit it... Thats a really good point

    • @nickdisney3D
      @nickdisney3D 4 месяца назад +3

      Cut it out

    • @VesperanceRising
      @VesperanceRising 4 месяца назад +1

      Why you gotta take a stab at me??

    • @VesperanceRising
      @VesperanceRising 4 месяца назад +1

      Just poking fun lol
      Trying to hold this thread together

    • @nickdisney3D
      @nickdisney3D 4 месяца назад +3

      Zip it!

    • @VesperanceRising
      @VesperanceRising 4 месяца назад

      Fair enough, you reap what you sow I guess...
      Let's just put a pin in this till I can stitch something better together....

  • @Ian-fo9vh
    @Ian-fo9vh 3 месяца назад

    Don’t say you’ll revoke the key and let them suffer

  • @alejandroantonioespinozalo3497
    @alejandroantonioespinozalo3497 4 месяца назад

    you need a RTX 30 or 40 to run it locally?

    • @rootor1
      @rootor1 4 месяца назад

      Not "need" but the better model you can run and the fastest it runs the performance of the whole thing improves exponentially. Better LLM means bigger and then you need more Vram in your graphic card. Faster inference means more cores in your graphic card so more parallel proceses can be run. Everyday appear improvements but i doubt there is something "magic" that allow run big models locally and fast around the corner, not in less than 1 year. Right now the best you can get at consumer level is a nvidia 4090 or maybe 2 connected together. For the 2 cards setup you would need special motherboard, special powersource and special case, that would mean around 5K$ in hardware.

    • @DavidGuesswhat
      @DavidGuesswhat 4 месяца назад

      ​@@rootor1I can run up to 13gb with a gtx 1650.

    • @DavidGuesswhat
      @DavidGuesswhat 4 месяца назад

      ​@@rootor1search about the channel Aisphere and the vídeo running llms offline on a gtx 1650

    • @DavidGuesswhat
      @DavidGuesswhat 4 месяца назад

      ruclips.net/video/6_qrS5OAPXo/видео.htmlsi=rJodlE3MxCRTHxCe

    • @DavidGuesswhat
      @DavidGuesswhat 4 месяца назад

      IaSphere lllm offline run gtx 1650 ​@@rootor1
      Search and be happy

  • @joe_limon
    @joe_limon 4 месяца назад

    I have had to take a step back from these projects. Too messy that even pinokio won't install/run. Even after attempting to uninstall, deleting files and registry keys

  • @ravipratapmishra7013
    @ravipratapmishra7013 4 месяца назад +1

    In my view you should ditch conda and live with pyenv+poetry.

  • @davidlee50
    @davidlee50 4 месяца назад

    Is there a feature to help handicapped people? APPS for PC?

    • @NNokia-jz6jb
      @NNokia-jz6jb 4 месяца назад

      What feature and what kind of handicap?

  • @yuri.caetano
    @yuri.caetano 4 месяца назад +4

    First. Fix me plz 😂

  • @AIPulse118
    @AIPulse118 4 месяца назад +1

    Wth do you need this complex method to optimize prompts?
    With an 8k context window you can litterally create a custom GPT for this without using knowledge files.

  • @ashishtater3363
    @ashishtater3363 4 месяца назад +2

    Does it support Gemini API??

    • @rootor1
      @rootor1 4 месяца назад +2

      Yes, supports all the apis compatible with the style that openAI made theirs (for example mistral, groq or local inference frameworks like llama.cpp), nowadays that api format have become a de-facto standard.

    • @MattLuceen
      @MattLuceen 4 месяца назад +1

      Looking forward to an answer to this as I owe OpenAI $124 and they won’t let me use their stuff any more 😢
      I don’t know that Gemini implements the OpenAI API style enough to work. Also there is no place to set the API URL in fabric config. 🤔

  • @andrewcheshire244
    @andrewcheshire244 3 месяца назад

    Nerds unite!

  • @WELDE83
    @WELDE83 4 месяца назад

    Any opinions ..they add Larry Summer to the board .... ridiculous

  • @saveli4
    @saveli4 4 месяца назад

    shocking