Deploy FastAPI & Open AI ChatGPT on AWS EC2: A Comprehensive Step-by-Step Guide 🚀

Поделиться
HTML-код
  • Опубликовано: 5 ноя 2024

Комментарии • 12

  • @Harsh-z6k
    @Harsh-z6k 2 месяца назад +1

    One of the best , to the point content . Keep it sir really it is very helpful

  • @AP-fu3bj
    @AP-fu3bj 4 месяца назад

    Very helpful - thank you

  • @sanket1165
    @sanket1165 8 месяцев назад

    Hey Pradip, Thank you so much for wonderful tutorial, as a total beginner it was really helpful.

  • @mohanvishe2889
    @mohanvishe2889 7 месяцев назад

    Simple to understand👍

  • @humayounkhan7946
    @humayounkhan7946 Год назад

    Hi pradip, thanks alot! this is super useful stuff, thanks alot for taking the time to create this. btw not sure if you notice but you forgot to show how to transfer file using bash (but i think you shared the command), aside from that, how do we terminate the ec2/fastapi since now we've configured for it run even if we close the tab? (do we just stop the instance itself or is there a recommended way). Thanks in advance!

  • @the-ghost-in-the-machine1108
    @the-ghost-in-the-machine1108 Год назад +1

    Cool, thanks

  • @Amino-ky4vs
    @Amino-ky4vs 7 месяцев назад

    Thanks a lot

  • @islamicinterestofficial
    @islamicinterestofficial Год назад

    Thanks for the video. I deployed my LLM on Runpod server and now have an endpoint where I pass any question and it will send me the response back. But I want to use this endpoint in Langchain? How I can use that?
    Basically I want to do document question answering using my own endpoint that I deployed. Can you tell me how I can use it in langchain. Rest I know complete things like how to use langchain, document QA etc. But I want to use my endpoint/API in langchain. Please guide me this if you ever done it

    • @FutureSmartAI
      @FutureSmartAI  Год назад

      You can do that you can use any python liabrary

    • @islamicinterestofficial
      @islamicinterestofficial Год назад

      @@FutureSmartAI you mean "requests" library?
      This is also fine but I want to use that endpoint in langchain. You know just like we import any huggingface model using huggingface api and then we pass that llm variable to langchain. And then we divide the user document in chunks, and then pass to the chain so that it predicts. But I don't know how I can pass to chain my own endpoint

  • @roktimashraful2085
    @roktimashraful2085 Год назад

    How many time you said you know 😆😆