LangChain + OpenAI Function Calling - Create powerful chains with tools and functions

Поделиться
HTML-код
  • Опубликовано: 16 окт 2024

Комментарии • 23

  • @saurabhjain507
    @saurabhjain507 8 месяцев назад

    Hi Markus,
    Is Function calling now deprecated? I see openai has now introduced tools to address API Assistants. Should function calling still work or tools should be adopted.

    • @codingcrashcourses8533
      @codingcrashcourses8533  8 месяцев назад

      Hi Saurabh, hope you are doing well! I have heard that as well, but I did not find any information about it being deprecated. The assistant API introduces "state" into the API with threads, so it works different and function calling can be used a tool.
      You got a nice timing, I will release a video about the Assitant API tomorrow :)

    • @saurabhjain507
      @saurabhjain507 8 месяцев назад

      @@codingcrashcourses8533 i have notifications turned on for your videos. So looking forward to it :)

  • @hiranga
    @hiranga Год назад

    @codingcrashcourses8533 how can you use the Functions Agent with FastAPI streaming? I'm using some previous FastAPI tutorials that were previously working fine for streaming, but no longer work with the Functions Agent. Im not sure why???

  • @izaac__
    @izaac__ Год назад

    Great videos! Thank you! Would you be able to help with an example where the Open AI functions agent uses a tool to access vector db, has conversational buffer memory and is customised with a prompt template (e.g. you are a useful X etc.)? I’ve tried but langchain docs are hard to parse 😂😅

    • @codingcrashcourses8533
      @codingcrashcourses8533  Год назад

      haha. I also some got some uscases where I don´t have the correct chain avaiable out of the box. I just use the normal LLMChain with a cutom prompt in this case and retrieve documents "by hand", for example top_n. History is also just a List of Classes with HumanMessage and AIMessage then where I append new messages myself. Not the perfect solution, but for custom prompts this might be the way to go currently.

    • @izaac__
      @izaac__ Год назад

      Ok great will try that! Glad I’m not the only one haha Thank you for the quick response

    • @codingcrashcourses8533
      @codingcrashcourses8533  Год назад +1

      @@izaac__ I use the following custom class to create a final input for the OpenAIChat class, maybe it helps you :-).
      class PromptBuilder:
      def __init__(self, system_message: str, human_message_template: PromptTemplate):
      self.system_message = system_message
      self.human_message_template = human_message_template
      def inject_into_prompt(self, template: PromptTemplate, **kwargs) -> str:
      return template.format(**kwargs)
      def build_final_prompt(
      self, messages: List[str], query: str, dokumente: str
      ) -> List[BaseMessage]:
      final_prompt = [SystemMessage(content=self.system_message)]
      for index, message in enumerate(messages):
      if index % 2 == 0:
      final_prompt.append(HumanMessage(content=message))
      else:
      final_prompt.append(AIMessage(content=message))
      newest_message = self.inject_into_prompt(
      template=self.human_message_template,
      query=query,
      dokumente=dokumente,
      )
      final_prompt.append(HumanMessage(content=newest_message))
      return final_prompt

    • @izaac__
      @izaac__ Год назад

      @@codingcrashcourses8533 awesome, thanks again :)

  • @user-wr4yl7tx3w
    @user-wr4yl7tx3w Год назад

    in your first example, given that the 'functions' list was about pizza, how was it able to return information about the capital of France?

    • @codingcrashcourses8533
      @codingcrashcourses8533  Год назад +3

      That was not the function, that was the model. The actual output is ALWAYS returned by the LLM. The model will just decide whether to retrieve additional information from functions or not.

  • @RunForPeace-hk1cu
    @RunForPeace-hk1cu Год назад

    How is OpenAI function different from lang chain StructureTool?

    • @codingcrashcourses8533
      @codingcrashcourses8533  Год назад

      One uses a special React prompt, OpenAI functions provide the native functionality from openai.

  • @kasvith
    @kasvith Год назад +2

    Hmm langchain feels like a hack now a days :/

    • @codingcrashcourses8533
      @codingcrashcourses8533  Год назад

      Why like a hack?

    • @kasvith
      @kasvith Год назад

      @@codingcrashcourses8533 as you mentioned using openai functions feels hacky

    • @cartolla
      @cartolla 10 месяцев назад +1

      Try to read its documentation. You will be sure it is a hack :D

  • @registeel2000
    @registeel2000 Год назад +1

    Please be careful with your API key and showing it, someone may take it and abuse it, I advise changing your key now that you have made it public