Better Than GPT-4o with Mixture of Agents ( MoA ) !

Поделиться
HTML-код
  • Опубликовано: 27 ноя 2024
  • НаукаНаука

Комментарии • 2

  • @artur50
    @artur50 5 месяцев назад

    Nice job! Still I have a question... did you have to pull all these large open source LLMs to have them locally available for your repo?

  • @davidmills9653
    @davidmills9653 5 месяцев назад

    I presume the use of 4o was as a benchmark? Would multiple 4o models give the same answer?