Meet Llama 3.1
HTML-код
- Опубликовано: 6 сен 2024
- Download Llama 3.1: go. cos2xi
Starting today, open source is leading the way. Introducing Llama 3.1: Our most capable models yet. We’re releasing a collection of new Llama 3.1 models including our long awaited 405B. These models deliver improved reasoning capabilities, a larger 128K token context window and improved support for 8 languages among other improvements. Llama 3.1 405B rivals leading closed source models on state-of-the-art capabilities across a range of tasks in general knowledge, steerability, math, tool use and multilingual translation.
#llama3 #opensource #ai
--
Subscribe: www.youtube.co...
Learn more about our work: ai.facebook.com
Follow us on Threads: threads.net/@a...
Follow us on Twitter: / aiatmeta
Follow us on Facebook: / aiatmeta
Connect with us on LinkedIn: / aiatmeta
Meta focuses on bringing the world together by advancing AI, powering meaningful and safe experiences, and conducting open research.
I’ve been using Meta AI for a week now, and I really think it needs long-term memory and the ability to remember individual users. At this point, IMO, the goal is to reach what we saw in the movie “Her”.
Thanks for the great model :)
NGL but are you guys using AI with unreal engine for the people that are in the video? cuz that's dope work right there
Does anyone knows why Meta doesn't seem to work on Multimodal LLMs?
cause lizard can only see 2D
I did the work and it is censored
When a review video has more views than the actual announcement video
Nice!
thanks :)
♨
🎉
🙌
How to use the ASR on LLama 3.1 please?
Hola ia
Hi
Nice!