- Видео 88
- Просмотров 195 445
Neural Hacks with Vasanth
Индия
Добавлен 24 мар 2020
Hi Everyone. This is Vasanth. Welcome to Neural Hacks with Vasanth. I have 3+ years of experience in Data Science field. I love working in the field of NLP. Please support the channel by liking and sharing the videos with your communities and subscribing to my RUclips channel.
For any queries the social links are given below
For any queries the social links are given below
LLM Pretraining Course - Evaluation of LLM Pretraining with Perplexity and Attention Weights
Join this channel to get access to perks:
ruclips.net/channel/UCsJmKTSuye9EXqNJVwNfWagjoin
Important Links:
Github Repo: github.com/Vasanth51430/LLM_Mastery_In_30_Days
For further discussions please join the following telegram group
Telegram Group Link: t.me/nhv4949
You can also connect with me in the following socials
Gmail: vasanth51430@gmail.com
LinkedIn: www.linkedin.com/in/vasanthengineer4949/
ruclips.net/channel/UCsJmKTSuye9EXqNJVwNfWagjoin
Important Links:
Github Repo: github.com/Vasanth51430/LLM_Mastery_In_30_Days
For further discussions please join the following telegram group
Telegram Group Link: t.me/nhv4949
You can also connect with me in the following socials
Gmail: vasanth51430@gmail.com
LinkedIn: www.linkedin.com/in/vasanthengineer4949/
Просмотров: 36
Видео
LLM Pretraining - This ADVANCED Pytorch Training is ALL YOU NEED
Просмотров 1252 часа назад
Join this channel to get access to perks: ruclips.net/channel/UCsJmKTSuye9EXqNJVwNfWagjoin Important Links: Github Repo: github.com/Vasanth51430/LLM_Mastery_In_30_Days For further discussions please join the following telegram group Telegram Group Link: t.me/nhv4949 You can also connect with me in the following socials Gmail: vasanth51430@gmail.com LinkedIn: www.linkedin.com/in/vasanthengineer4...
LLM Pretraining - Creating a Simple Custom Pytorch Trainer for LLM Pretraining
Просмотров 1064 часа назад
Join this channel to get access to perks: ruclips.net/channel/UCsJmKTSuye9EXqNJVwNfWagjoin Important Links: Github Repo: github.com/Vasanth51430/LLM_Mastery_In_30_Days For further discussions please join the following telegram group Telegram Group Link: t.me/nhv4949 You can also connect with me in the following socials Gmail: vasanth51430@gmail.com LinkedIn: www.linkedin.com/in/vasanthengineer4...
LLM Pretraining - Creating Custom Torch Dataset DataLoader
Просмотров 1117 часов назад
Join this channel to get access to perks: ruclips.net/channel/UCsJmKTSuye9EXqNJVwNfWagjoin Important Links: Github Repo: github.com/Vasanth51430/LLM_Mastery_In_30_Days For further discussions please join the following telegram group Telegram Group Link: t.me/nhv4949 You can also connect with me in the following socials Gmail: vasanth51430@gmail.com LinkedIn: www.linkedin.com/in/vasanthengineer4...
LLM Pretraining - Data Collection To Pretrain a Custom LLM
Просмотров 2179 часов назад
Join this channel to get access to perks: ruclips.net/channel/UCsJmKTSuye9EXqNJVwNfWagjoin Important Links: Github Repo: github.com/Vasanth51430/LLM_Mastery_In_30_Days For further discussions please join the following telegram group Telegram Group Link: t.me/nhv4949 You can also connect with me in the following socials Gmail: vasanth51430@gmail.com LinkedIn: www.linkedin.com/in/vasanthengineer4...
Master LLM Architecture from BASIC to ADVANCED in JUST 7 Hours!
Просмотров 1,2 тыс.14 дней назад
In this week 1 I have shown all the information which is required for you to Master LLM architectures by which at the end you will be able to understand and create any LLM architecture you want and Pretrain it for which I have covered: 1. Tokenizers and its Working in LLMs 2. Working of Transformers Indept Intuition with Math 3. Transformers From Scratch Implemented in Pytorch 4. LLM Architectu...
LLM Pretraining Course: Build MiniGPT2 From SCRATCH
Просмотров 72921 день назад
In this video I have shown how you can implement the GPT2 research paper from scratch by coding a mini version of GPT2. Join this channel to get access to perks: ruclips.net/channel/UCsJmKTSuye9EXqNJVwNfWagjoin Important Links: Github Repo: github.com/Vasanth51430/LLM_Mastery_In_30_Days For further discussions please join the following telegram group Telegram Group Link: t.me/nhv4949 You can al...
LLM Pretraining Course: Build LLMs From Ground Up
Просмотров 2,1 тыс.21 день назад
Welcome to the LLM Pretraining Workshop, where we'll take you on a journey from zero to hero in the world of artificial intelligence and deep learning! In this comprehensive guide, we'll dive into the realm of large language models, exploring the latest advancements in natural language processing and transformer architecture. You'll learn how to build an LLM model from scratch, leveraging data ...
Advanced LLM Architecture: How To Optimize Attention in Transformers(Simply Explained)
Просмотров 42528 дней назад
Get ready to dive into the thrilling and captivating world of Transformers and discover the incredible power of attention! In this video, we will explore various forms of attention mechanisms that are essential in the field of machine learning. From Multi Head Attention to Sliding Window Attention, each type brings its own unique capabilities and advantages to the table, allowing us to understa...
Advanced LLM Architecture: Understanding Positional Encodings (the KEY to Transformers)
Просмотров 657Месяц назад
Dive into the thrilling world of Positional Encoding in Transformers! This crucial concept delivers both positional and semantic information, and we're breaking down the top five methods: Sinusoidal (Absolute) Positional Encoding, Relative Position Embedding, RoPE, ALiBi, and a Mixture of Positional Encodings. Get ready for an in-depth exploration of each method, showcasing their pros, cons, an...
I Built a Transformer from Scratch (PyTorch)
Просмотров 1,7 тыс.Месяц назад
🔍 In this video, we'll dive deep into the Transformer architecture and break down the math behind it step-by-step, all while keeping the implementation concise! 🚀 By the end of this video, you'll know how to write a Transformer in under 150 lines of code with an easy-to-follow, line-by-line explanation. 📌 What You'll Learn: What is the Transformer Architecture? A high-level overview of how the ...
The Math Behind Transformers (and Why They're So Powerful)
Просмотров 2,1 тыс.Месяц назад
Watch now and learn the mathematics behind the Transformer Architecture! 🔗 In this video, we're going to dive into the Transformer Architecture and explore its various components, including token embeddings, positional encoding, attention, and feed-forward blocks. You'll learn about the encoder-decoder architecture and how it's used in sequence-to-sequence tasks. We'll also cover the importance...
How Tokenizers Work (and Why They're So Damn Cool)
Просмотров 699Месяц назад
🚀 Mastering LLMs in 30 Days: Day X - Tokenization Deep Dive 🧠 Join us on our journey to master Large Language Models! Today, we're diving deep into tokenization, with a special focus on Byte Pair Encoding (BPE). 🔍 In this comprehensive video, we cover: • What is tokenization and why it's crucial for NLP • Different types of tokenizers: - Character-level - Word-level - Subword tokenization (BPE,...
LLM Mastery in 30 Days: Day 1 - Introduction To Transformers | Transformer Types - Pros, Cons, Uses
Просмотров 933Месяц назад
In this video, we dive deep into the groundbreaking Transformer architecture that has revolutionized Natural Language Processing (NLP) and beyond. Learn about: - The history and evolution of the Transformer architecture - Key components: self-attention, positional encoding, and parallel processing - Three main types of Transformers: encoder-only, decoder-only, and encoder-decoder - Popular mode...
🚀 Exciting Launch: Grow With Me Neuron - Level Up Your AI Learning With Me!
Просмотров 308Месяц назад
🚀 Exciting News: Introducing "Grow With Me" Membership! 🚀 Join this channel to get access to perks: ruclips.net/channel/UCsJmKTSuye9EXqNJVwNfWagjoin We're leveling up our channel with a new membership tier: Grow With Me Neuron! Here's what you need to know about all our membership levels: 1️⃣ Priority Neuron: - Special badges - Priority responses to your comments 2️⃣ Community Neuron: - Everyth...
Mastering NLP Fundamentals: A 4-hour Hands-on Tutorial
Просмотров 2,1 тыс.Месяц назад
Mastering NLP Fundamentals: A 4-hour Hands-on Tutorial
LLM Mastery in 30 Days: Day 0 Prerequisites Part3 - Translation with Seq2Seq & Attention(Bahdanau)
Просмотров 605Месяц назад
LLM Mastery in 30 Days: Day 0 Prerequisites Part3 - Translation with Seq2Seq & Attention(Bahdanau)
LLM Mastery in 30 Days: Day 0 Prerequisites Part2 - DL For NLP - ANN, RNN, LSTM...
Просмотров 1,3 тыс.Месяц назад
LLM Mastery in 30 Days: Day 0 Prerequisites Part2 - DL For NLP - ANN, RNN, LSTM...
LLM Mastery in 30 Days: Day 0 Prerequisites Part1- NLP Basics to ML For NLP
Просмотров 2,8 тыс.Месяц назад
LLM Mastery in 30 Days: Day 0 Prerequisites Part1- NLP Basics to ML For NLP
LLM Mastery in 30 Days | Course Introduction
Просмотров 2,5 тыс.2 месяца назад
LLM Mastery in 30 Days | Course Introduction
Master Llama3: From Research Paper to Real World Application
Просмотров 6212 месяца назад
Master Llama3: From Research Paper to Real World Application
Master RAG in 5 Hrs | RAG Introduction, Advanced Data Preparation, Advanced RAG Methods, GraphRAG
Просмотров 6 тыс.3 месяца назад
Master RAG in 5 Hrs | RAG Introduction, Advanced Data Preparation, Advanced RAG Methods, GraphRAG
GraphRAG Free: Use Without Open AI API Key
Просмотров 3,2 тыс.4 месяца назад
GraphRAG Free: Use Without Open AI API Key
GraphRAG - The Most Advanced Futuristic RAG | Introduction, Setup, Working, Testing
Просмотров 4,3 тыс.4 месяца назад
GraphRAG - The Most Advanced Futuristic RAG | Introduction, Setup, Working, Testing
Master NLP in 12 Hours | Transformers, LLMs Pretraining, Finetuning, Deployment, RAG, Agents, Etc...
Просмотров 9 тыс.4 месяца назад
Master NLP in 12 Hours | Transformers, LLMs Pretraining, Finetuning, Deployment, RAG, Agents, Etc...
JARVIS: Become a Real Life Iron Man
Просмотров 1,5 тыс.5 месяцев назад
JARVIS: Become a Real Life Iron Man
Every AI Developer Should Know This
Просмотров 2,9 тыс.5 месяцев назад
Every AI Developer Should Know This
25+ AI SaSS IDEAS That can PAY You💰💸💸
Просмотров 5675 месяцев назад
25 AI SaSS IDEAS That can PAY You💰💸💸
Creating PRODUCTION level END TO END RAG pipeline| Fast, Accurate, Scalable, Simple
Просмотров 7 тыс.5 месяцев назад
Creating PRODUCTION level END TO END RAG pipeline| Fast, Accurate, Scalable, Simple
How RAG Works? End to End RAG From Scratch 😀
Просмотров 6796 месяцев назад
How RAG Works? End to End RAG From Scratch 😀
Sir please react to the comment if you dreaded it. Sir please explain in detailed
Creating a trainer there is not a lot of details in it. I am not sure what you are asking as detailed please let me know will take measures accordingly
Hey, Can you make a video on how to pre train a model but make it long at least couples of hours long From your 3 video I not fully able to understand I don't care your read the colab not or your properly explaining this after seeing your 4 videos on llm pre training I have extract enough information to get started If you can make a long video that would be really helpful for me and I can kick start my dream This for all of those people who complain about his explanation I understand that your are young and some of you viewer are students you have potential to change our nation But you need understand there are people in corporate world who dose worst explanation you can't even call it as explaining and they will kick you out saying you can't learn Compare to those people he is explaining This is really world I have experience in 3 of biggest MNC whose evaluation is 500 million+ You people are young but make sure you never stop learning one day you will understand Please make video on pre training that is atleast couples of hours long
My aim is to teach for students that is why this is a part of the course. One or two hour video of pretraining wouldn't make a difference because it will take weeks to pretrain a model. This is for educational purposes. Please check other channel videos since you are experienced I am sure you will find better channels. Thank you
Hey, Can you make a video on how to pre train a model but make it long at least couples of hours long From your 3 video I not fully able to understand I don't care your read the colab not or your properly explaining this after seeing your 4 videos on llm pre training I have extract enough information to get started If you can make a long video that would be really helpful for me and I can kick start my dream This for all of those people who complain about his explanation I understand that your are young and some of you viewer are students you have potential to change our nation But you need understand there are people in corporate world who dose worst explanation you can't even call it as explaining and they will kick you out saying you can't learn Compare to those people he is explaining This is really world I have experience in 3 of biggest MNC whose evaluation is 500 million+ You people are young but make sure you never stop learning one day you will understand Please make video on pre training that is atleast couples of hours long
sir please react for my feedback .
colab.research.google.com/drive/1VussXvsx-k--6TVG-NlbcAacyrRbXH1i see sir i have implemented your single bit llm . you did great job in single bit llm br providing a great colab but you explination in is like reading only (too bad). any student would expecte 1) intution ie explainig the paper in depth ie how traing is happening how forward prop and back ward pro going and how is it different from vannila ann . 2) code part do it in the video ie you start writing the code in the video it self and tell the studnts also why you are using this particular section of code orline so that it would be helpful .
please sir dont reat the colab instead try to explain .
yes i have gone through the llm from scratch week 1 . sir your colab note book is really good but your explination is like reading colab only no proper explination . Here we are students with zero knowledge so you should every term ( intution -> why and how ) all these thing if not its waste of time and effforts sir
Sir I also feel the same. Guess you also follow strivers strategy of explication. Please improve
Amazing Video Sir ! When will you upload the code
Brilliant Would you please mention the hardware requirement? I have a mac mini M4. Is it ok to go with?
Hi Vasanth, this video really helped me to refresh and deepen my understanding of RAG concepts. Great work-keep it up!
Outstanding ❤
Nice 👍
Amazing video sir !!
Amazing sir..❤❤❤
Love it!!!!!!!!!!!!
Outstanding 🎉❤
Great video vasamth as always
Love it!!!!!!!!!!!!!
Very well documented. Please continue uploading this series.
Very good. Thanks
Top
hey , i am not getting anything when i try to load database with 'data.db'. Please help me with it and can you also provide detail solution for it? Thank you.
👋
Sir please continue 30 Days LLM Course
I am a bit unwell so only taking time once I get better will make it more consistently
Man continue the sries
great ..........woow❤❤❤❤
a lot of dependencies errors
Informative!
Superb content!
Wow!!❤
Lov eit!!!!!!!!!!!!!!
Super excited for this course!!
Amazing Video Sir
Love it!!!!!!!!!
Indetail much things - Overwhelmed with these.Need for a continual learning
Great video!
😍🙏
Can you please tell how we will have to tokenize and make changes in the training process , when we are training a model for sql queries of code generation for questions , also what are the methods to train a model faster , the model usually take very long time for like a corpus of 20L sentences translation model , and I was using PEFT , might also consider unsloth to train once again
Amazing Video Sir !
One of the best tutorial in llm Sir ..please update Your GitHub repo... please sir 🙏🙏🙏
Sir please update GitHub repo...🙏🙏🙏🙏
Can you upload this presentation as well.
Can you make a detailed video on de-duplication
Hello sir! Nice video I was acutually thinking if you could make a video on Langugage translation from scratch , like first make a tokenizer by ouselves and then train the data on openly available datasets (Open Corpus .etc) From Hindi to english not using any pre-trained model like Heizen.. hi-en etc. from scratch . I was trying to implement it but didn't turned out to be successful. If you could make a successful tutorial, we can have a great understandings of working with them.
Please check the transformers implementation video provided already for this task
Thank you for the very detailed explanation. Can you make video about the LighRAG also? which is lighter version of this Graph and has slight advantages.
Can you please share github data also to understand better.
🥰❤️