Neural Black Magic
Neural Black Magic
  • Видео 11
  • Просмотров 11 286
Byte Latent Transformer: Patches Scale Better Than Tokens Paper Explained Visually and Clearly
This video provides the most straightforward clear explanation of the newly paper published by Meta, called "Byte Latent Transformer: Patches Scale Better Than Tokens".
You can read the paper here:
ai.meta.com/research/publications/byte-latent-transformer-patches-scale-better-than-tokens/
In this paper they introduce the Byte Latent Transformer (BLT), a new byte-level LLM architecture that, for the first time, matches tokenization-based LLM performance at scale with significant improvements in inference efficiency and robustness.
BLT encodes bytes into dynamically sized patches, which serve as the primary units of computation. Patches are segmented based on the entropy of the next byte, allo...
Просмотров: 381

Видео

Efficient Infinite Context Transformers with Infini-Attention (Paper Explained)
Просмотров 1,5 тыс.9 месяцев назад
❤️Support the channel❤️ Hi everyone. In this video, a comprehensive explanation of the recently presented paper "Leave No Context Behind: Efficient Infinite Context Transformers with Infini-attention" which is introduced by google researchers is provided. Paper link: arxiv.org/abs/2404.07143 #LLM #nlp #LLMs #largelanguagemodels #transformers #Attention #deeplearning #naturallanguageprocessing
ConvMixer: Patches Are All You Need? (paper explained with implementation in PyTorch)
Просмотров 3,2 тыс.11 месяцев назад
❤️Support the channel❤️ In this video, ConvMixer architecture proposed in "Patches Are All You Need?" paper is explained informatively along with simple yet comprehensive PyTorch implementation. Paper link: arxiv.org/abs/2201.09792 GitHub repository: github.com/Ardawanism/ConvMixer-Patches-Are-All-You-Need #deeplearning #machinelearning #convolutionalneuralnetwork #convolutionalneuralnetworks #...
MLP-Mixer: An all MLP Architecture for Vision explained (with implementation in PyTorch)
Просмотров 1,4 тыс.11 месяцев назад
❤️Support the channel❤️ In this video, MLP-Mixer which is an all MLP architecture developed by google brain team for computer vision tasks is explained informatively along with simple yet comprehensive PyTorch implementation. Paper link: arxiv.org/abs/2105.01601 GitHub repository: github.com/Ardawanism/MLP-Mixer-An-all-MLP-Architecture-for-Vision #deeplearning #machinelearning #computervision #...
AlphaGeometry Explained: Solving olympiad geometry without human demonstrations
Просмотров 553Год назад
❤️Support the channel❤️ In this video, AlphaGeometry is explained clearly and informatively. Paper link: nature.com/articles/s41586-023-06747-5 Blog post by Google DeepMind: deepmind.google/discover/blog/alphageometry-an-olympiad-level-ai-system-for-geometry/ #deepmind #AlphaGeometry #llm
Dark Knowledge in Neural Networks - "Knowledge Distillation" Explanation and Implementation
Просмотров 3,2 тыс.Год назад
❤️Support the channel❤️ A clear and comprehensive explanation of Knowledge Distillation is presented in this video. In addition, the approach presented in the paper "Distilling the Knowledge in a Neural Network" is implemented from scratch using PyTorch. read the "Distilling the Knowledge in a Neural Network" paper: arxiv.org/abs/1503.02531 #deeplearning #machinelearning #pytorch #neuralnetwork...
Attention Mechanism In a nutshell
Просмотров 577Год назад
The Attention Mechanism has become a widely recognized concept in deep neural networks, extensively studied across diverse applications in the realm of Artificial Intelligence. The Attention Mechanism in deep learning allows models to dynamically concentrate on relevant parts of input data, enhancing their ability to understand context and relationships. In this video, I aim to explain the prin...
MLP-Mixer An all MLP Architecture for Vision explained clearly and coded from scratch using PyTorch
Просмотров 306Год назад
in this video, we review MLP-Mixer Architecture, An all-MLP Architecture for Vision, which is developed by google brain team and achieve competitive results to CNNs which have been dominant architecture in computer vision domain for a long time, and transformer based architectures. we do a quick review of the paper and mention key points and ideas and investigate the architecture in depth. if y...

Комментарии

  • @AshutoshKumar-cw8tw
    @AshutoshKumar-cw8tw 14 дней назад

    Hey, nice explanation...Thanks, but it would be more helpful if you can perform this task on better datasets like CIFAR-10

  • @movienglish8038
    @movienglish8038 25 дней назад

    Nice job.

    • @NeuralBlackMagic
      @NeuralBlackMagic 24 дня назад

      Glad you liked this video. Thank you very much for your kind support.

  • @baharhaghi9921
    @baharhaghi9921 25 дней назад

    Great video! Loved your insights on Meta's new paper and LLMs. Looking forward to more content like this

  • @Elmiravakili-z7n
    @Elmiravakili-z7n 26 дней назад

    Thank you for your insightful information 🙏🏻

    • @NeuralBlackMagic
      @NeuralBlackMagic 25 дней назад

      Thanks for your kind support dear🙏☘ Glad you liked this video. Please subscribe to the channel and hit the bell icon to receive notifications when new videos are posted.

  • @puriaazad8180
    @puriaazad8180 26 дней назад

    Insightful stuff👌

    • @NeuralBlackMagic
      @NeuralBlackMagic 26 дней назад

      Thanks a lot for your warm support🤎🙏☘

  • @alihatamian9780
    @alihatamian9780 26 дней назад

    Wonderful

    • @NeuralBlackMagic
      @NeuralBlackMagic 26 дней назад

      Thank you very much for your kind support🙏☘ please subscribe to channel and hit the bell icon to receive notifications when new videos are posted.

  • @amirzarabadipour652
    @amirzarabadipour652 27 дней назад

    Interesting

    • @NeuralBlackMagic
      @NeuralBlackMagic 26 дней назад

      Thank you very much for your kind support🤎🙏☘

  • @parvinemami1398
    @parvinemami1398 27 дней назад

    Great

  • @Mimo-mv6fu
    @Mimo-mv6fu 27 дней назад

    Perfect 🔥

    • @NeuralBlackMagic
      @NeuralBlackMagic 27 дней назад

      Thanks for your kind feedback Glad you liked it.

    • @NeuralBlackMagic
      @NeuralBlackMagic 27 дней назад

      Please subscribe the channel and hit the bell icon to receive notifications when new videos are posted 🙏☘.

  • @ardavanmodarres4720
    @ardavanmodarres4720 27 дней назад

    Thanks for sharing, more videos please🤩😍👊🔥

    • @NeuralBlackMagic
      @NeuralBlackMagic 27 дней назад

      Glad you liked this video, sure! We'll have lot's new cool videos.

  • @ardavanmodarres4720
    @ardavanmodarres4720 27 дней назад

    Wow🔥🔥 That was awesome🤎

  • @darshantank554
    @darshantank554 4 месяца назад

    Thanks for explanation 👍

    • @NeuralBlackMagic
      @NeuralBlackMagic Месяц назад

      Glad you liked it🤩😍 Please follow the channel and press the bell icon to receive notifications when new videos are posted.

  • @Lilina3456
    @Lilina3456 5 месяцев назад

    thank you so much, that was great

    • @NeuralBlackMagic
      @NeuralBlackMagic Месяц назад

      Glad you liked it🤩😍 Please follow the channel and press the bell icon to receive notifications when new videos are posted.

  • @sarataheri6611
    @sarataheri6611 8 месяцев назад

    Thanks for sharing this helpful video! Keep going!💪

    • @NeuralBlackMagic
      @NeuralBlackMagic 6 месяцев назад

      Thank you very much for your support my dear🙏☘

  • @puriaazad8180
    @puriaazad8180 9 месяцев назад

    ⚡⚡

  • @puriaazad8180
    @puriaazad8180 9 месяцев назад

    ⚡⚡

  • @meysamjahanfakhr7124
    @meysamjahanfakhr7124 9 месяцев назад

    Thanks for sharing this content

    • @NeuralBlackMagic
      @NeuralBlackMagic 9 месяцев назад

      Thank you for your kind support🙏☘ Glad you liked it👊🔥

  • @SaraS-cq5yl
    @SaraS-cq5yl 9 месяцев назад

    Hey there, thanks for the video, it was a great paper, I’ll go through it for more details. It was very related to the "Lost in the Middle" phenomenon and the use of the RAG systems in LLM applications that I was recently reading about. That would be great to share more of these contents if you cross over in your research.

    • @NeuralBlackMagic
      @NeuralBlackMagic 9 месяцев назад

      Thank you very much for your kind support, positive feedback, and great comment🙏☘ Glad you liked it👊🔥 Sure, I do my best to provide high quality educational videos about machine learning and deep learning. I'm not familiar with the "Lost in the middle" phenomenon. Could you please explain it a little bit more so we can find similarities with the "Infini-Attention"?

  • @amirzarabadipour652
    @amirzarabadipour652 9 месяцев назад

    Keep going mate ✌️❤️

    • @NeuralBlackMagic
      @NeuralBlackMagic 9 месяцев назад

      Glad you liked it👊🔥 Thanks for your support🙏☘

  • @maryamnavabi140
    @maryamnavabi140 9 месяцев назад

    Awesome..keep going

    • @NeuralBlackMagic
      @NeuralBlackMagic 9 месяцев назад

      Thank you very much for your kind support and positive feedback🙏☘ Glad you liked it👊🔥

  • @movienglish8038
    @movienglish8038 9 месяцев назад

    Wonderful and informative. Keep going

    • @NeuralBlackMagic
      @NeuralBlackMagic 9 месяцев назад

      Glad you liked it 👊🔥 Thanks for your kind support 🙏🌹

  • @baharhaghi9921
    @baharhaghi9921 9 месяцев назад

    The explanation was clear, I enjoy it 👍

    • @NeuralBlackMagic
      @NeuralBlackMagic 9 месяцев назад

      Thanks for your positive feedback🙏☘ Glad you liked it👊

  • @dagma3437
    @dagma3437 9 месяцев назад

    Great content but please improve the audio volume and quality

    • @NeuralBlackMagic
      @NeuralBlackMagic 9 месяцев назад

      Thanks a lot for your feedback🙏☘ I will definitely do that👊

    • @erikdahlen2588
      @erikdahlen2588 9 месяцев назад

      I agree 😊

    • @NeuralBlackMagic
      @NeuralBlackMagic 9 месяцев назад

      @@erikdahlen2588 Thanks for your support and feedback🙏☘

    • @NeuralBlackMagic
      @NeuralBlackMagic 9 месяцев назад

      @@erikdahlen2588 Please subscribe with notifications to receive notifications when new videos are posted🔥👊

  • @rezalak583
    @rezalak583 9 месяцев назад

    "Attention is all you need", it was awesome.

    • @NeuralBlackMagic
      @NeuralBlackMagic 9 месяцев назад

      Glad you liked it☘🙏 Thanks a lot for your positive feedback🔥

  • @amirarsalanmodarres6509
    @amirarsalanmodarres6509 9 месяцев назад

    This video was very informative for ❤tnx for sharing

    • @NeuralBlackMagic
      @NeuralBlackMagic 9 месяцев назад

      Thanks for your feedback, glad you liked it☘🙏

  • @ardavanmodarres4720
    @ardavanmodarres4720 9 месяцев назад

    The "Infini-Attention" and "Infini-Transformer" seemed pretty cool :))

    • @NeuralBlackMagic
      @NeuralBlackMagic 9 месяцев назад

      Yes, it was quite simple but interesting🤩

  • @ardavanmodarres4720
    @ardavanmodarres4720 9 месяцев назад

    These video was extremely great and informative🔥 Thanks for sharing🙏

    • @NeuralBlackMagic
      @NeuralBlackMagic 9 месяцев назад

      Thanks for your support🙏 Glad you liked it👊

  • @mrmangoheadthemango
    @mrmangoheadthemango 10 месяцев назад

    You should make a video comparing the Vision Transformer and ConvMixer

    • @NeuralBlackMagic
      @NeuralBlackMagic 10 месяцев назад

      That's a very good idea, I liked that, thanks. What do you think about the "ConvMixer" architecture any way? Share your ideas with me.

    • @NeuralBlackMagic
      @NeuralBlackMagic 10 месяцев назад

      Please take a look at my other video, the "MLP-Mixer" at you convenience. It's a beautiful architecture too. Please share your ideas about that with me too. Thanks.

  • @kuroyuki919
    @kuroyuki919 10 месяцев назад

    good video, you haven't thought about implementing the CapsNet architecture. It's also very good.

    • @NeuralBlackMagic
      @NeuralBlackMagic 10 месяцев назад

      That's a great and interesting idea, thanks for your suggestion. I definitely consider that as topic of one of my future videos. What do you think about the "ConvMixer" architecture?

    • @NeuralBlackMagic
      @NeuralBlackMagic 10 месяцев назад

      Please take a look at my other video, the "MLP-Mixer" at you convenience. It's a beautiful architecture too. Please share your ideas about that with me too. Thanks.

  • @ardavanmodarres4720
    @ardavanmodarres4720 10 месяцев назад

    This is literally the best explabation of the "Attention Mechanism" I've seen so far! Great job, thank you very much for sharing👌🙏🔥

  • @clement1181
    @clement1181 11 месяцев назад

    😓 *promo sm*

  • @buh357
    @buh357 11 месяцев назад

    interesting paper, and thank you for implementation as well, this architecture is so basicallly efficientnet with patch embedding, i am thinking that if this thing work, we should consider adding patch embedding on efficeintnet.

    • @NeuralBlackMagic
      @NeuralBlackMagic 11 месяцев назад

      Hi, glad you liked it and thanks for your support🙏☘ Yes I agree with you, this architecture is very efficient, and adding patch embedding to EfficientNet seems very interesting. Please share the results with me if you did so. By the way, please subscribe to the channel and hit the bell icon to receive notifications when new videos are posted and stay updated🔥👊

  • @puriaazad8180
    @puriaazad8180 11 месяцев назад

    👏⚡

    • @NeuralBlackMagic
      @NeuralBlackMagic 11 месяцев назад

      Thanks a lot👊🔥 Please hit the bell icon to receive notifications when new videos are posted and stay updated✌

  • @meysamjahanfakhr7124
    @meysamjahanfakhr7124 11 месяцев назад

    I genuinely find it incredibly useful, and I'm pleasantly surprised by the simplicity of your explanation, it's refreshingly accessible.

    • @NeuralBlackMagic
      @NeuralBlackMagic 11 месяцев назад

      Thanks a lot for your great feedback👊🔥 Please hit the bell icon to receive notifications when new videos are posted and stay updated✌

  • @meysamjahanfakhr7124
    @meysamjahanfakhr7124 11 месяцев назад

    Thank you for sharing this content! I genuinely find it incredibly useful, and I'm quite amused by the simplicity of your language. it makes the information really accessible and easy to understand.

    • @NeuralBlackMagic
      @NeuralBlackMagic 11 месяцев назад

      Glad you liked it and thanks for your support🙏☘ Please subscribe to the channel and hit the bell icon to receive notifications when new videos are posted and stay updated🔥👊

  • @meysamjahanfakhr7124
    @meysamjahanfakhr7124 11 месяцев назад

    Thanks for sharing this content, I found this incredibly insightful🌹🌹

    • @NeuralBlackMagic
      @NeuralBlackMagic 11 месяцев назад

      Glad you liked it and thanks for your support🙏☘ Please subscribe to the channel and hit the bell icon to receive notifications when new videos are posted and stay updated🔥👊

  • @Sogandmarasi
    @Sogandmarasi 11 месяцев назад

    Nice video

    • @NeuralBlackMagic
      @NeuralBlackMagic 11 месяцев назад

      Thanks for your support☘🌻 Please follow the channel and hit the bell icon to receive notifications when new videos are posted💫

  • @amirarsalanmodarres6509
    @amirarsalanmodarres6509 11 месяцев назад

    Great as always 👏👏

  • @maryamnavabi140
    @maryamnavabi140 11 месяцев назад

    Great job 👏🏻

  • @Haniyahmadi
    @Haniyahmadi 11 месяцев назад

    I really appreciate the time and effort you put into creating the implementation section. I hope you continue this way!

    • @NeuralBlackMagic
      @NeuralBlackMagic 11 месяцев назад

      Glad you liked it🌻 Thanks for your support🙏 I try to do my best👊🔥

  • @Haniyahmadi
    @Haniyahmadi 11 месяцев назад

    Such a great video with an excellent explanation. Thanks for sharing

    • @NeuralBlackMagic
      @NeuralBlackMagic 11 месяцев назад

      Glad you liked it🌻 Thanks for your support🙏 Please hit the bell icon to receive notifications when new videos are posted✌

  • @artqazal1380
    @artqazal1380 11 месяцев назад

    👍👍

    • @NeuralBlackMagic
      @NeuralBlackMagic 11 месяцев назад

      Glad you liked it🌻 Thanks for your support🙏 Please hit the bell icon to receive notifications when new videos are posted✌

  • @artqazal1380
    @artqazal1380 11 месяцев назад

    👍👍👌👌

    • @NeuralBlackMagic
      @NeuralBlackMagic 11 месяцев назад

      Glad you liked it🌻 Thanks for your support🙏 Please hit the bell icon to receive notifications when new videos are posted✌

  • @maryamnavabi140
    @maryamnavabi140 11 месяцев назад

    Perfect video

    • @NeuralBlackMagic
      @NeuralBlackMagic 11 месяцев назад

      Glad you liked it🌻 Thanks for your support🙏

  • @baharhaghi9921
    @baharhaghi9921 11 месяцев назад

    Great as always 🌱

  • @baharhaghi9921
    @baharhaghi9921 11 месяцев назад

    Thank you for your clear explanation🙏

    • @NeuralBlackMagic
      @NeuralBlackMagic 11 месяцев назад

      Thanks for your feedback🌻 Glad you liked it👊🔥

  • @movienglish8038
    @movienglish8038 11 месяцев назад

    Keep going to make such informative videos

    • @NeuralBlackMagic
      @NeuralBlackMagic 11 месяцев назад

      Sure =) Thanks for you support 👊🔥