Training Neural Networks On Ghost Memories: Better Than RAG Tuning

Поделиться
HTML-код
  • Опубликовано: 19 янв 2025

Комментарии • 2

  • @ujjwalkumar-we7tl
    @ujjwalkumar-we7tl 22 дня назад +1

    Very Interesting! I wonder what would happen if we just deactivate 50% of the neurons during inference. And see how much of model's performance degrades.

  • @AbdoMohammed-jt5ye
    @AbdoMohammed-jt5ye 22 дня назад +2

    I dont understand anything, but it seem wonderful work