What BERT Can’t Do: The Transformer's Decoder [Lecture]
HTML-код
- Опубликовано: 10 окт 2022
- Neural LM Overview: • Neural Language Models...
BERT Encoder: • Understanding BERT: Th...
This is a single lecture from a course. If you you like the material
and want more context (e.g., the lectures that came before), check out
the whole course:
boydgraber.org/teaching/CMSC_...
(Including homeworks and reading.)
Music: / review-and-rest
Thank you for the video, would you be able to share a link to Mark Riedl paper showing the detailed architecture of transformers?
Sadly, no. He tweeted the image out, but he's not sure if he's allowed to make the underlying materials freely available. (I asked, because it's so awesome.)
You can always sign up for GATech's online education offerings! I'm sure it's higher quality than the stuff I give out for free on RUclips. I'm really jealous of their support for teachers offering courses online.
IS there a link to the "mark riddle(?)" transformer diagram? can't find it in the description.
pbs.twimg.com/media/FZUiCbpXgAEd11j?format=jpg&name=large ,say no more ;)
Stop teleporting!!!
Thanks (honestly) for the feedback. I was trying out a new multicam setup and I agree that it didn't work out as well as I would have hoped.
I would have liked the video more if you didn't change location of diagrams every 15 seconds...
Thanks (honestly) for the feedback. I was trying out a new multicam setup and I agree that it didn't work out as well as I would have hoped.
Hope you keep watching and continue giving feedback!
@Jordan Boyd-Graber it wasn't a bad idea, but should probably readjust the time between switches so it's not distracting.