How does it go with sentences we wouldn’t consider as standard ..i.e. foreign language learner generated sentences that contain anomalous features, errors, ‘odd’ sentences.
Hi Tommy, thanks for reaching out! A supervised dependency parser will generate the most likely parse of an 'odd' sentence, given what it has observed in its training data; so, if the training data is not representative, then the parser may make mistakes (but hopefully not too many). If your goal is to increase the reliability of the parser for sentences generated by foreign language learners, then you may augment the parser's training data with examples that are representative of foreign language learners. If your goal is to detect (or correct) incorrect parses (e.g. prune incorrect arcs in the dependency graph), you may consider using a sequence-to-sequence model for this (see www.aaai.org/ocs/index.php/AAAI/AAAI18/paper/viewFile/17038/16064). I hope that helps!
How can I distinguish the head and dependent? Sometimes it's easy to confuse the relationship. Like, why nsubj focuses on verb, but 'case' heads on noun....... what's the logic?
Hi Baran; I don't have a publicly available version of the slides. However, ig you increase the resolution of the RUclips video, the text should be significantly clearer.
hello im having trouble solving this problem: we have [012345] Word 0 has a dependency on word 2. Word 2 has dependencies on words 1, 5, and 3. Word 5 has a dependency on word 4. there are 11 transitions to reach the dependency tree above. I have tried a few times. all answers are wrong. Any help would be much appreciated. Thank you. i have tried many times and got all these wrong answers. Wrong answers: SH SH SH LA LA SH LA SH SH LA RA SH SH LA SH LA SH SH LA RA RA SH SH SH SH LA SH SH LA SH SH LA RA SH SH SH LA LA SH LA SH SH LA RA SH SH SH LA SH SH LA SH SH LA RA LA SH SH SH LA SH LA SH SH LA RA RA SH SH LA SH SH LA SH LA SH LA RA SH SH LA SH RA SH SH LA LA RA SH SH LA SH SH LA SH SH LA RA RA SH SH SH LA SH LA SH SH LA RA RA SH SH LA SH SH LA SH LA SH LA RA RA SH SH SH LA LA SH LA SH SH LA LA RA SH SH SH LA LA SH LA SH SH LA RA RA SH SH SH LA LA SH LA SH LA LA RA RA SH SH SH LA LA SH LA SH SH LA RA RA SH SH LA SH SH LA SH SH SH RA RA SH SH LA SH SH LA SH LA SH LA RA RA SH SH LA SH SH LA SH LA SH LA RA SH SH LA SH SH LA SH LA LA RA RA SH SH LA SH SH LA LA SH SH SH RA SH SH LA SH LA RA SH SH LA RA RA please help me
Simple and to the point, Brilliant instructor.
Thank you
Thanks for the patient explanation. I finally know what the meaning of it.
I'm so glad to hear that!
Thanks a ton Professor, you gave me a good head-start into dependency parsing. I need it for dense captioning.
You're welcome! I'm glad it was helpful.
Well-explained dear professor! Thanks.
Very helpful and a high-quality explanation. Thanks!
Thank you, @Frankie!
Great explanation!
Thanks Dennis!
PERFECT TUTORIAL
Thank you.
How does it go with sentences we wouldn’t consider as standard ..i.e. foreign language learner generated sentences that contain anomalous features, errors, ‘odd’ sentences.
Hi Tommy, thanks for reaching out! A supervised dependency parser will generate the most likely parse of an 'odd' sentence, given what it has observed in its training data; so, if the training data is not representative, then the parser may make mistakes (but hopefully not too many). If your goal is to increase the reliability of the parser for sentences generated by foreign language learners, then you may augment the parser's training data with examples that are representative of foreign language learners. If your goal is to detect (or correct) incorrect parses (e.g. prune incorrect arcs in the dependency graph), you may consider using a sequence-to-sequence model for this (see www.aaai.org/ocs/index.php/AAAI/AAAI18/paper/viewFile/17038/16064).
I hope that helps!
How can I distinguish the head and dependent? Sometimes it's easy to confuse the relationship. Like, why nsubj focuses on verb, but 'case' heads on noun....... what's the logic?
Would you please share the link of the slides? It is not readable on slides
Hi Baran; I don't have a publicly available version of the slides. However, ig you increase the resolution of the RUclips video, the text should be significantly clearer.
hello im having trouble solving this problem:
we have [012345]
Word 0 has a dependency on word 2.
Word 2 has dependencies on words 1, 5, and 3.
Word 5 has a dependency on word 4.
there are 11 transitions to reach the dependency tree above. I have tried a few times. all answers are wrong. Any help would be much appreciated. Thank you.
i have tried many times and got all these wrong answers.
Wrong answers:
SH SH SH LA LA SH LA SH SH LA RA
SH SH LA SH LA SH SH LA RA RA
SH SH SH SH LA SH SH LA SH SH LA RA
SH SH SH LA LA SH LA SH SH LA RA
SH SH SH LA SH SH LA SH SH LA RA LA
SH SH SH LA SH LA SH SH LA RA RA
SH SH LA SH SH LA SH LA SH LA RA
SH SH LA SH RA SH SH LA LA RA
SH SH LA SH SH LA SH SH LA RA RA
SH SH SH LA SH LA SH SH LA RA RA
SH SH LA SH SH LA SH LA SH LA RA RA
SH SH SH LA LA SH LA SH SH LA LA RA
SH SH SH LA LA SH LA SH SH LA RA RA
SH SH SH LA LA SH LA SH LA LA RA RA
SH SH SH LA LA SH LA SH SH LA RA RA
SH SH LA SH SH LA SH SH SH RA RA
SH SH LA SH SH LA SH LA SH LA RA RA
SH SH LA SH SH LA SH LA SH LA RA
SH SH LA SH SH LA SH LA LA RA RA
SH SH LA SH SH LA LA SH SH SH RA
SH SH LA SH LA RA SH SH LA RA RA
please help me