Thank you Nikhil for your kind words. Despite my RUclips Channel being underrated, the comments on my recent video have made me proud of my teaching abilities. Your motivation and encouragement have inspired me to create more videos. Let's keep learning together!!
Hi, pls add more videos on the continuation of CKY, chart parsing, PCFG, dependency parsing, transition parsing. Thanks for your lectures but need more videos.
Apply probabilistic CKY Parsing of PCFG to the sentence “John eats pie with ice-cream” and update the table. Production rules Probabilities S → NP VP 1.0 NP → N 0.4 NP → NP PP 0.6 VP → V NP 0.7 VP → VP PP 0.3 PP → P NP 1.0 N → "John" 0.2 N → "pie" 0.3 N → "ice-cream" 0.5 V → "eats" 1.0 P → "with" 1.0 can you please explain for this?
This man earned a lot of respect literally 💌
Thank you so much sir. I’m trying to learn this concept from different sources but yours was clear.
Most underrated RUclips channel it seems!!
Thank you Nikhil for your kind words. Despite my RUclips Channel being underrated, the comments on my recent video have made me proud of my teaching abilities. Your motivation and encouragement have inspired me to create more videos. Let's keep learning together!!
I really wanna thank you Mr. Binod. This was made so easy to understand to your genius.
sir ur way of explanation is simply superb
Thanks Binod, its really simple to understand. When i go thru your video - you make concept easy to understand & quickly grasp it. Thanks once again
Wonderful explanation ❤sir.Thank you
sir brilliant explaination thanks for teaching respect your efforts
Very useful video! Clear explanation on the concept and easily understandable numerical.
Thanks a lot sir, for explaining it clearly you really helped me understand this concept better than any other lecturers present
Thank you Binod for your videos. it helped for exams!
Glad to know this NLP Natural Language Processing tutorial video helped you. Thanks for your nice words !!
Fantastic explanation. Thank you Binod!
this is a super helpful explanation. Thanks for making this video..
tmrw is my exam. watching this video boosted my confidence. easy to understand
Thank you for explaining this clearly
Happy to hear Nihal, that this Amazon NLP Natural Language Processing CKY videos Tutorial series helped you. Keep Learning!! @binodsumanacademy
Thank you ! This was an amazing explanation
You're very welcome! Goos to know, this CKY video helped you.
Thank you sir you cleared me this topic
Thanks for your video, very helpful and understandable
Glad this NLP video was helpful for you! Keep Learning !!
explained so well thanks so much
Thank you Sir ! You make subject interesting.
Glad to hear Nidhi that this Natural Language Processing NLP Tutorial series helped you. Keep Learning and thank you for your nice words !!
Thank you Soooooo Much it was really helpful
Good to know this NLP CKY Algorithm RUclips Video helped you. Thank you for nice words. Keep Learning !!
I could be wrong but I calculated a different final probability. In chart[0,5] I get 2.304x10^-8
Thanks for the explanation though! Very much enjoyed
thank you sir, great explanation
This is really helpful, thank you ! :))
Glad this Natural Language Processing tutorial was useful for you, Keep Learning !!
What a lovely teacher, keep it up 🙃🌹
clear and concise!
Hello, so if there is more than one parse given by the probabilistic CKY you choose the more probable?
Thanks Sir, this has been very helpful, i really appreciate
good explaination, it's also help to implement through programs
Big thanks 🙏.
Sir ! Please explain Chu- Liu-Edmonds algorithm
Thank You, It is a nice video.
Thank You, Sir.
Thank you sir
Thanks Soo much👌
Why did NP and v didn't connect in 0,3 box ??
awesome
Hi, pls add more videos on the continuation of CKY, chart parsing, PCFG, dependency parsing, transition parsing. Thanks for your lectures but need more videos.
Apply probabilistic CKY Parsing of PCFG to the sentence “John eats pie with ice-cream” and update the table.
Production rules Probabilities
S → NP VP 1.0
NP → N 0.4
NP → NP PP 0.6
VP → V NP 0.7
VP → VP PP 0.3
PP → P NP 1.0
N → "John" 0.2
N → "pie" 0.3
N → "ice-cream" 0.5
V → "eats" 1.0
P → "with" 1.0
can you please explain for this?
will the set of rules be given in the question paper or do we have to do that on our own
Please complete the playlist sir
Plz make video how you calculate so fast
❤
how precision can be achive with probabilistic cky algorithm?
sir aapne isme error kiya hai 0.3*0.4+0.02 =??
Nice sir
sir, which book you are referring to?
parse tree ??
by any chance are you from Assam ?
💯
Next video when?
What is probability is not given
The IIT professor failed to teach the CYK algorithm as effectively as you did.
Binod Suman I think calculation at the end is wrong
Binod
Binod
Binod
ha ha Binod