![Varsha's engineering stuff](/img/default-banner.jpg)
- Видео 201
- Просмотров 913 319
Varsha's engineering stuff
Индия
Добавлен 6 дек 2011
Being profession in teaching for last 17 years, never thought of youtube video creation. During corona virus pandemic, idea comes to my mind of video creation for my students and just for luck uploaded video on RUclips.
I got surprised by getting positive response from viewers. I thankful to them. Give me suggestions for improvement.
My name is Dr. Varsha Patil, Ph.D. in Image Processing from Mumbai University.
Have expertise in Image Processing, Data Mining, Machine Learning, Natural Language Processing and many more.
If you really like my channel, subscribe and share the link with your friends.
I got surprised by getting positive response from viewers. I thankful to them. Give me suggestions for improvement.
My name is Dr. Varsha Patil, Ph.D. in Image Processing from Mumbai University.
Have expertise in Image Processing, Data Mining, Machine Learning, Natural Language Processing and many more.
If you really like my channel, subscribe and share the link with your friends.
Part 6: Introduction to NLP, Why NLP is hard, Textual Humor, Sarcasm, Idioms, Neologisms, Tokenizati
Textual Humor
Sarcasm
Tricky Entity Names
Idioms
Neologisms
Segmentation Issues
New Senses of a word
Non standard use of English [ e.g. Informal, Shortform]
New Senses of a word Words or Phrases:
Multiway Interprtation (Confusing Meanings)
Language Imprecision and Vagueness Extreme examples of Lexical Ambiguity
Sarcasm
Tricky Entity Names
Idioms
Neologisms
Segmentation Issues
New Senses of a word
Non standard use of English [ e.g. Informal, Shortform]
New Senses of a word Words or Phrases:
Multiway Interprtation (Confusing Meanings)
Language Imprecision and Vagueness Extreme examples of Lexical Ambiguity
Просмотров: 24
Видео
Part 5: Introduction to NLP, Language, Grammar and Knowledge in NLP
Просмотров 319 часов назад
Part 5: Introduction to NLP, Language, Grammar and Knowledge in NLP
Part 4: History of NLP, First Era, Second Era, Third Era, Fourth Era, Introduction to NLP
Просмотров 219 часов назад
Early NLP History (1940-1950), Second Era (1957-1970), Third Era (1970-1993) , Fourth Era (1993- till date)
NLP Introduction Part 3: Generic NLP System, Parser, Semantic, Pragmatic. & Discourse, Reasoner
Просмотров 7221 день назад
Generic NLP System Input processor, NLP Processor block Output Processor Parser Semantic Processor Pragmatic & Discourse Processor Reasoner Action & Response Generator
NLP Introduction Part I:Definition,Natural Language Generation(NLG)& Understanding(NLU), Need, Goals
Просмотров 68Месяц назад
NLP Introduction Part I:Definition,Natural Language Generation(NLG)& Understanding(NLU), Need, Goals
Semantic Analysis Part 3:Relations among lexemes & their Senses, NLP, Homonymy, Polysemy, Syno,
Просмотров 42Месяц назад
Part 2: Semantic Analysis, NLP, Semantic Relations: Relations among lexemes & their Senses Synonymy, Antonymy, Gradation, Homonymy, Polysemy Hypernymy/Hyponymy Meronymy/Holonymy, Entaliment & Troponymy
Part 1: Semantic Analysis, NLP, Computational, Distributional, Formal Semantics, Lexicon & Lexeme
Просмотров 53Месяц назад
Semantic Analysis, Part 1:NLP, Computational, Distributional, Formal Semantics, Lexicon & Lexeme
Part 7: Earley Parser, Top Down Parser, NLP, Predict, Scan, Complete, Chart, Tanle, CFG Rule
Просмотров 62Месяц назад
Part 7: Earley Parser, Top Down Parser, NLP, Predict, Scan, Complete, Chart, Tanle, CFG Rule
Parser Part 6: Predictive Parser, Top Down, NLP, First, Follow, Stack, look ahead, Predictive Table
Просмотров 63Месяц назад
Parser Part 6: Predictive Parser, Top Down, NLP, First, Follow, Stack, look ahead, Predictive Table Imp Note: In second example, minor mistakes are there in last and second last row. By mistake, under determiner section, Verb rule gets types and same for Auxiliary Verb. Also when applying rule, we have to check ambiguity
Part 5: PCFG parser, Bottom Up Parser, NLP, CFG, Probability, CYK Algorithm, Parse Trees Exercises
Просмотров 162Месяц назад
Part 5: PCFG parser, Bottom Up Parser, NLP, CFG, Probability, CYK Algorithm, Parse Trees Exercises PCGF (Probabilistic Context Free Grammar)Parser
Part 4: Bottom Up Parser, Shift Reduce Parser, Stack, Shift, Reduce, Ambiguity, Backtracking exercis
Просмотров 39Месяц назад
Part 4: Bottom Up Parser, Shift Reduce Parser, Stack, Shift, Reduce, Ambiguity, Backtracking exercises
Parser Part 3: Bottom Up, COCKE-YOUNGER-KASAMI (CYK or CKY Parser), NLP, CNF, CFG, Tree, Dynamic
Просмотров 89Месяц назад
Parser Bottom Up, COCKE-YOUNGER-KASAMI (CYK or CKY Parser), NLP, CNF, CFG, Parse Tree, Dynamic programming
Part 2: NLP Parsers, Modelling Constituency, CFG, Chomsky Normal Form (CNF), Top down & Bottom Up
Просмотров 93Месяц назад
Modelling Constituency Context Free Grammar (CFG) Chomsky Normal Form (CNF) Top down & Bottom Up Approach
Part 1: Parsers in NLP, Parsers Role,Words & Word Groups (Constituency),Types of Parsers, Ambiguity
Просмотров 113Месяц назад
Part 1: Parsers in NLP, Parsers Role,Words & Word Groups (Constituency),Types of Parsers, Ambiguity
Good Turing Discounting, Smoothing, C*, P*GT, Backoff, Interpolation, Laplace, MLE, NLP
Просмотров 133Месяц назад
Good Turing Discounting, Smoothing, C*, P*GT, Backoff, Interpolation, Laplace, MLE, NLP
Part 6: Image Processing Introduction, Connectivity, Adjacency, Euclidean, City Block, Chess Board,m
Просмотров 23Месяц назад
Part 6: Image Processing Introduction, Connectivity, Adjacency, Euclidean, City Block, Chess Board,m
Part 5: Image Processing Introduction, IMAGE FILE FORMATS, TIFF, BMP, JPEG, Features, Adv, Disadv
Просмотров 14Месяц назад
Part 5: Image Processing Introduction, IMAGE FILE FORMATS, TIFF, BMP, JPEG, Features, Adv, Disadv
Part 4 : Image Processing, Sampling, Quantization, Spatial Resolution, Gray level, Tonal resolution
Просмотров 922 месяца назад
Part 4 : Image Processing, Sampling, Quantization, Spatial Resolution, Gray level, Tonal resolution
Part 3: Components or Elements of an Image Processing System
Просмотров 642 месяца назад
Part 3: Components or Elements of an Image Processing System
Part 2: Fundamental Steps in Image Processing, Acquisition, Filtering, Segmentation, Morphology
Просмотров 962 месяца назад
Part 2: Fundamental Steps in Image Processing, Acquisition, Filtering, Segmentation, Morphology
Part 1: Introduction to Image Processing, Definition Digital image, Applications,
Просмотров 542 месяца назад
Part 1: Introduction to Image Processing, Definition Digital image, Applications,
Part 6: Image Transform, Fast Hadamard Transform, Butterfly approach, Complexity N*log2(N-1)
Просмотров 3422 месяца назад
Part 6: Image Transform, Fast Hadamard Transform, Butterfly approach, Complexity N*log2(N-1)
Part 5: Image Transform, Discrete Cosine Transform, DCT, Basis images, Cosine, Real, Low, High freq
Просмотров 1472 месяца назад
Part 5: Image Transform, Discrete Cosine Transform, DCT, Basis images, Cosine, Real, Low, High freq
Part 4: Image Transform, Fourier Transform, Exercise on 1D, 2D signal, Forward & Backward Transform
Просмотров 1412 месяца назад
Part 4: Image Transform, Fourier Transform, Exercise on 1D, 2D signal, Forward & Backward Transform
Part 3: Image Transform: Walsh Transform, Increasing sign change, Exercises on 1D, 2D, WHT
Просмотров 952 месяца назад
Part 3: Image Transform: Walsh Transform, Increasing sign change, Exercises on 1D, 2D, WHT
Part 2: Image Transform, Walsh Hadamard Transform ,1D signal, 2D Signal, Recursive generation
Просмотров 1682 месяца назад
Part 2: Image Transform, Walsh Hadamard Transform ,1D signal, 2D Signal, Recursive generation
Part 1: Image Transform, Introduction, Unitary Transform, Orthogonal Transform, Kronecker Product
Просмотров 4992 месяца назад
Part 1: Image Transform, Introduction, Unitary Transform, Orthogonal Transform, Kronecker Product
Part 7: Image Compression, Lossy, Vector Quantization, R,L, Code book, Code vectors, Image Vectors
Просмотров 2102 месяца назад
Part 7: Image Compression, Lossy, Vector Quantization, R,L, Code book, Code vectors, Image Vectors
Part 6: Image Compression, Improved Gray scale Quantization, False contouring, MSB, LSB
Просмотров 3002 месяца назад
Part 6: Image Compression, Improved Gray scale Quantization, False contouring, MSB, LSB
Part 5: Image Compression, Lossless, Arithmetic Encoding, and Lossless Predictive Coding (DPCM)
Просмотров 1502 месяца назад
Part 5: Image Compression, Lossless, Arithmetic Encoding, and Lossless Predictive Coding (DPCM)
👏👏
Hi Vinay
😶🌫
Thanks a lot 🙏
You're most welcome
90 minutes of content covered in 12 minutes. thanks!
So sweet of you dear. Thank you so much for such nice compliments.
Hello, i found your video is so helpful for my research. Can you give me some recommendations to determine relevant documents to be used as references to calculate the effectiveness and recall of my suggested method? is it possible to create relevant documents without experts? if yes what method can i use? Thanks.
Glad it was helpful!!!. To ceate relevant documents without expert, you can methods like Crowdsourcing, Automated Relevance Feedback,,Pseudo-Relevance Feedback, Content-Based Filtering
Hello, i found your video is so helpful for my research. Can you give me some recommendations to determine relevant documents to be used as references to calculate the effectiveness and recall of my suggested method? is it possible to create relevant documents without experts? if yes what method can i use? Thanks.
Hello, i found your video is so helpful for my research. Can you give me some recommendations to determine relevant documents to be used as references to calculate the effectiveness and recall of my suggested method? is it possible to create relevant documents without experts? if yes what method can i use? Thanks.
Hello, i found your video is so helpful for my research. Can you give me some recommendations to determine relevant documents to be used as references to calculate the effectiveness and recall of my suggested method? is it possible to create relevant documents without experts? if yes what method can i use? Thanks.
Hello, i found your video is so helpful for my research. Can you give me some recommendations to determine relevant documents to be used as references to calculate the effectiveness and recall of my suggested method? is it possible to create relevant documents without experts? if yes what method can i use? Thanks.
Hello, i found your video is so helpful for my research. Can you give me some recommendations to determine relevant documents to be used as references to calculate the effectiveness and recall of my suggested method? is it possible to create relevant documents without experts? if yes what method can i use? Thanks.
hi where can I get yout slides?
Share the PPT please
Mam, Can you post "Features and Unification Features and Unification - Structures - Unification of Structure - Features and Structures in Grammar - Implementing Unification - Parsing with Unification Constraints - Probabilistic CFG - Probabilistic Lexicalize CFG - Dependency Grammar."
For which university syllabus
It was very helpful & easy to understand, thank you!.
You are welcome!
Easy explanation thank you
You are welcome
Mam it comes under which routing protocols mam
Dear, it is density based clustering to cluster similar items
Thank you
Thanks dear
Need more explanation
It is made only to show drawbacks of cyk can be overcome. Next part is not in our university syllabus.
At 12:24 why you are not applying formula used at 5:26 i e chain rule?
best ever explanation. Mai toh mam ki explanation ka divaana ho gya.
Thank you so much, dear for such wonderful compliments. Do subscribe my channel and share with your friends
very much helpful for sem 7 students!
Thank you dear
Thank you
Thanks for comment
👌Clear explanation for Decision tree
Thanks for compliment. Plz share in your friend groups
Perfect 👍
Thank you! Cheers!
Very clear contents
Keep watching
super video aunty
Thank you so much
You're most welcome
Try to speak louder. Your voice is very low. Try to understand with more examples step by step
ok
@@varshasengineeringstuff4621 But Your notes are really helpful mam
Thank you
You're welcome
Any one from SPIT college?
Thank you so much
You're most welcome
Very good explanation
Thanks for liking
Great explanation
Glad you liked it
Thank you ma'am ❤
Most welcome 😊
You great
Thank you
Overall a good explanation but try to use marker for clarification on screen that what you are exactly reading and teaching to students.
sure
Thanks a lot mam! u were my only hope
My pleasure 😊
How we can decide that we have to consider 8 features???
Its not compulsory to take eight features. IN HMM we take only current previous case. Depending on language, we can decide features such that prediction for POS tagging will be accuracte
Perplexity not in the syllabus
Good
Thanks
At 8:43 why T1 is removed as it contains I1,I2 whivh is a frequent item set
How do we calculate initial probabilities of the HMM if I have the vocabulary set and tag set? For example if I am given a set of sentences in their normalised form their respective POS tags?
From the dataset, you can calculate initial probabilities
could you please upload a video for word sense disambiguation in modern information retrieval....or pls share this topic based any link....thank you
ok. i know wsd in nlp
Why did we take N as 3 and 4 in perplexity ? (how was N changed for the same sentence ?)
This helps a ton!
Thanks for nice reply. Plz share my videos link with your friends
Could explain more neatly, she was going fast
It may be because I already taught my students in class. Definitely, i will improve
Great explanation!
Thanks!
Mam, plz continue NLP playlist
Give list of Topics
Nice explanation
Is it possible to share ppt
no