Varsha's engineering stuff
Varsha's engineering stuff
  • Видео 201
  • Просмотров 913 319
Part 6: Introduction to NLP, Why NLP is hard, Textual Humor, Sarcasm, Idioms, Neologisms, Tokenizati
Textual Humor
Sarcasm
Tricky Entity Names
Idioms
Neologisms
Segmentation Issues
New Senses of a word
Non standard use of English [ e.g. Informal, Shortform]
New Senses of a word Words or Phrases:
Multiway Interprtation (Confusing Meanings)
Language Imprecision and Vagueness Extreme examples of Lexical Ambiguity
Просмотров: 24

Видео

Part 5: Introduction to NLP, Language, Grammar and Knowledge in NLP
Просмотров 319 часов назад
Part 5: Introduction to NLP, Language, Grammar and Knowledge in NLP
Part 4: History of NLP, First Era, Second Era, Third Era, Fourth Era, Introduction to NLP
Просмотров 219 часов назад
Early NLP History (1940-1950), Second Era (1957-1970), Third Era (1970-1993) , Fourth Era (1993- till date)
NLP Introduction Part 3: Generic NLP System, Parser, Semantic, Pragmatic. & Discourse, Reasoner
Просмотров 7221 день назад
Generic NLP System Input processor, NLP Processor block Output Processor Parser Semantic Processor Pragmatic & Discourse Processor Reasoner Action & Response Generator
NLP Introduction Part I:Definition,Natural Language Generation(NLG)& Understanding(NLU), Need, Goals
Просмотров 68Месяц назад
NLP Introduction Part I:Definition,Natural Language Generation(NLG)& Understanding(NLU), Need, Goals
Semantic Analysis Part 3:Relations among lexemes & their Senses, NLP, Homonymy, Polysemy, Syno,
Просмотров 42Месяц назад
Part 2: Semantic Analysis, NLP, Semantic Relations: Relations among lexemes & their Senses Synonymy, Antonymy, Gradation, Homonymy, Polysemy Hypernymy/Hyponymy Meronymy/Holonymy, Entaliment & Troponymy
Part 1: Semantic Analysis, NLP, Computational, Distributional, Formal Semantics, Lexicon & Lexeme
Просмотров 53Месяц назад
Semantic Analysis, Part 1:NLP, Computational, Distributional, Formal Semantics, Lexicon & Lexeme
Part 7: Earley Parser, Top Down Parser, NLP, Predict, Scan, Complete, Chart, Tanle, CFG Rule
Просмотров 62Месяц назад
Part 7: Earley Parser, Top Down Parser, NLP, Predict, Scan, Complete, Chart, Tanle, CFG Rule
Parser Part 6: Predictive Parser, Top Down, NLP, First, Follow, Stack, look ahead, Predictive Table
Просмотров 63Месяц назад
Parser Part 6: Predictive Parser, Top Down, NLP, First, Follow, Stack, look ahead, Predictive Table Imp Note: In second example, minor mistakes are there in last and second last row. By mistake, under determiner section, Verb rule gets types and same for Auxiliary Verb. Also when applying rule, we have to check ambiguity
Part 5: PCFG parser, Bottom Up Parser, NLP, CFG, Probability, CYK Algorithm, Parse Trees Exercises
Просмотров 162Месяц назад
Part 5: PCFG parser, Bottom Up Parser, NLP, CFG, Probability, CYK Algorithm, Parse Trees Exercises PCGF (Probabilistic Context Free Grammar)Parser
Part 4: Bottom Up Parser, Shift Reduce Parser, Stack, Shift, Reduce, Ambiguity, Backtracking exercis
Просмотров 39Месяц назад
Part 4: Bottom Up Parser, Shift Reduce Parser, Stack, Shift, Reduce, Ambiguity, Backtracking exercises
Parser Part 3: Bottom Up, COCKE-YOUNGER-KASAMI (CYK or CKY Parser), NLP, CNF, CFG, Tree, Dynamic
Просмотров 89Месяц назад
Parser Bottom Up, COCKE-YOUNGER-KASAMI (CYK or CKY Parser), NLP, CNF, CFG, Parse Tree, Dynamic programming
Part 2: NLP Parsers, Modelling Constituency, CFG, Chomsky Normal Form (CNF), Top down & Bottom Up
Просмотров 93Месяц назад
Modelling Constituency Context Free Grammar (CFG) Chomsky Normal Form (CNF) Top down & Bottom Up Approach
Part 1: Parsers in NLP, Parsers Role,Words & Word Groups (Constituency),Types of Parsers, Ambiguity
Просмотров 113Месяц назад
Part 1: Parsers in NLP, Parsers Role,Words & Word Groups (Constituency),Types of Parsers, Ambiguity
Good Turing Discounting, Smoothing, C*, P*GT, Backoff, Interpolation, Laplace, MLE, NLP
Просмотров 133Месяц назад
Good Turing Discounting, Smoothing, C*, P*GT, Backoff, Interpolation, Laplace, MLE, NLP
Part 6: Image Processing Introduction, Connectivity, Adjacency, Euclidean, City Block, Chess Board,m
Просмотров 23Месяц назад
Part 6: Image Processing Introduction, Connectivity, Adjacency, Euclidean, City Block, Chess Board,m
Part 5: Image Processing Introduction, IMAGE FILE FORMATS, TIFF, BMP, JPEG, Features, Adv, Disadv
Просмотров 14Месяц назад
Part 5: Image Processing Introduction, IMAGE FILE FORMATS, TIFF, BMP, JPEG, Features, Adv, Disadv
Part 4 : Image Processing, Sampling, Quantization, Spatial Resolution, Gray level, Tonal resolution
Просмотров 922 месяца назад
Part 4 : Image Processing, Sampling, Quantization, Spatial Resolution, Gray level, Tonal resolution
Part 3: Components or Elements of an Image Processing System
Просмотров 642 месяца назад
Part 3: Components or Elements of an Image Processing System
Part 2: Fundamental Steps in Image Processing, Acquisition, Filtering, Segmentation, Morphology
Просмотров 962 месяца назад
Part 2: Fundamental Steps in Image Processing, Acquisition, Filtering, Segmentation, Morphology
Part 1: Introduction to Image Processing, Definition Digital image, Applications,
Просмотров 542 месяца назад
Part 1: Introduction to Image Processing, Definition Digital image, Applications,
Part 6: Image Transform, Fast Hadamard Transform, Butterfly approach, Complexity N*log2(N-1)
Просмотров 3422 месяца назад
Part 6: Image Transform, Fast Hadamard Transform, Butterfly approach, Complexity N*log2(N-1)
Part 5: Image Transform, Discrete Cosine Transform, DCT, Basis images, Cosine, Real, Low, High freq
Просмотров 1472 месяца назад
Part 5: Image Transform, Discrete Cosine Transform, DCT, Basis images, Cosine, Real, Low, High freq
Part 4: Image Transform, Fourier Transform, Exercise on 1D, 2D signal, Forward & Backward Transform
Просмотров 1412 месяца назад
Part 4: Image Transform, Fourier Transform, Exercise on 1D, 2D signal, Forward & Backward Transform
Part 3: Image Transform: Walsh Transform, Increasing sign change, Exercises on 1D, 2D, WHT
Просмотров 952 месяца назад
Part 3: Image Transform: Walsh Transform, Increasing sign change, Exercises on 1D, 2D, WHT
Part 2: Image Transform, Walsh Hadamard Transform ,1D signal, 2D Signal, Recursive generation
Просмотров 1682 месяца назад
Part 2: Image Transform, Walsh Hadamard Transform ,1D signal, 2D Signal, Recursive generation
Part 1: Image Transform, Introduction, Unitary Transform, Orthogonal Transform, Kronecker Product
Просмотров 4992 месяца назад
Part 1: Image Transform, Introduction, Unitary Transform, Orthogonal Transform, Kronecker Product
Part 7: Image Compression, Lossy, Vector Quantization, R,L, Code book, Code vectors, Image Vectors
Просмотров 2102 месяца назад
Part 7: Image Compression, Lossy, Vector Quantization, R,L, Code book, Code vectors, Image Vectors
Part 6: Image Compression, Improved Gray scale Quantization, False contouring, MSB, LSB
Просмотров 3002 месяца назад
Part 6: Image Compression, Improved Gray scale Quantization, False contouring, MSB, LSB
Part 5: Image Compression, Lossless, Arithmetic Encoding, and Lossless Predictive Coding (DPCM)
Просмотров 1502 месяца назад
Part 5: Image Compression, Lossless, Arithmetic Encoding, and Lossless Predictive Coding (DPCM)

Комментарии

  • @foxleaderneon4920
    @foxleaderneon4920 3 дня назад

    👏👏

  • @rithvikshetty8052
    @rithvikshetty8052 7 дней назад

    Hi Vinay

  • @Sai_Rajzz
    @Sai_Rajzz 9 дней назад

    Thanks a lot 🙏

  • @Sabik47
    @Sabik47 14 дней назад

    90 minutes of content covered in 12 minutes. thanks!

  • @anantasmapan6131
    @anantasmapan6131 16 дней назад

    Hello, i found your video is so helpful for my research. Can you give me some recommendations to determine relevant documents to be used as references to calculate the effectiveness and recall of my suggested method? is it possible to create relevant documents without experts? if yes what method can i use? Thanks.

    • @varshasengineeringstuff4621
      @varshasengineeringstuff4621 16 дней назад

      Glad it was helpful!!!. To ceate relevant documents without expert, you can methods like Crowdsourcing, Automated Relevance Feedback,,Pseudo-Relevance Feedback, Content-Based Filtering

  • @anantasmapan6131
    @anantasmapan6131 16 дней назад

    Hello, i found your video is so helpful for my research. Can you give me some recommendations to determine relevant documents to be used as references to calculate the effectiveness and recall of my suggested method? is it possible to create relevant documents without experts? if yes what method can i use? Thanks.

  • @anantasmapan6131
    @anantasmapan6131 16 дней назад

    Hello, i found your video is so helpful for my research. Can you give me some recommendations to determine relevant documents to be used as references to calculate the effectiveness and recall of my suggested method? is it possible to create relevant documents without experts? if yes what method can i use? Thanks.

  • @anantasmapan6131
    @anantasmapan6131 16 дней назад

    Hello, i found your video is so helpful for my research. Can you give me some recommendations to determine relevant documents to be used as references to calculate the effectiveness and recall of my suggested method? is it possible to create relevant documents without experts? if yes what method can i use? Thanks.

  • @anantasmapan6131
    @anantasmapan6131 16 дней назад

    Hello, i found your video is so helpful for my research. Can you give me some recommendations to determine relevant documents to be used as references to calculate the effectiveness and recall of my suggested method? is it possible to create relevant documents without experts? if yes what method can i use? Thanks.

  • @anantasmapan6131
    @anantasmapan6131 16 дней назад

    Hello, i found your video is so helpful for my research. Can you give me some recommendations to determine relevant documents to be used as references to calculate the effectiveness and recall of my suggested method? is it possible to create relevant documents without experts? if yes what method can i use? Thanks.

  • @danianiazi8229
    @danianiazi8229 17 дней назад

    hi where can I get yout slides?

  • @binary_110
    @binary_110 22 дня назад

    Share the PPT please

  • @rukmanikhandhan
    @rukmanikhandhan 27 дней назад

    Mam, Can you post "Features and Unification  Features and Unification - Structures - Unification of Structure - Features and Structures in Grammar - Implementing Unification - Parsing with Unification Constraints - Probabilistic CFG - Probabilistic Lexicalize CFG - Dependency Grammar."

  • @Noor_Sayed_98
    @Noor_Sayed_98 Месяц назад

    It was very helpful & easy to understand, thank you!.

  • @gyansujan1464
    @gyansujan1464 Месяц назад

    Easy explanation thank you

  • @deebigadevi
    @deebigadevi Месяц назад

    Mam it comes under which routing protocols mam

  • @makkenamaryelizbeth3286
    @makkenamaryelizbeth3286 Месяц назад

    Thank you

  • @achyutcreations961
    @achyutcreations961 Месяц назад

    Need more explanation

    • @varshasengineeringstuff4621
      @varshasengineeringstuff4621 Месяц назад

      It is made only to show drawbacks of cyk can be overcome. Next part is not in our university syllabus.

  • @Ikaachandubattula
    @Ikaachandubattula Месяц назад

    At 12:24 why you are not applying formula used at 5:26 i e chain rule?

  • @mahakal1541
    @mahakal1541 Месяц назад

    best ever explanation. Mai toh mam ki explanation ka divaana ho gya.

    • @varshasengineeringstuff4621
      @varshasengineeringstuff4621 Месяц назад

      Thank you so much, dear for such wonderful compliments. Do subscribe my channel and share with your friends

  • @madhavannaikar1874
    @madhavannaikar1874 Месяц назад

    very much helpful for sem 7 students!

  • @praveenkumarg-vz4pi
    @praveenkumarg-vz4pi Месяц назад

    Thank you

  • @jenithaagnesj4956
    @jenithaagnesj4956 Месяц назад

    👌Clear explanation for Decision tree

  • @jenithaagnesj4956
    @jenithaagnesj4956 Месяц назад

    Perfect 👍

  • @jenithaagnesj4956
    @jenithaagnesj4956 Месяц назад

    Very clear contents

  • @nishanthnishanth8185
    @nishanthnishanth8185 2 месяца назад

    super video aunty

  • @azverndias913
    @azverndias913 2 месяца назад

    Thank you so much

  • @riboy2133
    @riboy2133 2 месяца назад

    Try to speak louder. Your voice is very low. Try to understand with more examples step by step

  • @user-qp9so1by1j
    @user-qp9so1by1j 2 месяца назад

    Thank you

  • @YashBurad-cq9rj
    @YashBurad-cq9rj 2 месяца назад

    Any one from SPIT college?

  • @aalindshukla3894
    @aalindshukla3894 2 месяца назад

    Thank you so much

  • @gyansujan1464
    @gyansujan1464 2 месяца назад

    Very good explanation

  • @gyansujan1464
    @gyansujan1464 3 месяца назад

    Great explanation

  • @jyotirmayeejena8518
    @jyotirmayeejena8518 3 месяца назад

    Thank you ma'am ❤

  • @shivamkumar-du1jo
    @shivamkumar-du1jo 3 месяца назад

    You great

  • @muhammadilyas-bo6nr
    @muhammadilyas-bo6nr 3 месяца назад

    Overall a good explanation but try to use marker for clarification on screen that what you are exactly reading and teaching to students.

  • @chiragsingh8323
    @chiragsingh8323 3 месяца назад

    Thanks a lot mam! u were my only hope

  • @aayup4
    @aayup4 3 месяца назад

    How we can decide that we have to consider 8 features???

    • @varshasengineeringstuff4621
      @varshasengineeringstuff4621 3 месяца назад

      Its not compulsory to take eight features. IN HMM we take only current previous case. Depending on language, we can decide features such that prediction for POS tagging will be accuracte

  • @saiei
    @saiei 4 месяца назад

    Perplexity not in the syllabus

  • @sarthaksharma5929
    @sarthaksharma5929 4 месяца назад

    Good

  • @TunerXMusic4u
    @TunerXMusic4u 4 месяца назад

    At 8:43 why T1 is removed as it contains I1,I2 whivh is a frequent item set

  • @lakithpusarla1040
    @lakithpusarla1040 4 месяца назад

    How do we calculate initial probabilities of the HMM if I have the vocabulary set and tag set? For example if I am given a set of sentences in their normalised form their respective POS tags?

  • @nithyabp1347
    @nithyabp1347 4 месяца назад

    could you please upload a video for word sense disambiguation in modern information retrieval....or pls share this topic based any link....thank you

  • @amansethi1305
    @amansethi1305 4 месяца назад

    Why did we take N as 3 and 4 in perplexity ? (how was N changed for the same sentence ?)

  • @sigmacryo4200
    @sigmacryo4200 4 месяца назад

    This helps a ton!

  • @_beingdeeksha
    @_beingdeeksha 4 месяца назад

    Could explain more neatly, she was going fast

  • @VaidehiMahyavanshi-um3io
    @VaidehiMahyavanshi-um3io 5 месяцев назад

    Great explanation!

  • @FaizKhan2k3
    @FaizKhan2k3 5 месяцев назад

    Mam, plz continue NLP playlist

  • @arkblod6517
    @arkblod6517 5 месяцев назад

    Nice explanation

  • @MrManoj6667
    @MrManoj6667 5 месяцев назад

    Is it possible to share ppt