Fake News Detection Project with BERT Fine-tuning | Deep Learning for NLP | Project#11

Поделиться
HTML-код
  • Опубликовано: 5 ноя 2024

Комментарии • 37

  • @MarutiBhakth7147
    @MarutiBhakth7147 Год назад +1

    Thank you so much sir...I literally fell in love with BERT model, you really have great teaching skills, only an awesome teacher can teach ML in such and interesting easy and detailed way.... ❤🥰

    • @skillcate
      @skillcate  Год назад

      Thanks for the lovely feedback Rakesh ❤️

  • @AfeezeeHQ
    @AfeezeeHQ 9 месяцев назад +1

    Thank you for this video. It's very enlightening. However, I have a question. This is a fake news detection that is solely based on linguistic cues. The research I'm working on is more than just predicting based on linguistic cues but source credibility analysis and contextual analysis are involved too. How can I develop a fake news prediction model that will effectively detect if news is fake or real based on an extensive dataset such as the Twitter Truth Seeker dataset?

  • @MuhammadAbdullah-gx2ou
    @MuhammadAbdullah-gx2ou 10 месяцев назад

    Thanks for this wonderful project, please let me know one point about transfer learning. Have we used this? if yes then where and how. Thanks.

  • @jadhavpruthviraj5788
    @jadhavpruthviraj5788 2 года назад +1

    Man,What a video Loved it...The way you explain,patience is outstanding. Keep on droping videos

  • @hooman3563
    @hooman3563 2 года назад +3

    Thanks for the video! How can we save the fine-tuned model so that we can run it next time without going through fine-tuning the model again?

    • @skillcate
      @skillcate  2 года назад +3

      Hey Hooman 😊, I'm sure you are a dog lover!!
      Within the Fake New Detection code file, under the 'Model Training' section, the # Train & Predict code script has this line:
      torch.save(model.state_dict(), 'c2_new_model_weights.pt')
      With this line of code, the Model weights get stored in your Working Directory. For loading these saved weights, you may use (also shown in the tutorial at 15:24):
      # load weights of best model
      path = 'c2_new_model_weights.pt'
      model.load_state_dict(torch.load(path))

  • @mryoso25
    @mryoso25 Год назад

    Very insightful. Thank you so much for this! Hope we can have NLP deep learning project for text classification.

  • @sadik3611
    @sadik3611 Год назад +1

    Hi sir ,your videos are just awesome, I use this code for sentiment analysis to detect spam email.

    • @skillcate
      @skillcate  Год назад

      Great 👍 Glad that you find it useful.

  • @hemalshah1410
    @hemalshah1410 Год назад

    Thank you for sharing such great content with great explanation. 🙌 Keep posting!

    • @skillcate
      @skillcate  Год назад

      Glad that you find it useful!

  • @edilgin
    @edilgin Год назад

    I followed the notebook and trained my model for 30 epochs but in every epoch training loss was 0.001 and test loss was 0.003. Any reason that might be the case?

  • @mayanklohani19
    @mayanklohani19 Год назад

    what things will changes if we want to use text instead of title?

  • @LucyOConnor-u6g
    @LucyOConnor-u6g Год назад

    I tried running this notebook and hit a bunch of errors on the sklearn imports. Again, tried running this with an older version of sklearn and hits errors. Any help? I am fairly new to coding.

  • @venkatdatta.g9105
    @venkatdatta.g9105 7 месяцев назад

    Can you let me know how to save the model

  • @danielihenacho
    @danielihenacho 2 года назад +1

    Is possible to use huggingfacae transformers for a project similar to this ?

    • @skillcate
      @skillcate  2 года назад +1

      Hey Daniel, how are you mate!
      Yes, sure you can. Not just similar, you can actually use it do "almost" any problem you may have, be it NLP / CV / Audio. Here's their documentation: huggingface.co/docs/transformers/index
      For similar problems, you may do Sentiment Analysis, Movie/Book Recommendation, Text Summarization, Topic Modeling, etc. within NLP.

    • @danielihenacho
      @danielihenacho Год назад

      @@skillcate Thanks for the information. Please I would like to ask if a tensorflow version of this episode is available?

  • @Siriiiiii
    @Siriiiiii 9 месяцев назад

    also do project on fake reviews

  • @harissaeed5811
    @harissaeed5811 2 года назад +1

    hi sir ,i need your help i have only 10 day left for my fyp submision can u please help me in this i dont know how to implement bert on roman urdu data set . oter wise my fyp will be rejected can u please make a vedio on it ,

    • @skillcate
      @skillcate  2 года назад

      Let's keep in touch on this..

  • @YagnasreeMusalreddygari
    @YagnasreeMusalreddygari 7 месяцев назад

    Sir i am unable to see the datasets from the link u provided it is showing that no results found

  • @maleeshamendis5924
    @maleeshamendis5924 10 месяцев назад

    Hi Sir.I used the same code and run it. But when the epochs are = 2 , code takes log time to run, even it doesn't execute. I even tried increasing the the number of epochs. Noprogress.How can I resolve it? :(

    • @skillcate
      @skillcate  10 месяцев назад

      Where are you running the code? It is likely that you are training on CPU machine where it take longer time than gpu

  • @basmahhyder5695
    @basmahhyder5695 2 года назад +1

    I implemented this code for fake reviews detection and got an accuracy of 57% which was quite unexpected.. :(

    • @skillcate
      @skillcate  Год назад

      Dear Basmah, how many epoch you trained your model for? I think you might have done 2 epochs. If yes, you should do 2-4 more epochs for your results to get better.

    • @basmahhyder5695
      @basmahhyder5695 Год назад

      @@skillcate Yes, I changed that, and with epoch= 10, I got an accuracy of 71%. That's the maximum accuracy I could get.

  • @rashdakhanzada8058
    @rashdakhanzada8058 Год назад

    how to work with roman urdu dataset using bert.

    • @skillcate
      @skillcate  Год назад +1

      Dear Rashda, we'll soon be doing a video on this problem statement. Allow us a week's time.

    • @rashdakhanzada8058
      @rashdakhanzada8058 Год назад

      @SKILLCATE sure it would be great.Another thing i want to get know that colab offers 12hours runtime per session.what does this mean that after 12 hours we cannot run the same notebook free ?

    • @skillcate
      @skillcate  Год назад +1

      @@rashdakhanzada8058 12-hour runtime per session in Colab means you can use the free version of Colab for up to 12 consecutive hours. Once the time is up, you can start a new session and continue your work. It's like driving a rental car for a specific duration and then getting a new car if you need more time.

    • @rashdakhanzada8058
      @rashdakhanzada8058 Год назад

      @SKILLCATE means now i have to create a new notebook ,insert the same code inside this and can work on it for 12 hours more.And have to do this again again...

    • @skillcate
      @skillcate  Год назад +1

      ​@@rashdakhanzada8058 Not really. Notebook remains as is. Just think it like this: You're training a very heavy Deep Learning Model for which Model training goes on for let's say for 24hours. With the free Colab Account, you have a Runtime limit of 12hours. So you won't be able to train such heavy Deep Learning Models.
      But, for a beginner level Data Science related stuff, you will face absolutely no problem. As most of your training would take minutes to a couple of hours. 🤗