Tweet Scraping using Twint and Sentiment Analysis using HF Transformers

Поделиться
HTML-код
  • Опубликовано: 23 дек 2024

Комментарии • 28

  • @HC-gn2pv
    @HC-gn2pv 3 года назад +1

    Very clear explanation, thank you. You are awesome!

  • @lordchupreme5306
    @lordchupreme5306 3 года назад +1

    Thank you for the video, would been nice if you did some animated visualizations using the data?

  • @adamirsyad3723
    @adamirsyad3723 2 года назад

    Hello, what if we want to search ampersand(&) symbol such as H&M? what will be the command? Thank you

  • @rajeevmayekar1775
    @rajeevmayekar1775 Год назад

    Could not find the Guest token in HTML .....error

  • @d3v487
    @d3v487 3 года назад

    Sir Please make a video of on how to finetune and train our own model for domain specific dateset.

    • @RitheshSreenivasan
      @RitheshSreenivasan  3 года назад

      Sure will try to make a video

    • @rafiulislam4838
      @rafiulislam4838 3 года назад

      @@RitheshSreenivasan That would be amazing if you could. Great content.

  • @medSad333
    @medSad333 3 года назад

    the function column_names() doesn't work

    • @RitheshSreenivasan
      @RitheshSreenivasan  3 года назад

      Please have a look at documentation for latest version

    • @medSad333
      @medSad333 3 года назад

      @@RitheshSreenivasan thanks

    • @medSad333
      @medSad333 3 года назад +1

      @@RitheshSreenivasan In your github code the parameter "likes_count" should be replaced by "nlikes" (like in the video) that was the problem
      and for me i needed to dowload tensorflow thanks for everything

  • @faizaahmad314
    @faizaahmad314 Год назад

    I want to scrap the emotions data in roman urdu like happy sad angry fear
    How can I do this
    Please please please please reply me fast its urgent

    • @RitheshSreenivasan
      @RitheshSreenivasan  Год назад +1

      Response from ChatGPT:
      To scrape emotions data in Roman Urdu from Twitter, you can use the Tweepy library, which provides an easy-to-use interface for interacting with the Twitter API. Here's an example Python code that demonstrates how to scrape emotions data from Twitter using Tweepy:
      ```python
      import tweepy
      # Twitter API credentials
      consumer_key = "YOUR_CONSUMER_KEY"
      consumer_secret = "YOUR_CONSUMER_SECRET"
      access_token = "YOUR_ACCESS_TOKEN"
      access_token_secret = "YOUR_ACCESS_TOKEN_SECRET"
      # Authenticate with the Twitter API
      auth = tweepy.OAuthHandler(consumer_key, consumer_secret)
      auth.set_access_token(access_token, access_token_secret)
      api = tweepy.API(auth)
      # Keywords to search for emotions in Roman Urdu
      keywords = ["happy", "sad", "angry", "fear"]
      # Scrape emotions data from Twitter
      emotions_data = []
      for keyword in keywords:
      tweets = tweepy.Cursor(api.search, q=keyword, lang="ur").items(100) # Adjust the number of items as per your requirements
      for tweet in tweets:
      emotion = {
      "keyword": keyword,
      "text": tweet.text,
      "created_at": tweet.created_at
      }
      emotions_data.append(emotion)
      # Print the scraped emotions data
      for emotion in emotions_data:
      print(emotion)
      ```
      Before running the code, make sure you have installed the Tweepy library (`pip install tweepy`) and have obtained your Twitter API credentials by creating a Twitter Developer account and setting up a new application. Replace the `"YOUR_CONSUMER_KEY"`, `"YOUR_CONSUMER_SECRET"`, `"YOUR_ACCESS_TOKEN"`, and `"YOUR_ACCESS_TOKEN_SECRET"` placeholders with your actual API credentials.
      In the code above, the `keywords` list contains the emotions you want to search for in Roman Urdu. The code will retrieve 100 tweets per keyword (you can adjust this number) that match the specified emotions and language ("ur" for Urdu).
      The scraped data includes the keyword, the text of the tweet, and the timestamp of when the tweet was created. You can modify or extend the code to extract additional information from the tweets if needed.
      Please note that web scraping Twitter is subject to Twitter's API usage limits and terms of service. Make sure to comply with the guidelines and use the data responsibly.

    • @faizaahmad314
      @faizaahmad314 Год назад

      @@RitheshSreenivasan Thanks for your response

  • @AafaqFazal
    @AafaqFazal 3 года назад

    hello bro i want to ask some questions

  • @HBG-SHIVANI
    @HBG-SHIVANI 3 года назад

    How we can install the transformers with pytorch in anaconda

    • @RitheshSreenivasan
      @RitheshSreenivasan  3 года назад

      Create conda virtual environment. . Then install transformers and pytorch in your virtual environment

    • @RitheshSreenivasan
      @RitheshSreenivasan  3 года назад

      conda create -prefix pytorchvenv python=3.6
      conda activate pytorchenv
      pip install transformers
      pip install pytorch

  • @maitreyeeganguly3283
    @maitreyeeganguly3283 3 года назад

    Sir, I tried to run this part,
    c.Search = "lockdown in India"
    # Custom output format
    c.Format = "Username: {username} | Tweet: {tweet}"
    c.Limit = 100
    c.Since = '2021-03-30'
    c.Pandas = True
    a = twint.run.Search(c)
    after that, the error is showing like this.
    CRITICAL:root:twint.run:Twint:Feed:noDataExpecting value: line 1 column 1 (char 0)
    sleeping for 1.0 secs
    CRITICAL:root:twint.run:Twint:Feed:noDataExpecting value: line 1 column 1 (char 0)
    sleeping for 8.0 secs
    CRITICAL:root:twint.run:Twint:Feed:noDataExpecting value: line 1 column 1 (char 0)
    sleeping for 27.0 sec
    Can you please help me with that?

    • @RitheshSreenivasan
      @RitheshSreenivasan  3 года назад +1

      Please refer to twint documentation. I have only explored twint to the extent required for this video

    • @yunchaohou2011
      @yunchaohou2011 3 года назад

      did u solve it?

    • @yunchaohou2011
      @yunchaohou2011 3 года назад

      uninstall and get the latest version

    • @rajeevmayekar1775
      @rajeevmayekar1775 Год назад

      same issue ....did you solved that error? any help appreciated