How To Fix Japanese Keyword Hack On WordPress - After Fixing WordPress Hack

Поделиться
HTML-код
  • Опубликовано: 23 авг 2024

Комментарии • 53

  • @jasonetaylor
    @jasonetaylor 6 месяцев назад +2

    This was helpful. All of my spam links were not under the same area that you showed, they were all under the first tab in Pages. The tab is "Alternate page with proper canonical tag"

    • @WebSensePro
      @WebSensePro  6 месяцев назад

      Thanks for mentioning that in comments

    • @jasonetaylor
      @jasonetaylor 6 месяцев назад

      How long before you see these bad links falloff? It's been a few weeks and I still seem them show up. @@WebSensePro

  • @user-yy8hf9pd1z
    @user-yy8hf9pd1z Год назад +6

    how to download 36k indexed urls to put in sitemap google only allowing 1k any ideas ?

    • @alihaelectronics
      @alihaelectronics Год назад +2

      i am also having same problem bro
      do you have any idea!?

    • @WebSensePro
      @WebSensePro  Год назад +2

      You mean Google only exports 1k URL's?

    • @user-yx7um8tm5s
      @user-yx7um8tm5s 10 месяцев назад

      My Site have 253k url😢😢 How to solved this issue??

  • @ael2719
    @ael2719 Год назад +4

    hey ! Good job. please try to go little deep by showing how to edit robots.txt file or how to add spam.txt file and one more basic question : how to remove the malicious code from the hacked website and preventions. etc

    • @WebSensePro
      @WebSensePro  Год назад +2

      Sure, need more likes and shares for that

  • @amaitaintercultura1126
    @amaitaintercultura1126 15 дней назад +1

    thank you from ROME! also for me the urls list was under indexed pages....currently I've another problem cause (probably caused from the malware) now if I click from the dashboard ADD PLUGIN page showed as not existing page :_(

  • @seemakharote313
    @seemakharote313 Месяц назад +1

    I tried all the process what you explained in this video but still there is a japenese keyword crowled on my site

    • @WebSensePro
      @WebSensePro  Месяц назад

      It takes few weeks to resolve the issue

  • @atharva.d8685
    @atharva.d8685 2 месяца назад +1

    Hi , I created the spam.txt file but when I submit the sitemap from Google Console it every time shows me an error message in the sitemap status- "Sitemap could not be read" . Can someone help me here?

    • @WebSensePro
      @WebSensePro  Месяц назад

      Give it 24-48 hours and try again

  • @bellojoseph6838
    @bellojoseph6838 11 месяцев назад +1

    After getting the new crawl report from google , would it be removed automatically or I would do that manually?

    • @WebSensePro
      @WebSensePro  11 месяцев назад

      It should be done automatically but make sure to remove malware from the website first

  • @caesarforever
    @caesarforever 8 месяцев назад +1

    When I submit the spam.txt file then Google automatically detects that the contained URLs need to be removed? Is it because the file is called "spam.txt"? Or do I need to consider something else for the URLs to get removed and not added?

    • @WebSensePro
      @WebSensePro  8 месяцев назад

      You just need to tell google about the URL thats it

    • @cvrlik
      @cvrlik 7 месяцев назад

      Same question. How does Google know this is spam? Name of the file is enough?@@WebSensePro

  • @henokr.1016
    @henokr.1016 Месяц назад

    Thanks. Submitting those Spam URLs as a sitemap? Will Google understand it is a Spam? Will it not confuse it with a legit sitemap? Also, deleting unusual posts from the Database -> Posts table will also help. Thanks.

    • @WebSensePro
      @WebSensePro  Месяц назад

      Please contact via websensepro.com/contact-us for paid support

  • @Mackota_
    @Mackota_ 10 месяцев назад +1

    Hi! Very good information. My only problem with this issue is that Not all of the pages injected are finished with .html
    Ir there another way to tell robots.txt wich pages are not supossed to be indexed? Thank you in advance!

    • @WebSensePro
      @WebSensePro  9 месяцев назад

      In that case you will have to add links manually on your robots.txt

  • @gutierrezmenezes1788
    @gutierrezmenezes1788 11 месяцев назад +1

    Hi. In my case, the malware keeps injecting more URLs with each update. Should I delete it first? how to make?

    • @WebSensePro
      @WebSensePro  11 месяцев назад

      Did you remove the malware first?

    • @WebSensePro
      @WebSensePro  11 месяцев назад

      ruclips.net/video/BrFUD7eQjGg/видео.html Watch this to learn how to remove malware

  • @adityapathak300
    @adityapathak300 2 месяца назад

    For cleaning the website. I have installed wordfence and it removed the unwanted code. How can I make sure it removed all unwanted files

    • @WebSensePro
      @WebSensePro  2 месяца назад

      Please contact via websensepro.com/contact-us for paid support

  • @adeelkhanblogger
    @adeelkhanblogger 3 месяца назад

    I am confused at the end of the video you said these spammy pagers were created due to a code , where can we final that code by hacked and how to remove that code.

    • @WebSensePro
      @WebSensePro  3 месяца назад

      Watch malware removal video

  • @rohnyjones4201
    @rohnyjones4201 2 месяца назад

    I have downloaded the URL sheet but it is showing only 1000 . My website has 8.5M pages. How to download them all ?

    • @WebSensePro
      @WebSensePro  2 месяца назад

      Please contact via websensepro.com/contact-us for paid support

  • @gutierrezmenezes1788
    @gutierrezmenezes1788 11 месяцев назад +1

    Does the text in the robots.txt file have to be exactly as it appears on your screen?

    • @WebSensePro
      @WebSensePro  11 месяцев назад

      No it does not have to be the same

  • @ahhgea
    @ahhgea 11 месяцев назад +1

    You can hardly read the text in your video. Please upload a higher quality version. Or please include what you're supposed to include in the robots.txt file in the description.

    • @WebSensePro
      @WebSensePro  11 месяцев назад

      Video is 4K, here is the robots.txt file content:
      User-agent: *
      Disallow: /*.html$

  • @leaxse
    @leaxse 8 месяцев назад +1

    when i download sheet i didn't show a table sheet what i do ?

    • @WebSensePro
      @WebSensePro  8 месяцев назад

      Try again with a different email

  • @abojadooo
    @abojadooo 7 месяцев назад +1

    thanks a lot for the video its very helpful, but first how to remove the hack before this fix

    • @WebSensePro
      @WebSensePro  6 месяцев назад

      ruclips.net/video/BrFUD7eQjGg/видео.html check this video for malware removal

  • @visitthevenues1417
    @visitthevenues1417 Год назад +1

    How did you load into the google search

  • @gardeningandforestry
    @gardeningandforestry 9 месяцев назад

    Informative video 👍🏼

    • @WebSensePro
      @WebSensePro  9 месяцев назад

      Thanks for appreciation, if this video/channel helped in any way please support by Subscribing and Hit the Like button.

  • @hassankalule
    @hassankalule Год назад +1

    What if the pages are already indexed?

    • @WebSensePro
      @WebSensePro  Год назад

      Clean pages or pages with malware?

    • @bellojoseph6838
      @bellojoseph6838 11 месяцев назад

      @@WebSensePro pages with malware