How to Fix Page with Redirect Issues - Google Search Console

Поделиться
HTML-код
  • Опубликовано: 21 авг 2024
  • UPDATE: www.rankya.com... Page Indexing Page with redirect error reports in Google Search Console may show issues regarding URLs with redirection, this video lesson shows you many different ways to approach troubleshooting Page with redirect status because the issue could be caused by various factors (website setup, incorrect URL structure, www vs non-www linking or even web server can cause page with redirect errors).
    Page with redirect in Search Console means that Googlebot has encountered a non-canonical URL that redirects Google user-agent to another web page. To learn more visit Search Console Help Section:
    support.google...

Комментарии • 27

  • @MeganBennettAUS
    @MeganBennettAUS 7 месяцев назад +1

    Thank you RankYa. Excellent overview of some typical problems and also giving solutions on how to fix. Much appreciated.

    • @RankYa
      @RankYa  7 месяцев назад

      Thanks for the comment Megan, please do keep in mind that RankYa.com.au is based in Melbourne and I may be able to provide expert solutions to fix ANY/ALL Search Console issues for your own valued clients

  • @abbasafrica
    @abbasafrica 6 месяцев назад

    Best explanation I come across so far
    So, since there's no issue when looking at "All submitted pages" I can leave everything as it is

    • @RankYa
      @RankYa  6 месяцев назад +1

      :-) and you are correct, simply focus on content or other things on your website.

  • @chrismanson2286
    @chrismanson2286 10 месяцев назад +1

    Thank you for the detail, but you didn't mention that word "file" I love it. I am going to the blog now. Thanks again.

  • @sanliySEO
    @sanliySEO 3 месяца назад

    Thank you for the detail explanation, found some really good technical source. Really supportive.
    Also, I have a question. Should we hit the 'Validate Fix' button after fixing any issues.
    As far as I audited the redirects, there's no issues. so I hit the validation fix button, and in a few days notified the validation failed error.
    But, I didn't find any issues with my redirects and not sure why google says validation failed.
    May be I'm wrong here. What does actually google expect us to fix in this case? Does google need us to remove the redirect in terms of the original URL (redirecting URL) get indexed? or there are some technical issues of the way we have redirected. How can I receive a successful validation with fixing some issues. Is this possible to receive a successful fix after a validation when the actual redirect is still in place?
    I think this section for Google list down the URL list that are not able to index due to the redirects, and google is still crawling these urls may be from other sources, or internal links (not in my XML sitemap for sure). in this case should we still hit the validation fix button? I love to find a way to receive the successful fix notification after a validation while the redirects are still in place correctly if it's possible.
    Appreciate your response.
    Thanks

  • @kahnelsonk5471
    @kahnelsonk5471 27 дней назад

    thanks you Rank Ya ,i'm need here and having a problem with redirect insues on my website and i can't sort it out

    • @RankYa
      @RankYa  27 дней назад

      You can check out more insights for fixing page with redirect issues on www.rankya.com website www.rankya.com/?s=page+indexing or you may consider the affordable services for fixing search console issues on my main website www.rankya.com

  • @unkonxeeus
    @unkonxeeus 7 месяцев назад +1

    Hello, thank you, the "all submitted pages menu" is greyed out. So I can not get to the "submitted pages". It is not clickable

    • @RankYa
      @RankYa  7 месяцев назад

      Does your website in *Google Search Console* has XML Sitemap submitted?

    • @user-sn9pe5by7u
      @user-sn9pe5by7u 5 месяцев назад

      I have this problem too

    • @user-sn9pe5by7u
      @user-sn9pe5by7u 5 месяцев назад

      Yes My website has submitted XML sitemap

  • @HonestLady
    @HonestLady 8 месяцев назад

    So what to do when my redirect problem is just an s , when it was new it started with http: , later I added as https:
    Due to this google shows as canonical, I did live testing , it says URL is available to Google …how to fix this?
    Thank you

    • @RankYa
      @RankYa  8 месяцев назад

      I think your setup isn't haven't any problems, if you recently changed the website to have https: Google will pick that up. All you need to do is ensure =
      1 the canonical URLs uses the https version, then
      2 the website (at server level) redirects http > https
      3 and simply ensure that internal links on the entire website is also updated to be using https
      4 then it is just matter of ensuring the XML Sitemap contains https version of the URL

  • @oyewoletoosin8169
    @oyewoletoosin8169 3 месяца назад +1

    I wasted about 20 minutes watching your video but can't find where you fixed anything. Isn't that ridiculous? You didn't fix anything unfortunately

  • @InsightThoughtSystems
    @InsightThoughtSystems 4 месяца назад +1

    This is completely impossible to follow.

    • @RankYa
      @RankYa  4 месяца назад

      Noted, I totally understand, accordingly I've created new content covering all page indexing issues here:
      www.rankya.com/google-search-console/page-indexing/how-to-fix-page-indexing-errors/
      Page with redirect simply is:
      example.com/sampleURL (this is known to Google) but when Google tries to crawl the sampleURL, the website is redirecting Google to another URL, for example: example.com/anotherURL
      Fixing it can be basic or complex, it will depend on the website setup and the content management system being used.
      At basic level do this:
      Ensure that the XML Sitemap you've submitted to Search Console contains the correct URL you want Google to index. And do not waste time looking at *All known pages* instead only focus on "All submitted pages"

  • @midnightsoulband9320
    @midnightsoulband9320 5 месяцев назад

    i dont have robots.tex option , am i being dumb?

    • @RankYa
      @RankYa  5 месяцев назад

      You could easily create a robots.txt file to block Google to crawl certain parts of a website (doing so will also remove some unimportant Search Console errors). How do you create 1? Simply use Microsoft Notepad program and name that file robots.txt