Это видео недоступно.
Сожалеем об этом.

Resolving rogue robots directives

Поделиться
HTML-код
  • Опубликовано: 14 авг 2024
  • In this episode of SEO Fairy Tales, Martin Splitt and Jason Stevens, a Senior Performance Media Manager at Google, shares how Jason’s team audited a site with bad search snippets and discovered that a robots.txt file was preventing the site from being crawled. Learn the steps you can take to investigate changes in website traffic or snippets using Search Console and Google’s suite of tools.
    Chapters
    0:00 - Intro
    0:27 - Bad snippets
    1:18 - Researching the root cause
    2:11 - Understanding the updated directive
    3:30 - It isn't always that simple
    4:17 - What to look for?
    5:36 - Other things to look out for
    6:55 - Reports and tools
    8:56 - Did it work?
    9:43 - Setting up for sustainable success
    11:05 - robots.txt - friend or foe?
    14:01 - Wrap up
    Watch more episodes of SEO Fairy Tales → goo.gle/SEOFai...
    Subscribe to Google Search Central Channel → goo.gle/Search...
    #TechnicalSEO #Snippets #Crawling

Комментарии • 4

  • @GoogleSearchCentral
    @GoogleSearchCentral  Год назад +2

    Subscribe to the Google Search Central Channel → goo.gle/Webmasters

  • @sanjaysanwal3393
    @sanjaysanwal3393 8 месяцев назад

    I have a very awkward issue, GOOGLE is crawling a type of URLs on our site, which is expected, but when I inspect those URLs, Google says 'blocked by robots' on GSC.
    What can be the reason here?

  • @simonmcox
    @simonmcox Год назад

    You are having too much fun! Love it! Good case too!

  • @FacundoTroitero
    @FacundoTroitero Год назад

    "interesting"