Wow, thank you so much, Jack. I was hacked recently and experienced a serious drop in traffic. Your tips are helping me slowly climb back up. The best part is that these minute tweaks actually work.
Hello and thank you for your comment. Glad that helped. We hope that your experience with plugin excels. Please feel free to reach out to us again in case you need any other assistance.
Jack: May I suggest that you make a new video series about how to optimize rank math schema (I have already seen other existing videos which are hard to follow) in bulk for elementor pro ecommerce pages and sites with many pages? I love your videos and your clear, systematic concise explanations.
Hello, Thank you for your comment. We have taken note of your topic suggestion. Thank you so much for the feedback and your valuable suggestions. We really appreciate them.
But I could not understand if I use the wildcard (*) for User-agent, then it includes all bots, and why do I need to include another User-agent, msnbot, in the case as the video shows? I understand it's optional, but is there any use in adding another User-agent if I use a wildcard at the top?
Hello, thank you for your comment. If you’re referring to the example shown at 3:35, the first rule (User-agent:* Disallow: /) will prevent all search engine crawlers from crawling your website. The next rule (User-agent:Googlebot Disallow: ) is specifically addressing Googlebot and mentioning there is no directory disallowed, and hence Googlebot can crawl the entire website. We don’t recommend blocking all search engine crawlers unless it is a test or staging site. You can also refer to more examples here: rankmath.com/kb/how-to-edit-robots-txt-with-rank-math/ Hope that helps, and please do not hesitate to let us know if you need our assistance with anything else.
Hello, thank you for your comment, You can checkout our tutorial on about the topic: rankmath.com/kb/cant-edit-robots-txt/ Hope that helps and please do not hesitate to let us know if you need our assistance with anything else.
Hello, and thank you for your comment. The context in which files were disallowed in this tutorial is that they are supposed to be downloaded by the users after they fill out a form, typically subscription forms. If search engines were to crawl and index these file types, then these files would be available to download directly from search results, and users don't have to fill out the subscription forms. To answer your second question, if search engines were to crawl everything on the site, including the internal resources, then they would end up causing an increased load on the website's server. Similarly, crawlers do have limited resources for each website, and we want them to be utilized for the most valuable pages on the website instead of thin content or pages that aren't allowed to be indexed. Hope it helps!
Thanks! This is helpful. Can you also make a video on 301, 302, 303, 304 errors? I struggle to understand the basics of these and how to fix them with Rankmath
Hello and thank you for your comment. We do have a video discussing 301 & 302 Redirection here ruclips.net/video/yxmbcES_5CA/видео.html For 303 and 304 redirect types, you can refer to this article: www.contentkingapp.com/academy/http-status-codes/ Hope that helps. Thank you.
User-agent: * Disallow: /wp-admin/ Allow: /wp-admin/admin-ajax.php is causing a google error that robots.txt is blocking wp-admin/admin-ajax.php. How to fix this?
Hello and thank you for your comment. Glad that helped. We hope that your experience with plugin excels. Please feel free to reach out to us again in case you need any other assistance.
Ok thanks To owner of Google has started validating your fix of Page indexing issues on your site. Specifically, we are checking for ‘Discovered - currently not indexed’, which currently affects 248 pages. Validation can take a few days; we will send you a message when the process is complete. You can monitor the progress of the test by following the link below.
Hello, thank your for your comment, Sure let us know how it goes. If you need assistance, please feel free to open a support ticket so that our team can assist you in resolving this issue? Our support staff is available 24/7 to help you. rankmath.com/support/ Looking forward to helping you.
Hello, thank you for your comment, Do you have issues with robots.txt? Can you submit a support ticket here: rankmath.com/support/ so that our team can look into this closely? Looking forward to helping you
There is no " Edit robots.txt " option right now. It's really annoying to say I'm not satisfied with your plugin. I must deactivate the plugin. Poor Service and documentation also. shame!
Hello, thank you for your comment, Can you please ensure you're using Rank Math in the Advanced Mode? Here is the tutorial: rankmath.com/kb/advanced-mode/ In the mean time, you can submit a support ticket here: rankmath.com/support/ so that our team can look into this closely Looking forward to helping you.
Wow, thank you so much, Jack. I was hacked recently and experienced a serious drop in traffic. Your tips are helping me slowly climb back up. The best part is that these minute tweaks actually work.
Hello and thank you for your comment.
Glad that helped. We hope that your experience with plugin excels.
Please feel free to reach out to us again in case you need any other assistance.
In-depth video explaining everything related to robot.txt! THANKS Mate🙌
Hello, and thanks for your comment. Glad that helped. Feel free to get in touch if you have questions.
Love the way you explained and it's really straightforward
Hello, and thanks for your comment. Glad you liked our video. Feel free to get in touch if you have questions.
Jack: May I suggest that you make a new video series about how to optimize rank math schema (I have already seen other existing videos which are hard to follow) in bulk for elementor pro ecommerce pages and sites with many pages? I love your videos and your clear, systematic concise explanations.
Hello, Thank you for your comment.
We have taken note of your topic suggestion.
Thank you so much for the feedback and your valuable suggestions.
We really appreciate them.
It is a very useful video for me. Thank you. 👍
Hello, and thanks for your comment. Glad you liked the video. Feel free to get in touch if you have questions.
Thanks, it was a very informative video
Hello, and thanks for your comment. Glad that helped. Feel free to get in touch if you have questions.
I appreciate the information
Hello, and thanks for your comment. Glad that helped. Feel free to get in touch if you have questions.
But I could not understand if I use the wildcard (*) for User-agent, then it includes all bots, and why do I need to include another User-agent, msnbot, in the case as the video shows? I understand it's optional, but is there any use in adding another User-agent if I use a wildcard at the top?
Hello, thank you for your comment.
If you’re referring to the example shown at 3:35, the first rule (User-agent:* Disallow: /) will prevent all search engine crawlers from crawling your website.
The next rule (User-agent:Googlebot Disallow: ) is specifically addressing Googlebot and mentioning there is no directory disallowed, and hence Googlebot can crawl the entire website.
We don’t recommend blocking all search engine crawlers unless it is a test or staging site.
You can also refer to more examples here: rankmath.com/kb/how-to-edit-robots-txt-with-rank-math/
Hope that helps, and please do not hesitate to let us know if you need our assistance with anything else.
Hi, what if the robots.tx is not writable?
Hello, thank you for your comment,
You can checkout our tutorial on about the topic: rankmath.com/kb/cant-edit-robots-txt/
Hope that helps and please do not hesitate to let us know if you need our assistance with anything else.
Why do we need to disallow the files from crawlers? What's the disadvantage if they crawl anything and why should it be done??
Hello, and thank you for your comment.
The context in which files were disallowed in this tutorial is that they are supposed to be downloaded by the users after they fill out a form, typically subscription forms. If search engines were to crawl and index these file types, then these files would be available to download directly from search results, and users don't have to fill out the subscription forms.
To answer your second question, if search engines were to crawl everything on the site, including the internal resources, then they would end up causing an increased load on the website's server. Similarly, crawlers do have limited resources for each website, and we want them to be utilized for the most valuable pages on the website instead of thin content or pages that aren't allowed to be indexed.
Hope it helps!
Thanks! This is helpful. Can you also make a video on 301, 302, 303, 304 errors? I struggle to understand the basics of these and how to fix them with Rankmath
Hello and thank you for your comment.
We do have a video discussing 301 & 302 Redirection here ruclips.net/video/yxmbcES_5CA/видео.html
For 303 and 304 redirect types, you can refer to this article: www.contentkingapp.com/academy/http-status-codes/
Hope that helps.
Thank you.
@@RankMath Thank you man! Appreciate it
Awesome video , easy to understand
thanks ❤❤❤
Hello, and thanks for your comment. You are always welcome. Feel free to get in touch if you have questions.
User-agent: *
Disallow: /wp-admin/
Allow: /wp-admin/admin-ajax.php is causing a google error that robots.txt is blocking wp-admin/admin-ajax.php. How to fix this?
Hello,
Can you submit a support ticket here: rankmath.com/support/ so that our team can look into this closely?
Looking forward to helping you
@@RankMath I submitted the ticket. Thank you.
Thank you...
Hello, and thanks for your comment. Glad that helped. Feel free to get in touch if you have questions.
thanks
Hello, and thanks for your comment. You are always welcome. Feel free to get in touch if you have questions.
Nice
Hello and thank you for your comment.
Glad that helped. We hope that your experience with plugin excels.
Please feel free to reach out to us again in case you need any other assistance.
Thanks, my Feeds page have eaten me too deep
Hello, and thanks for your comment. You are always welcome. Feel free to get in touch if you have questions.
Ok thanks
To owner of
Google has started validating your fix of Page indexing issues on your site. Specifically, we are checking for ‘Discovered - currently not indexed’, which currently affects 248 pages.
Validation can take a few days; we will send you a message when the process is complete. You can monitor the progress of the test by following the link below.
Hello, thank your for your comment,
Sure let us know how it goes. If you need assistance, please feel free to open a support ticket so that our team can assist you in resolving this issue? Our support staff is available 24/7 to help you. rankmath.com/support/
Looking forward to helping you.
I have no idea what you mean
Hello, thank you for your comment,
Do you have issues with robots.txt? Can you submit a support ticket here: rankmath.com/support/ so that our team can look into this closely?
Looking forward to helping you
4:33
Hello, and thanks for your comment. Feel free to get in touch if you have questions.
There is no " Edit robots.txt " option right now. It's really annoying to say I'm not satisfied with your plugin. I must deactivate the plugin. Poor Service and documentation also. shame!
Hello, thank you for your comment,
Can you please ensure you're using Rank Math in the Advanced Mode? Here is the tutorial: rankmath.com/kb/advanced-mode/
In the mean time, you can submit a support ticket here: rankmath.com/support/ so that our team can look into this closely
Looking forward to helping you.