Mar 31 2024 03:51 AM
Hi, I would like to understand the reason why my robots.txt tester keeps on failing.
I have emailed tech support who told me to review my sitemap xml file, however I’m running 3 blogs with identical sitemap and robots structure and this problem only occurs in 1 of them.
I have done robots testing on Bing Webmaster and the results stated that nothing crawling is allowed. Some my pages are already indexed by Bing as well.
Could someone let me know step by step on what I could do to remedy this issue?
Thanks in advance!
Apr 03 2024 03:25 AM
Apr 03 2024 08:53 AM
Apr 04 2024 10:04 AM
Thank you for your reply, @ctomczyk . Will defo try that.
An additional detail though, my blog is updated several times a week and the newer posts are getting indexed. However the old ones are not, I even checked on the url inspection where the status stated “Discovered but not crawled”.
Even after I fetched the sitemap or submit the urls, the discovery date never changed nor the status move on to crawling stage.
Do you happen to have any suggestions for this?
Apr 04 2024 11:24 AM