Google stop crawl my site, it said robots.txt unreachable, but i open it on my browser, it's ok,and then i move it, but still no change, it has continued for five days, what should i do now.
I did Fetch as Google for robots.txt and other pages many time, but it show Unreachable robots.txt, the error happened on January 16 to 17, the server of this web is in China, it always normal before January 16. We have another web with the same IP, it's ok now, but it's a Chinese web, and i was be told that Google has make it's Chinese server return to USA on January 17, may be it's the problem? The googlebot from USA can't crawl my web? Before that day my web crawled by googlebot from Beijing.
Last edited by sirius; 02-20-2014 at 02:40 PM.
Reason: Site linking not permitted.
Perhaps you got penalty from google? If not, try to check and tidy up your site's internal linking because it's related to how fast google will recognize your site and crawl it. Google don't want to crawl a messy and incoherent website or if it did, very rarely and slowly.
There will be many reasons for your issue which you need to check one by one and get confirmed that everything is ok on your side for Google to crawl your website. Please check the possible errors list i stated below
1. No fresh content updated on pages for longer period
2. Google might of penalized your website
3. Reduced crawling rate in webmaster tools
4. Your web hosting provider might have some firewalls enabled on the server which might be blocking google crawling your website
5. Non responsive website/ too slow website
Please do check with the other website when it was last cached by google, with that you will get some information on your IP problems also.
Check the source code of your home page and see what pages google stopped crawling by finding this (<meta name="robots" content="noindex,follow" />) code in your page source. If you are not getting this code then fetching your site fetch as google in webmaster will help you add your site again in google.