March 24, 2015: Some webmasters hoping to get their sites in shape by the 4/21/2015 “mobilegeddon” date have reported that Google’s Mobile-Friendly Test Page is giving their sites “fails” when in fact their sites have been built to be mobile-friendly, for example, by using responsive themes and otherwise abiding by mobile best practices.
These “false positive” test results appear to stem from the fact that these webmasters have blocked the Googlebot from being able to crawl CSS and Javascript resources in their robots.txt file. This condition, if it exists, is reported by the tool in a small notice that reads: “This page may appear not mobile-friendly because the robots.txt file may block Googlebot from loading some of the page’s resources.”
Mobilegeddon (AKA Mobilefriendly) update: [fergcorp_cdt_single date=”21 April 2015″].
As Google’s Matt Cutts explains below in a 2012 “public service announcement,” once upon a time it might have been good practice to block bots from crawling CSS and Javascript, but there’s no need to do this today.
To ensure aganst any false positive test results, make sure you do not block the Google Bot in your robots.txt file. Detecting a bot-blocking robots.txt file is easy. Google’s Webmaster Tools contains a handy robots.txt Tester in its “Crawl” panel that will let you examine your robots.txt file and see if their are any such restrictions within it.
You’re still going to have to locate robots.txt on your website if you need to modify it but it’s not hard to find using your site’s hosting control panel or via FTP. WordPress users running the Yoast SEO plugin can get to robots.txt via this plugin’s “Edit Files” sub-module.
- 10 Mistakes to Avoid When Using QR Codes for Marketing - September 20, 2023
- Kevin Lee on How AI Changes the SEO Landscape - August 31, 2023
- The Power of Compound Marketing: Kevin Lee Presents @ 1MediaWorld 2023 Global Conference - March 7, 2023