You should not Block Googlebot from Crawling Javascript and CSS Files
In a short video on YouTube, Head of Google’s Webspam Team, Matt Cutts asked SEO folks not to block Googlebot from crawling the JavaScript and CSS pages. The Google Guy is famous and well known among the community of SEO professional and webmasters and has been answering their questions through short videos for some time.
However, this time he released this short video as “Public Service Announcement”. In the Beginning he says, “I just have a short public service announcement rather than a question today”. He requested the webmasters for not excluding these files and pages from the Google crawler, “….just take that out of the
robots.txt.”
In this video, He explains that why you should not block Googlebot from crawling JavaScript and CSS files. He says, “Let us crawl the JavaScript, let us crawl the CSS, and get a better idea of what’s going on on the page.” He added that the cost in terms of bandwidth needed to serve JavaScript files to Google is insignificant.
Due to the secretive nature of Google, all the suggestions from Matt to make the search more effective provides webmasters with opportunities to learn something new.
No comments yet.