Google finally updated its technical webmaster guidelines on Oct 27th 2014; the guidelines focus on websites that have active CSS and JavaScript. As per the new guidelines, Google advises webmasters to allow the Googlebot to access the JavaScript, CSS, and image files on the web pages for optimum rendering and indexing.
Webmasters who choose to disregard this guideline and disallow the bot from crawling the Javascript or CSS files in their website's robots.txt will observe a negative effect on their website rankings. This is because it will directly affect how the Google algorithms render and index the web content.
Tips To Ensure Optimal Indexing
- Google's rendering engine does not support many kinds of technologies that different web pages may be using. To ensure optimum indexing, it is essential to ensure that the web design follows the principles of progressive enhancement. This makes it easy for the systems to identifyoo gd content & basic functionality.
- Webmasters are advised to adhere to the best techniques for page performance optimization. This is because quick rendering of pages allows the users to access the content easily and makes the process of indexing more efficient too. Some effective techniques include ensuring that the server is capable of handling the extra load of JavaScript and CSS files to Googlebot.The process of serving the CSS and JavaScript files can also be optimized by merging the separate CSS and JavaScript files, minifying the merged files and making the necessary changes in the web server configurations to allow it to serve them compressed.Now that Google has launched rendering-based indexing, it has also updated the Fetch and Render feature in Webmaster Tools which allows web masters to see how the systems render their web pages.
This feature makes it easy to identify indexing issues e.g. improper robots.txt restrictions, redirects that Googlebot cannot follow, and others.