π This article explains the benefits of crawling your site's JavaScript and how Botify manages JavaScript crawls.
Overview
Crawling page HTML provides valuable insights. However, if your website relies heavily on JavaScript to load page elements, it's important to crawl it to get a realistic view of how search engines experience your site. Crawling JavaScript allows search engines to better understand and index dynamic content. Some benefits include:
Improved indexing: JavaScript-powered content, such as dynamically generated text and links, can be accurately indexed, leading to better search engine visibility.
Enhanced user experience: Crawling JavaScript enables search engines to render web pages accurately, ensuring users receive search results that reflect the most up-to-date content and functionality.
Deeper content discovery: JavaScript crawling enables search engines to access content that may be hidden behind interactive elements or loaded dynamically, allowing for a more comprehensive understanding of a website's content.
Better SEO performance: By crawling JavaScript, websites can potentially achieve higher rankings in search engine results pages (SERPs) by resolving issues only visible with JavaScript, leading to increased organic traffic and better overall SEO performance.
Compatibility with modern web technologies: With the increasing prevalence of JavaScript frameworks and single-page applications (SPAs), crawling JavaScript is essential for search engines to navigate and index content on modern websites effectively.
If Botify doesn't crawl your site's JavaScript, you'll miss important insights into JavaScript resources, load times, and overall performance that can impact your crawl budget.
Viewing Your Pages Without JavaScript
Disable JavaScript in your browser for a quick view of how your pages load without JavaScript. Significant gaps in your page display are from content only accessible after executing scripts on your pages. This is what is missed by HTML-only crawls.
You can temporarily disable JavaScript by adjusting your browser's settings or using a browser extension like Quick JavaScript Switcher. For example, to disable JavaScript in Chrome settings:
Navigate to Settings > Privacy and security > Site settings > Content and select JavaScript.
βIn the "Customized behaviors" section, click Add to identify your website's address.
β
See the difference with JavaScript disabled for this page:
and enabled:
How Botify Crawls in JavaScript
When SiteCrawler crawls your site in JavaScript, we request resources to load the pages hosted on your site. Since each page may need multiple resources to generate the content, you will likely notice increased resource requests on your server. For example, if 20 resources are needed to generate the content on one page, you will have requests for 20 times the number of pages per second we are crawling, as defined in your project settings.
The benefits of getting an accurate assessment of your site by crawling in JavaScript outweigh this increase in requests. The increase typically does not impact server load significantly, especially when using one or both of the following mitigation strategies:
Request crawling from a static IP address and allow our IP addresses.
A strong indicator of the need to crawl from a static IP address is if you notice several 403 or 429 errors because Botify's crawler is blocked from making too many requests. To test this solution in your project, send a request to our Support team through the in-app messenger:
Finding JavaScript Performance Metrics
When JavaScript is enabled in your project, JavaScript-related performance is available in the following reports:
SiteCrawler's Performance report, including JavaScript load time and JavaScript crawl resource warnings.
βURL Details JavaScript tab.
βAnd you can filter reports with the metrics in the JavaScript Crawl folder:
β
Enabling JavaScript Crawls
To enable JavaScript crawls for your project, send a request to your Account Manager.