Skip to main content

List of Alerts in AlertPanel

Updated over a year ago

📘 This article provides the list of alerts used in AlertPanel. AlertPanel is part of Botify's Intelligence Suite and is available with a Botify Pro or Enterprise plan.

Overview

Alerts in AlertPanel are based on one of the following sources:

Log Files

Log file alerts are based on the logs you provide to Botify. Your log files will be evaluated daily based on the following alerts:

  • Increase in Crawls to 3xx Status Codes: We are seeing an increase in crawls from search engines that go to pages with a 3xx HTTP status code. While redirects (i.e., 3xx status codes) play an important role and are sometimes necessary, ideally, we would not see much crawl budget dedicated to these pages. We recommend you review this increase to understand better if internal linking is to blame or if this is due to other factors.

  • Increase in Crawls to 4xx Status Codes: We are seeing an increase in crawls from search engines that go to pages with a 4xx HTTP status code. While 4xx status codes play an important role and are at times necessary, ideally, we would not see much crawl budget dedicated to these pages. We recommend that you review this increase to understand better if internal linking is to blame or if this is due to other factors.

  • Increase in Crawls to 5xx Status Codes: We are seeing an increase in crawls from search engines that go to pages with a 5xx HTTP status code. 5xx Status codes, also known as server errors, indicate that your servers are possibly overloaded or having issues. This can cause search engines to have a hard time requesting and crawling content, and users could also receive a 5xx status code. We recommend this is reviewed to understand better if the servers are having consistent issues, as that could call for a variety of fixes and improvements.

  • Increase in Crawls to New Pages: We are seeing an increase in crawls from search engines that are going to pages that previously had not seen crawls. These new pages are seeing crawls from search engines for the first time.

  • Organic Visits Decrease: Organic traffic, users visiting your site, has significantly decreased compared to usual patterns in the last 40 days according to log files. There could be numerous reasons why traffic has decreased, but this issue should be reviewed.

  • Time to Discover or Refresh Your Pages Has Increased: We are seeing that the number of days it takes for search bots to discover or refresh your most important pages is increasing. We are focusing on the 80% most active pages on your site based on visits, and it is taking longer for search bots to find or recrawl this content. This could lead to longer discoverability times of your content or stale content in search results.

AlertPanel Crawls

This type of alert is based on the strategic pages identified in the custom list of URLs in the AlertPanel settings. These are intended to be a smaller set of “strategic” pages you want to monitor daily across the alerts listed below. These strategic pages will be monitored daily and evaluated based on the following alerts:

  • Decrease in Hreflang Tags: We have detected a decline in outgoing hreflang tags on your website. Hreflang tags play a pivotal role in helping search engines understand how your various international pages relate to one another, and you could be sending incorrect signals to search engines without these tags. We recommend you review the decline and that the pages have all necessary hreflang tags present.

  • Incorrect Hreflang Tags: We have detected a rise in pages that have hreflang tags that target incorrect region and language values. This can confuse search engines as to how international pages relate to one another, and we recommend you review these pages and their hreflang tags.

  • Increase in 3xx Status Codes Pages: We are seeing an increase in pages answering with a 3xx HTTP status code according to your daily AlertPanel crawl. While redirects (3xx status codes) play an important role and are sometimes necessary, ideally, we would not see the URLs in your AlertPanel crawl returning a 3xx status code. We recommend that you review these pages to understand better if the 3xx redirect was intentionally implemented and why.

  • Increase in 4xx Status Codes Pages: We are seeing an increase in pages answering with a 4xx HTTP status code according to your daily AlertPanel crawl. While 4xx status codes play an important role and are sometimes necessary, ideally, we would not see the URLs in your AlertPanel crawl returning a 4xx status code. We recommend you review these pages to understand better if the 4xx error was intentionally implemented and why.

  • Increase in 5xx Status Codes Pages: We are seeing an increase in pages answering with a 5xx HTTP status code according to your daily AlertPanel crawl. 5xx Status codes, also known as server errors, indicate that your servers are possibly overloaded or having issues. This can cause search engines to have a hard time requesting and crawling content, and users could also receive a 5xx status code. We recommend you review these 5xx status code URLs to understand better if the servers are having consistent issues, as that could call for various fixes and improvements.

  • Increase in Canonical Tags: We have detected a rise in the number of pages containing a canonical tag that points to a different page. Canonical tags are a strong signal to search engines, and you should review these pages to ensure the canonical tag is needed and that it points to the correct page.

  • Increase in Noindex Tags: We have detected an unusual rise in noindex tags being added to pages. A noindex tag is a very clear signal to search engines to remove the URL from search results, we recommend that you review these pages to avoid losing any traffic.

  • Increase in Thin Content Pages (page content): The number of pages with thin content has increased. We deem thin content 100 words or less of unique content on a page. This increase is flagged because content is an important signal of the intent and purpose of a page.

  • Duplicate H1 Tag: We have detected an increase in duplicate H1 tags on URLs in the same zone (protocol, language, and domain are the same). The H1 tag is a ranking factor and an important way to describe a page and its purpose. URLs ideally should have a unique and descriptive H1 tag, and we recommend you review these pages.

  • Duplicate Page Titles: We have detected an increase in duplicate page titles on URLs in the same zone (protocol, language, and domain are the same). The page title is not only a ranking factor and appears as the blue clickable link in organic results, but also a very important way to define a page. URLs should have a unique page title, and we recommend you review these pages.

  • Duplicate Meta Descriptions: We have detected an increase in duplicate meta descriptions on URLs in the same zone (protocol, language, and domain are the same). While meta descriptions are not a ranking factor, they often appear in search results and can help encourage individuals to click through on your listing in organic results. URLs ideally should have a unique meta description that describes the page, and we recommend you review these URLs and their meta descriptions.

  • Missing H1 Tag: We have detected an increase in URLs missing an H1 tag. The H1 tag is a ranking factor and an important way to describe a page and its purpose. URLs ideally should have a unique and descriptive H1 tag, and we recommend you review these pages.

  • Missing Page Titles: We have detected an increase in URLs missing a page title. The page title is not only a ranking factor and appears as the blue clickable link in organic results, but it is also a very important way to define a page. URLs should have a unique page title, and we recommend you review these pages.

  • Missing Meta Descriptions: We have detected an increase in URLs missing a meta description. While meta descriptions are not a ranking factor, they often appear in search results and can help encourage individuals to click through on your listing in organic results. URLs ideally should have a unique meta description that describes the page, and we recommend you review these pages.

  • Increase in Pages Missing Structured Data: Structured data is an important way to pass details and information about a page to search engines, and Botify saw an increase in pages without structured data.

  • Slow Page Speed Pages: The number of pages that take longer than 2 seconds to load has increased. Site speed is an important aspect for users and search engine bots like Bingbot and Googlebot. For users, a slow website can be frustrating and cause them to leave, while for bots, a slower website can impact their ability to discover and crawl your site.

  • Increase in Slow Rendering Time Pages: Botify detected the number of pages that take longer than 4 seconds (4,000ms) has increased. When pages take longer to load and render, it can lead to search engines crawling fewer pages and search engines such as Google have noted that site speed is a ranking factor.
    *Important to note: This alert is only triggered if you have a JavaScript crawl.

  • Increase in JavaScript Resources blocked by robots.txt file: Botify saw an increase in pages that utilize resources currently blocked by your website's robots.txt file. This is important because these resources are also blocked for search engines like Bing and Google which can prevent them from properly rendering these pages.
    *Important to note: This alert is only triggered if you have a JavaScript crawl.

  • Large JavaScript Resources: There is an increase in pages that load large JavaScript resources (> 5 MB). As size increases, it takes more time for search engines to process them, meaning some might not be rendered.

  • Insufficient Crawl: We detected a crawl with 50 or fewer URLs that may highlight an issue on your website. Since a crawl with this volume is irrelevant, other alerts based on the total URLs crawled will not be triggered until this issue is fixed.

  • Peak or Decrease in URLs Crawled: We detected an unexpected drop or peak in the volume of URLs crawled based on the usual pattern of your website movement. This may highlight an issue on your website and cause false positive alerts since the ratio of URLs impacted has been altered.

Robots.txt

Botify first discovers Robots.txt files associated with your account, as defined in AlertPanel Settings. You can add or edit the list of Robots.txt files as necessary. AlertPanel will evaluate changes to your Robots.txt files daily for unexpected shifts based on the following alerts:

  • Robots.txt is missing: We could not find or access one of the robots.txt files you have added to your AlertPanel Settings. Without a functional Robots.txt page, search engines could crawl and index areas of your site that you do not want to be crawled. We recommend you review this as soon as possible.

  • Robots.txt has changed: We have detected that some directives in the Robots.txt file have changed. Robots.txt file statements are strong signals to search engines on how to crawl your website and can drastically impact site traffic. Make sure that all changes have been made purposefully.

Custom Alerts

You can create custom alerts in SiteCrawler based on the regular crawl for any metric. Custom alerts are not available for RealKeywords or other sources at this time.


Read next:

Did this answer your question?