# SiteCheckerBotCrawler SiteCheckerBotCrawler is the web crawler for the SEO tool SiteChecker.pro. It is a technical audit bot that is activated only when a user initiates a scan of a website. Its purpose is to analyze a site for technical SEO issues, performance metrics, and content quality to provide the site owner with actionable insights for improvement. Its presence in your logs means someone is actively analyzing your site's SEO health. Breadcrumb navigation - [Privacy-focused, simple website analytics](https://plainsignal.com/) - [Agents](https://plainsignal.com/agents "Agents, User-Agents, Crawlers, Browsers") - [SiteCheckerBotCrawler](https://plainsignal.com/agents/sitecheckerbotcrawler) ## What is SiteCheckerBotCrawler? SiteCheckerBotCrawler is the web crawler for the website auditing and SEO analysis tool SiteChecker.pro. It is a specialized bot designed to analyze websites for technical SEO issues, performance, and content quality. The bot identifies itself in server logs with the user-agent string `SiteCheckerBotCrawler`. It works by systematically visiting the pages on a target site to collect data on factors like HTML structure, page speed, and other elements that impact search engine rankings. ## Why is SiteCheckerBotCrawler crawling my site? SiteCheckerBotCrawler is crawling your website because someone has used the SiteChecker.pro tool to analyze it. This could be you, a member of your team, or a third party such as a competitor or marketing consultant. The crawler is not a continuous one like a search engine's; it performs targeted crawls only when a user initiates an analysis on the SiteChecker.pro platform. The scope of the crawl is determined by the settings the user selects for the audit. ## What is the purpose of SiteCheckerBotCrawler? The purpose of SiteCheckerBotCrawler is to gather the technical and SEO-related data that powers the SiteChecker.pro analysis tools. These tools help website owners and marketers identify and fix issues that could be affecting their search rankings or user experience. The bot collects information on technical SEO factors, content quality, and broken links. The platform then processes this data into reports with actionable recommendations for improving the website's performance. For the person who initiates the scan, the service provides valuable diagnostic information. ## How do I block SiteCheckerBotCrawler? To prevent users of the SiteChecker.pro tool from analyzing your website, you can add a disallow rule to your `robots.txt` file. This is the standard method for managing crawler access. To block this bot, add the following lines to your `robots.txt` file: ``` User-agent: SiteCheckerBotCrawler Disallow: / ``` ## Related agents and operators ## Canonical Human friendly, reader version of this article is available at [SiteCheckerBotCrawler](https://plainsignal.com/agents/sitecheckerbotcrawler) ## Copyright (c) 2025 [PlainSignal](https://plainsignal.com/ "Privacy-focused, simple website analytics")