A PHP class for detecting bots, crawlers, and spiders via user agent and HTTP headers.
CrawlerDetect is a PHP library that identifies bots, crawlers, and spiders by analyzing the user agent string and HTTP headers. It helps web applications filter non-human traffic to improve analytics accuracy and manage server resources more efficiently. The library detects thousands of known bots and is regularly updated with new patterns.
Web developers and administrators working with PHP applications who need to distinguish between human visitors and automated traffic for analytics, security, or resource optimization. It is particularly useful for those using frameworks like Laravel, Symfony, or YII2.
Developers choose CrawlerDetect for its extensive, up-to-date database of bot patterns and simple API, which requires minimal configuration. Its availability in multiple languages and frameworks through ports and packages makes it a versatile choice for cross-platform projects.
🕷 CrawlerDetect is a PHP class for detecting bots/crawlers/spiders via the user agent
Open-Awesome is built by the community, for the community. Submit a project, suggest an awesome list, or help improve the catalog on GitHub.
Detects thousands of known bots with patterns stored in Fixtures/Crawlers.php, regularly updated through community contributions as mentioned in the README.
Offers an easy API with methods like isCrawler() and getMatches(), requiring minimal configuration, as demonstrated in the usage example.
Provides dedicated packages for Laravel, Symfony, and YII2, making it seamless to integrate into popular PHP frameworks.
Available in ES6, Python, JVM, .NET, Ruby, and Go via ports, allowing bot detection across diverse tech stacks, as listed in the README.
Relies solely on user agent and HTTP headers, which can be easily faked by sophisticated bots, reducing reliability in high-security contexts.
Regular expression matching on each request can slow down high-traffic applications, though not explicitly stated, it's a inherent trade-off for accuracy.
Bot pattern updates depend on pull requests from contributors, as per the contributing section, which might not keep pace with rapidly evolving bots.