Show HN: CrawlerCheck – A tool to check if a site is blocking crawlers

2 bogozi 1 7/8/2025, 10:01:08 AM crawlercheck.com ↗
Hi HN,

I'm a long-time developer and SEO consultant. Over the years, I've seen clients suffer from a simple, costly mistake: accidentally blocking Googlebot or other important crawlers with a misplaced rule in robots.txt or a noindex tag.

Manually checking the robots.txt file, then the page's meta tags, then the X-Robots-Tag HTTP header is a tedious process. I wanted a tool that would do it all in one shot and give me a clear answer.

So, I built CrawlerCheck. You give it a URL, and it checks all three sources of crawler directives to tell you if a page is accessible.

The backend is written in Go, and the frontend is a lightweight Svelte app. The goal was to make it as fast and reliable as possible.

It's a brand new project, and I'd love to get some honest feedback from the HN community. Thanks for taking a look.

Comments (1)

8organicbits · 12h ago
Looks pretty helpful, thanks for building this.

Minor suggestion. Consider sorting the checks by status, or adding a summary at the top. I needed to scroll to find if anything was blocked.

I don't know enough about the SEO space, but would a llms.txt check also help?