Robots.txt Checker

The Robots.txt Checker Bookmarklet is an essential tool for SEO auditing and crawl control verification.

With one click, this bookmarklet analyzes the site's robots.txt file and checks if the current page is disallowed.

Perfect for quick SEO audits and crawl control verification. No more manual robots.txt inspection - just click and check!

javascript:!function(){const t=new URL(window.location.href),e=`${t.protocol}//${t.hostname}/robots.txt`;fetch(e).then((t=>t.text())).then((e=>{const n=e.split("\n").filter((t=>t.trim()&&!t.startsWith("#"))),o=(n.filter((t=>t.toLowerCase().startsWith("user-agent:"))),n.filter((t=>t.toLowerCase().startsWith("disallow:")))),i=(n.filter((t=>t.toLowerCase().startsWith("allow:"))),t.pathname),r=o.some((t=>{const e=t.toLowerCase().replace("disallow:","").trim();return i.startsWith(e)})),s=document.createElement("div");s.style.cssText="\n                position: fixed;\n                top: 20px;\n                right: 20px;\n                width: 80%;\n                max-width: 800px;\n                max-height: 80vh;\n                background: white;\n                padding: 20px;\n                border-radius: 8px;\n                box-shadow: 0 2px 10px rgba(0,0,0,0.1);\n                z-index: 10000;\n                overflow-y: auto;\n            ";const a=document.createElement("div");a.innerHTML=`\n                <h2 style="margin-top: 0;">Robots.txt Analysis</h2>\n                <div style="margin-bottom: 20px;">\n                    <button onclick="this.parentElement.parentElement.remove()" style="float: right;">Close</button>\n                </div>\n                <div style="margin-bottom: 20px; padding: 10px; border: 1px solid ${r?"#ff4444":"#4CAF50"}; border-radius: 4px;">\n                    <h3 style="margin: 0 0 10px 0;">Current Page Status</h3>\n                    <p style="margin: 0; color: ${r?"#ff4444":"#4CAF50"}">\n                        ${r?"This page is disallowed in robots.txt":"This page is allowed in robots.txt"}\n                    </p>\n                </div>\n                <div style="margin-bottom: 20px;">\n                    <h3 style="margin: 0 0 10px 0;">Robots.txt Rules</h3>\n                    <pre style="margin: 0; white-space: pre-wrap;">${e}</pre>\n                </div>\n            `,s.appendChild(a),document.body.appendChild(s)})).catch((t=>{alert("Error fetching robots.txt: "+t.message)}))}();
Drag me to bookmark bar: 👉🏻 Robots.txt Checker

This tool streamlines your SEO auditing by:

Why Use the Robots.txt Checker?

This essential tool makes robots.txt verification effortless. Perfect for SEO professionals, web developers, and anyone who needs to ensure proper crawl control. Simply click the bookmarklet to instantly check if any page is disallowed in robots.txt. Helps you quickly identify crawl control issues and verify that your content is properly accessible to search engines.