Skip to main content

    User-Agent

    A User-Agent is an HTTP header that identifies the client (browser or bot). robots.txt rules are grouped by user-agent.

    Definition

    The User-Agent header identifies the requesting client (browser, app, or crawler). In SEO, robots.txt uses user-agent groups to apply Allow/Disallow rules, and log analysis uses it to understand bot behavior and errors.

    Why it matters

    • robots.txt directives are applied per user-agent group
    • Logs help you identify crawler behavior and errors
    • It helps answer “who is crawling what”

    How to implement

    • Use robots.txt user-agent groups for different crawlers
    • Don’t treat self-reported user-agents as security controls
    • Analyze logs to spot repeated crawling and error URLs

    Related

    FAQ

    Common questions about this term.

    Back to glossary