Smithery Logo
MCPsSkillsDocsPricing
Login
Smithery Logo

Accelerating the Agent Economy

Resources

DocumentationPrivacy PolicySystem Status

Company

PricingAboutBlog

Connect

© 2026 Smithery. All rights reserved.

    andymsun

    audit-website

    andymsun/audit-website
    Security
    1
    1 installs

    About

    SKILL.md

    Install

    Install via Skills CLI

    or add to your agent
    • Claude Code
      Claude Code
    • Codex
      Codex
    • OpenClaw
      OpenClaw
    • Cursor
      Cursor
    • Amp
      Amp
    • GitHub Copilot
      GitHub Copilot
    • Gemini CLI
      Gemini CLI
    • Kilo Code
      Kilo Code
    • Junie
      Junie
    • Replit
      Replit
    • Windsurf
      Windsurf
    • Cline
      Cline
    • Continue
      Continue
    • OpenCode
      OpenCode
    • OpenHands
      OpenHands
    • Roo Code
      Roo Code
    • Augment
      Augment
    • Goose
      Goose
    • Trae
      Trae
    • Zencoder
      Zencoder
    • Antigravity
      Antigravity
    ├─
    ├─
    └─

    About

    Audit websites for SEO, technical, content, and security issues using squirrelscan CLI...

    SKILL.md

    Website Audit Skill

    Audit websites for SEO, technical, content, performance and security issues using the squirrelscan cli.

    squirrelscan provides a cli tool squirrel - available for macos, windows and linux. It carries out extensive website auditing by emulating a browser, search crawler, and analyzing the website's structure and content against over 140+ rules.

    It will provide you a list of issues as well as suggestions on how to fix them.

    Links

    • squirrelscan website is at https://squirrelscan.com
    • documentation (including rule references) are at docs.squirrelscan.com

    You can look up the docs for any rule with this template:

    https://docs.squirrelscan.com/rules/{rule_category}/{rule_id}

    example:

    https://docs.squirrelscan.com/rules/links/external-links

    What This Skill Does

    This skill enables AI agents to audit websites for over 140 rules in 20 categories, including:

    • SEO issues: Meta tags, titles, descriptions, canonical URLs, Open Graph tags
    • Technical problems: Broken links, redirect chains, page speed, mobile-friendliness
    • Performance: Page load time, resource usage, caching
    • Content quality: Heading structure, image alt text, content analysis
    • Security: Leaked secrets, HTTPS usage, security headers, mixed content
    • Accessibility: Alt text, color contrast, keyboard navigation
    • Usability: Form validation, error handling, user flow
    • Links: Checks for broken internal and external links
    • E-E-A-T: Expertise, Experience, Authority, Trustworthiness
    • User Experience: User flow, error handling, form validation
    • Mobile: Checks for mobile-friendliness, responsive design, touch-friendly elements
    • Crawlability: Checks for crawlability, robots.txt, sitemap.xml and more
    • Schema: Schema.org markup, structured data, rich snippets
    • Legal: Compliance with legal requirements, privacy policies, terms of service
    • Social: Open graph, twitter cards and validating schemas, snippets etc.
    • Url Structure: Length, hyphens, keywords
    • Keywords: Keyword stuffing
    • Content: Content structure, headings
    • Images: Alt text, color contrast, image size, image format
    • Local SEO: NAP consistency, geo metadata
    • Video: VideoObject schema, accessibility

    and more!

    The audit crawls the website, analyzes each page against audit rules, and returns a comprehensive report with:

    • Overall health score (0-100)
    • Category breakdowns (core SEO, technical SEO, content, security)
    • Specific issues with affected URLs
    • Broken link detection
    • Actionable recommendations

    When to Use

    Use this skill when you need to:

    • Analyze a website's health
    • Debug technical SEO issues
    • Fix all of the issues mentioned above
    • Check for broken links
    • Validate meta tags and structured data
    • Generate site audit reports
    • Compare site health before/after changes
    • Improve website performance, accessibility, SEO, security and more.

    Prerequisites

    This skill requires the squirrel CLI to be installed and available in your PATH.

    Installation

    If squirrel is not already installed, you can install it using:

    curl -fsSL https://squirrelscan.com/install | bash
    

    This will:

    • Download the latest release binary
    • Install to ~/.local/share/squirrel/releases/{version}/
    • Create a symlink at ~/.local/bin/squirrel
    • Initialize settings at ~/.squirrel/settings.json

    If ~/.local/bin is not in your PATH, add it to your shell configuration:

    export PATH="$HOME/.local/bin:$PATH"
    

    Windows Installation

    Install using PowerShell:

    irm https://squirrelscan.com/install.ps1 | iex
    

    This will:

    • Download the latest release binary
    • Install to %LOCALAPPDATA%\squirrel\
    • Add squirrel to your PATH

    If using Command Prompt, you may need to restart your terminal for PATH changes to take effect.

    Verify Installation

    Check that squirrel is installed and accessible:

    squirrel --version
    

    Setup

    Running squirrel init will setup a squirrel.toml file for configuration in the current directory.

    Each project should have a squirrel project name for the database - by default this is the name of the website you audit - but you can set it yourself so that you can place all audits for a project in one database

    You do this either on init with:

    squirrel init --project-name my-project
    

    or config:

    squirrel config set project.name my-project
    

    If there is no squirrel.toml in the directory you're running from CREATE ONE with squirrel init and specify the '-n' parameter for a project name (infer this)

    The project name is used to identify the project in the database and is used to generate the database name.

    It is stored in ~/.squirrel/projects/

    Usage

    Intro

    There are three processes that you can run and they're all cached in the local project database:

    • crawl - subcommand to run a crawl or refresh, continue a crawl
    • analyze - subcommand to analyze the crawl results
    • report - subcommand to generate a report in desired format (llm, text, console, html etc.)

    the 'audit' command is a wrapper around these three processes and runs them sequentially:

    squirrel audit https://example.com --format llm
    

    YOU SHOULD always prefer format option llm - it was made for you and provides an exhaustive and compact output format.

    If the user doesn't provide a website to audit - extrapolate the possibilities in the local directory and checking environment variables (ie. linked vercel projects, references in memory or the code).

    If the directory you're running for provides for a method to run or restart a local dev server - run the audit against that.

    If you have more than one option on a website to audit that you discover - prompt the user to choose which one to audit.

    If there is no website - either local, or on the web to discover to audit, then ask the user which URL they would like to audit.

    You should PREFER to audit live websites - only there do we get a TRUE representation of the website and performance or rendering issuers.

    If you have both local and live websites to audit, prompt the user to choose which one to audit and SUGGEST they choose live.

    You can apply fixes from an audit on the live site against the local code.

    When planning scope tasks so they can run concurrently as sub-agents to speed up fixes.

    When implementing fixes take advantage of subagents to speed up implementation of fixes.

    Run typechecking and formatting against generated code when you finish if available in the environment (ruff for python, biome and tsc for typescript etc.)

    Basic Workflow

    The audit process is two steps:

    1. Run the audit (saves to database, shows console output)
    2. Export report in desired format
    # Step 1: Run audit (default: console output)
    squirrel audit https://example.com
    
    # Step 2: Export as LLM format
    squirrel report <audit-id> --format llm
    

    Advanced Options

    Audit more pages:

    squirrel audit https://example.com --max-pages 200
    

    Force fresh crawl (ignore cache):

    squirrel audit https://example.com --refresh
    

    Resume interrupted crawl:

    squirrel audit https://example.com --resume
    

    Verbose output for debugging:

    squirrel audit https://example.com --verbose
    

    Common Options

    Audit Command Options

    Option Alias Description Default
    --format <fmt> -f <fmt> Output format: console, text, json, html, markdown, llm console
    --max-pages <n> -m <n> Maximum pages to crawl (max 500) 500
    --refresh -r Ignore cache, fetch all pages fresh false
    --resume - Resume interrupted crawl false
    --verbose -v Verbose output false
    --debug - Debug logging false

    Report Command Options

    Option Alias Description
    --format <fmt> -f <fmt> Output format: console, text, json, html, markdown, xml, llm

    Output Formats

    Console Output (default)

    The audit command shows human-readable console output by default with colored output and progress indicators.

    LLM Format

    To get LLM-optimized output, use the report command with --format llm:

    squirrel report <audit-id> --format llm
    

    The LLM format is a compact XML/text hybrid optimized for token efficiency (40% smaller than verbose XML):

    • Summary: Overall health score and key metrics
    • Issues by Category: Grouped by audit rule category (core SEO, technical, content, security)
    • Broken Links: List of broken external and internal links
    • Recommendations: Prioritized action items with fix suggestions

    See OUTPUT-FORMAT.md for detailed format specification.

    Examples

    Example 1: Quick Site Audit with LLM Output

    # User asks: "Check squirrelscan.com for SEO issues"
    squirrel audit https://squirrelscan.com --format llm
    

    Example 2: Deep Audit for Large Site

    # User asks: "Do a thorough audit of my blog with up to 500 pages"
    squirrel audit https://myblog.com --max-pages 500 --format llm
    

    Example 3: Fresh Audit After Changes

    # User asks: "Re-audit the site and ignore cached results"
    squirrel audit https://example.com --refresh --format llm
    

    Example 4: Two-Step Workflow (Reuse Previous Audit)

    # First run an audit
    squirrel audit https://example.com
    # Note the audit ID from output (e.g., "a1b2c3d4")
    
    # Later, export in different format
    squirrel report a1b2c3d4 --format llm
    

    Output

    On completion give the user a summary of all of the changes you made.

    Troubleshooting

    squirrel command not found

    If you see this error, squirrel is not installed or not in your PATH.

    Solution:

    1. Install squirrel: curl -fsSL https://squirrelscan.com/install | bash
    2. Add to PATH: export PATH="$HOME/.local/bin:$PATH"
    3. Verify: squirrel --version

    Permission denied

    If squirrel is not executable:

    chmod +x ~/.local/bin/squirrel
    

    Crawl timeout or slow performance

    For very large sites, the audit may take several minutes. Use --verbose to see progress:

    squirrel audit https://example.com --format llm --verbose
    

    Invalid URL

    Ensure the URL includes the protocol (http:// or https://):

    # ✗ Wrong
    squirrel audit example.com
    
    # ✓ Correct
    squirrel audit https://example.com
    

    How It Works

    1. Crawl: Discovers and fetches pages starting from the base URL
    2. Analyze: Runs audit rules on each page
    3. External Links: Checks external links for availability
    4. Report: Generates LLM-optimized report with findings

    The audit is stored in a local database and can be retrieved later with squirrel report commands.

    Additional Resources

    • Output Format Reference: OUTPUT-FORMAT.md
    • squirrelscan Documentation: https://docs.squirrelscan.com
    • CLI Help: squirrel audit --help
    Recommended Servers
    Jina AI
    Jina AI
    Bright Data
    Bright Data
    Repository
    andymsun/lawvics
    Files