How can we help?
Guides, feature explanations, and answers to common questions — everything you need to get the most out of CrawlHunt.
Getting Started
3 articles
Creating your CrawlHunt account takes less than 60 seconds. You can sign up with an email and password, or use Google OAuth for one-click access — no separate password needed.
Step 1 — Visit the sign-up page
Go to crawlhunt.com/signup. You will see two options:
- Email & Password — enter your full name, a valid email address, and a secure password (minimum 8 characters; a mix of letters, numbers, and symbols is recommended).
- Continue with Google — click the Google button to create an account using your existing Google account. CrawlHunt only requests your name and email address.
Step 2 — Verify your email address
After submitting the form CrawlHunt sends a verification email to the address you used. Open the email and click the Verify my email button. The link is valid for 24 hours.
Step 3 — Log in and reach your Dashboard
Once verified you are redirected to the login page. Sign in with your credentials (or click Continue with Google) and you will land on the CrawlHunt Dashboard, ready to add your first project.
A Project in CrawlHunt represents one website you want to audit. You can create as many projects as your plan allows and configure each one independently.
Step 1 — Open the new project modal
From the Dashboard, click the + Add New Project button in the top-right corner of the page.
Step 2 — Enter your website URL
In the Website URL field, paste the full root URL of the site you want to audit — for example https://example.com. Always include the scheme (https://). CrawlHunt will recursively discover and follow links starting from this URL.
Step 3 — Configure crawl settings
- Crawl Depth / Page Limit — toggle All pages to scan every reachable URL, or enter a numeric limit (1–1,000 pages). For large sites on lower-tier plans, start with a 50–100 page limit to stay within your monthly quota.
- Crawl Policy — controls which links CrawlHunt will follow:
• Crawl everything — follows all discovered internal links.
• Follow robots.txt — respects your site's exclusion rules.
• Follow sitemap.xml — only crawls URLs listed in your XML sitemap. - Crawl Interval — schedule automatic re-crawls to keep audit data fresh: once, daily, every 2 days, every 4 days, weekly, or monthly.
Step 4 — Choose when to start
- Run now — the crawl queues immediately after you click Create Project.
- Schedule for later — pick a specific date and time to start the first crawl.
Step 5 — Monitor live progress
Click Create Project. If Run now was selected, CrawlHunt immediately begins crawling. The Dashboard row shows a live progress bar and updating page count. The status badge changes from Crawling… to Completed when the crawl finishes.
Once a crawl completes, click View on any project row to open the full Audit Results page — your central command centre for SEO health data.
The Overview tab — your site's health at a glance
The first tab you see is Overview, which summarises the most important metrics:
- SEO Score — a composite score from 0–100 calculated across all crawled pages and issue categories. Aim for 80+.
- Total Pages Crawled — number of unique URLs discovered and analysed. Click this card to jump to the Crawled Pages tab.
- Broken Links — pages or resources returning 4xx/5xx status codes. These hurt both UX and rankings.
- HTTPS Coverage — percentage of pages served securely over HTTPS. Any gap is a ranking signal issue.
- Mobile-Friendly — pages passing basic mobile viewport and responsive design checks.
- Avg Response Time — mean time to first byte (TTFB) across all crawled pages in milliseconds.
Score breakdown chart
Below the summary cards is a Score Breakdown chart showing how each SEO category (Meta Tags, Content, Performance, Links, Security, etc.) contributes to the overall score. Categories in red or amber need the most attention.
Navigating the five result tabs
- Issues — every SEO problem found, filterable by category and severity.
- Crawled Pages — every page with its individual score and HTTP status.
- Resources — CSS, JS, images, fonts, and other assets with their status and size.
- Page Insights — Core Web Vitals and PageSpeed scores for any individual URL.
- Google Search Console — live GSC data (requires Google account connection).
Dashboard
3 articles
The Dashboard is your home screen in CrawlHunt. It lists every project you've created in a sortable table so you can see the health, activity, and status of all your websites at a glance.
Column reference
- Project — the project name (defaults to the domain) and its full URL. Click the name to open the Audit Results page directly.
- Crawl Policy — the policy used during the last crawl: Crawl everything, Follow robots.txt, or Follow sitemap.xml.
- Last Update — date and time the most recent crawl completed. Hover for the exact timestamp.
- Pages Crawled — e.g. "47 / 100" means 47 pages were scanned against a configured limit of 100. If All pages is enabled it shows the total count discovered.
- Resources Crawled — number of CSS files, JS files, images, fonts, and other assets discovered across all crawled pages.
- Site Health — overall SEO health score (0–100) from the latest audit, colour-coded: green (80+), amber (50–79), red (below 50).
- Errors — count of Critical SEO issues requiring immediate action.
- Warnings — count of advisory issues that should be addressed but are less urgent.
- Crawlability — percentage of pages that could be successfully accessed and crawled.
- HTTPS — percentage of pages served over a secure HTTPS connection.
- Internal Linking — a health indicator for your internal link structure based on orphan pages and link-depth analysis.
Every project row has an Actions column on the right-hand side. The available buttons change depending on whether a crawl is currently running.
When the project is idle
- View — opens the full Audit Results page for the most recently completed crawl. This button is always visible (even while a new crawl is in progress — it shows the previous results until the new one finishes).
- Run Again — triggers a fresh crawl immediately using the same configuration. New results overwrite the current audit data once the crawl completes. Use this after making SEO fixes to verify your improvements.
- Delete — permanently removes the project and all its historical crawl data. This action cannot be undone. A confirmation prompt appears before deletion proceeds.
When a crawl is actively running
- View — still available; displays the previous crawl results while the new one is in progress.
- Stop — replaces the Run Again button while crawling. Click it to pause the active crawl mid-way. Partial data collected up to that point is saved and displayed.
Every CrawlHunt plan includes a monthly page crawl allowance. The Crawl Limit Warning indicator only appears on the Dashboard when your usage is approaching or has reached that limit — it is not visible during normal usage.
When the indicator appears
- Approaching the limit — a yellow Crawl Limit Warning badge appears in the Dashboard header when you are close to exhausting your monthly quota. Hover over it to see exactly how many URLs are remaining and your used vs. total count.
- Limit reached — the badge remains visible and new crawl requests are paused until the next billing cycle resets your counter.
How the quota works
- Every page scanned during a crawl counts as one unit against your monthly quota, regardless of whether that page has changed since the last crawl.
- Resources (CSS, JS, images, fonts) do not count against your page quota.
- The counter resets at the start of each billing period — on the same calendar day of the month you first subscribed.
What happens when you reach the limit
- Any crawl already in progress will complete for pages already queued.
- New crawl requests (manual or scheduled) are held in a queue and released automatically when the next billing cycle resets your counter.
Audit Results
4 articles
The Issues tab is the most action-oriented section of the audit. It aggregates every SEO problem found across all crawled pages into a single filterable list so you can prioritise and fix them efficiently.
Issue severity levels
- Critical — issues that directly harm your search ranking or prevent Google from indexing your pages (e.g. missing title tags, broken canonical URLs, noindex on important pages).
- Warning — issues that are sub-optimal and should be improved but are not immediately damaging (e.g. title tags that are too long, images without alt text).
- Info — informational notes about your site's configuration that may require review but are not inherently problematic.
Using the filter bar
- Category — filter by issue type: Meta Tags, Content, Links, Performance, Mobile, Security, Structured Data, etc.
- Severity — show only Critical, Warning, or Info issues.
- Keyword / URL search — type any text to instantly narrow the list to matching issues or pages.
Fixing an issue step by step
- Click any issue row to expand its full details — you'll see the affected URL, the current problematic value, and a plain-English explanation of why it matters for SEO.
- Click the AI Fix button to generate an AI-powered suggestion tailored to that specific page and issue type.
- Review the suggestion — the original value and the suggested replacement appear side by side.
- Accept the fix (marks the issue as resolved) or dismiss it and write your own correction.
Applying fixes — platform connection required
The AI Fix, Manual Fix, and other fix action buttons are always visible and clickable. However, when you click a fix button, CrawlHunt checks whether you have a connected platform that matches the project's domain:
- No platform connected — a Platform Connection Required modal appears with the message: "You need to connect a Shopify or WordPress platform to use this action." Click Connect Platform to go to the Integrations page and link your account.
- Platform connected but domain doesn't match — the same modal appears. The connected platform must belong to the same domain as the project you're auditing. For example, if you connected a Shopify store at
mystore.myshopify.com, fix buttons will only work on a project crawling that exact domain. - Matching platform connected — the fix is generated and applied without interruption.
The Crawled Pages tab lists every URL discovered and scanned during the crawl. Use it to understand how Google sees each individual page and quickly spot pages with redirect chains, poor SEO scores, or high issue counts.
Page table columns
- URL — the full page address. Click the external link icon next to it to open the page in a new browser tab.
- Code — the raw HTTP status code returned by the server (e.g. 200, 301, 404).
- Status — a human-readable label for the response code: OK, Moved Permanently, Not Found, etc. 200 statuses appear in green; redirects in amber; errors in red.
- Redirect URL — if the page returned a 3xx redirect, the destination URL is shown here. Empty for non-redirect pages.
- Content Type — the MIME type of the response, e.g.
text/html. Non-HTML resources (PDFs, images) that were discovered as links are included here. - Images — number of
<img>elements found on the page. - H1s — count of
<h1>tags on the page. More than one H1 is flagged as a Warning. - Response — server response time (TTFB) in seconds for this specific page.
- SEO — the individual SEO score (0–100) for this page. Scores below 50 appear in red, 50–79 in amber, 80+ in green.
- Inlinks — number of other crawled pages that link to this page internally.
- Outlinks — number of links going out from this page to other pages (internal and external).
- Issues — total SEO issues found on this page alone (sum of Critical + Warning + Info).
Toolbar actions
- Search pages… — filter the table in real time by typing any part of a URL.
- Filter icon — apply advanced column filters (e.g. show only pages with a 404 status).
- Export as CSV — download the full pages table as a spreadsheet.
- Export as PDF — export a formatted PDF report of the crawled pages.
- Generate llms.txt file — generates an
llms.txtfile listing all crawled URLs, useful for AI/LLM indexing purposes. - Run Again — triggers a fresh crawl immediately using the same project settings.
Page detail panel
Click any row to expand a Page Details panel below the table. The panel is split into six tabs:
- URL Details — a summary card view showing Status Code, Content Type, Response Time, SEO Score, Indexability, H1 Count, Images, Inlinks, Outlinks, Issues, Errors, Warnings, and whether the page was JS-rendered.
- Inlinks — all internal pages that link to this URL.
- Outlinks — all links found on this page pointing to other URLs.
- SEO — a full list of SEO issues detected on this page, with a count badge on the tab.
- Redirect — the full redirect chain for this URL if any redirects were followed.
- Canonical — the canonical URL declared on this page and whether it matches the crawled URL. A count badge highlights canonical mismatches.
- Schema — any structured data (JSON-LD, Microdata) detected on the page.
The Resources tab lists every external and internal asset loaded by your pages — images, stylesheets, JavaScript files, fonts, videos, and more. Analysing resources helps you find performance bottlenecks and broken asset references that hurt both user experience and SEO.
Asset type filters
- Images — PNG, JPEG, WebP, SVG, GIF. Look for large unoptimised files that slow page load times.
- CSS — stylesheets. Render-blocking CSS can delay page rendering and hurt Core Web Vitals.
- JavaScript — scripts. Large or parser-blocking JS is one of the most common performance problems.
- Fonts — custom web fonts. Excessive font files add significant load time.
- Other — videos, iframes, documents, and miscellaneous assets.
What to prioritise
- Broken assets (404 status) — missing images or scripts that cause blank content, layout breaks, or JavaScript errors on your pages.
- Images over 200 KB — compress large images or convert them to WebP format for faster load times.
- Third-party scripts — scripts loaded from external domains (analytics, ads, chat widgets) that may add latency outside your control.
- Mixed content — HTTP resources loaded on HTTPS pages trigger browser security warnings and may be silently blocked, causing missing images or broken functionality.
The Page Insights tab runs a full PageSpeed Insights analysis powered by Google's Lighthouse engine for any URL in your crawl, giving you lab-measured performance data alongside real-user field data in one place.
How to run a Page Insights analysis
- Open the Page Insights tab within any Audit Results page.
- Select a URL from the dropdown (all crawled pages are listed).
- Choose Mobile or Desktop analysis mode.
- Click Analyse Page. Results typically appear within 10–30 seconds.
Core Web Vitals explained
Google uses Core Web Vitals as a direct search ranking signal. Here is what each metric measures and what to aim for:
- LCP — Largest Contentful Paint — how quickly the largest visible element (hero image, main heading) renders. Target: under 2.5 seconds.
- INP — Interaction to Next Paint — how fast the page responds to user interactions like clicks and taps. Target: under 200 ms.
- CLS — Cumulative Layout Shift — measures unexpected layout shifts (e.g. a late-loading image pushing text down). Target: under 0.1.
- FCP — First Contentful Paint — when the first text or image appears on screen. Target: under 1.8 seconds.
- TTFB — Time to First Byte — server response latency. Target: under 800 ms.
AI Fixes
3 articles
AI Fixes are CrawlHunt's most powerful feature. They use large language models to analyse your page's content, context, and existing SEO metadata, then generate tailored, optimised replacements for common SEO problems — in seconds.
What AI Fixes can generate
- Meta titles — compelling, keyword-rich titles within the ideal 50–60 character range.
- Meta descriptions — descriptive, conversion-oriented summaries between 120–160 characters.
- Image alt text — descriptive alt attributes that are both accessible and SEO-friendly.
- Open Graph tags — optimised social sharing titles and descriptions.
- Heading suggestions — improved H1 structures aligned to the page topic.
- Canonical URL recommendations — correct canonical tag values for duplicate content issues.
How the review process works
CrawlHunt will never update your website automatically. Every AI Fix requires human review:
- CrawlHunt generates the suggestion and displays it alongside the original value in the Issues panel.
- Review the suggestion — you can edit the text directly in the field if you want to tweak it.
- Click Accept to mark the fix as approved, or Dismiss to skip it.
- Accepted fixes can then be pushed to your CMS (WordPress or Shopify) with one click, or exported for manual implementation.
Bulk AI Fixes lets you generate AI suggestions for all open issues simultaneously instead of clicking AI Fix on each row individually — the fastest way to process a freshly crawled site with many issues.
How to run a Bulk AI Fix
- Open the Issues tab for any project.
- Optionally apply filters to scope the batch (e.g. only Critical issues, or only Meta Tags category issues).
- Click the Bulk AI Fixes button at the top of the issues list.
- A confirmation modal shows how many fixes will be generated and how many AI credits will be consumed. Confirm to proceed.
- A progress bar tracks the batch processing in real time. You can navigate away and return — fixes continue generating in the background.
Reviewing bulk-generated fixes
Once processing completes all generated suggestions appear inline in the issues list. You have three options:
- Accept All — approves every suggestion at once. Best used after spot-checking a sample of the output quality.
- Reject All — dismisses all suggestions (e.g. if quality was unsatisfactory).
- Cherry-pick — review each fix individually and accept or dismiss row by row.
Once you've accepted AI fixes, you can push them directly to your CMS using CrawlHunt's platform integrations — no manual copy-pasting or developer help required.
Requirements
- A connected WordPress or Shopify integration (see the Integrations section for setup steps).
- At least one accepted AI fix for a page belonging to the connected site.
Pushing a fix to your CMS
- Accept the AI fix for the issue (or use Bulk Accept).
- In the resolved fix row, click the Push to WordPress (or Push to Shopify) button that appears once a fix is accepted.
- CrawlHunt calls the platform's API and updates the relevant post, product, or page meta fields automatically.
- A green confirmation toast appears confirming the push was successful. The fix row is marked Published.
What gets updated
- WordPress — SEO title, meta description (via Yoast SEO or Rank Math if installed, otherwise native WordPress fields), and image alt text via the WordPress REST API.
- Shopify — product SEO title, meta description, and image alt text via the Shopify Admin API.
Google Search Console
2 articles
CrawlHunt can display your Google Search Console performance data — clicks, impressions, indexing status, Core Web Vitals field data, and more — directly inside the audit results page, so you never have to switch tabs.
Step 1 — Open the GSC section
In the left sidebar, click Google Search Console. If you haven't connected yet, you'll see a prompt to link your Google account.
Step 2 — Authorise via Google OAuth
- Click Connect Google Account. You are redirected to Google's OAuth consent screen.
- Select the Google account that has access to your Search Console properties.
- Review the requested permissions — CrawlHunt only requests read-only access to your GSC data.
- Click Allow. You are redirected back to CrawlHunt with a green confirmation message.
Step 3 — Select your GSC property
After connecting, CrawlHunt fetches all verified Search Console properties linked to your Google account. Select the property that matches the project you want to analyse. If your site is verified as both https://example.com and https://www.example.com, choose the one that matches your canonical domain.
Once connected, CrawlHunt surfaces seven distinct Google Search Console views directly inside your audit results. Here is what each contains and when to use it:
Available views
- Overview — headline metrics: total clicks, total impressions, average click-through rate (CTR), and average search position over the last 28 days. Great for a quick pulse-check on organic performance.
- Performance — detailed breakdown by query (keyword) and landing page. Filter by country, device, and date range. Use this to identify your best-performing pages and keywords with untapped potential.
- Keyword Rankings — track keyword positions over time. Spot keywords trending up or down and identify page-2 keywords (positions 11–20) that are prime candidates for content improvement to push onto page 1.
- Pages Indexing — shows which pages are indexed by Google, which are excluded, and the specific reason for each exclusion (e.g. noindex tag, 404 Not Found, Crawled — currently not indexed). Fixing indexing gaps is often a quick win for organic traffic.
- Sitemaps — lists all XML sitemaps submitted to Google Search Console, their last fetch time, and the ratio of submitted vs. indexed URLs. A low indexed ratio may indicate content quality or duplicate content issues.
- Core Web Vitals — field data collected from real Chrome users visiting your site, split by mobile and desktop and segmented into Good / Needs Improvement / Poor. Unlike Page Insights (which is lab data), this reflects real-world performance against Google's thresholds.
- Security & Manual Actions — any active Google penalties (manual actions) applied to your site and security issues detected by Google (e.g. hacked content, malware, deceptive pages). A clean result here is critical for maintaining search visibility.
Integrations
2 articles
The WordPress integration lets CrawlHunt push accepted AI fixes — meta titles, meta descriptions, image alt text, and more — directly to your WordPress posts and pages. The setup starts inside CrawlHunt and finishes with a plugin installation on your WordPress site.
Step 1 — Add the integration in CrawlHunt
- In the CrawlHunt web app, navigate to the Integrations tab in the sidebar.
- Click the Add Integration button.
- In the modal that appears, select WordPress.
- Enter your WordPress site URL (e.g.
https://yoursite.com) and confirm.
Step 2 — Download and install the CrawlHunt plugin
- After entering your URL, CrawlHunt redirects you to the CrawlHunt WordPress plugin page.
- Download the plugin .zip file from that page.
- In your WordPress admin panel, go to Plugins → Add New → Upload Plugin.
- Choose the downloaded .zip file and click Install Now, then Activate.
What the integration can update
- Post and page SEO titles and meta descriptions (via Yoast SEO or Rank Math if installed, otherwise native WordPress SEO fields)
- Image alt text for media library items
- Open Graph title and description tags
The Shopify integration connects your online store to CrawlHunt, enabling one-click publishing of AI-generated SEO fixes to your product and collection pages. The entire setup is initiated from inside the CrawlHunt web app.
Step 1 — Add the integration in CrawlHunt
- In the CrawlHunt web app, navigate to the Integrations tab in the sidebar.
- Click the Add Integration button.
- In the modal that appears, select Shopify.
- Enter your Shopify store name — this is the part of your
.myshopify.comURL before.myshopify.com. For example, if your store URL ishttps://quickstart-bac0fc57.myshopify.com, enterquickstart-bac0fc57.
https:// or .myshopify.com. For example, for https://quickstart-bac0fc57.myshopify.com enter quickstart-bac0fc57.
Step 2 — Install CrawlHunt on your Shopify store
- After entering your store URL, CrawlHunt redirects you directly to your Shopify store's app install screen.
- Review the permissions requested by CrawlHunt and click Install to complete the installation.
- Once installed, you are redirected back to CrawlHunt with the integration confirmed as active.
What happens after connecting
- Accepted AI fixes can be pushed to product SEO titles, meta descriptions, and image alt text directly from CrawlHunt without touching the Shopify admin.
Billing & Plans
2 articles
CrawlHunt offers plans for solo developers, growing businesses, and large agencies. All plans support unlimited projects — the limits are on monthly page crawls and AI fix credits.
Plan overview
- Free — a limited monthly page crawl quota, no AI fixes, manual data export only. Ideal for evaluating CrawlHunt on a small personal or portfolio site.
- Starter — increased crawl quota, AI fixes enabled with a monthly credit allowance, integrations available. Best for small business owners managing 1–5 websites.
- Pro — substantially higher crawl and AI credit limits, priority crawl queue, advanced GSC views (Keyword Rankings, Page Indexing, Core Web Vitals), and scheduled automatic crawls. Recommended for SEO professionals and marketing teams.
- Agency — the highest limits across all categories, white-label PDF audit reports, team member seats so multiple users can access the same account, and dedicated priority support. Designed for agencies managing 50+ client sites.
For up-to-date pricing and a full feature comparison table, visit crawlhunt.com/#pricing.
You can upgrade, downgrade, or cancel your CrawlHunt subscription at any time from the Billing & Subscription page in the sidebar.
How to upgrade your plan
- In the sidebar, click Billing & Subscription.
- Your current plan, next renewal date, and usage metrics are displayed at the top of the page.
- Click Change Plan and select the plan you want to switch to.
- Review the charge summary — if upgrading mid-cycle you pay only a prorated amount for the remaining days in your billing period.
- Confirm your payment method (or add a new card) and click Confirm Upgrade.
- Your new plan limits take effect immediately.
Managing your payment method
Click Update Payment Method to add or replace a credit or debit card. CrawlHunt uses Stripe's hosted payment form — your full card number is never stored on CrawlHunt's servers.
Viewing and downloading invoices
The invoice table below your plan card lists every past charge with its date, amount, and payment status. Click Download PDF on any row to save a receipt for your records or accounting software.
Cancelling your subscription
Click Cancel Subscription at the bottom of the Billing page. Your access and data remain fully intact until the end of the current paid period. Account data is retained for 30 days after expiry before permanent deletion.
Settings
2 articles
Go to Settings → Account from the sidebar to manage your personal profile details.
Updating your display name & profile picture
- Click your current profile picture to open the image upload dialog.
- Select a new image (JPG or PNG, max 2 MB). Your avatar appears in the dashboard header and in any exported audit reports.
- Update your Display Name in the text field.
- Click Save Changes.
Changing your email address
- Enter your new email address in the Email field.
- Click Save Changes.
- A verification email is sent to the new address. Click the confirmation link to activate the change. Your old email remains active until the new one is verified.
Changing your password
- Scroll to the Change Password section.
- Enter your current password to confirm your identity.
- Enter your new password, then re-enter it in the confirmation field.
- Click Update Password. You will not be logged out of current sessions.
Two-factor authentication (2FA) adds a second verification step to your login. Even if someone obtains your password, they cannot sign in without also having your authenticator app. We strongly recommend enabling 2FA on all CrawlHunt accounts.
Step 1 — Install an authenticator app
If you don't already have one, install a TOTP authenticator app on your phone:
- Google Authenticator (iOS / Android) — simple and widely supported.
- Authy (iOS / Android / Desktop) — supports encrypted cloud backup of your 2FA codes.
- Microsoft Authenticator (iOS / Android)
- 1Password or Bitwarden — if you already use one of these password managers, they have built-in TOTP support.
Step 2 — Enable 2FA in CrawlHunt
- Go to Settings → Security.
- Click Enable Two-Factor Authentication.
- A QR code is displayed on screen alongside a manual text key (for apps that don't support QR scanning).
Step 3 — Scan the QR code and verify
- Open your authenticator app and tap + Add account — then choose Scan a QR code.
- Point your camera at the QR code on screen. The app will add a CrawlHunt entry and begin generating 6-digit codes that refresh every 30 seconds.
- Enter the current 6-digit code from your app into the verification field in CrawlHunt.
- Click Verify and Enable. 2FA is now active on your account.
Backup codes
After enabling 2FA, CrawlHunt generates a set of one-time backup codes. Download and store these in a safe place — your password manager, or printed in a secure location. If you ever lose access to your authenticator app, you can use a backup code to sign in and then disable 2FA.
Popular topics
Can't find what you're looking for? Open a support ticket and our team will help.