
Getting started
Tasks (How do I...?)
- find and display my site's broken links
- locate a broken link
- limit my crawl using blacklisting / whitelisting
- export an xml sitemap (Plus and Pro)
- use canonical href to exclude duplicates from my xml sitemap (Plus and Pro)
- find missing meta tags (Pro)
- find duplicate content (same content, different url) (Pro)
- check the Spelling of a website (Pro)
Reference
- Limitations
- Settings
- Blacklists / Whitelists (Do not check / Only follow / Do not follow)
- Number of threads
- Timeout and Delay
- Check for broken images / Check linked js and css files
- Ignore external
- Don't follow 'nofollow'
- Check links on error pages
- Treat subdomains of root domain as internal
- Ignore querystrings
- Ignore querystrings
- Pages have no file extension
- Archive pages while crawling
- Preferences
- Limiting Crawl
- User agent string / spoofing
- Generally be tolerant rather than strict
- Ignore ../ that travels above the domain
- Sitemap > XML / Template (Plus and Pro)
- Sitemap > Check for robots.txt and robots meta tag (Plus and Pro)
- SEO > Parameters (Pro)
- SEO > Options > Keyword analysis while scanning (Pro)
- The link and page inspectors
- Re-checking
- Checking local files
- Importing a list of links
- Sitemap
- Which pages are included?
- SEO and page analysis
- FAQs
- More information and download