Scrutiny

Scrutiny application icon

< Back to Scrutiny home

Version History

v7.4.4 released June 2017

v7.4.3 released June 2017

v7.4.2 released June 2017

v7.4.1 released June 2017

v7.4.0 released May 2017

v7.3.2 released May 2017

v7.3.1 released May 2017

v7.3.0 released April 2017

v7.2.1 released April 2017

v7.2.0 released Apr 2017

v7.1.6 released Mar 2017

v7.1.5 released Mar 2017

v7.1.4 released Mar 2017

A number of fixes around the sitemap functionality, exclusion of pages from the sitemap and canonical urls:

Further fixes to 'check links within pdfs' functionality:

Adds standard 'Help book' manual. Find this under the Help menu. This will be under continuous review and improvement.

Other small fixes

v7.1.3 released Feb 2017

v7.1.2 released Feb 2017

v7.1.1 released Jan 2017

v7.1.0

v6.8.21 Released Jan 2017

v6.8.20

v6.8.18

v6.8.17

v6.8.16

v6.8.15

v6.8.14

v6.8.13

v6.8.12

v6.8.11

v6.8.10

v6.8.9

v6.8.8   <- click to download this version

v6.8.7

v6.8.6

v6.8.5

v6.8.4

v6.8.3

v6.8.2

v6.8.1

v6.8

v6.7

v6.6.5

v6.6.4

v6.6.3

v6.6.2

v6.6.1

v6.6

v6.5

v6.4

v6.3.1

v6.3

v6.2.2

v6.2.1

v6.2

v6.1.5

v6.1.4

v6.1.3

v6.1.2

v6.1.1

v6.1

v6.0.8

v6.0.7 (no longer beta)

v6.0.6 (no longer beta)

Improvements inherited from the v6 engine:

v6.0.5 (no longer beta)

Improvements inherited from the v6 engine:

v6.0.4 (beta)

v6.0.3 (beta)

v6.0.2 (beta)

v6.0.1 (beta)

v6.0 (beta)

v5.9.21 released Nov 2015

v5.9.20 released Nov 2015

v5.9.18 released Oct 2015

v5.9.17 released Oct 2015

v5.9.16 released Oct 2015

v5.9.15 released Oct 2015

v5.9.14 released Sep 2015

Improvements to page search:

v5.9.13 released Sep 2015

Improvements around xml sitemap

v5.9.12 released Sep 2015

v5.9.11 released July 2015

v5.9.10 released July 2015

v5.9.8 released July 2015

v5.9.7 released July 2015

v5.9.6 released June 2015

Some fixes and improvements to xml sitemap:

Other fixes / enhancements

(v5.9.5)

v5.9.3 released June 2015

v5.9.2 released June 2015

Some changes designed to help crawl very, very large websites:

Other small enhancements:

Fixes:

v5.9.1 released June 2015

v5.9 released June 2015

Many improvements related to sitemaps and in particular the .dot (graph) export (in readiness for SiteViz, a visualiser to display the sitemap): Enhancements: Fixes:

v5.8.8 released May 2015

v5.8.7 released May 2015

v5.8.6 released May 2015

v5.8.5 released May 2015

v5.8.4 released April 2015

v5.8.3 released April 2015

v5.8.2 released April 2015

v5.8 released Feb 2015

Improves spelling results / workflow:

v5.7 released Jan 2015

v5.6.4 released Jan 2015

v5.6.3 Release Candidate released Dec 2014

v5.6.2 Release Candidate released Dec 2014

v5.6.1 Release Candidate released Dec 2014

v5.6 Release Candidate released Dec 2014

v5.5.2 released Dec 2014

v5.5.1 released Nov 2014

v5.5 released Nov 2014

v5.4.8 released Nov 2014

New features and enhancements relating to authentication and crawl limits

v5.4.7 released Nov 2014

NB Some users experienced a crash while using this version. Please upgrade to 5.4.8

A number of enhancements relating to character encoding: Other enhancements and fixes:

v5.4.6 released Oct 2014

A number of enhancements and fixes to spelling and grammar checking:

v5.4.5 released Oct 2014

v5.4.3 released Oct 2014

v5.4.2 released Oct 2014

v5.4.1 released Sep 2014

v5.4 released Sep 2014

v5.3.1 released Aug 2014

v5.3 released Aug 2014

v5.2.1 released Jul 2014

v5.2 released July 2014

v5.1.2 released June 2014

v5.1.1 released June 2014

Enhancements

Small fixes

v5.1 released June 2014

Improvements and refinements to the UI New features and fixes

v5.0.10

v5.0.9

v5.0.8

v5.0.7

v5.0.6 (beta) Released May 2014

v5.0.5 (beta) Released May 2014

v5.0.4 (beta) Released May 2014

v5.0.3 (beta) Released April 2014

v5.0.2 (beta) Released April 2014

v5.0.1 (beta) Released April 2014

v5 (beta) Released April 2014

v4.5.5 Released June 2014


(as part of ongoing support for v4 alongside development of v5)

 

v4.5.4 Released February 2014

 

v4.5.3 Released February 2014

 

v4.5.2 Released December 2013

 

v4.5.1 Released September 2013

Adds 'soft 404' support:
- highlights suspected soft 404s (where status code is 2xx but the intended page hasn't been found)
- You can customise this list to find soft 404s within your own site or add terms found in external soft 404s
- You can switch the feature off (in Preferences) if you have a large site and want best performance and this isn't important to you

Adds automatic update check:
- New dialog gives information about new versions when available with single click to download

Small fixes and improvements:
- Adds 'visit' button beside url field in link inspector

Change of policy with demo mode
- Allows a limited number of trial scans rather than a period of time

v4.4.1 Released September 13

Small fixes and improvements:
- Expandable views will only expand when crawl is paused or finished. Deferring the building of these views improves speed and efficiency
- Fixes bug preventing pages to be added to the sitemap if canonical link is given as a relative url

v4.4 Released September 13

Retina screen compatible
OSX Mavericks tested and supported

Main window's Toolbar redesigned in line with Apple's human interface guidelines and for retina screen compatibility
Adds toolbar controls (show / hide / customise) to main View menu

Minimum system requirements OSX 10.5. 10.4 users should not upgrade to v4.4, for compatibility with newer systems it uses features not available in 10.4
Small fixes and improvements:
- now indents data for expandable views when exported as csv, html

v4.3.1 Released August 13

Small fixes and improvements:

- Fixes problem with og:description being displayed in SEO table rather than meta description (if they are different and depending which comes first).
- Does not show pages in sitemap (& exported XML sitemap) if canonical link is present and points to a different page.
- Adds time and date to comments at top of XML sitemap

v4.3 Released July 13

Adds more highlighting options to SEO table:
- option to highlight pages with many links. Preference added so that you can choose the threshold, but default is set to one link per 100 words (can be changed in Preferences>SEO) - (this number comes from Matt Cutts of Google: http://www.mattcutts.com/blog/how-many-links-per-page )
- option to highlight too short / too long meta description. This is important because it is displayed on search engine results page (SERP). Defaults set to between 30 and 160 characters is ok, can be changed in Preferences>SEO

Adds link count as column to SEO table, sortable.
Also adds word count to SEO table, sortable, allowing user to find pages with small or large amounts of content, and to compare number of links with number of words. (Some guides give a number of links for your content, eg one link per 125 words). Currently Scrutiny doesn't display this latter - the calculation must be made in a spreadsheet with exported data.
Also adds canonical url to SEO table (sortable) and takes this information into account when highlighting duplicates (ie two pages aren't marked as duplicates if one contains a canonical url)
Adds these three new fields (links, words, canonical url) to the Page inspector (double-click an item in the SEO table)

Adds checkboxes to switch columns on or off in SEO table (Preferences>SEO)
Adds context menu to SEO table which includes Copy URL (command-C from keyboard), Visit and Get Info (command-I from keyboard)
When the starting url is edited, user is asked whether they'd like to edit the url for the current website configuration or whether they're intending to create a new configuration

Fixes problem with response times getting inflated if validation is running
Fixes bug related to new 'by link' outline view causing a crash sometimes after switching to another site and starting a new crawl
Views with switchable columns (via Prefs) now remember how the user has resized and repositioned them
Now correctly resets column sorting on all views when starting new crawl
Fixes two small and unrelated bugs causing odd results if nofollow switched off and base href present but set to ""

v4.2.2 Released June 13

Fixes problem with new link text column in By page view not always displaying accurate data where same link occurs multiple times on same page
Tweaks how information is displayed in new expandable By link view
Fixes crash on launch for some existing users
Filter button now works in flat view

v4.2 Released June 13

improvements to interface:
- Changes the 'by link' view to an expandable view, occurrences can be seen by expanding view rather than as previously having to open the link inspector
- Link inspector still appears on double-click from link views and is improved
- Adds context menus to the 'by link' and 'by page' views and the 'appears on' table in the link inspector - a number of actions can be performed with a right-click (or control-click) including 'Copy URL' and where appropriate Visit, Highlight and Locate
- The new Copy URL action is available with a command-C and will copy the URL of the selected item
- A new Locate action lists how to click through from the starting url to find the link in question. It is available via context menus, the link inspector and cmd-shift-L
- Adds 'link text' column to 'by page' view
- Change to wording: 'on page' now 'appears on'
- Changes default for highlighting a link on the page - now looks like highlighter pen rather than a box around it (changes prefs defaults to 'background' rather than 'border', and changes the default colour to yellow rather than dark grey - ie (existing users can select this option in prefs if they like))

Ignores and continues if 'bad SSL certificate' warning is encountered. But only for the website being tested. (anything else, ie external links, won't be followed anyway)
If image checking is switched on, now collects alt text and displays in 'link text' columns
Some options removed from Preferences>Views>By Link view (Status, URL, On Page) because these are needed for the new outline view to work properly
Exporting from 'by link' view better than previously. (was putting all 'on page' information in a single cell to reflect the view - led to problems due to Excel's 256-character limit)
Export added to by Page view
Exports from expandable views reflect the state of the view, ie which rows are expanded or not

v4.1.3 (Released May 13)

Improves authentication: allows you to input field names for websites which require login details to be sent by web form (eg Wordpress sites)
Remembers last-used filename and directory when saving sitemap xml file - details are remembered for each of your sites
Ignores and continues if 'bad SSL certificate' warning is encountered. But only for the website being tested. (anything else, ie external links, won't be followed anyway)
If a link just has a hash as the url, a hash character is displayed rather than the word 'hash' to avoid confusion
Improves speed of csv exports

v4.1 (Released April 13)

Now able to search for duplicates (same page with different urls)
Checks whether links are 'nofollow', displays this information in the link tables (switchable as per other columns) and adds option to prefs to 'not follow nofollow links'
Also checks for robots meta tag and whether nofollow present. If so the new 'don't follow nofollow links' will also apply to links on that page
Adds selector allowing choice of highlighting in SEO table; missing SEO parameters (as before) , possible duplicates or pages marked as nofollow
Double-click in SEO table now opens a new inspector showing SEO information including a list of possible duplicates
Scrutiny will check for the nofollow attribute - which is an overhead - if either of the columns are showing (Preferences > Views). So that you can see which links are 'nofollow' even if you've chosen not to not follow them. Hide both columns (which is the default global setting) if you don't need to know about this, then it won't slow the crawl down
Fixes page analysis not working properly
Checking for blacklist or whitelist terms is now case-insensitive, as you would probably expect
If flagging blacklisted urls, then the highlight colour used is orange or the warning colour (was red or bad link colour). Not an error so inappropriate to use an error colour
No longer includes 404 pages in the sitemap
Fixes problem of apparent duplicates in sitemap and SEO tables caused by two different link urls redirecting to the same url
Fixes bug preventing total image weight being shown in SEO table
More context help buttons

v4.0.4 (Released February 13)

Fixes problems creating black/whitelist rules on first run with no settings saved
Correctly sets window to edited (dirty spot in red button) when black/whitelist rules are changed, triggering prompt to save when switching settings

v4.0.3 (Released February 13)

Small fixes

v4 (Release Candidate January 13)

Major improvements to the engine and data storage meaning that even small sites will crawl more quickly and large sites will crawl very much more quickly without slowing down or losing responsiveness
When stop button is pressed, all open threads are abandoned, and then recreated if 'continue' is pressed. Gives a much better user experience.
Blacklist and whitelist boxes replaced by a more user-friendly table of rules (existing data will be preserved and presented in the new way)
Adds 'By page' links view. If 'bad links only' are showing, the view will show a list of pages requiring attention, expanding to show the bad links on that page.
Routines for 'by page' view re-written to avoid apparent hanging at the end of the crawl of a big site
Adds new settings to Preferences, allows setting of limits - default to 200,000 links. Offering the option of limiting the crawl of a large site (maybe better achieved by using blacklist / whitelist rules) but also a safety valve to prevent crashing due to running out of resources when crawling very large sites
If starting crawl within a directory, crawl is limited to that directory, ie crawl will go down a directory structure but not up. This matches users' expectations. Previously, crawl extended to all pages in the same domain.
Blacklist and whitelist boxes replaced by a more user-friendly table of rules (existing data will be presented in the new way)
Fixes inefficiencies in full report generation which were giving the impression of 'hanging' if full report generated for medium or large sites
Fixes problem with robots.txt if more than one user-agent is specified. Now will only use an exclusion list for user-agent = all (*) or Google (ie Scrutiny will respect the file as if it were Googlebot)
Moves 'check links on custom error pages' to settings rather than global preferences, and moves the 'labels' preferences to the View rather than General tab of the preferences window
Adds Help contents to help menu - links to manual index page
Increases maximum number of threads from 30 to 40 (will improve crawling for some sites) with the default now 12 rather than 7. Extreme left (labelled 'fewer') is still a single thread
Updated application icon
Resets the 30-day trial if you've used the trial with a previous version. There will be a price increase but existing licences will work with v4. This is a thank you to those who have bought in early
Change of price. Existing licences will work with the new version - a thank you to them for buying in early.

v3.2.2 Build 2 (Released January 13)

Problems with certificate resolved, supplied as an installer package (10.4 users will need 3.2.2 Build 1)

v3.2.2 (Released December 2012)

Fixes a bug when crawling multiple sites sometimes preventing them all from being crawled properly
Fixes a bug preventing 'delay' setting from being saved properly depending on localisation settings eg if decimal separator is ','
Prevents some unnecessary save dialogue

v3.2 (Released November 2012)

Adds 'filter' button for selecting out internal links or images
Fixes problem where some servers don't like 'referer' header field is sent with no value (ie starting url) and return 'bad request'
Corrects the Prefs window>Validation>suggested address for local instance of validator - adds slash at the end which seems to be necessary (clicking this address auto-completes the location field)
Fixes bugs with "Re-check bad links", one causing minor hang if used twice in a row.

v3.1 (Released October 2012)

Crawls multiple sites in one go - providing a list of links in plain text and using plain text mode is taken to be a list of sites, ie each is followed and crawled
(note that the total number of links / pages crawled must still be within the capacity of your computer to hold all the data, eg perhaps 100,000 links or 10,000 pages)
Page analysis tool now shows uncompressed and compressed size of files where gzip is being used by the server. So webmasters can easily see the benefit of their servers' gzip service and the actual 'transferred' weight.
'plain text mode' button fixed - state wasn't being saved with settings

v3.0.4

fixes bug causing instability on 10.4

v3.0.3

Small fixes including: fixes some relative urls being formed incorrectly
fixes double-click in links 'by link' view opening wrong item if search box has been used.

v3.0.2

Small fixes including: fixes hash trimming and trailing slash trimming (problem with the latter was leading to many apparent redirects)
clear and re-start only enabled when crawl is paused or finished
Fixes context help 'i' button for 'timeout' and 'delay'.

v3.0.1 (App Store) and v3.0 build 2 (web dist)

spelling mistake in prefs occurrences
fix page status not displaying
fixes links to new manual
fixes ua string defaulting
updates change freq - original values use rules, don't have to push button
Fixes progress indicator in AS dist
Fixes re-checking stacking statuses
At this point AS version has hi res document icon, Web has lo-res one

v3 (Released September 2012)

Able to schedule crawl via iCal with optional repeat. (Instructions added to manual.) CLI-minded people can use cron to do the same thing.
SEO keyword analysis searches content. This feature has to be switched on in preferences as it uses more of your computer's resources during the crawl
Adds sorting (by clicking on table headers) to all tables
Adds filtering and sorting to SEO view when a keyword or phrase is typed
Black / whitelists can apply to content as well as url (new checkbox below black / whitelist fields) -
Blacklisteded urls can be flagged (option in preferences)
Many internal changes making crawl slightly quicker and significantly more memory-efficient, larger sites can be crawled in one go
All statuses are shown for redirected links rather than just the final one
SEO table url column displays final url for redirected urls rather than the redirected one
New options in prefs for:
- checking content for seo keywords (must be checked before crawling site)
- flagging blacklisted or whitelisted urls (remember that you can blacklist or whitelist keywords in the content now too)
password field in Advanced settings is now a secure text field - hides the password from view
Adds Clear and Re-start to File menu
Fixes follow whitelist box not being saved with settings in web distribution
Fixes total image weight calculation in main SEO table
Measures in place to limit problems caused by extraneous and invisible characters entering the url field with a copy and paste

v2.0.1 (Released July 2012)

rename menu item blocked if crawl is running, and switches to list view if not already showing. (Previously appeared not to work if icon view was showing)
fixes redirected urls (3xx) not being highlighted yellow
fixes update frequency not being carried through to the xml sitemap from the rules table
if a url is redirected, sitemap table shows url as redirected rather than original link url, and exported xml sitemap similarly gives redirected url rather than original link url
maunal and help menu improved

v2: (Released July 2012)

Main new features

New page analysis tool
- load a page and its elements (images, .js files and .css files) noting the response time and load time for each element.
- It will give you a total, and you will be able to see where any problems lie
- can be used as a standalone tool or opened to analyse the currently-selected page. Document-based so that you can have more than one test open at the same time.

Keyword analysis
to count the occurrences of a word or phrase in url, title, meta description, meta keywords and main headings. Simply type the word into the search field above the list.

Greater control and better prioritisation in XML Sitemap
if you choose the 'Automatic' setting for priority, Scrutiny will mark your starting url as 1.0 and then calculate the others based on the number of clicks from the home page, and use a logarithmic scale. ie one click from first page = 0.5, two = 0.3, three = 0.2 with all other pages = 0.2
Further to this, you can set up some 'rules' to specify priority and update frequency for certain pages or sections of your site. You only need to enter a partial url. This way it is possible to specify a particular url or a section of the site, eg "/engineering/"
New export options including html sitemap and a full report
Can export a full report containing summaries and full lists for links, SEO and validation. You can save this in pdf format or html format. The links are 'clickable' in both formats.
The link views ('by link' and flat view) in v2 have a search box which searches the url, redirect url, status and link text, all case insensitive
User interface improvements; Starting url is visible from all tabs, along with the Go button which changes to pause / continue as appropriate, replacing the old toolbar pause / continue button, adds new tab for full report (prints from File>Print and 'print' toolbar item) , adds alternating background to all views, tabs have a pull-down menu rather than separate buttons for the export options, closing main window doesn't quit app, Window menu allows re-opening main window or page analysis windows.

If you are not a registered user, then the trial period is reset. So even if you've had 30 days' trial of v1, you can still try v2 for 30 days.

Other improvements and fixes

Splits the check for robots.txt and meta robots/noindex into two separate checkboxes as some users have wanted to use one but not the other
Links relative to scheme eg //domain.com (see http://www.ietf.org/rfc/rfc3986.txt section 4.2) previously handled ok but not if the page's base href was given in this format
Prevents switching to a new site while crawl is running, which was affecting the crawl in previous versions
Removes good colour from Preferences (to allow for stripey views)
Fixes a stability issue when the validator is crawling and there are a large number of unauthorised / redirected urls
Fixes 'new settings' not clearing 'last checked' status
Fixes a bug causing a bit of a hang if 'robots.txt' is set to be respected but crawling the site locally.

Version 1.6.3

released June 2012

Adds support for telephone links such as tel: and skype: (now recognised and skipped rather than reported as an error)
Fixes bug relating crawling local sites introduced in 3.8.4
Fixes problem with crawling local sites if they are stored in the root Library folder
Fixes bug causing special characters such as ü, ö, ä in page title or link text being altered to u, o, a when exported. All exports (.dot, .csv, .tdl, .html) now export using utf-8 character encoding. Note that in line with web standards (RFC 1738) Integrity and Scrutiny don't support non-ascii characters in urls
Fixes bug causing 'Recheck broken links' to give strange results or appear to hang if validation is set to crawl all pages

Version 1.6.2

released May 2012

Traps and highlights a certain kind of recursion
A fix and improvements re secure (https:) links. The problem could cause hanging or crashes in certain circumstances
Fixes problem with thread counting, faster crawling

Version 1.6.1

released May 2012

Adds 'Response time' as a column to the SEO table
Fixes bug affecting checking of broken images where image has src = "" and improved handling of empty quotes if that option is switched on
Fixes spurious text appearing in 'Link text' for links on images where the images alt = '' (empty string)
Fixes bug preventing proper construction of urls where base href = "/"
Improves submission of username and password for sites requiring authentication
Fixes problem of crawl or 'recheck broken links' not always finishing properly
Fixes potential crash under certain circumstances (involving redirect, url having trailing slash and settings set to ignore trailing slashes)

Version 1.6

released April 2012

Adds check for robots.txt and noindex in the meta data (this feature is off by default, switched on in preferences). When crawling, All links are followed and checked regardless, but if a page is marked as 'noindex' in the robots meta tag or disallowed in the robots.txt file, it will not be included in the sitemap, SEO or validation checks. robots.txt must have a lowercase filename and be constructed as shown at http://www.robotstxt.org/robotstxt.html
Indicates progress via application icon in dock
Adds image count to SEO table, shows number and weight of images on page (only those linked from html, not those linked from css). For this feature to work, 'check for broken images' must be checked in settings. (I believe that Google takes load time into account while Bing does not.)
Adds totals for 'no description' and 'no title' to SEO tab
Default link check timeout shortened to 30s
Fixes bug preventing images from being found if 'src' doesn't follow 'img' in the html
Fixes bug causing broken images to spuriously appear in Sitemap and other tables
Fixes bug causing number of html validation errors to sometimes incorrectly show as 0
Two versions now maintained, one built for distribution via web (10.4 - 10.7 supported) and one certified and built for distribution via App Store (10.5 to 10.latest supported). The latter will have a .1 at the end of the version number in the About box, eg 1.6.0.1 is the App Store version.
Note that if you download and buy via the web it is not possible to upgrade via the App Store and vice versa
App Store version has Lion features such as full-screen mode

Version 1.5

released April 2012

Removes 'generating flat view' progress bar. This job is now done much more quickly and in the background
Adds columns to validator tab, number of errors and number of warnings
Adds 'Export as CSV' button to toolbar
Adds ability to export sitemap and validator results as csv, from menu, toolbar or button on relevant tab
Fixes comma or trailing comma in blacklist fields preventing proper crawl
Adds switch in preferences to ignore trim leading or trailing spaces or mismatched quotes from a url
When crawling locally fixes 'file is directory' being included in bad links
Some fixes to the 're-check bad links'. (Was causing crash sometimes since last release)
Highlighting link on page feature is switchable between highlighting and simply visiting page
Validator will list all pages but only check the starting page. Checking the whole list as before can be switched on in preferences, but note that the public validator will only check a certain number of pages in succession, even with the 1 second delay that they ask for. This is the reason for this change. The Integrity and Scrutiny FAQs page gives details of installing the w3c validator locally which should allow full and rapid checking.
Fixes problem of throbber sometimes continuing to turn when crawl or re-check has finished

Version 1.4

released March 2012

Adds username and password fields to advanced settings window. If using authentication, Scrutiny will attempt to send these credentials if challenged by the server. If details are sent and then rejected by the server, a message to that effect will be sent to the Console.
Adds 'Ignore trailing slash' button to settings, can be set per site, set to 'yes' by default
Fixes a problem preventing crawling of pages if braces { } are present in the url
When crawling local files, directories are not reported as an error (as long as the directory exists)
Options for sitemap update frequency 'daily', 'weekly', 'monthly' etc altered to lowercase for compliance with the sitemap standard
'Customize' added to toolbar (although this has been dropped by Apple from Lion 10.7 onwards so will only appear in 10.4 -> 10.6)
Updated application icon and removal of non-standard buttons on the various tabs. Adds several buttons to the customisable toolbar

Version 1.3.4

released February 2012

New feature - option to ftp sitemap file to server after generating it. Server and authentication details are saved with the config for each site. Some related options added to Preferences
Sends referrer header field for every request (other than the starting url) - this fixes a very small number of odd bugs
'Open local file' is added to the File menu. Functionality to crawl a site locally or import a list of links did exist in previous versions and was documented, but wasn't very accessible as it relied on a drag and drop into the starting url field (which still works and is to be improved in a future version)
Fixes bug preventing links to w3c being checked properly
Fixes a small memory leak
Clears data from flat link view before starting a new crawl
Fixes bug preventing crawl from finishing properly if user tries to highlight link on page before link has been checked
Fixes bug preventing date stamp from being written properly every time
Improves re-check broken links - now correctly uses as many threads as are set in settings and fixes problem preventing it from finishing every time
Clarifies number of links checked (x of y)

Version 1.3.3

released end December 2011

SEO and Validation can be disabled in global prefs for better performance if not needed
Allows setting of delay and timeout for Validation (in global prefs)
Links to subdomains can be considered as internal rather than external. ie peacockmedia.software and www.peacockmedia.software are considered the same site (which is not necessarily true but most people would expect) and therefore both are followed. Adds checkbox in global preferences to switch this option. Default is on. With the option on, Integrity will discover more links (and potentially more bad links) on certain websites. Option needs to be switched off if you wish to deliberately limit your crawl to one subdomain
Fixes problem with 'Re-check broken links' button
Fixes problem with exporting links if 'bad links only' are showing
If unregistered, registration window was nagging. This was unintentional and has been switched off, should now only show on startup and after 3 days

Version 1.3.2

released November 2011

Exports .dot file (standard format used by graphing applications) which can be opened as a visualisation in third-party graphing apps. includes colour to indicate levels. Accessed via File>Export or a new toolbar button added by 'Customize toolbar...'
Allows crawling of some sites requiring authentication. Log in using Safari and check the box in advanced settings. Must be used with caution and with proper backups.
Adds advanced settings; authentication and custom header fields
Removes distance column from links tables. Shown in Sitemap table where it's more appropriate
Adds 'Getting started' to Help menu (online help to be improved shortly)
Fixes problems with 'Re-check broken links' and 'Re-check this link'
Fixes 'on page as title / url' preference
Fixes glitch with 'Inspect selected' button when flat view is showing

Version 1.3.1

released October 2011

Fixes bug preventing proper crawling of local files
Now handles UTF characters in meta keywords / description
Fixes bug preventing page title from showing if it contains UTF characters
Fixes 'Inspect selected' button on flat sortable link view
On pressing 'Go' for the second time, previous results are cleared immediately
File>New takes you back to the settings tab if not already in view

Version 1.3

released October 2011

Requires licence key, activation panel shows at startup with option to continue and use application.
Trial period set for 30 days

Version 1.2 (Beta)

released September 2011

Compatible with 10.4 / ppc upwards
Compromises Lion full-screen mode
Adds 'file size' column to SEO table
Fixes print button - fits visible table to page width and landscapes page
Fixes problems with 'export csv' and 'export html' buttons
Fixes problem of user not being able to get main window open again if closed
Fixes bug causing base href not to be discovered which could lead to many improperly-constructed relative urls

Version 1.1 (Beta)

released September 2011

Fixes titles and descriptions not showing properly if carriage returns present between tags
Continue button greys properly on open and when crawl finishes
adds meta keywords to SEO table
adds url column to SEO table

Version 0.1 (Beta)

released August 2011

Uses tried and tested website crawling engine from Integrity
adds SEO parameters: meta description, title and headings
adds improvements to sitemap generation
adds html validity check with configurable url for validator (allowing for local instance)
adds full-screen mode and improved interface