Website SEO Audit Checklist

SEO Audit Checklist 2015 – If you lack the experience, starting a new SEO project can be overwhelming and difficult to manage. You may not be able to quickly and clearly see the best approach to take.

Here is a brief list I use to make sure I have covered all important areas of an website audit and be able to thoroughly plan the things that need to be done.

First of all it’s important to acknowledge the fact that you can perform a website audit from multiple perspectives:

Table of Contents

  1. Site Architecture
  2. Technical/Server Issues
  3. HTML Use/Analysis
  4. Content Review
  5. Negative Practices
  6. Keywords
  7. Webmaster Tools
  8. Social Media

Site Architecture

Canonical URLs (Best Page Addresses)

– Access to pages on domain (www vs. non-www)
– Home Page linking consistency
– Capitalization/Lower Case (capitals in domain name ok, in folders and files a potential problem)
– Print Versions (CSS Rather than crawlable duplicate PDFs/Docs
– Canonical Link Elements – do they match up right?
– Rel Prev/Next link elements for paginated pages?
– Internal Redirects (internal 301 redirects avoided)

Robots.txt File

– Correctly formatted
– Includes all it should (including cart pages, email referral pages, login pages)
– Includes link to XML sitemap or XML Sitemap Index

Meta robots noindex/nofollow

– Used Appropriately
– Used on pages that a deep crawler might try to index (like form and search results pages)

Category/Site Structure (URLS and Information Architecture)

– Unique and User Friendly
– Use of appropriate category and sub-category link structures
– Customer orientated rather than feature orientated
– Provides tasks/Options for different personas

Choosing File Names

– Uses hyphens as word separators
– Unique
– Avoids Keyword Stuffing
– If file names to be changed, links on site changed, and 301s set up for external visitors

Custom Error Page

– Sends proper 404 code status
– no soft 404s
– Helpful to visitor (navigation, directories, search)

HTML Sitemap

– Organized into user friendly and user oriented categories
– Provides links to most important pages
– Avoids using too many links
– Doesn’t include 404s or links that redirect internally

XML Sitemap

– Properly formatted (XML proper encoding)
– Uses only canonicals
– No 404s and no internally redirected pages
– Submitted to GWT and Bing Tools

Jump to Table of Contents

Technical/Server Issues

OS/Server/CMS/Catalog Considerations

Server Status: Messages 200, 300, 400, 500

Secure Server | HTTPS Protocol

– No error messages
– No https bleed-over to pages that aren’t supposed to be https
– No certificate authority errors

Search Friendly Links

– All links to be indexed reachable by text-based links or “href” and “src”.

Broken and Redirected Links

– Broken links identify, links removed or replaced
– All 301 redirected links replaced with direct links

External Links

– Checked for broken links and redirects and replaced where appropriate
– Pages linked to checked for repurposed content

Duplicated Content

– Internally (see canonical section above)
– Mirrors identified and disallowed/noindexed as appropriate
– Substantially duplicated content on self-owned other sites removed/changed/blocked
– Substantially duplicated content on other sites removed (friendly email, AUP letter to host, DMCA)

JavaScript

– Can pages be navigated with javascript disabled? If not, are URLs for pages accessible in HTML code with “href” and “src”?
– If Ajax is necessary, is Google’s hashbang approach used?

Dynamic Pages

– Avoid session IDs in URLs
– Avoid excessive multiple data parameters in URLs
– Avoid excessive processor calls
– Avoid calls to multiple servers as much as possible
– Avoid keyword insertion pages (pages were the content is substantially the same except for keywords that are inserted into the pages).
– Keep boilerplate (disclaimers, copyright notices, other text that appears on most pages) that exists on templates light.
– Label page segments semantically well (the div class for those could be things such as header, footer, sidebar, advertisement, or whichever is most appropriate.)

Page Load Times

– Images compressed for right dimensions and for file sizes?
– GZIP or Deflate used?
– Base 64 encoding for images avoided?
– External CSS and Javascript used and minimized?
– Long browser caching dates?
– CDN in use where appropriate?
– Other Page Speed considerations

Cookies

– Navigation of indexable pages possible without accepting them?

Jump to Table of Contents

HTML Use/Analysis

Deprecated HTML/HTML Validation

– If invalid, are errors the type that will harm SEO?

Cascading Style Sheets (CSS)

– If invalid, are errors the type that will harm SEO?

Title Elements

– Relevant to the content of the page and be keyword-rich.
– Meaningful and able to stand on its own as a description of the page it titles.
– Persuasive and Engaging to those who see it out of context
– As unique as possible compared to other titles on the site
– If the name of the site appears in the title, it should be at the end of the title, and not at the beginning, unless it is the home page.
– No more than ten words or roughly 60-70 characters in length.
– Unique if possible compared to titles from other sites.

Meta Description Elements

– Descriptive of the content of the page
– Includes the main keyword phrase the page is optimized for
– Engaging and persuasive to viewers who see it out of context (search snippets or social shares)
– Around 25 words or 150 characters in length
– Well written sentences, using good punctuation
– One sentence preferable, but two alright if keywords are in the longer sentence
– Preferable to have keywords as close to the start as appropriate

Heading Elements

– Top level heading should describe the content of the page
– Lower level headings should effectively describe the content they head
– One top level heading preferable per page
– Headings should be used like headings in an outline, in proper order
– Main and subheadings can, and should contain targeted keywords if possible and appropriate.
– A heading element should not be used for the page logo
– Headings for lists and sections in page navigation should use CSS to style them rather than heading elements.

Strong/Em Elements

– For bold text, use the “strong” HTML element.
– For Italics text, use the “em” HTML element
– Use Strong and Em to highlight the use of keywords and related words
– When bolding or italicizing other text on a page, use CSS to style how it looks
– Don’t over use bold or italics – emphasizing too much means emphasizing nothing.

Image Optimization

– Use alt text for images on a page that are meaningful
– Use captions for images on a page that are meaningful
– A caption for an image should be contained within the same HTML element as the image (like a div)
– Select images that are meaningful that are related to the keywords optimized for
– Use the chosen optimized keywords in the alt text and captions where appropriate
– Use file names that reflect those keywords where appropriate.
– Use hyphens to separate words in image file names.
– Use alt=”” for images that aren’t meaningful like decorations or bullet points
– Use alt text for logos that are descriptive of the business or organization
– Larger images with better resolution might be ranked a little better than smaller and lower resolution images.
– Alt text should not be a list of keywords, but can contain a keyword phrase.
– Alt text shouldn’t be more than 10 words or so.
– Avoid keyword stuffing alt text, captions, and image file names.

Anchor Text

– Keywords should be used in anchor text
– If the keywords for a page being pointed to aren’t used, related terms should be
– Anchor text used in navigation should be descriptive of what is on the page linked to
– Anchor text should not use generic terms such as “click here.”
– Anchor text shouldn’t be longer than 10 words or so if possible
– Anchor text shouldn’t be stuffed with multiple keywords

Meta Data optimization

– Search engines do not use Dublin core meta tags
– Search engines do not use the revisit meta tag
– A robots index, follow tag is unnecessary and redundant
– a NOODP will keep Google and Bing from using Open directory project titles instead of title element titles, if the site is even listed in DMOZ

Jump to Table of Contents

Content Review

Amount of Text

– Having some minimum amount of text on a page (200 words?) gives search spiders something to index.

Spelling Errors

– Possible quality signal
– Important to credibility

Keyword Use in Copy

– Are keywords chosen for a page being used in page titles, meta descriptions, headings, and content

Keyword Prominence/Visual Segmentation

– How well does the HTML code of a page show how it’s broken down into different blocks (heading, main content, sidebars, footers, etc.)
– Are keywords used in the different sections, and especially in the main content area of pages?

Use of Related Words/Phrases

– Some words tend to co-occur on pages ranked highly for a certain query (or categories of results for queries), and it can help in the rankings for a page to use some of those phrases.

Penguin/Panda Analysis

Is there a loss in traffic that corresponds to one of the Panda or Penguin updates?

Resource: http://www.seomoz.org/google-algorithm-change

Jump to Table of Contents

Negative Practices

Hidden Text

– Is there text on pages in the same font color as the background?
– Is there text on pages hidden through an offset div?
– Is there a large amount of text on pages in small iframes or CSS scrolling overflows
– Is there text in a font color that matches the font color as the page background that might be mistaken as hidden text?

Cloaking

– Does the site use cloaking to show search engines one thing and visitors something else?

Meta Refresh

– Are meta refreshes used instead of redirects, and if so might they be used in a way which might deceive search engines?

JavaScript Redirection

– Is javascript redirection being used so that search engines see one thing, and visitors see something else?

Outward Links/Link Exchanges

– Is the site using link directory pages that promise being listed in exchange for a link?

Keywords

Keyword Research, Selection and Implementation

– Are relevant, competitive, appropriate and popular keywords being used on the pages of the site?
– Are those keywords being used effectively on those pages?

Keyword Focusing | Mid- to Long-Tail Key Phrases

– Do the main pages of the site focus upon more competitive keyword phrases?
– Do deeper pages with less pagerank focus upon long-tail phrases?

Webmaster Tools

Google Webmaster Tools/Errors Analysis*

– Has the site been verified with GWT?
– Has a choice of “www” setting been made? (Doesn’t have to be if domain access issues are addressed)
– Has a targeted country/location been selected? (Doesn’t have to be)
– Have any errors listed been checked upon?

Jump to Table of Contents

Social Media

Social Media Audit | Status

– Does the site integrate appropriate social sharing buttons?
– Do the pages of the site provide links to social profiles for the site?

On-Site Social Engagement

– Does the site provide ways to give feedback to the site owners?
– Does the site provide a way to leave comments?
– Is there user generated content on the site, such as reviews and ratings, and does it use rich snippets if so?
– Are there public user/member profile pages, and if so how rich are they in terms of features?
– Is there a forum on the site, and if so, some guidelines for its use?

Analytics

Have analytics been set up for the site?
– Code on every page

 

 

Quick overview


Check indexed pages  
  • Do a site: search.
  • How many pages are returned? (This can be way off so don’t put too much stock in this).
  • Is the homepage showing up as the first result?
  • If the homepage isn’t showing up as the first result, there could be issues, like a penalty or poor site architecture/internal linking, affecting the site. This may be less of a concern as Google’s John Mueller recently said that your homepage doesn’t need to be listed first.


Review the number of organic landing pages in Google Analytics

  • Does this match with the number of results in a site: search?
  • This is often the best view of how many pages are in a search engine’s index that search engines find valuable.


Search for the brand and branded terms

  • Is the homepage showing up at the top, or are correct pages showing up?
  • If the proper pages aren’t showing up as the first result, there could be issues, like a penalty, in play.

Check Google’s cache for key pages
  • Is the content showing up?
  • Are navigation links present?
  • Are there links that aren’t visible on the site?
PRO Tip:
Don’t forget to check the text-only version of the cached page. Here is a bookmarklet to help you do that.


Do a mobile search for your brand and key landing pages

  • Does your listing have the “mobile friendly” label?
  • Are your landing pages mobile friendly?
  • If the answer is no to either of these, it may be costing you organic visits.

On-page optimization


Title tags are optimized
  • Title tags should be optimized and unique.
  • Your brand name should be included in your title tag to improve click-through rates.
  • Title tags are about 55-60 characters (512 pixels) to be fully displayed. You can test here or review title pixel widths in Screaming Frog.

Important pages have click-through rate optimized titles and meta descriptions
  • This will help improve your organic traffic independent of your rankings.
  • You can use SERP Turkey for this.
Check for pages missing page titles and meta descriptions

The on-page content includes the primary keyword phrase multiple times as well as variations and alternate keyword phrases

There is a significant amount of optimized, unique content on key pages

The primary keyword phrase is contained in the H1 tag
Images’ file names and alt text are optimized to include the primary keyword phrase associated with the page.

URLs are descriptive and optimized
  • While it is beneficial to include your keyword phrase in URLs, changing your URLs can negatively impact traffic when you do a 301. As such, I typically recommend optimizing URLs when the current ones are really bad or when you don’t have to change URLs with existing external links.

Clean URLs
  • No excessive parameters or session IDs.
  • URLs exposed to search engines should be static.

Short URLs
  • 115 characters or shorter – this character limit isn’t set in stone, but shorter URLs are better for usability.

Content


Homepage content is optimized
  • Does the homepage have at least one paragraph?
  • There has to be enough content on the page to give search engines an understanding of what a page is about. Based on my experience, I typically recommend at least 150 words.

Landing pages are optimized
  • Do these pages have at least a few paragraphs of content? Is it enough to give search engines an understanding of what the page is about?
  • Is it template text or is it completely unique?

Site contains real and substantial content
  • Is there real content on the site or is the “content” simply a list of links?

Proper keyword targeting
  • Does the intent behind the keyword match the intent of the landing page?
  • Are there pages targeting head terms, mid-tail, and long-tail keywords?

Keyword cannibalization
  • Do a site: search in Google for important keyword phrases.
  • Check for duplicate content/page titles using the Moz Pro Crawl Test.

Content to help users convert exists and is easily accessible to users
  • In addition to search engine driven content, there should be content to help educate users about the product or service.

Content formatting
  • Is the content formatted well and easy to read quickly?
  • Are H tags used?
  • Are images used?
  • Is the text broken down into easy to read paragraphs?

Good headlines on blog posts
  • Good headlines go a long way. Make sure the headlines are well written and draw users in.

Amount of content versus ads
  • Since the implementation of Panda, the amount of ad-space on a page has become important to evaluate.
  • Make sure there is significant unique content above the fold.
  • If you have more ads than unique content, you are probably going to have a problem.

Duplicate content


There should be one URL for each piece of content
  • Do URLs include parameters or tracking code? This will result in multiple URLs for a piece of content.
  • Does the same content reside on completely different URLs? This is often due to products/content being replicated across different categories.
Pro Tip:
Exclude common parameters, such as those used to designate tracking code, in Google Webmaster Tools. Read more at Search Engine Land.

Do a search to check for duplicate content
  • Take a content snippet, put it in quotes and search for it.
  • Does the content show up elsewhere on the domain?
  • Has it been scraped? If the content has been scraped, you should file a content removal request with Google.

Sub-domain duplicate content
  • Does the same content exist on different sub-domains?

Check for a secure version of the site
  • Does the content exist on a secure version of the site?

Check other sites owned by the company
  • Is the content replicated on other domains owned by the company?

Check for “print” pages
  • If there are “printer friendly” versions of pages, they may be causing duplicate content.

Accessibility & Indexation

Check the robots.txt

  • Has the entire site, or important content been blocked? Is link equity being orphaned due to pages being blocked via the robots.txt?


Turn off JavaScript, cookies, and CSS

  • Use the Web Developer Toolbar
  • Is the content there?
  • Do the navigation links work?


Now change your user agent to Googlebot

  • Use the User Agent Add-on
  • Are they cloaking?
  • Does it look the same as before?
PRO Tip:
Use SEO Browser to do a quick spot check.


Check the SEOmoz PRO Campaign

  • Check for 4xx errors and 5xx errors.


XML sitemaps are listed in the robots.txt file


XML sitemaps are submitted to Google/Bing Webmaster Tools


Check pages for meta robots noindex tag

  • Are pages accidentally being tagged with the meta robots noindex command
  • Are there pages that should have the noindex command applied
  • You can check the site quickly via a crawl tool such as Moz or Screaming Frog


Do goal pages have the noindex command applied?

  • This is important to prevent direct organic visits from showing up as goals in analytics

Site architecture and internal linking


Number of links on a page
  • 100-200 is a good target, but not a rule.

Vertical linking structures are in place
  • Homepage links to category pages.
  • Category pages link to sub-category and product pages as appropriate.
  • Product pages link to relevant category pages.

Horizontal linking structures are in place
  • Category pages link to other relevant category pages.
  • Product pages link to other relevant product pages.

Links are in content
  • Does not utilize massive blocks of links stuck in the content to do internal linking.

Footer links
  • Does not use a block of footer links instead of proper navigation.
  • Does not link to landing pages with optimized anchors.

Good internal anchor text

Check for broken links
  • Link Checker and Xenu are good tools for this.

Technical issues


Proper use of 301s
  • Are 301s being used for all redirects?
  • If the root is being directed to a landing page, are they using a 301 instead of a 302?
  • Use Live HTTP Headers Firefox plugin to check 301s.

“Bad” redirects are avoided
  • These include 302s, 307s, meta refresh, and JavaScript redirects as they pass little to no value.
  • These redirects can easily be identified with a tool like Screaming Frog.

Redirects point directly to the final URL and do not leverage redirect chains
  • Redirect chains significantly diminish the amount of link equity associated with the final URL.
  • Google has said that they will stop following a redirect chain after several redirects.

Use of JavaScript
  • Is content being served in JavaScript?
  • Are links being served in JavaScript? Is this to do PR sculpting or is it accidental?

Use of iFrames
  • Is content being pulled in via iFrames?

Use of Flash
  • Is the entire site done in Flash, or is Flash used sparingly in a way that doesn’t hinder crawling?

Check for errors in Google Webmaster Tools
  • Google WMT will give you a good list of technical problems that they are encountering on your site (such as: 4xx and 5xx errors, inaccessible pages in the XML sitemap, and soft 404s)

XML Sitemaps  
  • Are XML sitemaps in place?
  • Are XML sitemaps covering for poor site architecture?
  • Are XML sitemaps structured to show indexation problems?
  • Do the sitemaps follow proper XML protocols?

Canonical version of the site established through 301s

Canonical version of site is specified in Google Webmaster Tools

Rel canonical link tag is properly implemented across the site
  • Make sure it points to the correct page, and every page doesn’t point to the homepage.

Uses absolute URLs instead of relative URLs
  • This can cause a lot of problems if you have a root domain with secure sections.

Site speed

Review page load time for key pages 

  • Is it significant for users or search engines?

Make sure compression is enabled
  •  Gzip Test

Enable caching

Optimize your images for the web
  • Google’s guide to optimizing your images
Minify your CSS/JS/HTML


Use a good, fast host
  • Consider using a CDN for your images.

 

Optimize your images for the web
  • Google’s guide to optimizing your images

Mobile


Review the mobile experience
  • Is there a mobile site set up?
  • If there is, is it a mobile site, responsive design, or dynamic serving?

Make sure analytics are set up if separate mobile content exists

 

If dynamic serving is being used, make sure the Vary HTTP header is being used

  • This helps alert search engines understand that the content is different for mobile users.
  • Google on dynamic serving.

Review how the mobile experience matches up with the intent of mobile visitors
  • Do your mobile visitors have a different intent than desktop based visitors?

Ensure faulty mobile redirects do not exist
  • If your site redirects mobile visitors away from their intended URL (typically to the homepage), you’re likely going to run into issues impacting your mobile organic performance.

Ensure that the relationship between the mobile site and desktop site is established with proper markup
  • If a mobile site (m.) exists, does the desktop equivalent URL point to the mobile version with rel=”alternate”?
  • Does the mobile version canonical to the desktop version?
  • Official documentation.

International


Review international versions indicated in the URL
  • ex: site.com/uk/ or uk.site.com

Enable country based targeting in webmaster tools
  • If the site is targeted to one specific country, is this specified in webmaster tools?
  • If the site has international sections, are they targeted in webmaster tools?

Implement hreflang / rel alternate if relevant
  • Documentation

If there are multiple versions of a site in the same language (such as /us/ and /uk/, both in English), update the copy been updated so that they are both unique

Make sure the currency reflects the country targeted

Ensure the URL structure is in the native language 
  • Try to avoid having all URLs in the default language

Analytics


Analytics tracking code is on every page
  • You can check this using the “custom” filter in a Screaming Frog Crawl or by looking for self referrals.
  • Are there pages that should be blocked?

There is only one instance of a GA property on a page
  • Having the same Google Analytics property will create problems with pageview-related metrics such as inflating page views and pages per visit and reducing the bounce rate.
  • It is OK to have multiple GA properties listed, this won’t cause a problem.

Analytics is properly tracking and capturing internal searches

Demographics tracking is set up

Adwords and Adsense are properly linked if you are using these platforms
  • Instructions for linking AdWords
  • Instructions for linking AdSense

Internal IP addresses are excluded
  • Official documentation

UTM Campaign Parameters are used for other marketing efforts
  • Google URL Builder

Meta refresh and JavaScript redirects are avoided
  • These can artificially lower bounce rates.

Event tracking is set up for key user interactions
  • Event Tracking Documentation

 

Top Search: