Parts Of An SEO Audit
This article was originally published at diamondandbranch.com
One of the activities that I enjoy most about my job is the research we do to understand how a website is seen by search engines. We already know that it takes more than flipping a switch for users to come to your site. We also know that we can’t expect Google to find websites and automatically understand what the site is about, and in what searches it should show up for. We have to guide Google and show them what the site is about, when and where to show the content, and convince them that we are worth showing up in the top positions.
This is where SEO steps in. SEO has many technical and strategic activities that can either help you get to the top or bury your site all the way at the bottom of the results. To understand how a website appears in the eyes of Google (and other search engines too), we at Diamond + Branch perform technical audits of our clients’ websites.
This technical SEO audit helps us understand several important things: where you are today, what items need immediate attention, what’s currently working but may need some attention, and what’s currently working well that you should do more of. There are 5 parts to our technical SEO audit. These are:
• On-Page Ranking Factors
• Off-Page Ranking Factors
• Other Ranking Factors
The first step is to start with a crawl report. This type of report can give us detailed information for all of the pages that are accessible to search engine crawlers. It is the closest tool that we can configure to behave like the Google search engine crawlers.
Can search engines find the pages on your website?
We want to show up in search engines’ search results, right? The first step to get there is to allow search engine crawlers to navigate through the pages of your website. First, we need to make sure that you have the instructions for the search engine crawlers to which parts of the website they should crawl. We do this by creating a robots.txt file. We may want to hide some pages from Google crawlers because these are pages that shouldn’t be publicly accessible. In order to do this, you will be looking at the HTML robots meta tag and how it’s configured.
Are there any fires that we need to put out? The HTTP(s) status will provide us with the possible errors that your pages may have and will allow us to decide which need immediate attention. Understanding how users and search engines find their way around your website is very important, and we’ll understand this by looking at the website architecture. The last part of the accessibility metrics is for us to get a score of your website performance. You definitely want a fast website, not only because Google prioritizes websites that load faster, but also because websites that load fast create a better user experience. So, we run a report in Google’s PageSpeed Insights tool to learn how quickly the site loads in both mobile and desktop and see what suggestions they may have for you.
Do search engines think it’s worthwhile to index/save your pages?
We want to make sure that search engines consider your pages important enough to remember them so that they can appear to people who are performing related searches. In order to do this, we have to be very clear about communicating your website structure, the type of files and pages that you have, and the relationship between them. We do this by creating a sitemap – a file that provides all of this information. We then submit to search engines the location of this file so they know which to consider. We can understand how well search engines are considering to save your pages by checking the amount of pages in the sitemap versus the amount of pages that they’ve considered to index.
After we verify your important pages have been indexed by search engines, we perform a brand search to make sure that you own the top results. After we make sure that we’re ranking for your brand name we perform a generic search. This takes the primary keywords that you’re aiming to rank for and finds where you rank. This also gives us an idea who the top competitors for those keywords are.
On-Page Ranking Factors
What is your website saying about you?
After we verify how well your website can be accessed, and how your pages are being remembered, we then check the information and content that you are using to be competitive and earn a good ranking in Google.
To do this, we start with the URL structure. We want to make sure that you are using “friendly” URLs, something that both humans and search engines can understand. For example, www.website.com/products/soap is much easier to understand than www.website.com/page=256?id=22-ab.
Then we verify user content and keywords to make sure that you are using your primary keywords in your content. This allows both users and search engines to relate your content with their searches. We look at the content structure to make sure that content flows. We don’t want a 2,000+ block of letters – that’s not good for users and so search engines will decide against it, too.
We make sure that you are not cannibalizing your keywords. This is when more than one page is trying to rank for the same keywords. We look for duplicate content in meta descriptions to make sure that you have a good description of your pages in search results.
It’s important to take advantage of internal linking whenever possible, internal linking is simply adding a link to a related post, or page between posts and pages, this creates a better experience with users finding more of your content, and it also gives Google the idea that we have more content to support what we’re talking about. Most likely we have related content all over your website or a page that gives more detailed information about a certain topic that is mentioned in a page.
We check that you are describing the different pieces of your content or pages by using HTML Markup. We prefer and recommend Schema Markup, which is a series of tags that improve how search engines interpret and represent the content in your pages in search engine results. The last on-page elements we look for are Twitter Cards and Facebook Open-Graph Protocol. We want your brand to be represented equally everywhere. When a link is shared from your website to these social media channels, you can control which photos, videos, and other media are shown as well as the page information that is shown on the post by adding the markup code from these social media channels to your pages.
Off-Page Ranking Factors
What are other websites saying about you?
Who doesn’t like to be popular? And although sometimes the most popular sites aren’t the most useful, they influence people and attract the most attention. In order to verify your popularity, we check a few different metrics. The most simple one is to see if your traffic has increased over time. We check Google Analytics to learn about the traffic behavior year-over-year, has your traffic increased? Are you getting referral traffic? These are some of the questions that we answer with this.
We also look at backlinks. This gives us an idea of other sites recommending you to their viewers and sending traffic over to you. The most useful are followed links, which means that the website that links to you is vouching for you.
Social media channels are important too. We want to make sure that your brand is interconnected all over the internet – a website and different social media accounts. Allow users to find your content on other channels, and let them easily go from your website to the social media channels, and vice versa. The point here is that we are looking for healthy traffic flow – the kind of traffic flow that happens naturally when something is very popular.
What other tools are we using to reinforce the previous elements?
In this last step, we take a look at tools that can inform us of how well all of the previous elements are working and what score you are receiving as a website. First, we look at the Domain Authority that your website is receiving from Moz. This metric is a score developed by Moz that predicts how well a website will rank on a search engine result page. The other metric from Moz that we look at is Spam Score, which represents the percentage of similar websites that have been penalized by Google.
The last ranking score that we started using recently is the popularity of your website given by Alexa Rank. This score reveals how a website is doing relative to all other sites, which gives us a very good idea to find where you stand amongst your competitors.
After we pull all these metrics and provide a report on the status of each of them individually, we report the outcome of our analysis is a summary of observations and recommendations. It’s important to list what’s working and what you should continue doing, but also list the items that need attention – what fires need to put out immediately and what activities we are going to do slowly over the coming months.
It sounds exhausting, doesn’t it? But it’s so worth it! This post only briefly covered the structure and the different parts of our SEO Audit. And as long as this post is, it’s not nearly close to the amount of information we get when we perform this audit. This is a time-consuming audit. Yet it’s very important to pay attention to every detail so that we can provide the right recommendations and begin working on the activities that are going to benefit our clients the most.