fbpx
technical seo audit
Mariusz Interewicz Updated: 24 Mar 2023 17 min to read

Technical SEO Checklist – Factors Worth Analyzing

A fully functional website is the bloodline of any business. But having a sleek website is not enough; it has to rank well on search engine result pages (SERPs), and that’s where a technical SEO comes into play. With the rapid changes in the digital marketing industry, one way to keep up to date is by evolving – stay on top of your game, and keep doing what’s working.

Speaking of what is working, you’ve got to do a technical SEO review regularly. Basically, a Technical SEO audit is looking at your website through the eyes of search engines like Google and Bing.

There are millions of websites in the world today, and each one of these sites is competing for the top spot in SERPs. After all, the top spot translates to better visibility and lots of eyeballs. If everything is done right, you can convert these eyeballs to loyal paying customers.

To make it to the top spot, you need to perform a technical SEO audit regularly. Google’s algorithms change all the time, your site gets new features, too, and you make changes. You have to assume at the very beginning that SEO development for your business is a job that cannot be done just once. Continuous improvements are crucial. To do this effectively, you need to know what to optimize. And that’s what a technical SEO audit service is for.

What is on-site SEO?

All good marketers want to improve the visibility and search engine ranking of their websites. To do that, they have to optimize the elements of their website for search engines, which exactly is called on-site SEO. On-site SEO is also known as on-page SEO optimization.

This is basically a set of technical tasks. It involves making sure your web pages, meta tags, content, and overall structure of your web page are optimized in regard to your target keywords. Simply put, on-site SEO optimization is things you do on your website to boost visibility. In contrast, off-site SEO is things you do outside the website to boost visibility and search engine ranking.

Now you understand the basics of on-site SEO optimization, here is what you need to know about technical SEO audit.

 

PageSpeed Insights ASPER BROTHERS

Scoring 100 on PageSpeed Insights is the goal of many SEO teams. However, you have to remember the real goal of technical optimization: the best possible user experience. That’s why, in addition to metrics, it makes sense to study user behavior on the site.

 

What is Technical SEO?

Ranking high on SERPs is not rocket science, but it requires smart and diligent work. Previously, website owners could trick Google’s algorithm by merely stuffing keywords and backlinking from micro-sites. Well, those days are over. If you want to be at the top, you’ve got to refine your SEO strategy.

A technical SEO audit is pretty much using the eyes of Google to look through your website. It’s a digital health check that lets you analyze your website’s ranking ability. Think of it as a process of checking the technical aspect of your website’s SEO.

Before ranking is done, search engine bots crawl the internet to find websites and web pages. These web pages will be checked for various ranking factors. Google algorithm uses more than 200 ranking signals. Backlinko has pretty much covered the updated Google ranking factors. Whichever industry you are in, your competitors are keeping up with the latest SEO changes — and that’s why you should too.

To keep up with these changes, you’ve got to monitor the health of your website – and that’s where a technical SEO audit comes into play. You can perform a technical SEO audit on your website using tools like Moz, SEMrush, Screaming Frog, Google Search Console, ahrefs etc. However, even good tools have their limitations. First of all, they often do not consider the specifics of your site and do not know why something has to stay and what can be dropped.

Additionally, tools often have limited access to your site and do not have all the data they need to see the big picture. That’s why the best SEO audit is one performed by a human.

Of course, a human uses many tools, too (we’ll write more about that later), but above all, an experienced engineer can consider the particular situation of a given site. A human will also directly answer your questions and concerns about particular issues.

As earlier stated, an SEO audit helps you to know your website’s health. This way, you will know what to improve on and pinpoint things dragging you behind. So here comes the next big question – why should you conduct an SEO audit regularly?

 

The SEO audits we provide are not only a way to better performance and results in Google. It’s also an opportunity to improve user experience. Our clients say the recommended changes increase the effectiveness of website goals. We believe that SEO should be looked at holistically as part of an overall business strategy. Mike Jackowski COO, ASPER BROTHERS Order SEO Audit

 

Why is it worth performing a Tech SEO Audit Regularly?

Google’s search algorithm keeps changing regularly, and you should change with it. If you want to dominate your industry, you should constantly lookout for what’s new in the digital space. To do that, you’ve got to continually perform a technical SEO audit and adapt to the changes.

Regularly auditing your website for technical factors is important to ensure your digital asset is aligned to Google’s best practices. Here are the areas that are worth analyzing in a technical SEO audit.

 

Technical SEO Checklist

There are different types of SEO audits. Some focus more on the site’s content architecture, and others take into account internal and external linking. A technical SEO audit is different. It mainly helps you understand how Google’s robots see and index your site and how individual elements on the page affect crawl speed and user experience.

A good technical audit also includes assessing the technology used, languages, JS framework, and the level of external resources on the site. The more experienced and technical background the engineer performs, the more specific suggestions for recommended changes you will get.

 

You need to know that a good audit is not just an analysis of the situation, but also recommendations on what changes to make to improve performance results and user experience.

 

Accessibility and visibility for search engines

Websites are designed for users, not just for search engines, and Google moves towards optimizing the UX of sites favoring those that care about their users.

That’s why you should have your users in mind while designing a website, and these users include people who are not using a JavaScript- browser.

To test your website accessibility, you’ve got to preview it in a browser with the JavaScript turned off. Of course, you can also test it in a text-only browser like Lynx.

JavaScript Disable Chrome Dev Tools

Disabling JavaScript allows you to see what content is generated on the page without using JS. Gif Source: https://stackoverflow.com/

A text-only browser helps you determine the accessibility of a site and helps you identify other content like the text embedded in images.

 

Indexing Issues

Indexing is the process of organizing the information on a web page and storing them in the search engine database. This helps search engines immediately respond to the queries from the users. For example, Google uses the XML sitemap to pinpoint what’s important on a website even though web pages not on your sitemap will be indexed by Google.

“The Google Search index contains hundreds of billions of webpages and is well over 100,000,000 gigabytes in size. It’s like the index in the back of a book — with an entry for every word seen on every webpage we index. When we index a webpage, we add it to the entries for all of the words it contains.”

Source – Google.com

It is essential to have an XML sitemap for your website so that Search Engine spiders can index all your major web pages effectively. However, all in all, it would be best if you focus on the overall site quality and not just on pages reported in the sitemap.xml.

 

Core Web Vitals

Core Web Vitals metrics

The need to meet Core Web Vitals standards is one of the most common reasons for initiating technical audits.

 

Core Web Vitals are now on everyone’s lips. Google started to include this set of parameters as a ranking factor. It even gave developers better insight into the status of this metric through changes in Search Console, where there are now “Page Experience” and “Core Web Vitals” tools.

But what are Core Web Vitals? Simply put, according to Google, they are 3 important metrics that allow you to measure how well your website is optimized for user experience. They are Largest Contentful Paint (LCP), First Input Delay (FID) and Cumulative Layout Shift (CLS). Let’s briefly review each of these.

  • Largest Contentful Paint (LCP) – This is the load time for the largest element in the first view on the page. It could be a large menu, and it could be a slogan in the header section, the largest photo, or some JavaScript-generated element – such as an animated text. This metric shows how quickly the user sees the main content on the first screen. Below 2.5 seconds, LCP is marked as good.
  • First Input Delay (FID) – This is an indicator of how interactive the page is. It is the time between when a user calls for action and when the page responds. For example, it can be the time between clicking on a menu item and expanding a mega menu item. FID below 100ms is considered good.
  • Cumulative Layout Shift (CLS) – This is an indicator to measure Visual Stability on a page. Surely you have encountered a situation that after entering the page, many of its elements change positions, new sections appear, or between the existing content are loaded with ads and other elements. This is CLS. For Google to consider CLS as good, it must be below 0.1.

There are often questions about how Core Web Vitals differs from PageSpeed results. The PageSpeed Insights score is a number from 0 to 100 that measures your site’s overall performance and loading speed. Additionally, Google in this tool gives many tips on how to optimize this performance.

Core Web Vitals focus primarily on the impact of performance on user experience. They show lab data and real data from users, so we can realistically assess the impact of site optimization on the end-user experience. And after all, that’s ultimately what we optimize a site for – to make it more accessible and friendly on as many devices and for as many users as possible.

 

Page load time

It is important to know how fast your website or web page loads. If your website takes more than 5 seconds to load, you might be losing a large volume of your audiences. According to Portent, 0-4 seconds of load time sites are ideal for conversion.

The conversion rates of your website drop by 4.42% with each additional second of load time.

 

Canonical URLs

Canonical URLs signal to the search engines that prevent duplicate content by integrating indexing and linking properties to a single URL. This is especially important if you have content on your site at different URLs that can cannibalize each other. By setting a canonical URL, you show Google which URL you think is more important for a given content and keyword.

Tools like Screaming frog SEO spider gives a quick overview of the canonical implementation across your website and reports the common error.

 

Loading external resources

You can have the best-optimized site, give pure HTML on the page without any JavaScript (although that’s hardly possible today, don’t use JS), but just plug in too many external resources, and your site performance will fly down.

Let me give you an example. Hotjar, you are familiar with it, right? It’s a great tool, but it’s not irrelevant to your site. It’s an example of external JavaScript that needs to download to the user’s browser, affecting performance. Another example – a chatbot. This is also a great tool that helps you achieve better conversions. But also, to make it work, it often needs to load JavaScript that is fetched from an external server.

Another popular example is ads. Ad providers try to have fast ad servers, but all the time, displaying an ad requires downloading elements from outside.

What is the solution to this? Using external tools, be it SEO link building tools or SEO audit tools, wisely, limiting them or enabling them only when you actually use them. Sometimes it is also better to write something custom than download a whole big JS library that will heavily load the page.

Waterfall view in gtmetrics

In order to identify the loaded external resources and their influence on the page loading, it is worth using the waterfall view. You can get it in Chrome Dev Tools, but also e.g. on gtmetrix.com

 

Optimizing image files

When it comes to optimizing images – several things matter. First of all, size matters here. It makes no sense to load huge images that are scaled down to smaller sizes. It’s better to have the sizes cropped to the needs of a particular section. It will also help if you use the “srcset” tag, which will display the right image to fit your device – different for a phone and different for a desktop.

The second thing is the compression ratio. It’s best if you upload images that are already properly compressed. If you don’t have such a possibility – you can use the functions of your CMS or implement solutions that will do it on the server side.

Another thing is to use a CDN. For example, Cloudflare may be a good option if your website has a lot of graphic files. This will ensure that graphic resources are loaded from servers closer to the user’s location.

 

Lazy Loading

Lazy Loading is a very clever solution. Simply put, Lazy Loading allows you to load elements as you scroll down the page. Elements (for example, photos) can be loaded while the page is being used, making the first loading of the page much faster.

Lazy Loading can be used for many elements – images, whole sections, videos, illustrations, etc.

It can be implemented natively (many browsers already understand lazy tags) and dedicated libraries that support it. This is one of the fastest and most significant ways to optimize page load time.

Lazy Loading Native Chrome

Chrome up to version 76 supports native lazy loading.

 

Redirect chains and loops

Sometimes the requested URL goes through multiple redirects before the destination URL is served. This occurrence is commonly referred to as the redirect chain. Your web page may get into problems like delayed crawling, decreased pages speed, and loss in link equity with too many redirects.

It is essential to audit your website to see if your web pages redirect through the chain of redirects.

 

Too large JavaScript and CSS

As you probably know – CSS is responsible for styling (appearance) of elements on your page, and JavaScript provides interactivity and some functionality. This is, of course, a big simplification. Practically every website today uses these two types of files. And the rule is simple – the smaller these files are, the better for load times.

There are many ways to optimize CSS and JS. Compression is the easiest way. But the most important thing is to make sure that there is only what is really necessary for these files. You can load different resources on each subpage. When, for example, functionality is only on the main page, the JS file responsible for it does not need to be loaded on other subpages. Your frontend development team should consider this in the project structure.

Additionally, it is sometimes a good idea to load CSS inline within the page code rather than as a separate file. This is especially recommended for key page elements such as menus or header sections. Thanks to this, in many cases, the browser will render your page faster.

 

Uncached JavaScript and CSS files

It is vital to enable caching of JavaScript and CSS files. When the cache is not enabled, the user’s browser has to make multiple requests for the asset making the site load slow.

 

SSL and Security

Having an SSL certificate is crucial to increase your SEO effort. An SSL helps to maintain a secure connection from a web server to the browser.

SSL implementation is one of the ranking signals for Google, and not having one can put a question mark on your website’s authority.

Let’s Encrypt is a clear winner when it comes to SSL usage statistics. As of May 2019, the authority supported over 169 million fully-qualified active domains. What’s more, it’s responsible for 98,072 million active certificates and over 51 million active registered domains.

source: hostingtribunal.com

 

Structured Data

Structured data is a set of markup implemented on a web page to provide accurate additional details in the web page.

The structured data or the schema markup is a set of code provided by Schema.org. This helps Google and other search engines to return information to the users more accurately.

 

Redirects

Redirects send human visitors and search engine bots to a different URL from the one they requested originally.

The fewer redirects, the better because Google’s robot will have less work to do. Of course, some redirects are advisable (e.g. 301 redirects from a version with “www” to one without “www”). But the general rule is to do it wisely.

 

Mobile Friendliness

With Google’s “mobile-first” approach, it is now even more crucial to have responsive and mobile-friendly websites.

You need to perform a usability audit to understand how your website is performing on a particular device. Mobile-Friendly-Test is a free tool that determines how mobile-friendly your website or a web page is.

 

Robots.txt

The robots.txt file tells search engine crawlers which page or file to crawl and which not to.

The website audit tool gives you a clear idea of how your web pages are being crawled and how good your robots.txt file is implemented.

 

Technical SEO Audit Tools

SEO is such a big industry today that there are many useful tools on the market to support SEO auditing. However, specialized technical audit (especially in the JS area) is such a specific activity that tools should be treated only as a means to achieve the goal. All of the items listed below are supportive and help diagnose areas worthy of attention and deeper analysis. Personally, I find Chrome Dev Tools to be the most helpful of all these tools. In the right hands, Dev Tools will identify most technical issues with a website.

 

Chrome Dev Tools

Chrome Dev Tools is a potent tool built-in Google Chrome browser. I’m not going to describe all the features here, but I will list those worth using initially. For technical analysis, it’s often a good idea to enable the option to view the page as a GoogleBot. Another useful feature is to disable JavaScript to see what content is generated without JS. Google does render JS, but in my opinion, it’s best when key content elements also work without JS. Additionally, you can simulate the page load time on different devices and different network speeds. Very useful.

Another useful feature is the analysis of loaded resources in the “Network” tab. This is where we can discover the largest loaded elements of the page, such as CSS and js files or external resources that increase the page load time. Handy is a division into types, filtering, sorting by types. The “capture screenshots” function also allows you to observe the progress of page rendering. It is worth associating this item with More Tools > Rendering. After enabling this, we have access to highlighting Layout Shift Regions or analyzing all Core Web Vitals. Anyway, I recommend that you familiarize yourself with all the features available in the “more tools” tab. Often people overlook these features, and they really are beneficial.

Dev Tools is something that many web developers work with daily. Therefore, it seems to be good to engage someone with developer experience to do performance and technical analysis. Surely the audit will go more smoothly for such a person.

Chrome Dev Tools

 

Page Speed Insights

Pagespeed Insights gives you an overall insight into your website speed. By merely inputting your URL, Google page insight will analyze your web content and recommend boosting your page load speed.

Nowadays, page load speed is one of the factors that determine your ranking in the SERP. If it takes time for your page to load, you will likely experience a high bounce rate. And if visitors do not interact with your website before leaving, you will pretty much have poor conversion performance and poor user experience.

One fascinating feature of the Page Speed Insight is the use of field data. That is, data is captured whenever visitors land on your website. The data reflect the user experience of the visitor, and it also offers actionable steps to help you optimize your website for better performance.

 

Lighthouse

Lighthouse helps improve your webpage quality. It helps web owners determine audit functionalities like performance, SEO, accessibility, and progressive web apps.

To use Lighthouse, all you need is to input the URL of the website. Then the audit begins and generates a comprehensive report on the web page.

From there, you get to see how to improve the web page.

Lighthouse report

LightHouse is a great way to learn about a few key metrics that evaluate not only performance but also user experience.

 

Google Search Console

One way to get ahead in the digital marketing world is to keep a close eye on your performance in SERPs — and the Google search console helps you to do just that.

This technical SEO audit tool helps you to diagnose common SEO issues. You can also use the tool to ascertain that Google’s web crawlers can access your website and that new pages are indexed at the right time.

You can optimize the setting to receive alerts whenever there are indexing issues on your website.

What’s more, the Google search console allows marketers to track their keyword ranking positions on search engine result pages.

 

Screaming Frog

Screaming Frog is one of the giants in the industry, and the company’s SEO spider can rapidly crawl websites of any size. It also delivers SEO recommendations to the users.

It can pinpoint web pages with missing or poorly optimized metadata, and broken links, diagnose and fix redirect chain issues, and uncover duplicate pages.

 

DeepCrawl

If you manage a team, then DeepCrawl is probably the best fit for you. It’s an enterprise-level technical SEO audit tool that delivers customized reporting.

With DeepCrawl, you can test XML sitemaps, find broken links, monitor page performance and speed, and ascertain content quality. One excellent feature is the historical data view. The feature lets you compare results and identify what needs to be fixed.

 

Technical SEO Team Structure

A technical SEO audit is good to do after every major site update and implementation of new features. And it is certainly necessary when launching a new site. But how to assemble the right team to run the audit?

The ideal team for a technical SEO audit service should include:

  • A manager familiar with the site (on the client’s side) – ideally, the team should include a person responsible for the site’s growth. Such a person should be able to define recent changes and indicate expectations and goals for the audit. Knowledge of SEO issues is not required here on a high level, but it is better if the manager knows this subject. Contact with the decision-maker is important because sometimes it is necessary to work out compromises on functionality – appearance – performance. It may also be a manager responsible for marketing or presence on the web. Then, for example, we can establish good practices related to the use of third-party tool codes on the website, such as Hotjar, etc.
  • Highly technical SEO specialist / Web developer – a person for special tasks and the most substantive part of the audit. Ideally, such a person is a programmer or knows HTML, CSS and JavaScript very well. Thanks to that, he or she will define problems but, e.g. indicate specific solutions and ways to solve the main problems. A popular example here would be a recommendation for the optimal technology for SSR implementation (e.g. Nuxt js or Next js). The technical auditor will partner for technical consultations with the development team that will implement the recommended changes.

After the audit, it is certainly good to determine how the changes will be implemented on the website. Of course, it all depends on whether the site owners have their own team of developers or whether it is necessary to hire a dedicated team of developers.

A technical SEO audit can have different levels of sophistication and different levels of detail. One thing is certain. In the changing world of search engine requirements, it makes sense to stay on top of your own site’s optimization and performance levels. It’s worth getting started. Even small changes to your site often pay huge dividends in improving your position in Google.

 

Call to action
We specialize in optimizing the performance of websites and web applications. Learn more about our advanced technical SEO audit.
default avatar asper brothers

Share

ADD COMMENT

RELATED articles