7 ways to diagnose a drop in traffic


With the launch of Google’s new useful content update, there is likely to be some volatility in rankings and traffic in the coming weeks. With that in mind, we thought it would be helpful to kickstart our SEO newsletter with a few hacks we collected to quickly diagnose a drop in traffic.

We will cover 7 different ways you can understand why your traffic has gone down and show you how to monitor and mitigate traffic dips in the future.

Most of the time, organic traffic drops for one of these seven reasons:

  1. Redesign and rebranding
  2. Updating a website without SEO supervision
  3. Content updates
  4. Site architecture modification
  5. Domain Migrations
  6. Google algorithm update
  7. Technical Problems

As a starting point for investigating dips, it’s best to understand what has changed on your site. Here are a couple of hacks that may help you determine why your traffic has changed.

7 hacks for diagnosing traffic drops

  1. Use your GSC coverage report to spot trends
  2. Use the GSC coverage report to check for bloated URLs
  3. Use the GSC page experience, key web vital elements, and crawl statistics reports
  4. Compare traffic from Bing and Google
  5. Use Archive.org to find the changes
  6. Scan the website
  7. Use automated coding

If there are annotations in Google Analytics (GA) or release notes, that will really help to understand what has changed, but often they are not there, so we need to be creative.

1. Use your GSC coverage report to spot trends

A quick way to find out what’s going on is to go to the Google Search Console (GSC) and check the coverage reports.

No alternative text provided for this image

Take a look at the graphs on the right hand side and note any patterns. Which charts are going up or down?

For example, in one of this report, we can see a sharp increase in the number of noindex pages. So, afterwards, we ask ourselves: “Is this related to the decrease in traffic?” Maybe this site recently canceled the indexing of a bunch of pages by accident.

2. Use your GSC coverage report to check for bloated URLs

A Google Search Console coverage report can also show you problems like bloated URLs. The bloated URL occurs when you add a significant number of website pages with redundant or low-quality content, making it harder for your priority pages to rank highly.

No alternative text provided for this image

The chart above shows an example of a site that has released over 100,000 URLs in the past few months. This led to a sharp drop in the impressions they were previously ranked for.

So, we don’t have a definitive answer here, but it gives you an idea of ​​what deserves further investigation because we can see the relationship between increasing noindex URLs and decreasing impressions.

Google may not have been indexing recently added pages because they were redundant or thin. It is also possible that this site intentionally non-indexed some pages and this caused this drop.

3. GSC page experience, core web vital data and crawl statistics reports

Significant changes in performance can affect your rankings, so these reports are worth checking out:

  • Vital major webs in Google Search Console
No alternative text provided for this image

The Core Web Vitals report shows the performance of your pages, based on real-world usage data.

  • Page experience in Google Search Console
No alternative text provided for this image

The Page Experience report provides a summary of the user experience of your site visitors.

  • Google Search Console crawl statistics
No alternative text provided for this image

The Crawl Statistics report shows Google’s crawl history statistics on your website.

Note that the orange line is this scan statistics report – this is the average response time. For clarity, the average response time refers to the average time it takes Googlebot to download a full page.

As the average response time increases, the number of crawled URLs decreases. This isn’t necessarily a traffic killer, but it’s something you should consider as a potential cause.

No alternative text provided for this image

The crawl statistics can also help detect problems with hosting. This helps to respond when some subdomains on your site have had problems recently. For example, it might serve 500 or another problem reported by Google.

The great thing about the GSC Page Experience, Core Web Vitals, and Crawl Stats reports is that they only take a minute or two to review. So, they’re a great way to get a quick read of the site and what issues might explain the drop in traffic.

4. Compare Bing and Google Traffic

Here’s a quick way to find out if you’re the one responsible for the drop or if Google is: Look at your Bing organic traffic data.

No alternative text provided for this image

If you see a drop in traffic on Google but not on Bing, it is likely that Google is responsible.

If you don’t see any differences and organic traffic has dropped on both Google and Bing, chances are you’ve done something.

Good news: When you are responsible it is much easier to fix. You can reverse engineer what you did and get your site ranking again.

Bad news: If Google is responsible for the decline, you will need to do some further analysis to understand what has changed and why it is impacting you. This may require some big data solutions which we will discuss in the last section.

5. Use Archive.org to find the changes

Archive.org can be really useful if you don’t keep documentation of the site’s historical changes, which most don’t. In these cases, you can use Archive.org to see screenshots of each page and site template before and after the traffic drop.

No alternative text provided for this image

One major benefit is that Archive can go back years compared to GSC, which only provides the last 16 months of data.

6. Scan the website

To find technical problems, we recommend that you scan the website. You can use tools like the screaming frog or Sitebulb for this.

Crawling your site can help you find a number of technical issues like broken links, nofollow navigation links, search engine robots stuck in robots.txt, etc.

7. Use automatic tagging

If you’re not using automatic tagging, you should. This is the best option when you have a large site and / or need to harness the power of big data to narrow down the keywords and pages that are causing the drop in traffic.

Automatic coding by categories, intent and page type allows you to:

  • Easily find patterns in traffic dips
  • Better understand the traffic in progress
  • Preserve knowledge from past analyzes
  • Make it easier to predict the impact of future SEO projects

LSG’s recent SEO office hours have covered this topic, including a walkthrough on how we used automated tagging to uncover the cause of a nationally known ecommerce site. You can check out our auto-tagging summary blog here.

[Previously published on Local SEO Guide’s LinkedIn Newsletter – Page 1: SEO Research & Tips. If you’re looking for more SEO insight, subscribe to our LinkedIn newsletter to get hot takes, new SEO research, and a treasure trove of useful search content.]



Source link

By LocalBizWebsiteDesign

Leave a Reply

Your email address will not be published. Required fields are marked *