How to use automatic tagging to diagnose traffic drops


In our most recent SEO office hours session we discussed the question uttered (often in panic) by stakeholders and SEOs alike:

Why is my F @ # k% * g SEO Tanking ?!

Let’s recap the topics covered by Andrew Shotland, CEO of LSG, and Karl Kleinschmidt, VP of SEO Strategy, such as:

  • How SEOs are currently analyzing a drop in traffic without big data
  • What automatic tagging can do to make it easier to calculate traffic drops
  • Automated tagging: a step by step guide
  • A case study on how automated coding helped our client

This is the post, we’ll dive right into the automated tagging part of the SEO office hours session. However, they also talked about 7 simple hacks to diagnose traffic dips which you can read in our newsletter, or you can watch the full video of SEO office hours here.

How SEOs are currently analyzing a drop in traffic without Big Data

To analyze what’s changed on your site we’re likely going to need some big data manipulation, but we should understand how SEOs are currently analyzing a drop in traffic with no big data to start with, so we’ll dive into that.

Without big data, chances are you’ll be tracking a small set of keywords using third-party placement trackers and / or using small samples of your keywords in GSC.

Problems with this:

  • You have to manually scroll through the data and capture patterns so you can miss something
  • The cost of ranking tracking can grow quite quickly by using third-party trackers with large amounts of keywords
  • Even if you use free tools like GSC, this can take a long, long time when done manually

What automated coding can do to make it easier to calculate traffic drops

So what’s the best way to do it? Automated tracking. Using automatic tagging for categories, intent and page type you can:

  • Easy to find models in traffic dips
  • Better understand the traffic in progress
  • Preserve knowledge from past analyzes
  • Make it easier to predict the impact of future SEO projects

The approach with automated coding

First, let’s tag all keywords in GSC in 3 buckets: categories, intents and page types.

For example, for page types, 1 page type could be product pages, so we tag any page that has “.com / p /” in the URL.

An example of intent could be any query that contains question words such as what, how, who, etc.

Finally, an example of a category could be the shoes or shoes keyword to tag all queries containing shoes or shoes.

This approach allows you to store and tag all Google search console traffic for the past 16 months and move forward whenever you pull GSC data. You can save the data in Google Cloud which allows you to have the largest possible body of data to analyze instead of analyzing samples of your total GSC data.

This can make a big difference in how efficiently you can find patterns in your daytime traffic tank data.

Automated coding is particularly useful because it saves knowledge from previous analyzes. So, if you are an agency working with multiple clients, you don’t have to remember a weird pattern, content release issue, etc. which occurred 9 months ago.

Tag it and save a lot of time and headache.

Best practices for automated coding

  1. Tag keyword templates: Tag any patterns you find in your keywords – the more patterns you have, the better.
  2. Use regular expression: It is also useful to use regular expressions (regex) to tag instead of “if contains”. This can give you a lot more options.
  3. Tag specific industry groups: Tag anything specific to your industry, such as pages, keyword groups that you know are relevant such as:
    • Brands for eCommerce
    • Levels of taxonomy for eCommerce
    • Blog categories for a blog
    • Sales funnel levels (TOFU, MOFU, BOFU)
    • City / States for location-based activities
  1. Use negative keywords if necessary: You may want to filter out branded and non-branded keywords. For example, if you work on the Nike site you might want a category labeled “shoes” but NOT “Nike shoes” if you are more interested in unbranded.
  2. Start tagging now: Don’t wait for a drop in traffic, tthe earlier you start, the more data you have.

The steps for automated coding

  1. First, you can perform an n-gram analysis of the URLs and sort them by traffic. You want to tag directories and subdirectories.

What the hell does that mean?

Basically, an n-gram parsing simply splits the URL into its individual parts (i.e. directories and subdirectories).

So, you see in the graphic above, element 1 is / p / and element 2 is / shoes /. In this analysis, / p / are product pages and / shoes / are shoe pages.

So, we can tag this type of page as shoe product pages. The same goes for pants, gloves, jackets, etc.

Where element 1 is / blog / and element 2 is / shoes /, you guessed it, these are shoe blogs, so let’s tag that too. / s / is a search page on Nike and you have an idea.

See! Not that scary.

  1. Then, do an n-gram analysis of the keywords and sort by traffic. Here, you want to tag categories and intent.

Then, we run the same analysis for the queries, all queries containing “shoes” or “shoes”, as well as colors and gender. We can do this with regex statements since anything that is “red”, “blue”, “black”, etc. It is labeled as a color category.

  1. Then, look at the untagged keywords, sorted by traffic and categories / intent tags.

Once you get past the obvious clusters, you may notice patterns like this one. So, here we can see the unclassified “children’s pants” and “children’s hat” groups.

The best thing to do would be to create a regular expression so that things like “babies”, “baby”, “babies” or “babies” are a tag as they are synonymous.

  1. Additionally, you can scroll through untagged pages, sorted by traffic and by tagged page types.

Since this URL ends with “tall” and “large”, we know these pages are size-relative pages. Then, we can use a regex statement to group these as a size page type as well.

Tools we use for automatic tagging

  • Ngram URL tool
  • Keywords Ngram tool
  • Search Console tagging tool
  • Bigquery
  • Google Cloud
  • Table

The N-gram tools in combination with the Search Console tagging tool allow us to find models and tag them. We have built one in-house but there are free ones available.

Once we have the data, it is uploaded to the Google cloud and we use Bigquery to access it. From there, you can use Tableau or Google Data Studio for data manipulation.

Case Study: How Automated Tagging Helped Our Client

Okay, to make the benefit of tagging more concrete, let’s look at a real case study of how this helped us quickly understand a drop in traffic.

Our client was an eCommerce site hit by the Google Core May update. We started tagging after the fall and found 5 top models in the ranking drops.

These tables show the difference in clicks from 2 weeks before the update.

Page Types (e.g. product pages, search pages, blogs)page types

As you can see, page type 1 is the most affected, but page types 2-5 are also.

Requests for intent (e.g. purchase, questions, near me)

Intent 1 dropped sharply, but so did unclassified intent or keywords without a known intent tag. So this lets us know that we needed to look into these uncategorized keywords further to identify the drop and also start excluding a lot of tagged keywords outside of this bucket.

Marche (eg. Nike, LG, etc.)

Again, we see that the uncategorized section is the hardest hit, while brand 1 was also heavily hit.

Categories (eg. Shoes, city)

For categories, we need to look at categories 1 and 2, as well as the unclassified part of the queries.

So far in our investigation, we have noticed that many uncategorized keywords have been most affected by our intentions Queries, brands and categories, so we need to dig into this information.

Keywords with unknown category

Here, we’ve applied our page types to our unknown categories with a custom regex statement specific to this client.

For keywords with unknown categories that fell in rank, pages 1, 4, and 5 account for most of the decline.

Keywords with unknown brands

We have done the same thing here for unknown brands.

We used a regex statement to find where it was true and false when we overlaid our page types.

As you can see, it’s true for page types 1, 4 and 5 while it’s false for page type 2. So, for page type 2, we know we need to find keywords that contain that page type but where the regex statement is false.

This reduces the list of all keywords (in this case 1000) to a few dozen that we have to do manually.

Keyword with unknown intent

Again, looking at the query intent, we can see problems with page types 1,4,5 but the regex statement is false on page type 2. We can take this information and narrow the keyword list to the type of page 2 as we did for unknown Marche.

We now have a priority list of keyword patterns in ranking drops and a list of action items:

  1. Investigate the page type 1,3,4,5 where the Regex statement is true
  2. Look at page type 2 where the Regex statement is not true
  3. Investigate intent 1
  4. Investigate the trademark 1
  5. Investigate categories 1 and 2

Since we know how much traffic they have lost, this gives us a chance to know what to send to the developers first as you investigate further smaller models.

If you want more SEO insights, sign up for our LinkedIn newsletter to get exciting results, new SEO research, and a wealth of useful content.



Source link

By LocalBizWebsiteDesign

Leave a Reply

Your email address will not be published. Required fields are marked *