How to interpret ishihara test answers

How to interpret Google Search Console crawl statistics to get the most out of SEO

I don't know about you, but I'm always looking for new ways to improve my SEO.

I've been in digital marketing for over ten years now. I've heard it a thousand times: "SEO is dead!"

And yet it thrives.

It is still a very important part of any website.

That's why I like to find new, interesting and unknown methods to improve my SEO.

Today I am sharing an SEO technique with you that is little known, but can make a big difference.

The Google Search Console crawl statistics reports.

You probably already know that the Google Search Console, also known as GSC, is one of my favorite tools for managing websites.

You've probably used them before too. You might even be a GSC expert.

But there is much more to the GSC than meets the eye.

I use a lot of GSC features that other people rarely use.

The crawl statistics are often overlooked.

The page is tiny and consists of only three charts and a few data. But you shouldn't underestimate them.

Because here you can see how the crawler of the Google search engine reacts to your website. If you want to get the most out of your SEO, this information is important.

Now I'll show you how you can use crawl statistics to improve your SEO and gain insights into values ​​that you can't find anywhere else.

What are the crawl statistics?

First, I would like to briefly explain to you what the crawling statistics are all about.

You need to go to the Google Search Console and go to the correct property to access these statistics.

Click in the left sidebar Crawling.

Then choose Crawl statistics from the menu.

Now you're on the crawl statistics reports page! This page should look something like this:

I admit that doesn't look very helpful at first glance. You can't immediately see how this data can benefit your SEO.

So let's first look at what the values ​​mean.

These are the measured values ​​of yourCrawl rate.

The crawl rate tells us how often the search bots search your website. In this case, we see the activity of Google's search bots.

Here is Google's definition:

A fast crawl rate is almost always desirable because then the bots can index your website faster and easier.

And when your website gets more attention from Google, your ranking in the SERPs increases.

This is why crawling statistics are so important. When your crawl rate is low, your SEO suffers.

If your crawl rate suddenly shoots up, something might be wrong with your site.

That is why you should always keep an eye on your crawl rate.

The diagrams may not make a lot of sense right now, but that's not bad!

We are now looking at how we can interpret these values.

How to read the diagrams

The crawling statistics page is divided into three areas:

  • Pages crawled per day
  • Kilobytes downloaded per day
  • Download time of a page (in milliseconds)

All three statistics are equally important, so you should always look at all three together.

I am still working on all three areas.

Pages crawled per day

This statistic shows you how many pages the Googlebot crawls every day. You can see the results for the last 90 days:

If you want to see a specific day, you can move the mouse pointer over the diagram to see the results:

On the right side you can see your high, average and low crawl rate.

Now comes the tricky part.

The crawl rate differs from other metrics, such as the domain authority. You have no control over how often Google searches your website.

You can but try to understand why there are deviations, especially if there are very large deviations within a few days.

Since the crawl rate also tells you how fast and bot-friendly your website is, it is a good indicator of whether your site is easy to crawl or not. (It should be easy to crawl!)

You want a constant crawl rate. It should look something like this:

There are always a few outliers, but overall the crawl rate remains constant.

When you see something like this:

Then you have a problem.

All three diagrams are similar. In order to keep you busy with the crawl rate, the first diagram, the pages crawled per day, is crucial.

Strong outliers and irregularities could be a sign that something is wrong with your site.

What should you do if this is the case with you?

Let's look at the different scenarios now.

If the graph suddenly drops could have the following causes:

1. Maybe you have broken code (e.g. HTML) or unsupported content on your website.

If you've recently edited your code, that could be the problem.

You can check your code with one of W3's validators to see if it works.

2. Your Robots.txt file could be blocking too much.

You should always revise your Robots.txt file very carefully, as you can accidentally block information that Google needs to crawl a page.

If your Robots.txt file is huge:

Maybe you need to revise it.

3. Your website has old content.

It's no secret that Google loves fresh content.

And this is how it works (at least in theory): If you make a change to your website, Google will be informed about it and will crawl your site again.

Your website will be re-indexed every time the Google bot crawls your page.

And if your site is great, then you can hope for a ranking boost.

But if your website has too much old content, it won't be crawled as often and the crawl rate will drop.

You want to prevent that at all costs.

You should regularly fill your website with fresh and helpful information.

If the content of your site is stale, it is sure to be less crawled and gets fewer clicks.

Always keep in mind that you don't just have to spruce up your site for the search engines. You do it for the users too.

If you publish new content regularly, your website will be crawled more often by Google and seen by more people.

If the graph suddenly shoots up,could have the following reasons:

1. You have published new content or revised the code of your site.

Even if you don't consciously try to be crawled again, it could still happen, especially if you something new published.

That could be new content or a new piece of code.

If you've just recently redesigned your page and your chart suddenly soars, it's probably because of the new information on the website.

2. Your Robots.txt file allows the crawl bots to crawl a lot of your content.

If you've never revised your Robots.txt file, then you probably don't have any, or have a basic one.

This means that the search bots will always search your entire content.

Do we want that? Yes and no. that always depends on what you want from the bots. Read this article for details.

These are the main reasons for the outliers in the diagram.

As I said, you should always strive for a constant crawl rate.

So your diagram should look something like this:

If your chart has multiple ups and downs, it doesn't matter.

But if you notice inactivity for a longer period of time ...

... or see dramatic changes ...

... you probably have a problem that needs to be fixed.

One last thing: Use the values ​​on the right to determine the average crawl rate of your website.

There is no formula for this; You have to observe the values ​​over a longer period of time.

Since the crawling statistics only contain data from the past 90 days, you can compare your values ​​every 90 days to get a good overview and to determine an average value.

I'll tell you now what I mean by that.

So you determined the values ​​for a 90-day period four times and the average values ​​were 40, 52, 49 and 45.

In this case, the average for the year is 46.5.

You need to monitor the crawl rate for the past 90 days and the average of all 90 day periods.

My average score is 5,962.

My site is extensive because I publish new content several times a week. So don't get discouraged if your scores are lower.

If your website is new or has fewer links, it is also likely to be crawled less frequently. As your website grows, so does your crawl rate.

The diagram of the pages crawled per day is one of, if not the most important, areas of crawl statistics.

How can you optimize your crawl rate?

If your crawl rate is low or has too many deviations, you should work on getting a constant crawl rate.

You should definitely eliminate outliers highlighted by problems, but you also need to optimize your website for a constant crawl rate over the long term.

We have already mentioned one way to optimize the crawl rate: fish content.

The more often you publish new content, the higher your crawl rate.

But that's not enough.

The Google bots use complicated algorithms to determine the quality of your website.

Ever better Your content is, the more likely the Google bot will give you an advantage in the SERPs.

The best way to achieve this is with long and detailed content. This content has been proven to rank better and your users will love it too.

You can use other tricks, however.

One of these tricks came from Brian Dean from Backlinko.

He says you should republish your old content. The idea is really brilliant, because this way you get the most out of your already finished content.

Your users will then see your content as if it had just been published:

Still, it's old content.

But since you have revised them, the Google bot crawls them again and your crawl rate increases.

You can solve almost all of your problems by always posting fresh content. But if you don't see the results you want, you should revise and republish your older content.

Kilobytes downloaded per day

You are probably wondering what this is all about. What is it downloaded there?

Every time the search bot visits your website, it downloads your page for the indexing process.

So this value indicates how many kilobytes the Google bot has downloaded.

The value is always different, depending on how big your page is. If your website is small, Google doesn't have to download that much, etc.

Statistics is probably the less helpful area of ​​crawling statistics, but you can still gain insights that can help you analyze your page performance.

If your value is always very high:

Then the Google bot crawls your site very often.

Since the Google bot has to download your page again every time, a high value indicates that Google is frequently searching your website.

But a high value is also a double-edged sword.

This also means that Google will take longer to crawl your site. However, if your site can be downloaded quickly, it will be less difficult and easier to browse.

That's why you should take a closer look at this diagram to find out how fast the Google bot can search and index your site.

You can also use this diagram with the first diagram to see if Google likes your website.

Download time of a page (in milliseconds)

This statistic is a bit of a misnomer.

You are probably thinking by now that it has something to do with your page speed.

The page speed is important, but by no means everything that is measured here.

According to Google's John Mueller, here you can see how long it takes the Google bot to make HTTP requests to crawl your site.

Pretty disappointing, I know.

However, you can still use this information.

Low values ​​are better here.

If your values ​​are low, the Google bot doesn't spend as much time on your site and can therefore search and index them faster.

You have no influence on these metrics, but you can see how quickly Google is searching your site.

You can put this information together with the diagram of the Kilobytes downloaded per day to use. It works like this.

Look at both diagramsKilobytes downloaded per day and Duration of the download of a page (in milliseconds) at.

Let's just say they look like this:

You need to think about how these two diagrams relate to each other.

If both averages are high, the Google bot is spending a lot of time on your site. That's not ideal.

Since you cannot influence how long the Google bot takes to make an HTTP request, you have to influence how many kilobytes Google downloads.

For example, you can exclude unimportant pages from crawling. You could revise your Robots.txt file. (Read this article if you need help with this).

You can remove unimportant pages and useless code from your website. You have to be careful though, because all of your content and code has an impact on your SEO.

It doesn't matter if Google downloads a lot of kilobytes, though. That shouldn't cause you sleepless nights.

Conclusion

I want to be completely honest with you.

When I first looked at the crawl statistics, I thought, "Is that all?"

They are not very impressive.

Perhaps you've landed on this page by chance and didn't think about anything else. However, since the site is difficult to find, you probably haven't looked at it any further.

But once you know what the statistics mean, a light will dawn on you. Suddenly new possibilities are available to you.

The page is really extremely useful, despite the fact that it contains so little information.

It only contains three diagrams and yet it is a gold mine of information. If you are into SEO, have to You have to look at the site.

And the best part is that it's free. So if you're not already using the Google Search Console, you should start now.

If the crawling statistics have already become an integral part of your long-term SEO strategy, you can outdo your competitors and secure a unique advantage.

With the reports of the crawl statistics you can better understand the relationship between your website and the Google bot.

Just think of it as some kind of relationship therapy. You can check how the relationship is doing and make changes if necessary.

You should also take a look at the other areas of the Google Search Console. It's best to start with the crawling statistics and then work your way through the GSC bit by bit.

Have you ever used the crawl statistics? Do you want to start now?

share