Using Google Search Console to see if Google AdsBot is getting 403 errors

PUBLISHED 4 Feb 2024

Is Google's AdBot forbidden from seeing some of your site's pages? Here's why that matters, and how to find out using Google Search Console.

Circumventing systems and cloaking

Recently I was approached by a client who had received a Google Ads suspension for 'circumventing systems'. This serious violation has lots of potential causes, but their PPC consultant had already ruled most of those out.

What they were investigating now was cloaking, which Google defines in its advertising policies as:

"showing different content to certain users, including Google, than to other users"

The client certainly wasn't engaging in cloaking deliberately, but the PPC consultant wondered whether Google was inadvertently being forbidden from accessing particular pages or resources on the site. This would give Google a different experience to that enjoyed by human visitors, so might fall under the umbrella of cloaking.

After all, Google's advertising policies also mention that Google needs to be "able to review a version of the content" - an understandable requirement!

Finding 403s in Google Search Console

So how can you tell whether Google is being forbidden from accessing certain pages because of permissions? In this situation, it would receive a '403 Forbidden' server response. And the easiest place to see this is in Google Search Console.

First up, Search Console's Page Indexing report (Indexing > Pages) will show you whether Google has not indexed any pages because of this server response (the following image is just an example, not from the client site in question):

List of reasons why pages aren't indexed in Google Search Console

But this relates to the crawler that Google uses for Google Search, known as Googlebot. If you're investigating a Google Ads violation, you'll be more interested in the experience received by AdsBot. This crawler's job is to visit advertisers' sites and checks ad quality. (Google Search Central has a full list of Google's various crawlers if you're interested.)

If AdsBot wasn't able to access a page, it wouldn't be able to check ad quality, and might flag this up as cloaking - which could trigger a 'circumventing systems' violation.

I appreciate there are lots of mights and coulds in that previous sentence, but Google doesn't provide a huge amount of detail as to what it considers cloaking in the context of advertising. Understandably so, because unscrupulous advertisers could take advantage of that to bend the rules without breaking them.

(Incidentally, there is a little bit more info about cloaking in Google's general search documentation - it would be safe to assume that this also applies to the paid advertising side of things.)

Using the Crawl Stats report to focus on AdsBot

To see server responses received by AdsBot as opposed to Googlebot, you'll need Search Console's Crawl Stats report.

  1. In Google Search Console, go to Settings
  2. Under 'Crawl stats', click OPEN REPORT
  3. Scroll down to 'By Googlebot type'

Crawls by Googlebot type from Google Search Console

  1. Click on AdsBot

This will show you crawl activity for AdsBot specifically. (If you don't currently advertise on Google Ads, you'll see little or no activity from AdsBot.)

Below the graph are some example requests and responses. You could add a filter here to only show 403s:

  1. Click on the filter button
  2. Tick 'Response'
  3. Set the filter to 'Contains 403'

Filtering Google Search Console's crawl report for 403 server responses

  1. Click DONE

Working with the AdBot Crawl Stats data

My preferred way of working is to EXPORT the data instead, then manipulate it in Google Docs or Excel.

When you export the data, you'll be given three files (if you choose CSV) or three tabs (if you choose Google Docs or Excel). In either case, the one with the specific requests and responses listed is called 'Table':

Three CSV files exported from Google Search Console's crawl stats report

You can sort, filter or pivot table this data to see the 403 responses specifically.

In the case of my client, AdsBot wasn't getting any 403s. Disappointing, but at least we could rule it out as a cause of cloaking and the Google Ads violation.

We could now move on to looking at whether specific resources on the page were being blocked to Google, substantively changing the content... but that's for another post.

Subscribe
Notify of
guest
0 Comments
Inline Feedbacks
View all comments
James Clark
Hi! I'm James Clark and I'm a freelance web analyst from the UK. I'm here to help with your analytics, ad operations, and SEO issues.
0
What do you think? Leave a commentx
linkedin facebook pinterest youtube rss twitter instagram facebook-blank rss-blank linkedin-blank pinterest youtube twitter instagram