Just this week I began getting e-mails from Google introducing their all-new Search Console:
A quick refresher on what the Search Console does for webmasters and marketers:
- Checks your content for accessibility by Google for indexing
- Lets you submit updated XML sitemaps for indexing
- Identifies missing or unlinked pages and other errors
- Gives you data on impressions, clicks, clickthrough rate and SERP position in Google organic search
- Can be linked to your AdWords account and import organic search data into it
There are 2 basic vikews in thne new Search Console interface: Status and Sitemaps.
Here's the default Status overview screen:
What you have there immediately is (1) Performance view - a time plot of Clicks- and (2) Index Coverage - a plot on the same time axis of Errors and Valid Pages indexed by Google.
The plot of Clicks isn't terribly exciting - it just shows growth in clicks beginning during Christmas school vacation, which is typical for a tourism site.
The Index Coverage plot is interesting though. It shows that some event happened on 30 November 2017. (In the live version, you can see date and numbers of errors and valid pages anywhere on the plot just by mousing over the line.) That event caused one error and 73 valid pages magically to appear. What could do that? Webmaster uploaded a new XML sitemap that included 73 pages that were online, and one that was not. Already we've learned something useful.
Notice the mention of Enhancements at the bottom of the page. I'm not going to go there now. It's a subject unto itself. Maybe later.
Clicking the OPEN REPORT on the Performance overview, we get this:
There's a lot of stuff in that view:
- The Search Type switch (top left) lets you choose to see data from Web pages, images or videos, and apply various filters.
- The Data switch lets you choose one of several time spans.
- The 4 big tabs at the top lets you display plots of clicks, impressions, CTR and/or ad position - one at a time or in any combination. (The screencap is showing the CTR plot.)
- The small tabs below the plot let you display information on Queries (search terms), Pages (landing), Countries (where clicks came from) and Devices (mobile, desktop, tablet). In the Queries view (shown), at the bottom of the screen there's a list of search terms with click counts. Click on one search term, and the plot will show pattern of clicks, impressions (etc.) for just that search term.
Clicking OPEN REPORT on the Index Coverage overview produces a screen like this:
Once again, by clicking on the big tabs at the top you can see several different collections of data: Errors, Valid Pages w/Warnings, Valid Pages and/or Excluded Pages.
That's the Warning view in the screencap. It tells us 2 things: (1) only 1 page is involved (the vertical scale at left is # of Pages, and it reads 1) and (2) error first occurred on 30 November (time on horizontal axis) and has never varied.
Note the Error Status at the bottom of that screen. "Submitted URL, not found (404)" means that the XML sitemap the webmaster uploaded on 30 November included the URL of a page that Google couldn't find on the site, perhaps one that existed earlier but was removed.
Valid with Warnings and Valid views are pretty humdrum.
But the Excluded report is interesting:
Here Google has given us a summary of various reasons for which it has excluded some page URLs from its crawl results - duplicate page without canonical tag, etc. - with a count of pages in each category.
Clicking on "duplicate page without canonical tag" produces a list of the duplicate URLs (which in this case are URLs that were created in an earlier AdWords campaign with tracking query strings (e.g., /about.html?source=google is a duplicate of /about.html, etc.)
In the detail view of that report (not shown here), Google provides a link to a nice help page that explains what all the reasons for exclusions mean, and more.
Finally, opening the Sitemaps report produces a screen like this:
Here you can add a new sitemap - actually add a link to a new sitemap, which you'll have to get created elsewhere - for Google to crawl. This is something that needs doing any time you change, add or remove URLs.
And Google gives you a link to the sitemap it's currently using, with submission date, date of least read, status of the crawl on that date (Success or Failure), and total number of URLs Googlebot found.
Definitely a big improvement over the previous version of the Search Console.
So far we have the new version for only 2 of the sites we manage, hope they complete the rollout soon.
Questions or comments? Please send.
Later...
Comments on A first look at the new and improved Google Search Console