A Guide to Google Search Console

A Guide to Google Search Console

A Guide To Google Search Console

Using GSC to its full potential may seem complicated but if you learn the basics you’ll be able to point out SEO opportunities quite easily. This will be a guide on how, when, and why to use GSC for your site or a clients.

Overview:

1.     How your site shows up in Search Engines

2.     Traffic from Searches

3.     Fetching, Rendering, and Indexing

4.     Google’s Robots and their crawl activity.

Search Engine Appearance

How you show up in the search is critical to getting people to your site. There are many different ways to change how you show up in the Search, from AMP (Accelerated Mobile Pages) Rich Snippets and Structured Data.

Structured Data

Structured Data can add all sorts of great things to your search engine results. Products, prices, events, pictures, almost anything you can think of! But what happens when you implement it and you don’t see a difference when it comes to the SERPs. By Clicking on Structured Data under the Search Appearance Tab you’ll see something similar to the screenshot below.

Graphing Data from Google Analytics

Remember: you’ve already tried to implement structured data but it isn’t working the way its intended to.

Structured Data is incredibly important for the user. It allows Google to show the viewer what is on your site, that its trusted (via Reviews), as well as a host of other things. Making sure that its all firing correctly will allow you increase page views, click through rates, and potentially sales.

As you can see we have 1700 errors because our hentry structured data doesn’t function properly. How do we test this? Use Google’s Structured Data Testing Tool to see what errors and warnings we are currently throwing. See below for an example of a functioning hentry structured data markup.

You can go through and check out each individual type of structured data here. If you are working with a developer always provide screenshots addressing the problem. If you are fixing the data yourself then you already know how to fix the problem or you need to read more about Structured Data.

Rich Cards

Ever googled a movie and at the very top, there is a single box out laying the details of your favorite movie MacGruber? Perhaps you’re a cook and needed a recipe for flapjacks. A rich card appears showing you the basics of said recipe.

Rich Cards can be one of 4 things, movies, recipes, courses, and jobs. The Rich Card section under Search Appearance will show you what is being indexed by Google and what pages are showing up as errors in Rich Cards.

Rich Cards are created by having the correct Structured Data implemented for the purpose of making sure mobile users have an easy to read and engaging format at their fingertips.

Data Highlighter

Data Highlighter is a substitute for building out your own structured data found in Data Highlighter under Search Appearance. This is mostly used by people who have been in the SEO industry since at least the early 2000’s. The Majority of people will implement structured data directly on the site. BUT if you don’t want to go that route you can use the Data Highlighter to implement markup directly in GSC.

Accelerated Mobile Pages

AMP is a great way to increase your sites SEO value across the board. If you’re just starting your most likely see a screen like this. See image below:

You can check yours to see if your AMP page is valid Here Pro Tip: Bookmark that bad boy in your browser header.

AMP served as a subset of HTML known as AMP HTML. The primary thing you need to know is that it strips away everything that is going to slow down the page from loading quickly. Allowing the reader to get at the meat of the page much quicker.

You’ll see any errors on your page laid out here. A quick guide to most common issues is below:

·  start with the doctype <!doctype
html>
.

·  contain a top-level ⚡> tag ( is accepted as well).

·  contain and tags (They are optional in HTML).

·  contain a href="$SOME_URL"> tag inside their head that points to the regular HTML version of the AMP HTML document or to itself if no such HTML version exists.

·  contain a tag as the first child of their head tag.

·  contain a content="width=device-width,minimum-scale=1"> tag inside their head tag. It’s also recommended to include initial-scale=1.

·  contain a tag inside their head tag.

·  contain the AMP boilerplate code (head > style[amp-boilerplate] and noscript >
style[amp-boilerplate]
) in their head tag.

 

Search Traffic

The more people you can get in the door the better off you’ll be. That’s what the Search Traffic tab is all about. Who’s linking to your page, how many times, and which link? Want to know your top page? You can find it all here.

Search Analytics

Search Analytics is pretty straightforward. It measures clicks, impressions, CTR (click through rate) and average keyword position.

You can then sort each of these main topics by the following queries, pages, countries, devices, search type, search appearance, and date. (dates are going to be updated so you can compare year over year in 2018). It will look like this.

Most people use this as a way to pull reports but with Google Data Studio this isn’t necessary. As you can see it allows a quick eagle eye view. Allowing you to hone in on what is working and what isn’t.

Links To Your Site

As mentioned in the beginning of this section, Search Traffic allows you to check your links by looking under Links To Your Site. This section is pretty easy to read. See Image below:

This allows you to take a dive into your link building process. Are you receiving links from high-quality sites? Whats being linked the most? As you can see it’s the Return Policy. This could show that there might be an actual product problem. /dealer-log-in/ shows that people are interested in helping others sell your merchandise though, which is good.

This is also a great place to check if you have any toxic links by downloading the list and looking. I advise using this once your site has reached several hundred links. Don’t go picking through 50 links, you should have larger SEO items to deal with. Using a disavow tool will save you a lot more time doing this as well.

Internal Links

Straight to the point. This section lists what you are linking to from one page of the website to another the most. If you sell beanbags you want /beanbags/ to be the most linked to URL. Not your homepage or the about us section of your site. The Internal Links section highlights what is important to you.

Manual Actions

If anything is in here it means google personally went in and t-boned your website for doing something incorrectly. Perhaps you have a bunch of backlinks to Russian directories, a two minute load time, or something illegal.

Follow Googles Guidelines and don’t do anything shady and you’ll be on your way to never seeing anything in here, just like it should be.

International Targeting

If you are an international company you’ll have different languages your site needs to be in. You can do this by adding a hreflang tag. En-us is English us but there are also en-ca English Canada and en-gb English great Britain as well as Spanish, French, etc. If you are targeting other countries you have to have hreflang tags and they will show up here.

If you’ve ever tried to do this you know it’s a pain and hard to do correctly. If you haven’t tried this before and you need to I recommend hiring an expert.

Mobile Usability

Mobile Usability shows you any issues you have that are giving the user a negative experience. If you have issues you will not rank as well as if you had none. This is becoming increasingly important and by the end of 2018, I believe mobile users will be the most important. Google is already transitioning weight towards mobile and away from desktop. Below is an example of the mobile usability with a few errors.

As you can see we have two issues.

1.     The clickable elements are too close which can cause misclicks on phones.

2.     Some content isn’t being responsive and does not adjust its size to a mobile screen. Causing parts of it to be cut off, missing, or too wide. Creating a negative experience.

3.     You have to include a meta viewport tag. This adjusts the dimensions based on the device used by visitors.

 

Google Index

The Google Index is separated into 3 different tabs. Index Status, Blocked Resources, and Remove URLs. This is one of the most important features of GSC and if you read anything in this review it should be this.

Index Status

Index Status shows the total amount of pages that have been indexed. Simple stuff right? WRONG! Go ahead and click on the advanced feature and you can see what is also being blocked by robots. This can be huge. I had a client who had disallowed his entire site because of one /. This can be a quick way to find out if you have a spike in pages being blocked by robots. See the example below:

The most important factor you need to know about the index report is this. If you look at GA and GSC you can find out the number of pages in the index status on GSC matches up to the number of landing pages receiving organic traffic in GA. If they don’t match or aren’t very close it means a small % of your indexed pages are receiving any traffic.

Blocked Resources

Blocked Resources do one very crucial thing. It tells Google which pages to ignore. As stated this can be misused and can block your entire site if used improperly. If used correctly you can make sure no page is indexed that you wouldn’t want to share with the rest of the world/show up in a google search.

You can block a page by using a nofollow tag or a noindex. If you want them to show up again simply remove the tags.

Remove URLs

Why would you ever want to Remove URLs?

1.     Duplicate Content

2.     Thin Content (under 300 words)

Google will remove the URLs from the web for 90 days to give you a chance to get a handle on everything. This should only be used when you have an action plan for reducing duplicate content and filling in thin content that you are being punished for.

Crawl

All those robots crawling around your site. Does it do you any good? Only if you look here! The Crawl Section is broken up into several different sections that you can utilize to make your website a better place to hang out, and find!

Crawl Errors

This is section is simple, it shows you what site errors and url errors you may have. You’ll use this section to make sure you don’t have any 404’s HTTP to HTTPs success, etc. See image below:

As you can see we have 18 server errors. This means that when a user tries to connect to the server 18 pages are not being found. This could be for several reasons. The pages no longer exist but haven’t been properly redirected/404’d. The pages do exist but the server data has been corrupted. It may exist and may be on the server but is unreachable.

Large upticks are usually caused my site migrations to different platforms or servers but can occur from other sources.

DNS errors are being caused by your server so contact your host

Soft 404’s mean your page isn’t up to scratch. Make sure it doesn’t have thin or duplicate content OR is being properly 404’d/301 redirected.

Server Errors = too much traffic for your site to handle. Dive in and see if your site is having a connection issue or timeouts caused by an influx of traffic to the site.

404 errors are usually the error you will see over and over again. Always redirect a page that no longer exists to the closest url possible. If it’s about red sweaters redirect it to the sweaters category page. Whatever you do make sure that you DON’T redirect 404’s to the homepage.

Crawl Stats

Simply put this is the data that shows when Google is crawling your site. How many pages, kilobytes and time spent on page. Here is a great example of what crawl stats look like. See image below:

This is exactly what you want to see on the red and blue lines. Not much activity, the big spike is due to a server migration. The green shows time spent on each page and was quite erratic, now that the site has been properly transitioned to a new server as well as HTTPs we see it level out.

 

The only time you want to see high activity is on purpose. Perhaps you fixed your robots.txt file, transferred to HTTPS, just uploaded a large amount of content, etc.

Fetch as Google

Fetch and render as Google is a great tool to see what’s going on in your site through Google’s eyes. It allows you to see exactly what Google Sees from a design concept.

You can also fetch pages using google to do the following:

1.     Update an old webpage

2.     Launch a new part of your site

3.     Update your robots.txt file

4.     Make sure canonical tags are implemented

5.     Update your site to HTTPS

Robots.txt Tester

Robots.txt Testing is crucial to making sure you know what is blocked and what isn’t on your site. Having a simple / in the wrong place can completely turn off your organic search.

If you have a site that is getting 0 traffic or very minimal organic traffic this is the first place that you should check.

Sitemaps

Sitemaps are an easy way to make sure Google can read your site, and knows what pages are the most important. Anytime you add or remove a new category or part of the site you need to check your sitemaps.

You can upload sitemaps individually or you can use a /sitemap_index.xml which has all of your sitemaps indexed. Most of the time you will do this individually until your site gets too large (sitemaps can only be up to 10Mbs each).

See image below for an example:

GSC warns you about things for a reason. The above site has three warnings. These need to be taken care of and find out what problems can be fixed.

The main issue with this site is that 5xx errors are being thrown (server side). A clear red flag that their server is running slow, needs to be updated, or the site needs a higher % of allocation due to the traffic they are getting.

URL Parameters

URL Parameters aren’t anything to play with. If you don’t know what you can seriously mess things up. Most small websites won’t need to deal with this. As your site continues to grow larger and larger you will use this. Everything from amp parameters to page, type, variant, utm_, are going to show up here. You can allow Googlebot to decide what to do with them or you can manually set the action yourself.

The main reason you would want to go in and edit a crawl on your site is to stop Google from seeing duplicate content. There shouldn’t be any on your site, but sometimes comments can be seen as duplicate content too.

 

Conclusion:

There you have it! A step by step guide to everything that is Google Search Console.