The Complete Guide to Google Search Console for SEO Pros

4
A Complete Google Search Console Guide For SEO Pros

Google Search Console gives you the necessary data to monitor and improve site performance using search. Search Console is the only way to access this data.

Publishers and online businesses must do this if they want to maximize their potential success.

It’s easy to manage your search presence by using the free reports and tools.

Google Search Console: What’s it all about?

Google Search ConsoleGoogle makes it easy for publishers and search marketers to monitor the health of websites relative to Google Searches.

This report summarizes search performance metrics to aid publishers in improving websites and driving more traffic.

Search Console lets Google notify it of security issues (like hacking vulnerability) and whether or not a penalty was imposed manually by the search quality division.

Here are some key points

  • Monitoring crawling and indexing
  • Correct errors.
  • A summary of search performance.
  • Request indexing to have your pages updated
  • Both external and internal connections should be examined.

It’s not necessary to use Search Console to rank better nor is it a ranking factor.

It is essential to increase search engine performance and to attract more visitors to your site.

What are the steps to get going?

You must verify site ownership before Search Console can be used.

Google provides several different ways to accomplish site verification, depending on if you’re verifying a website, a domain, a Google site, or a Blogger-hosted site.

Search Console can automatically verify domains of Google Domains.

The majority of website visitors will verify their sites using one of these four methods.

  1. Upload HTML files
  2. Meta tag
  3. Google Analytics Tracking Code
  4. Google Tag Manager.

Site hosting platforms can limit the number of files that may be uploaded. Site owners will also need to be verified.

But, that’s becoming less of an issue as many hosted site services have an easy-to-follow verification process, which will be covered below.

How do I verify site ownership?

Website ownership can be verified using either a WordPress website or the one described below.

  1. Upload HTML files
  2. Meta tag

When verifying a site using either of these two methods, you’ll be choosing the URL-prefix properties process.

Let’s stop here and acknowledge that the phrase “URL-prefix properties” means absolutely nothing to anyone but the Googler who came up with that phrase.

Don’t let that make you feel like you’re about to enter a labyrinth blindfolded. Google allows you to easily verify the website.

HTML File Upload

Step 1. Go to the Search Console and open the Property Selector dropdown that’s visible in the top left-hand corner on any Search Console page.

Screenshot taken May 20,22 by the Author

Step 2 The label on the pop-up is Select a Type Of PropertyClick on the link and enter the URL. Continue button.

Step 2Screenshot taken May 20,22 by the Author

Step 3 You can download it by clicking on the HTML File Upload option.

Step 4 Upload the HTML file to your root directory

The root can also be translated to root https://example.com/. Verification.html should be used to identify the downloaded file.

Step 5You can verify your identity by clicking on Verify Again in the Search Console.

Google, a web platform that allows you to validate the authenticity and domain of any website, is a good example. WixUnd Weebly is similar to the above steps, except that you’ll be adding a meta description tag to your Wix site.

Duda’s method is simple and uses just a Search console AppIt quickly verifies the site, and you can then get going.

GSC Issues

Ranking in search results depends on Google’s ability to crawl and index webpages.

If there are problems with crawling and indexing, the Search Console URL Inspection Tool informs you. It is important to do this before serious problems occur and the results drop.

URL Inspection Tool

URL Inspection Tool will show you if URLs have been index.

The user can access each URL that is submitted:

  • For a new website, ask for indexing
  • Google Sitemaps, and Referring Pages indicate how the page was found by Google.
  • Check the most recent date of crawl for URL.
  • Google could be using any of these URLs.
  • Status of mobile usability
  • Be on the lookout for breadcrumbs or other additions.

Protection

This section includes Discovery, Crawl (which shows if Google was successful in crawling the URL) and Enhancements (which gives information on the status of structured data).

You can access coverage by using the menu to your left

CoverageScreenshot by the author, May 20,22

Coverage Error Reports

While these reports are labeled as errors, it doesn’t necessarily mean that something is wrong. Sometimes it is just an indication of indexing that may need to be improved.

The screenshot shows that Google has displayed a 403 Forbidden Server response to nearly 6,000 URLs.

Googlebot will be notified by the server with a 403 error message. This indicates that Googlebot is not allowed to crawl certain URLs.

Coverage report showing 403 server error responsesScreenshot by the author, May 20, 22

Googlebot is unable to crawl member pages in web forums.

Each member is given a personal member page. This includes a list of the most recent posts, as well as stats.

This document lists URLs responsible for this error.

When you click any URL, a menu will appear to the right. This menu allows you view the URL.

There’s also a contextual menu to the right of the URL itself in the form of a magnifying glass icon that also provides the option to Inspect URL.

Inspect URLScreenshot taken May 20,22 by the Author

Click the “Inspect” button to inspect the URL. This will reveal how the page was found.

You can find the following data points:

  • Last crawl.
  • It is possible to crawl up as.
  • Are crawling permissible?
  • If the page fetch function fails, it displays an error code.
  • Does indexing allow?

Google Canonical also has information:

  • User-declared canonical.
  • Google-selected canonical.

You will find all the information you need about the forum website in the Discovery section.

This section includes links to Googlebot member profiles.

With this information, the publisher can create a PHP Statement that will take the links out of the member pages.

You could also solve this issue by adding another entry to the robots.txt. Google won’t be able crawl the pages if you do this.

You can fix this 403 error. Googlebot then has more resources to crawl other sites.

Google Search Console’s coverage report makes it possible to diagnose Googlebot crawling issues and fix them.

Fixing 404 Errors

Publishers will also be notified by the coverage report of 500- and 404-series errors.

A 404 server response is called an error only because the browser or crawler’s request for a webpage was made in error because the page does not exist.

It doesn’t mean that your site is in error.

If another site (or an internal link) links to a page that doesn’t exist, the coverage report will show a 404 response.

You can inspect URLs by clicking on URLs affected URLs and then selecting Inspect URL.

Then you can decide if the link should be fixed (in the case of an internal link) or redirect to the correct page (if an external link is coming from another website).

Oder, it could not be found and was incorrectly linked.

If the page doesn’t exist anymore or it never existed at all, then it’s fine to show a 404 response.

GSC: Benefits

The Performance Report

Search Console Performance Report provides multiple insights into search site performance. It includes search features like featured search snippets.

Four types of searches can be explored by the Performance Report:

  1. Web.
  2. Image.
  3. Video.
  4. News.

Search Console will default to showing you the type of search.

To change the type of search shown, click on the Search Type button

Default search typeScreenshot taken May 20,22 by the Author

The menu that appears will allow you to select the search type to be viewed.

Search Types MenuScreenshot taken May 20, 22 by the Author

A useful feature is the ability to compare the results of two types search within one graph.

These four metrics can be found at the top and bottom of this Performance Report.

  1. Clicks total
  2. Total Impressions.
  3. CTR Average (click-through rates).
  4. Position average.
Screenshot of Top Section of the Performance PageScreen shot taken May 20, 22 by the Author

As a rule, total impressions and overall clicks will be chosen.

By clicking on each tab, you can see the bar graph metrics.

Impressions

A website’s impression refers to the time it has been in search results. As long as a user doesn’t have to click a link to see the URL, it counts as an impression.

Additionally, if a URL is ranked at the bottom of the page and the user doesn’t scroll to that section of the search results, it still counts as an impression.

Google will display the website in search results if it gets high amounts of impressions.

Impressions measurement is based on click metrics and average position.

Clicks

Clicks refer to how many times people have clicked search result links in order to access a site. You want to get a lot more clicks than impressions.

It is possible to have low clicks and high impressions, but they aren’t necessarily bad. These could be signs that your website needs improvement in order to draw more people.

When compared with the Average CTR and Average Position, clicks may be more valuable than other metrics.

Average CTR

CTR means click-through ratio. This is the number of people who clicked the links in search results to get to the site.

CTR low is an indication that your search engine results pages can be improved.

CTR means that the site is performing well.

Combining this number with the Average Position makes it even more valuable.

Average Position

This represents the average site ranking within the search results.

Positions 1-10 are the best places to find a great average.

An average position in the twenties (20 – 29) means that the site is appearing on page two or three of the search results. This isn’t too bad. To reach the top 10, it needs some improvements.

Sites with an average rank below 30 could be considered serious problems.

It is possible that you are ranked both for very low ranking keywords as well as high-ranking keywords.

It could be a sign that your content is not up to par. It may be an indication of a content gap on the website, where the content that ranks for certain keywords isn’t strong enough and may need a dedicated page devoted to that keyword phrase to rank better.

All four metrics can be viewed in conjunction to provide a complete picture of site performance.

The Performance Report is a quick tool that can help you understand the website’s performance with search.

It’s like a mirror that reflects back how well or poorly the site is doing.

Dimensions of Performance Report

Scrolling down to the second part of the Performance page reveals several of what’s called Dimensions of a website’s performance data.

There are six dimensions.

1. Queries: This page lists the top searched terms as well as the impressions and clicks for each phrase.

2. Pages: This page shows the most visited web pages plus impressions and clicks.

3. Countries: Clicks, impressions, top countries

4. Devices: The following section lists the top-rated devices for each category: mobile, tablet and desktop.

5. What to look out for? These rich results list the types of sites displayed. Also displayed are the site’s Web light, clicks and video results. Web Light results are optimized for slower devices.

6. Dates: You can arrange clicks/impressions according to date by clicking on the dates tab. Order clicks and impressions either in ascending, or descended order.

Key words

You can find these keywords in the Queries. This is one dimension within Performance Report. This displays the traffic for top 1000 searches.

Particularly important are low-performing queries.

These queries are rare, so they have lower traffic. This is long-tail trafic.

Search queries with other keywords may not return the same results. It could be due to insufficient internal links, or because the keyword phrase requires its own site.

It’s always a good idea to review the low-performing keywords because some of them may be quick wins that, when the issue is addressed, can result in significantly increased traffic.

Here are some other links

Search Console will allow you to view a full list of hyperlinks which lead to your website.

However, it’s important to point out that the links report does not represent links that are helping the site rank.

It simply shows all the links to this site.

The link list may contain links that are not helpful to site rank. Links with nofollow attributes may also be considered.

The Links report is accessible  from the bottom of the left-hand menu:

Links reportScreenshot taken May 20,22 by the Author

In the Links Report, there are two columns: Internal Links and Extern Links.

External Links are the links that lead to the site via external sources.

Internal Links are links that lead to other parts of the site from within the same website.

The External Links column contains three reports:

  1. The top linked pages are listed below.
  2. Best linking sites
  3. Linking text that is most effective

The top Linked Pages are listed in the Internal Links list.

Each report (top linked pages, top linking sites, etc.) Each report (top linked pages, top linking sites etc.) Every report has a hyperlink. MoreTo expand each report, click on the link.

For example, the report for Top Linked Pages has been expanded to include Top Target Pages. These pages have the most links.

By clicking the URL, the report will display all URLs linked to that page.

This report does not include the link to the website.

Sitemaps

Sitemaps are XML files that include a list of URLs. These help search engines find pages on a site.

Sitemaps are especially useful for large sites that require complex crawling, and particularly when there is frequent new content.

It is impossible to guarantee crawling or indexing. It is not possible to guarantee crawling or indexing a website’s quality, overall quality, and links quality.

Sitemaps simply make it easy for search engines to discover those pages and that’s all.

You can easily create sitemaps using many different templates, plugins, and the hosting platform.

Hosted websites platforms automatically create sitemaps for every website and then update them when the website’s content changes.

Search Console makes it easy for sitemap authors to upload sitemaps, and provides sitemap reporting.

You can access this function by clicking on the link at the top of the menu.

sitemaps

If there are issues with your sitemap, the Sitemap Section will notify you.

Search Console can delete sitemaps in reports. It’s important to actually remove the sitemap however from the website itself otherwise Google may remember it and visit it again.

After the coverage report is submitted, it will be processed and a sitemap section created. This will help in troubleshooting URLs submitted via sitemaps.

Search Console Page Experience Report

This page performance report gives data about site speed and user experiences.

Search Console offers information on Core Web Vitals and Mobile Usability, as well as other pertinent topics.

If you are looking for a quick overview on site speed, this page is the best place to begin.

Rich Result Status Reports

Search Console receives feedback about rich results through the Performance Report. It’s one of the six dimensions listed below the graph that’s displayed at the top of the page, listed as Search Appearance.

Search Appearance tabs show clicks, impressions data, and other information regarding rich search results.

This report highlights how important rich traffic is to a website. You can use it to help you identify specific trends in traffic.

You can use Search Appearance to diagnose problems with structured data.

An example of this is a drop in rich searches traffic. This could be an indication that Google has modified its data requirements. You must update your structured data.

It’s a starting point for diagnosing a change in rich results traffic patterns.

SEO: The Search Console Is a Great Tool

Search Console provides many advantages to SEOs and publishers. You can also upload link disavow, resolve penalties (manual actions), and monitor security events like site hackings. These actions all contribute to improved search visibility.

It is a valuable service that every web publisher must utilize if they want to improve search visibility.

Learn more


Bunny Pixar/Shutterstock featured photo

LEAVE A REPLY

Please enter your comment!
Please enter your name here