1. Ensure that your pages can be indexed
An index is another name for the database used by a search engine such as Google, Firefox, Internet Explorer, etc. Indexes contain the information on all the websites that Google (or any other search engine) was able to find. If a website is not in a search engine’s index, users will not be able to find it when they Google a keyword or your business name.

Use the Index Status report in Google Search Console to see the number of pages Google has crawled from your domain. You can also enter site:example.com where you will see the pages indexed.

You want the data to indicate the page count is increasing over time.

Test whether you have a robots.txt file. Type in your website URL/robots.txt

Example:
https://domainname.com/robots.txt

2. Secure your website
Google favours websites which are encrypted with https.

You can secure your website with an SSL certificate that is included in your hosting package for your website.

HTTPS also provides encrypted communication to a web server.

You may ask, how do I know that a website is secure. Look out for https in front of the domain and/or a small lock in front of the URL

Secure website

3. Crawl your site and look for any crawl errors

First, we need to look at What does website crawlers means. Web crawlers are known by many names, including spiders, robots, and bots. This means they crawl across the World Wide Web to index pages for Google and other search engines.

Search engines (Google, Firefox, Internet Explorer, etc) don’t know what websites exist on the Internet if it is not crawled and indexed.

Think of it like computer shopping in a new store. You have to walk in between the aisles and look at the product (computer, printer, etc) before you can pick out what you need.

4. URLs must have a clean structure
According to Google, “A site’s URL structure should be as simple as possible.” If your site’s URL structure has high numbers of URLs that point to identical or similar content on your site, it can cause problems for crawlers. Google also mark it as duplicate content that can lead to Google penalties. These penalties are difficult to resolve.

As a result, the crawler may be unable to completely index all the content on your site.

Examples of problematic URLs:

Sorting parameters
Some large shopping sites provide multiple ways to sort the same items, resulting in a much greater number of URLs. For example:

    http://www.example.com/results?search_type=search_videos&search_query=tpb&search_sort=relevance&search_category=25    Irrelevant parameters in the URL, such as referral parameters.
http://www.example.com/hotel-search-results.jsp?Ne=292&N=461+4294967240+4294967270                  

Visit this page on Google to read more.

Where possible, you’ll want to shorten URLs by trimming these unnecessary parameters. For SEO purposes the appropriate page URL is 75 characters long. As for indexing documents with long URLs (over 75-120 characters), will also be indexed, but will result in slower indexation.

5. Metadata
When pages are crawled, sometimes a NoIndex meta tag is inserted, which will block Google.

You can use a variety of software for it to be indicated. The technical SEO audit will include this as well.

5. Website speed
Search Engines favour faster sites and potential visitors are likely to leave the site because of slow-loading websites. The aim is for your website pages to load in 2 seconds or less.

What can you do?
Use speed testing tools such as Google PageSpeed Insights
Reduce image file size for optimised performance

6. Mobile Speed
Given Google’s modern “mobile-first” indexing approach, it is critical to appear in mobile searches.

Use responsive design
Update your website to feature a responsive design that creates an optimum experience regardless of screen size. This makes it also more mobile-friendly.

Some optimisations include:
Compressing images
Optimise your website for local SEO
Focus on long-tail keywords and voice optimisation

Website speed

7. Create useful links
Your search rankings will benefit from a well-planned internal link structure.

8. Master the basics
Implement the fundamental optimisation techniques to improve your pages’ potential to rank in search engines.

Use unique title tags:
Write unique title tags including a keyword phrase to accurately describe the page’s topic

Understand title length been expanded recently and search engine results may run as high as 75 characters. Use this free tool to calculate your number of title characters. Click here to visit this great tool!

Emphasize keywords
Include the keywords you’ve selected in your headings, subheadings, body copy and URL appropriately.

Write meta descriptions
Indexed pages should include meta descriptions featuring keywords to encourage readers to click-through. Avoid duplicating your page content in your meta description.

In conclusion:

Using this checklist helps you identify the different topics that need to be looked at or improved to ensure that the Technical SEO report is favoured by Google.

Mastering these basics will assist in positive Technical SEO report.

Make sure that you change this according to your warnings or errors detected in the report.

Categories: SEO

0 Comments

Leave a Reply

Your email address will not be published. Required fields are marked *