On Site SEO Check List Part 1

SEO checklist

On-page SEO is about resolving website’s technical issues and ensuring best user experience. The website structure and content are important search engine ranking factors. Ensuring that the best on page seo practices are followed is important for the success of Off Page SEO strategies.

15 To dos for On Page SEO

1) Site Verification- Google, Bing and Yandex

Get your site verified in all three major search engines, Google, Bing and Yandex. The verification process is easy. You just need to place an HTML tag on your website homepage. Verifying your sites on these search engine webmaster will help you get your website indexed faster in the search result. The webmaster tools will also provide you important information regarding your website technical issues, recommendations for seo improvements and enable you to target specific countries and set preferred versions of your website.

List of To dos- Google Search console, Bing Webmaster and Yandex Webmaster

1) Submit xml sitemap.

2) Set preferred domain version. Google search console gives you three options -www.example.com, example.com and don’t set a preferred domain. It is best to pick one and stick to it. Pro Tip- Set rel canonical on your homepage. Homepage duplicates are very common and that people might link to your homepage in many ways (example.com, www.example, http example, https Example). You need a canonical tag on our homepage to get the proper benefits of backlinks.

3) Set international targeting in google search console. Set the country whose users you want to target the most.

4) Use data highlighter in google search console for better SERP results.

5) Check for HTML improvements/SEO recommendations in webmaster tools, example duplicate meta tags, seo unfriendly urls etc.

6) Check the number of website pages indexed.

7) Check for crawl errors and make sure robot.txt file is working.

8) Remove unwanted/spam/low quality backlinks using disavow link feature.

9) Check website statistics like number of impressions, clicks, search queries, sites linking to you etc.

10) Submit Urls for indexing.

11) Check Site speed and mobile friendliness

2) Google Analytics Account-

Set up google analytics account to understand the customer demographics, customer behavior and marketing channel performance. Google analytics gives you a clear picture of who are your customers, where are they coming from, what keywords they are using to find you, which content they like the most and what are the best landing pages. You can also monitor your organic, paid, referral and social media marketing channels. Google analytics gives you the ammunition to optimize your website content and marketing strategies. It is the most reliable way of measuring your website performance and seo efforts. It is best to link your google analytics account with google search console for gaining better insights for search engine optimization.

3) Meta Titles and Meta Descriptions-

Meta Titles is one of the most important On-Page SEO factor. Each page of your website must have a unique Meta title of upto 60 characters. Meta titles help search engines understand the topic of the page.  Include the most important keywords in the Meta title. The Meta Tiles also appear in the search results, it is good practice to include a call to action in Meta title for better CTR (Click through rates).

Search engine users see the Meta description below the Meta tiles in search results. Meta descriptions are no longer ranking factor, but a descriptive and unique Meta description of up to 150 characters for each webpage makes the search result easy to understand for the users and makes the search results more clickable.

4) H1, H2, H3 tags-

Proper page formatting is crucial for on-page seo. You need to provide and tag proper headings with keywords in them, example main heading (h1 tag) and sub headings (h2). Use the size font (at least 12px) for H1 tag and (at least 9px) for H2 tags. Split the text between two consecutive headings into small paragraphs of minimum 4 to 5 lines. Bullet points and numberings should be preferred over long paragraphs for enhanced user readability.

5) XML Sitemap-

Create and submit xml sitemap on all major webmaster tools namely Google search console, Bing Webmaster and Yandex Webmaster. XML (Extensible Markup Language) sitemap is in simple terms the blue print of your website structure and content. It contains the list of your website urls.  The XML sitemap makes the search engine crawlers easily find your website urls and index them quickly in search results. XML sitemap tell search engines the nature of content on your website, importance of content, content updates frequency and location of dynamically generated content. XML sitemap mitigates the effects of weak internal linking and strong backlinks.

If your website is on wordpress, I would recommend Yoast SEO plugin for XML Sitemap generation, you can also use 3rd party tools for XML sitemap creation.

6) Robots.txt Exclusions-

The robots.txt protocol instructs webmasters about the pages and directories on your websites that should not be accessed by Googlebot and other web crawlers. When a page is blocked using robots.txt file, the search engines don’t index the contents and links of that page. Therefore any inbound link to the blocked page will not pass on SEO juice to other pages. Any exclusion via robot.txt should be well thought out.

You can access the Robots.txt, which is a plain-text and case sensitive file found in the root of a domain of any website by just adding /robots.txt at the end of domain name (e.g. www.example.com/robots.txt). Since the robots.txt is publically available for all domains (which have robots.txt file) so it doesn’t make sense to hide secret files or private user information.

Also each subdomain on a root domain must use separate robots.txt files. For example news.example.com and example.com should have separate robots.txt files (news.example.com/robots.txt and example.com/robots.txt). For robots.txt file to be found and function well, they must be placed in a website’s top-level directory.

The basic format of robots.txt is-

User-agent: [user-agent name]

Disallow: [URL string not to be crawled]

Example usage-

1) Using below syntax in a robots.txt file instructs web crawlers not to crawl any pages on the website, including the homepage.

User-agent: *

Disallow: /

2) Using below syntax in a robots.txt file instructs web crawlers to crawl all pages on the website, including the homepage.

User-agent: *

Disallow:

3) To disallow any page you need to give root url of the particular file or folder.

User-agent: *

Disallow: /info_test.php (To disallow bots from crawling the test page in root folder)

Disallow: /info/test_info.html (To disallow test_info.html under the folder ‘info’)

Disallow: /info/     (To disallow the whole info folder from crawling)

 7) SSL Certificate-

SSL certificates are used to secure payment transactions, customer data and login details. Is SSL good for SEO? The simple answer is yes. According to google HTTPS is one of the ranking factor. Search engines like google incentivize the HTTPS adoption for customer data security. HTTPS are crucial for ecommerce website that plan to advertise their products on Adwords or Bing.

HTTPS provide the confidence to users that the website is secure and safe to use. Though HTTPS are small ranking signal and won’t boost the ranking on highly competitive keywords, still in cases where SEO score of both websites are similar HTTPS can help you outrank other websites.