Header Responsive Ads

Google SEO Ranking Factors Part-4


46. Sitemap 

Google crawl the inner pages of a website up to level four. However, with the help of sitemap, you can let the googleBots to crawl all inner pages. What is sitemap? Sitemap is a separate file containing all URLs of your website and uploaded at the root directory of your website in server. There are two types of sitemap 

Xml sitemap 
Xml sitemap is created for crawlers to give some additional information along with URLs. It helps the crawler to crawl all URLs. 

Html sitemap 
Html sitemap is created for user to find his / her intended page quickly. Otherwise user may have to visit multiple pages to reach his intended page. It contains only URLs of website. 
Better to give link of sitemap file in footer. 
You can use online sitemap generator at XML-Sitemap. Xml sitemap for google blog can be generated here

47. SEO friendly URLs 


48. Title tag 

  • Title tag should be / contain keyword(s) 
  • It should be at the top of the page
  • Keywords in title tag should not repeat 
  • Title should be unique, if it is not unique for google as <title>Home page</title> then google will replace it with any title itself. 
  • Use actionable words in title as get, take, boost, learn, make to prompt user to click to increase CTR 
  • Use brand name at end of title 
  • Title should have primary keyword, secondary keyword, brand and location if business is local 
  • Length of your title should not exceed limit as google trim last words and show dots (….) instead. Title length should be between 50-60. 
  • Title should contains primary keyword at start. 
  • All pages should contain unique titles 

49. Meta SEO Tags 

Each meta tag contains two attributes, name and content. Four popular meta tags are meta description, meta author, meta keywords and meta viewport. 

Description meta tag 
Optimize description tag such that focus 50% for SEO to improve ranking and 50% for users to increase CTR. Length of meta description tag should be between 150 to 160 characters. Don’t duplicate keyword in description, use synonym at other place. U will get traffic of both keywords. 

Viewport meta tag 
Visible web page to user is viewport. It varies according to screen size as it becomes smaller for mobile and larger for desktop. Viewport tag tells the browser, how to control page dimensions and scaling. You should include meta viewport tag in all webpages as below
<meta name="viewport" content="width=device-width, initial-scale=1.0">

Author meta tag 
It shows the name of author to google. 
Google has deprecated Keyword meta tag, Revisit after meta tag, Cache control meta tag, Geo-meta tag and Expires meta tag 
You can generate meta tag by SEO Mofo (Best) & SEOChat. 

50. Breadcrumbs navigation 

Use BreadCrumbs in website as google rank such websites at the top of SERP. With the help of breadcrumb navigation, Google can easily understand and crawl all parent pages of current page. Google depreciated URL navigation now and recommended Breadcrumb navigation as ranking factor. 

51. Robots.txt 

When search engine crawler comes to your website, it first looks for Robots txt file. What is robots.txt? Robots.txt is a simple text file in which you can restrict the search engine not to crawl specific pages. You should also mention your website xml sitemap in robots.txt so that crawler can get all URLs of your website at the beginning of crawling. 
Robots.txt contains below commands 
User-agent: * 
It means below commands are for all search engines 

User-agent: Bingbot 
Only for bing search engine 

Disallow: /folder/ 
This folder is not allowed to crawl 

Disallow: /file.html 
This file is not allowed to crawl 

Disallow: /image.png 
This image file is not allowed to crawl 

User-agent: * Disallow: 
Block all search engine to crawl all content 

User-agent: * Allow: 
Allow all search engine to crawl all content 

Sitemap: http://www.techilm.com/sitemap.xml 
Sitemap is crawled by search engine

By default all URLs of a website are allowed to crawl so no need to mention allow command. Be careful when creating robots.txt file as one wrong command as Disallow: / may remove your entire website data from google index. Large organizations can use it to allow some pages to only their paid customers. 
Robots.txt file can be created online using Robotstxt or SEOBook.
Upload it in the root directory of your website and add in google search console. 
By disallowing private links, google crawl budget is optimized.

52. Optimize google crawl budget

Google crawl budget is the time allocated to google crawler to crawl webpages of your website on any given day. Number of pages may varies. You should optimize google crawl budget by 
  • Keeping your important content and URLs at the top of  pages so that google crawler can easily crawl at least important content and URL's of your website in given budget
  • Keeping most important pages at start and first level

Post a Comment

Previous Post Next Post

In Article Ads 1 Before Post

In Article Ads 2 After Post