SEO Tutorial and Digital Marketing Guide Material to Earn Money Online

with Tips & Tricks for making a website on top of the search engines

Get Started with Free Demo Class:

SEO Technical Subjects

SEO Modules:-
1) On page Optimisation: Making a website completely SEO friendly as per google guidelines.
2) Off Page Optimisation: Set of methods we follow to build backlinks to a page

3) Reporting: By means of Google Webmaster tools: Google Search Console & Google Analytics (Adobe is used by big corporations).

Timeline for SEO Improvement:-
Can be calculated based on 3 factors:
a) Keyword competition,
b) Who is the SEO competitors and their strengths.
c) Our SEO plan of action or the quality of work we plan to implement.

Core SEO Technical Terms

1) SEO Strategy: Page by Page and Keyword by Keyword.
As per user requirement the relevant page and then the data to be showed.
SEO is to be done on each page and the SEO score is then calculated on each keyword.
2) SEO Techniques:
White HAT: Ethical SEO techniques implemented on the page to rank it high on google.
Done as per GG and will naturally take a longer time.

Black HAT: Unethical SEO techniques implemented on the page to rank it high on google within a short period of time.

Is not done as per GG.

Blue Hat & Grey Hat…???

3) Google Crawlers/ Spiders/ Robots :
Is the google software that scans the internet with the help of backlinks
In Google type Cache: “Website address” to find if google crawlers have visited the website and when.
The visiting frequency of the crawlers is based upon the how often the website is being updated.
A more updated website gives a higher SEO score.

Auto Crawling: Google will automatically find your website if the hyperlink is linked to any other existing website which has been crawled.

Manual Method: Go to “Google search console” -> URL Inspection tab -> if page is crawled or not can also be checked -> If not can be done or if
required also updated - > then press Request Indexed.
- In case full site with multiple pages needs to be crawled, Sitemap option to be selected.

4) Backlinks:
Links are of 2 types, Inbound (That are in other pages and forward traffic to your page) and Outbound (which forward traffic to other pages).
If a website link is available in a hypertext format (Internet address format in blue and underlined) then that link is called a backlink.
Backlink checker tools: G Search,
Ways of backlinking:

1. Directory submission
2. Social Bookmarking
3. Article Submission,
4. Press release.
5. Periodical Blog Updates.
6. Questions raised and answered in forums.
7. Classified submissions (Paid)

5) Google page rank:
It is a numerical value between 0-10 given by google for each and every page and tells us how important the page is on the web.
Google gives this rank based upon no of quality backlinks and the Domain age (How old is the website).
- Google page rank checker tool is to be used.
- Has no significance to overall SEO score.

6) Domain Authority and Page Authority (DA & PA).
How good / Trustworthy/ overall quality of the website.
The score is from 0-100, With 100 being the best.
MOZ website authority. For checking the DA or PA.
Quality of backlinks from a higher score website will give a higher SEO score.

7) Alexa traffic rank.

Is a rank given to a website based upon the volume of traffic coming to it or the most popular.
Also lots of other important information regarding the traffic is also shown to understand the user / Potential customer behavior’s
A backlink from a higher rated website profoundly helps in an SEO score.

8) Blogs
We can create a free page.
Login-> Create Blogger profile-> Create Blog-> Title (Target Keyword) -> Address ____, Themes- > Subject.

Set of methods we follow to build backlinks is off page (Hyperlink)
2 Types of Backlinks : Quality & Spam.
Quality backlinks - > White Hat - > Positive SEO Score
Spam backlinks - > Black Hat - > Negative SEO Score.

1) If the backlink coming from same business related page then it is a quality backlink.
If a backlink coming from unrelated page then it is considered as spam backlink.
2) Natural Backlinks comes free W.H, Paid backlinks comes under spam if google feels it is not natural (wrong category)
3) Backlink coming from good quality sites W.H, Backlinks coming from low quality sites and is considered spam.

Backlink checker tools -> Multiple free tools for getting total no of backlinks.
SEM Rush is a good and popular Paid SEO tool for detail analysis of quality or spam backlinks.
Reports such as -> Rankings, Site Audit, On Page optimization errors & Suggestions, Social Media Tracker.
Linkforms -. Site containing only links for ad purpose.
Low quality Sites such as for Porn, Drugs, Dating, Virus effected sites, Linkforms, Directory submission sites, Social bookmaking sites, blog
comments, Forum posting sites. Any backlinks coming from these sites is considered as SPAM!!.

How to remove spam link - >
1) contact the website owner, But most of the time our competitors who create spam links to our website.
2) Disavow links tool -> Part of our search console
Paste all spam backlinks to notepad and submit to remove spam backlinks.

Every day thousands of websites are created by users from all over the world. It is mandatory to promote a website after designing and development is done to get business in online Website presence in 1st page of search engine like Google is very important to get more visibility and to get more clicks from users who are searching for a particular query in Google search box Each website owner has to create a website based up on their business requirement, products or services they are going to offer. When it comes to comes to promoting a website any site owner can promote their site using SEO So what is SEO?? And how can you be benefited using this freely???
SEO: Search Engine Optimization
“It is a process of getting any website in 1st page of search engines like Google naturally for the targeted keywords using ON page optimization and Off page Optimization“ .
ON Page: Doing changes within the website like URL structure, content, Meta tags etc
Off Page: Building back links for a website in high authority sites. Search engine robots crawls millions of web pages every day and adds the information in database, whenever a user types a query in Google search box it retrieves the search results based on the authority of the site.
Benefits of SEO:
Increase organic traffic for a site Increase ranking for high search volume keywords Generate leads and turns visitors to buyers Expands your business throughout the world makes your website stands in No.1 position and ahead of your competitor ranking position
Read more at Ultimate guide to SEO ON Page Implementation
For any website ON Page plays a major role in ranking and influencing search engine crawlers ON Page is a process of doing changes within the website like coding, URL structure and many more
Important SEO ON Page factors:
1. Title Tag: It is the title for the webpage and it appears in search results Drives initial click for the users and appears in search results. Google will allow (512 pixels) (if more than 512 pixels, it shows dots in search results)
2. Description Tag: It is the summary of the webpage. Google will allow 900 Pixels in search results . Add the description with meaningful sentence and with targeted keywords without stuffing.
Use SEO Mofo snippet optimizer tool before you actually add the tags in source code as it shows the snippet preview
3. Header Tags: H1 to H6 are header tags for a webpage. Each Webpage should have web content. Web content should be divided into paragraphs. Each paragraph should have heading H1, H2 are the important header tags used for a webpage. Use targeted keywords in H1&H2.

Alt Tags: Alternative text for users. It’s helpful identifying the names of the images for slow loading browsers
4. URL Structure: Avoid using long URL structure with unnecessary special characters.
Bad Ex:
Good Ex:
Depth of URL Structure: After Domain name maximum depth of Directories should not be more than 4
Bad Example:
Good Example:
5. Robots.txt: It is a text file which tells crawlers to crawl the website or block the site
Termed as Allow or Disallow -> Code to be used:
a) User agent: *
Allow: / Allow all crawlers from all content
b) User –agent: *
Disallow: / Blocks all crawlers from all content
c) User-agent: Google Bot
Disallow:/Books Block a specific web crawler from a specific folder
d) User-agent: Google Bot
Disallow:/services/advertising.html Block a specific web crawler from a specific webpage
Add the required code, save the file name as robots.txt and upload the file in root folder of server.
Sitemap: It is an xml file which consists of all URL’s of a website. Sitemap is for crawlers to visit the website and add the URL’s in the database. Makes faster indexation of all URL’s in search results. Update the sitemap in webmasters if any new or old URL’s are added.
Procedure: Go to
Enter the homepage of the website and click start. Download the uncompressed xml sitemap and upload in the root folder of the server.
Canonicalization: Loading same pages from multiple URL’s without creating Duplication Error. A) B)

Solution: Place <link rel=“canonical” href = “ ”/> tag in the source code of your preferred url.
Case sensitive URL’s: a)
Above 2 URL’s are loading same page which is duplicate in view of crawlers and important factor to change here is to remove capital letters in URL structure and always use small letters.
Bread Crumb Structure: It is the navigation for the user to move front and back pages in the website. To give a clear path to user all ecommerce sites uses bread crumb navigation now It increases internal links strength.
Code to be used
<a href=’’>Home</a> > <a href=’’> Clothes & Fashion Accessories</a> > <a href=’′> Shorts, Skirts & Jeans</a>
Output: Home > Clothes & Fashion Accessories > Shorts, Skirts & Jeans
301 Redirection: The 301 redirect is a permanent redirect page. I.e. moved to new URL from old URL. Whenever there are 404 broken links for the site. 301 have to be done to get rich user experience. Even in search engine point of view important broken links or renamed URL has to be redirected to working pages or new URL’s.
Procedure: Go to 301 redirection tag generator tool
Enter the old URL and new URL.
Generate the code, copy the code in the notepad and save the file as “.htaccess”.
Upload the .htaccess file in the root folder of the server.
Custom 404 Error: Custom 404 error page is required for all the wrongly typed URL’s (pages which do not exist in the site). These URL’s when typed in browser shows page not found in the server. So a custom 404 page has to be designed and give links to working pages or home page.
1. Giving link back to the home page or internal pages for all the incoming wrong URL structure like Ex: a) b)
Another method: Permanently redirecting all the incoming wrong URL structure to home page Ex:
Note: (It immediately redirects to home page i.e.
Code to be added in the error document file to create custom 404.

Error Document 404:
Web Content Optimization: Check web content in every webpage for plagiarism Ex:
Spam Content types:
Content Scraping: Content scraping is taking the content from one site and reusing it on another site Same Site
Duplication: Duplicate content present in same site in different pages and in internal blogs.
Do not index the content but follow the links on the page
<meta name=”robots” content=”noindex, follow”>
Importance of Hyphens in URL structure: Redirect URL structure from underscores to hyphens Ex: to
Google Considers each hyphen as word separator.
Set Preferred Domain in webmasters:
Website Loading Speed: Check loading speed of your website using Ideal loading time should be 2-3 seconds. If it’s more than 2 seconds check the size of images, minimize java script, avoid excessive flash images in the web pages.
Text to Html Ratio: Minimum 20%
Myths: Google does not index pages with low Text to HTML ratio. Google will penalize my site if my page is more than 100KB.
Fact: Higher Html to text Ratio is to increase user experience.
Getting reputation for a website depends on linking with authority sites and this can be possible using Off-Page in SEO.
Off-Page is basically used to build back links for any website with high authority sites.
Off-Page plays a major role in gaining ranking and reputation for a site.
Alexa Rank, Moz Rank and Page Rank for any site will be gradually improved if there is a proper off page optimization strategy.
Every search result a snippet, changes in SERP (search engine results pages) because of ON and Off page.

Important methods used for promoting a website in SEO Off Page.
1. Directory Submissions: It is a process of adding website or website URL to a web directory and it is one of the popular forms of link building. Top directory submission sites: Dmoz, Yahoo directory, Aol etc.
2. Social Bookmarking: Saving favorite URL’s in online is termed as Social Bookmarking. Most of the sites give instant back links which increases the site authority for a website. These services allow you to tag news stories, blogs, videos, audio files.
Top bookmarking sites:,,,
3. Forum Submission:
Forum link building/Posting service is a SEO technique which helps in building back links to your website.
Forum Posting is an Internet Marketing service which uses forum communities to build inbound links.
Find the relevant forums using syntax like keyword: forums
Ex: web design : forums
Forum search engine:
Threads are called questions
Replies are called answers
Register in the forum site
Click the user control panel
Fill the profile details and add the URL to get a back link.
Click save changes and take the destination URL
Participate in relevant forums and reply to questions to get more traffic and back links.
4. Business Profile Creation:
To get more traffic from business listing sites website owners need to analyze the type of business they are in to and submit in top local business listing sites like yellow pages, just dial, India mart, hot frog, yelp e.t.c.
Register in the site and enter the business details like company name , working hours, services ,top priority keywords ,address, contact no , images , videos , e.t.c

Local people who are looking for a particular product or service in local search engines finds these listings more suitable and more chances to enquire.
Getting a permanent back link from high authority site.
Generates more leads in online from the business listing portal.
Faster indexation in search results for the targeted keywords.
5. Article Submission:
Article submission is a process of submitting article about a product or service in article submission sites
It’s a part of content marketing.
An article consists of title, article body of about 500 words, author biography.
Rules for writing an article:
Select the title of the article based on the topic you have chosen.
Tool: Hub spot blog topic generator -> Enter a keyword in the above tool and it will give you the different titles for your article. Article should be minimum 500 words with 1% keyword density ( 1 targeted keyword for 100 words ) Use combinational keywords to get faster indexation instead of stuffing the same keyword for 5 times
Register in the article submission site and select the proper category to submit the article
Enter Title, summary of the article, category, article body, author biography
2 hyperlinks can be given in each article using href tag or insert hyper link option
Give links for targeted keywords to get back links from the high authority article submission sites and take the destination URL after submission
Check the plagiarism of any article using tools like, e.t.c
Promote unique content in article submission sites to get approval for the original articles.
Different types of links: Connection between two web pages or a webpage and a document.
Back links: Links coming back from other sites to your site (also termed as inbound links)
Internal Links: Links present within the site
External links: Links which are going away from your site. Commonly these links are social Media plug-ins or other websites links in footer (Also termed as Outbound links).

Footer Links: These are the links which are present in the common footer of the website. Implement the footer linking for the targeted keywords.
How to do competitor back link analysis to your website:
For every website there will be some position in search results for some targeted keywords.
Ex: is coming in 6th position for some keyword “billing machine manufactures”.
All other sites which are coming in search results are your competitors, but consider top 5 competitors because they are ranking ahead of your search snippet.
Popular tools used to find competitor back links: (Paid), (Paid), (paid).
How to check your competitor back links for free:
Type -> links : in Google to see any website back links.
Let’s discuss about a paid tool called “” and explore the important features.
Create an account in this tool and login
Click Site explorer and enter the site URL.
Under inbound links: Select new option -> referring domains: from which back links are coming for your website.
Important metrics to check in tool:
No of referring domains, No of referring pages, Anchor text used by the competitors, Links coming from CTLD’s ( Country level top referring domains) Link score or authority score of the back link ( Find Domain authority , Page rank , Alexa Rank of each back link to analyze and submit ) Do follow and No follow back links?
Doing Off page and ON page is not sufficient and it’s also required to analyze how the competitor sites are coming in top positions. Instead of searching for new sites in off page you can steal competitor back links and submit your website links in their referring domains.
A Comprehensive guide to Google Webmasters explained never before
Google Webmasters is a mediator or communicator between webmaster (site owner) and search engine. How can you know what’s happening on your website in view of search engines? You are doing SEO on page and off page continuously but how search engine crawlers are treating this?

You are optimizing your site with Meta tags content, sitemap, robots.txt and many more factors but how can you find whether search engine is facing any issue while crawling your site?
You do lot of off page submissions for your site like directory, bookmarking, profile creations, article submissions, blog commenting and many more. Do really crawlers see all those links?
Is Google bot able to identify your off page links without any errors?
Am I unknowingly doing any spam submissions?
Am I submitting in low quality sites?
Answer for all the above questions is given by a tool called Google Webmasters (A tool to interact with search engine by the webmaster) Note : To show information about all the above points Google should know about your site so you need to add your website using below procedure to get the info
Configuration of Google Webmasters Account:
1. Open and login with Gmail
2. Click add a property on left side top corner
In the next screen add the website URL and click continue
It will ask for verification of the website
Let’s say I have added
So to prove this is my website I need to verify the ownership
Recommended method: Download the html file and upload in the root folder of your website and click verify.
Alternate methods:
A) Add html tag on the header <head> of the home page. Copy and paste on the common header source code of home page (header.php file)
B) Google Analytics: Check radio button of Google analytics and click verify (If already Google analytics account is there but webmasters is not there then you go for this verification method)
Note: Use any one of the above methods to verify your website ownership (No need to use all)
Ok now you have verified your site, let’s dig in to the tool what information search engine crawler is going to show us: After verification is completed open dashboard on left side Search traffic, Google index, crawl are the important tabs here to discuss.

Select Search traffic:
1. Performance: This data also available in Google analytics.
Here you can find the queries which are typed by the user in the Google search box.
Find the No of clicks, impressions, CTR for the user’s queries and observe which query is having less no of clicks and more no of clicks over a period of time (Select date range and see or compare dates and see) also analyze by search type, device, and country e.t.c.
Note: If a query is having thousands of impressions but no clicks then there might be fault in Title tag and description tag which appears in search results. So change it with meaning full sentence without keyword stuffing such that your snippet gets more clicks
2. Links: Here you can see back links you got for your site
Tip: Download all the back links and analyze the quality of each back link using Moz bar or Page rank. If you have done ethical back link building then no need to check all the back links manually, its required to check only if there are any spam links coming from low quality sites.
3. Internal links: These are the links present within the website Here we can check each page linked with how many other pages within the website
4. Manual Actions: By default it will show no manual web spam actions found. So no need to worry!!!
5. International Targeting: Enable the targeted country here to get influenced by local search engines. Identify the targeted country of your website business whether you want to get your website in 1st page in or or (specify the targeted country here)
What happens if you have a website in 20 countries?
Ex:,, and many other countries
Solution: Add all the sub domains and verify each sub domain using above process of site verification and enable international targeting for each country.
Google Index:
1. Index Coverage Status: Check the indexation status of your website by date wise.
Don’t worry!! No of URL’s indexed will vary every day just check the graph.
How to find recent indexation of my website?
Type cache : in Google to find the recent indexation date of your website

2. Content Keywords: See the significance of your keywords in the web content
3. Blocked Resources: No blocked resources by default
4. Remove URL’s: You can tell Google to remove a search snippet from search results from here.
Enter the URL and click continues.
1. Crawl Errors: Check crawl errors regularly to find the broken links of the website
Solve the crawl errors manually and click mark as fixed to remove the errors
2. URL Inspection Tool (Fetch as Google): Forcibly telling Google to fetch the non-indexed urls
Enter the URL and click fetch. Use this option if important URL’s are not getting indexed in search results for the targeted keywords (Maximum URL’s you can fetch: 20 for each website)
3. Robots.txt: Enter the robots.txt and click test to check allow or disallow mode
4. Sitemaps: Here you can see the no of URL’s submitted and indexed (it’s suggested to have submitted and indexed URL’s are same) Update the site map in Root folder and update the sitemap in webmasters (when removed or added new URL’s in sitemap)
Clicks add or test sitemap and add sitemap.xml to refresh the sitemap
Final words: Monitor the Search Console account regularly to find issues about your site.
Maintain all websites webmasters accounts under one Gmail account (Ex: if you have 10 sites then add those 10 sites under one common Gmail account) Check all the above factors at least thrice in a week for better results.