Introduction To SEO: Page - 1
Introduction To SEO: Page - 1
UNIT-I: Overview: Performance: total clicks, total impression, avg. CTR, avg position -
URL Inspection: URL on is Google-view crawled page-view source, learn more option
Introduction to SEO
Search Engine Optimization (SEO) is the process of improving a website’s visibility
on search engines like Google, Bing, and Yahoo. The goal of SEO is to increase organic
(non-paid) traffic by ranking higher in search engine results pages (SERPs). A well-
optimized website attracts more visitors, improves user experience, and enhances credibility.
Why is SEO Important?
Increases Website Traffic: Higher rankings lead to more visibility and clicks.
Builds Trust and Credibility: Users trust websites that rank high on Google.
Improves User Experience: SEO involves optimizing website speed, mobile-
friendliness, and navigation.
Cost-Effective Marketing: Organic traffic is free compared to paid ads.
Gives a Competitive Advantage: Businesses with strong SEO outperform
competitors in online searches.
Types of SEO
1. On-Page SEO
On-Page SEO focuses on optimizing elements within a website to improve its search
rankings. Key aspects include:
Keyword Optimization: Using relevant keywords in titles, headings, and content.
Meta Tags: Optimizing title tags and meta descriptions to improve CTR.
URL Structure: Keeping URLs short, descriptive, and keyword-rich.
Internal Linking: Linking related pages to improve navigation and SEO strength.
Content Quality: Creating high-quality, original, and engaging content.
2. Off-Page SEO
Off-Page SEO refers to activities outside the website that influence rankings. Major
strategies include:
Backlinks: Getting high-quality links from other reputable websites.
Social Media Signals: Increased engagement on platforms like Facebook and Twitter
can impact SEO.
Brand Mentions: Positive mentions across the web boost credibility.
3. Technical SEO
Technical SEO improves website infrastructure to help search engines crawl and
index it efficiently. Key elements include:
Site Speed Optimization: Faster websites rank better.
Mobile-Friendliness: Websites should be responsive on all devices.
XML Sitemaps: Helps search engines understand website structure.
Secure Website (HTTPS): Secure sites are favored by search engines.
Page | 1
Search Engine Optimization (SEO)
Introduction
In the digital marketing and SEO world, performance metrics play a crucial role in analyzing
the effectiveness of online campaigns. Four key performance indicators (KPIs) commonly
used to evaluate performance are:
1. Total Clicks
2. Total Impressions
3. Average Click-Through Rate (CTR)
4. Average Position
These metrics help businesses and website owners assess how well their website is
performing in search engine results and online advertising campaigns. In this document, we
will provide a detailed explanation of each metric, their significance, how they are calculated,
and examples to illustrate their use.
1. Total Clicks
Google Analytics Definition: Total clicks refer to the number of times users have
clicked on a website link in search engine results.
Google Ads Definition: The number of times users have clicked on an advertisement
in a PPC campaign.
SEO Definition: The count of user interactions where they click on an organic search
result.
Importance:
Example 1:
If a website appears in Google's search results 50,000 times and users click on the link 5,000
times, the Total Clicks value is 5,000.
Example 2:
Increase Clicks:
Page | 2
Search Engine Optimization (SEO)
2. Total Impressions
Importance:
Example 1:
If a webpage is shown 100,000 times in search results, the Total Impressions value is
100,000, even if no one clicks on it.
Example 2:
A company runs an ad campaign for a new product. The ad appears 500,000 times across
various websites but only receives 5,000 clicks. The Total Impressions is 500,000.
Increase Impressions:
Marketing Definition: The percentage of users who click on a link after seeing it in
search results or an ad.
Page | 3
Search Engine Optimization (SEO)
Google Ads Definition: The number of clicks divided by the number of impressions
for a specific ad.
SEO Definition: The percentage of searchers who click on a webpage in the search
engine results page (SERP).
Formula:
Importance:
Example 1:
If a webpage receives 2,000 clicks from 40,000 impressions, the CTR is calculated as
follows:
Example 2:
Improve CTR:
4. Average Position
SEO Definition: The ranking position of a webpage in search engine results pages
(SERPs).
Google Search Console Definition: The mean ranking of a webpage for all search
queries.
PPC Definition: The average ranking of a paid ad in search results.
Importance:
Page | 4
Search Engine Optimization (SEO)
Example 1:
If a webpage ranks at position 3 for some keywords and at position 7 for others, the Average
Position is calculated as: Thus, the Average Position is 5.
Example 2:
A website ranks in position 1 for 10 keywords, position 3 for 15 keywords, and position 5
for 5 keywords. The average position is: Thus, the Average Position is 2.67.
1. Improve On-Page SEO – Optimizing headings, meta tags, and images helps in
rankings.
2. Use High-Quality Backlinks – Authoritative links improve credibility and rankings.
3. Target Long-Tail Keywords – Less competitive keywords improve visibility.
4. Enhance User Experience – Faster websites and better UI contribute to better
rankings.
5. Optimize for Voice Search – Many users search via voice, so optimizing for voice
queries helps.
URL Inspection: Google View, Crawled Page, View Source, and Learn
More Option
Introduction
URL Inspection is a critical tool in search engine optimization (SEO) and web
development that allows website owners to analyze how Google perceives a webpage. It
provides insights into indexing, crawling, and page rendering, helping webmasters optimize
their content for search engines.
We will explore each of these aspects in detail, providing multiple definitions, explanations,
examples, and strategies for effective usage.
Google Search Console Definition: The URL on Google View refers to the indexed
version of a webpage as seen by Google.
SEO Definition: The state of a webpage within Google’s search index, determining
its visibility in search results.
Page | 5
Search Engine Optimization (SEO)
Importance:
Example 1:
A website owner submits a blog post URL to Google Search Console and uses the URL
Inspection tool. The report shows that the page is indexed and displays in search results.
Example 2:
A business updates a product page but finds that Google is still displaying an older version.
Checking the URL on Google View confirms that Google has not yet re-crawled and indexed
the updated version.
Optimize:
SEO Definition: The analysis of how Googlebot crawls and retrieves a webpage’s
content.
Googlebot Definition: A record of when and how Googlebot last accessed a page.
Technical SEO Definition: The study of HTTP status codes, JavaScript rendering,
and blocked resources during crawling.
Importance:
Example 1:
A developer finds that their JavaScript-heavy webpage isn’t appearing in search results. The
Crawled Page Report shows that Googlebot couldn’t execute JavaScript properly.
Page | 6
Search Engine Optimization (SEO)
Example 2:
A website with multiple language versions experiences indexing issues. Crawled Page
Analysis reveals that the hreflang tags are incorrectly implemented, causing confusion for
Googlebot.
1. Fix Broken Links – 404 errors prevent Googlebot from accessing pages.
2. Ensure Robots.txt is Correctly Configured – Avoid blocking important pages.
3. Use Internal Linking – Helps Google discover and crawl pages efficiently.
4. Optimize Server Response Time – Slow servers can limit Googlebot’s crawling
frequency.
5. Implement Structured Data – Enhances understanding of webpage content.
Browser Definition: The HTML, CSS, and JavaScript code that constructs a
webpage.
Developer Definition: The raw markup that browsers and crawlers read to render a
webpage.
SEO Definition: A way to analyze metadata, structured data, and canonical tags.
Importance:
Example 1:
A webpage isn’t appearing in search results. Inspecting the View Source reveals a mistakenly
added <meta name="robots" content="noindex"> tag.
Example 2:
A website with rich snippets isn’t displaying star ratings in search results. Checking the View
Source confirms missing structured data for reviews.
1. Inspect Meta Tags – Ensure proper implementation of title, description, and robots
tags.
2. Check Canonical URLs – Avoid duplicate content issues.
3. Analyze Structured Data – Validate schema.org markup.
4. Ensure Proper H1-H6 Usage – Optimize header structure for SEO.
5. Look for Hidden Content – Some scripts might block important elements.
Page | 7
Search Engine Optimization (SEO)
<!DOCTYPE html>
<html lang="en">
<head>
<meta charset="UTF-8">
<title>Google</title>
</head>
<body>
<div id="main">
<input type="text" placeholder="Search Google">
</div>
</body>
</html>
4. Learn More Option for Troubleshooting
Importance:
Example 1:
A website owner sees a "Crawled – Not Indexed" warning in Search Console. Clicking
Learn More directs them to Google’s guidelines on resolving indexing issues.
Example 2:
An e-commerce store experiences slow loading times. The Learn More option provides tips
on optimizing images and using lazy loading.
Page | 8
Search Engine Optimization (SEO)
UNIT-II: Index: Coverage: valid, excluded, valid with warnings, submitted and indexed,
discovery, refrreing page, pages with errors, valid pages -Sitemaps-add new sitemap,
submitted sitemaps, type, submitted, lastread, status, discoveredURLs.
Index Coverage refers to the process by which Google tracks the indexing status of
URLs on a website. This feature in Google Search Console helps webmasters understand
which pages are successfully indexed, which are excluded, and which have warnings or
errors.
Index Coverage is a report in Google Search Console that provides insights into the
URLs Googlebot encounters, categorizing them into valid, excluded, valid with warnings,
and errors.
The term "Index Coverage" indicates the extent to which Google has indexed a
website’s content. It identifies how pages are processed, highlighting issues affecting
visibility in search results.
Example:
A website owner notices that some of their blog posts are missing from Google search
results. They check the Index Coverage Report in Search Console and find that some pages
are marked "Excluded" due to canonicalization issues. They resolve this by correctly
implementing canonical tags.
1. Valid Pages
Valid pages are successfully indexed and appear in Google Search results without any
errors or warnings.
A "Valid" status in the Index Coverage Report means Google has crawled and
indexed the page successfully, making it available for search rankings.
A valid page is any web page that meets Google's quality guidelines, is properly
structured, and is considered useful and relevant to users.
Example:
A website has an e-commerce product page that is correctly indexed, appears in search
results, and receives organic traffic from Google.
Page | 9
Search Engine Optimization (SEO)
2. Excluded Pages
The "Excluded" status in Index Coverage means Google found the page but decided
not to index it, either due to website settings or algorithmic determinations.
Excluded pages are web pages that Googlebot has crawled but determined they should
not appear in search results, often due to duplication, poor quality, or explicit exclusion
settings.
Example:
A page containing duplicate content is set with a noindex tag, preventing it from appearing
in search results.
Pages marked as "Valid with Warnings" are indexed but may have issues affecting
their search performance.
A "Valid with Warnings" page is an indexed page that Googlebot processed but
flagged for potential problems that might impact search rankings or user experience.
Example:
A blog post is indexed even though the robots.txt file disallows crawling, leading to a "Valid
with Warning" status.
Possible Causes:
Page | 10
Search Engine Optimization (SEO)
Mismatched Canonical Tags: Google selects a different canonical than the one
specified.
HTTPS Issues: Mixed content errors affect indexing.
"Submitted and Indexed" refers to URLs submitted via a sitemap that Google
successfully indexed.
This status confirms that a URL included in an XML sitemap has been processed and
indexed by Google.
"Submitted and Indexed" means the page was manually reported to Google through a
sitemap or Search Console request, and Google determined it fit for inclusion in the index.
Example:
A new article submitted in the sitemap appears in Google Search after a few days.
Common Problems:
5. Discovery
Discovery refers to how Googlebot finds new URLs on a website, either through
sitemaps, internal links, or external links.
A page is in "Discovery" status when Google knows about it but hasn’t crawled or
indexed it yet.
Discovery is the phase where Google identifies a page but has not yet determined if or
when it should be crawled and indexed.
Example:
A newly published blog post that hasn't been crawled appears under "Discovered – currently
not indexed."
Page | 11
Search Engine Optimization (SEO)
6. Referring Page
A "Referring Page" is the URL that links to the discovered page, helping Googlebot
find new content.
Referring pages act as pathways for Googlebot and users, assisting in page discovery
and link equity distribution.
Example:
A homepage linking to a new service page helps Google discover and index the new content.
Pages with errors are those Google attempted to index but encountered issues that
prevented successful processing.
An "Error" status in Index Coverage means Googlebot encountered problems like 404
errors, server issues, or incorrect redirects.
Error pages are URLs Googlebot could not successfully crawl and index due to client-
side or server-side issues, reducing their visibility in search.
Example:
A deleted blog post URL returning a "404 Not Found" error appears under "Pages with
Errors."
404 (Not Found): The page has been deleted or the URL is incorrect.
500 (Server Error): The server is down or misconfigured.
Redirect Loops: Pages stuck in infinite redirect chains.
Crawl Anomalies: Googlebot encountered unexpected issues while crawling.
8. Valid Pages
Valid pages are web pages that have been successfully crawled and indexed by Google.
These pages appear in search results and contribute to a website's visibility.
Page | 12
Search Engine Optimization (SEO)
Google found and indexed the page even though it was not included in the
sitemap.
Example: A blog post linked internally but missing from the XML sitemap
gets indexed.
The page was submitted through the sitemap and successfully indexed.
Example: A product page submitted in the sitemap appears in search
results.
Introduction to Sitemaps
A sitemap is an XML or HTML file that provides search engines with a structured list
of a website’s URLs, helping crawlers discover and index content efficiently. It acts as a
guide that tells search engines which pages are most important and should be indexed.
A sitemap is a file that acts as a roadmap for search engine crawlers, listing pages,
images, videos, and other resources to improve website visibility. It ensures search engines
do not miss any essential content on a site, even if the internal linking structure is weak.
Example:
Page | 13
Search Engine Optimization (SEO)
Importance of Sitemaps
Sitemaps play a crucial role in search engine optimization (SEO). Here’s why they are
essential:
1. Faster Indexing: Search engines can quickly find and index new or updated pages.
2. Better Crawl Efficiency: Googlebot and other crawlers efficiently navigate a
website, ensuring all critical content is considered.
3. Improved SEO Performance: Pages listed in a sitemap have a higher chance of
ranking in search results.
4. Handling Large Websites: Large sites or those with poor internal linking structures
benefit greatly from a sitemap.
5. Helps in Google Discover: Sites that want better visibility in Google Discover need a
well-maintained sitemap.
Types of Sitemaps
1. XML Sitemaps
An XML (Extensible Markup Language) sitemap is a structured file that helps search
engines understand the URLs available for crawling. An XML sitemap is a structured
document designed for search engines, listing URLs along with metadata like last modified
date, change frequency, and priority. It provides a clear structure of the site’s content for
crawlers.
It is a machine-readable file that lists the URLs of a website along with metadata (e.g.,
last modified date, priority, and update frequency).
Example:
Page | 14
Search Engine Optimization (SEO)
2. HTML Sitemaps
An HTML sitemap is a structured page designed to help visitors find content easily by
organizing links in a logical manner. It acts as a user-friendly roadmap, listing important links
that help visitors find content quickly.
An HTML sitemap serves as a secondary navigation tool for users, ensuring that
important pages remain easily accessible.
Example:
A blog site creates an HTML sitemap with categories and posts, improving internal linking
and enhancing user experience.
Page | 15
Search Engine Optimization (SEO)
3. Video Sitemaps
A video sitemap contains metadata about video files, enabling search engines to
display video-rich results in search rankings. It helps Google understand video details such as
duration, category, and thumbnail location for better indexing in Google Video Search.
A video sitemap is an XML file specifically structured to provide search engines with
important information about embedded video content.
Example:
A news website submits a video sitemap to Google, ensuring video clips appear in search
results with thumbnails.
4. Image Sitemaps
An image sitemap lists image URLs, helping search engines index and display them
in image search results. An image sitemap helps search engines discover and index images
from a website, improving visibility in Google Image Search.
An image sitemap provides a structured format for listing image assets, ensuring
better indexing and visibility in image search results. It is an XML-based file that contains
direct links to images along with attributes such as captions, titles, and licenses.
An image sitemap helps search engines understand the context of images on a site by
associating them with their respective web pages.
Example:
Page | 16
Search Engine Optimization (SEO)
A travel blog submits an image sitemap to ensure high-quality images appear in Google
Images search results.
5. News Sitemaps
News sitemaps are designed for Google News, ensuring timely indexing of news
articles. A news sitemap is designed for Google News, helping search engines quickly
discover time-sensitive news articles.
A news sitemap helps Google prioritize fresh news content, ensuring it appears in
news-specific search results. It provides additional metadata such as publication date, title,
and language, ensuring that news articles appear promptly in search results.
A news sitemap is a specialized XML format used by publishers to inform search
engines about recently published articles.
Example:
Page | 17
Search Engine Optimization (SEO)
A news portal submits a news sitemap containing breaking stories, ensuring they appear in
Google News results within minutes.
6. Mobile Sitemaps
RSS (Really Simple Syndication) and Atom sitemaps automatically update search engines
with new blog posts or articles.
They are typically used for news sites or blogs where content updates frequently.
Page | 18
Search Engine Optimization (SEO)
Submitted
"Submitted" refers to the total number of URLs included in a sitemap that has been
submitted to Google Search Console. It represents the pages a website owner wants Google
to crawl and index.
Explanation:
When a sitemap is uploaded to Google Search Console, it contains a list of URLs that
the website owner wants Google to discover.
These URLs can include web pages, images, videos, or news articles depending on
the type of sitemap submitted.
Google does not guarantee that all submitted URLs will be indexed. It evaluates
each URL based on factors like content quality, uniqueness, and crawl ability.
Example:
Imagine a website owner submits a sitemap with 1,000 URLs to Google. However, due to
duplicate content or restricted pages (e.g., robots.txt blocking), Google may choose to index
only 800 URLs out of the 1,000 submitted.
Last Read
"Last Read" refers to the last date and time Googlebot accessed and processed a
submitted sitemap.
Explanation:
Google periodically checks sitemaps to see if there are new or updated URLs.
If a sitemap is frequently updated (e.g., for a news website), Google will read it more
often.
Websites with static content (pages that don’t change often) may have their sitemap
read less frequently.
Example Scenario:
A news website submits a sitemap daily with newly published articles. Google’s Last Read
date might show “March 22, 2025” if the sitemap was processed on that day. If no updates
are made, the Last Read date may remain the same for weeks.
Page | 19
Search Engine Optimization (SEO)
Google Not Reading the Sitemap Frequently – This can happen if the website has
low authority or few updates.
Sitemap Errors – If the sitemap has incorrect formatting, Google may not process it.
Server Issues – If the sitemap URL is slow to load or returns an error (e.g., 404 Not
Found), Google won’t read it.
Status
Explanation:
The sitemap status tells the website owner whether Google accepted or rejected the sitemap.
It can have one of the following states:
Example:
A website submits a sitemap, and Google Search Console shows Status: Success This means
all URLs were processed correctly. However, if the status says Has Issues, some URLs
might be blocked or unreachable.
"Couldn’t fetch" – Google can’t access the sitemap due to a broken URL.
"Invalid XML format" – The sitemap has incorrect XML structure.
"URLs not allowed" – The sitemap contains URLs blocked by robots.txt.
"Large sitemap file" – Google allows a maximum of 50,000 URLs per sitemap. If
exceeded, Google may reject it.
Discovered URLs
"Discovered URLs" refers to the total number of URLs Google found in the
submitted sitemap. These are potential pages that could be crawled and indexed.
Page | 20
Search Engine Optimization (SEO)
Explanation:
The number of discovered URLs can be equal to or less than the submitted URLs.
Just because a URL is "discovered" doesn’t mean it will be indexed. Google decides
which URLs to index based on relevance and quality.
If discovered URLs are significantly lower than submitted URLs, it may indicate
issues with the sitemap or website structure.
Example:
A sitemap containing 2,000 pages is submitted. Google discovers 1,800 URLs from the
sitemap, meaning some URLs were ignored due to errors like broken links, redirects, or
restricted pages.
Redirect Loops – If a page redirects multiple times, Google may ignore it.
Thin Content – Pages with little or no content may not be discovered.
Server Overload – If the website’s server is slow, Google may crawl fewer URLs.
Page | 21
Search Engine Optimization (SEO)
Each of these elements contributes to better user engagement, faster load times, and
higher search rankings. Below is an in-depth explanation of each enhancement.
Core Web Vitals are key performance indicators (KPIs) used by Google to evaluate
the loading speed, interactivity, and visual stability of a webpage. These are crucial ranking
factors since Google’s Page Experience Update (2021).
Page | 22
Search Engine Optimization (SEO)
4. Fast Loading – Mobile users expect instant loading, so page speed should be
optimized.
5. Touch-Friendly Navigation – Buttons and links should be easy to tap.
6. Readable Fonts & Layout – Text should be large enough without zooming.
Improve Mobile Usability
Use Google’s Mobile-Friendly Test to check your site.
Enable responsive design using CSS media queries.
Avoid intrusive pop-ups that can negatively impact mobile UX.
Optimize images and use browser caching to reduce loading times.
3. AMP (Accelerated Mobile Pages) – Faster Mobile Experience
AMP (Accelerated Mobile Pages) is an open-source framework by Google that creates
lightweight, fast-loading mobile web pages. It removes unnecessary HTML and JavaScript
elements, making mobile pages lightweight and super fast.
AMP Work
Uses a stripped-down version of HTML (AMP HTML).
Loads pages instantly by removing unnecessary elements like JavaScript.
Google caches AMP pages to make them load even faster in search results.
Advantages of AMP:
Faster Page Load – Improves user experience and reduces bounce rate.
Higher SEO Ranking – AMP pages often rank better in mobile searches.
Increased Ad Revenue – Faster pages mean users stay longer, improving ad
impressions.
Implement AMP
Convert existing pages into AMP by using the <amp-html> tag.
Replace standard elements like <img> with <amp-img>.
Use Google AMP Validator to test AMP pages.
Page | 23
Search Engine Optimization (SEO)
Breadcrumbs are navigational elements that show the hierarchy of a website’s pages. They
help users track their location within a website.
Types of Breadcrumbs:
Benefits of Breadcrumbs:
Add Breadcrumbs
{
"@context": "https://schema.org",
"@type": "BreadcrumbList",
"itemListElement": [
{
"@type": "ListItem",
"position": 1,
"name": "Home",
"item": "https://example.com/"
},
{
"@type": "ListItem",
"position": 2,
"name": "Electronics",
"item": "https://example.com/electronics"
},
{
"@type": "ListItem",
"position": 3,
"name": "Laptops",
"item": "https://example.com/electronics/laptops"
} ] }
Page | 24
Search Engine Optimization (SEO)
FAQ Schema is structured data that allows frequently asked questions to appear directly in
Google search results.
{
"@context": "https://schema.org",
"@type": "FAQPage",
"mainEntity": [
{
"@type": "Question",
"name": "What is SEO?",
"acceptedAnswer": {
"@type": "Answer",
"text": "SEO stands for Search Engine Optimization, which helps improve website
ranking in search engines."
}
}
]
}
{
"@context": "https://schema.org",
"@type": "HowTo",
"name": "How to Make Coffee",
Page | 25
Search Engine Optimization (SEO)
"step": [
{
"@type": "HowToStep",
"name": "Boil Water",
"text": "Heat water to 95°C."
},
{
"@type": "HowToStep",
"name": "Add Coffee",
"text": "Pour hot water over ground coffee."
}
]
}
A Site Links Search Box allows users to search within a website directly from Google.
Implement
Example:
{
"@context": "https://schema.org",
"@type": "WebSite",
"url": "https://example.com/",
"potentialAction": {
"@type": "SearchAction",
"target": "https://example.com/search?q={search_term_string}",
"query-input": "required name=search_term_string"
}
}
Page | 26
Search Engine Optimization (SEO)
UNIT-IV: Security & Manual Actions: Manual actions-How do I remove Manual Actions in
Search Engine Optimisation-security issues and its report-
We will explore manual actions, their causes, how to remove them, security
issues, and their reports in SEO.
Manual actions are penalties applied by Google's human reviewers (not algorithms)
when a website violates Google’s Webmaster Guidelines. These penalties result in lower
rankings, de indexing (removal from search results), or warnings in Google Search
Console.
Unlike algorithmic penalties (such as Google’s Panda or Penguin updates, which are
automated), manual actions require a Google employee to review the site before applying
the penalty.
If a website receives a manual action, the website owner is notified through Google Search
Console (GSC).
Manual actions are different from algorithmic penalties (like Google’s Panda or Penguin
updates), which are applied automatically by search engine algorithms.
When Google detects violations, it applies manual actions and notifies the website owner
through Google Search Console under the "Security & Manual Actions" section.
Page | 27
Search Engine Optimization (SEO)
Lower Search Rankings – Affected pages rank lower in Google Search results.
De indexing – In severe cases, the entire website is removed from Google Search.
Warnings in Search Console – Affected websites receive alerts in Google Search
Console.
Loss of Organic Traffic – A drop in rankings leads to reduced website traffic.
To check if your site has received a manual action, follow these steps:
4. If a penalty exists, Google will provide details about the issue and the affected URLs.
Manual actions are penalties imposed by Google’s human reviewers when a website
violates Google’s Webmaster Guidelines. These actions can result in a drop in rankings,
removal from search results, or warnings in Google Search Console.
Below is a detailed breakdown of the different types of manual actions, their causes, and
examples of how they impact websites.
This manual action is applied to websites that engage in aggressive spam tactics, including:
Example: A website creates thousands of pages using a bot, stuffing them with random
keywords and copied text from other websites. Google detects this and applies a Pure Spam
manual action, removing the site from search results.
Page | 28
Search Engine Optimization (SEO)
How to Fix:
This action is applied when users (not the website owner) create spammy content on a
website. This often happens on:
How to Fix:
If Google finds that many spam websites are hosted on the same free hosting platform, it
may penalize the entire host or domain provider.
Example: A free website hosting service allows users to create unlimited websites. Over
time, hundreds of spammy, low-quality websites appear. Google deindexes the entire
hosting provider, affecting all websites on it.
How to Fix:
Google considers back links a major ranking factor, but manipulating links violates
guidelines.
This penalty occurs when a website acquires low-quality or paid backlinks to manipulate
rankings.
Page | 29
Search Engine Optimization (SEO)
Example: A website owner buys 10,000 backlinks from spammy websites to boost rankings.
Google detects the unnatural pattern and penalizes the site.
How to Fix:
Example: A blog owner charges money for adding “do-follow” links to unrelated websites.
Google detects this as a link scheme and applies a manual action.
How to Fix:
Example: A website creates 100 pages with slightly reworded content but provides no
unique value. Google detects this and applies a penalty.
How to Fix:
Google penalizes websites that overuse keywords unnaturally or hide text to manipulate
rankings.
How to Fix:
Page | 30
Search Engine Optimization (SEO)
Security issues affect user safety and trust, leading to manual actions or security
warnings.
If a website is hacked and used for spamming or distributing malware, Google applies a
manual action.
Example: A hacker inserts hidden links to spam websites into a blog. Google detects this
and penalizes the site.
How to Fix:
Cloaking is when a website shows different content to Google and users. Redirects can
also be deceptive.
Example: A page appears as an educational blog post to Google but redirects users to an
adult website.
How to Fix:
Google applies manual actions to local businesses and websites using misleading
structured data.
Businesses that use fake addresses or duplicate listings can receive penalties.
Example: A business creates multiple fake locations to dominate Google Maps results.
How to Fix:
Page | 31
Search Engine Optimization (SEO)
Page | 32
Search Engine Optimization (SEO)
Step 4: Remove Bad Backlinks Using Google’s Disavow Tool (If Needed)
If your manual action is due to unnatural links to your site, you need to remove or disavow those
bad backlinks.
4.1 How to Identify Bad Backlinks
1. Open Google Search Console
2. Go to Links > Top Linking Sites
3. Download your backlink report
4. Identify spammy, irrelevant, or paid backlinks
4.2 How to Disavow Bad Links
1. Create a Disavow File
o Open a Notepad (.txt) file
o List all spammy domain names like this:
domain:spamdomain1.com
domain:spamdomain2.com
2. Upload the Disavow File to Google
Visit Google Disavow Tool
Upload your .txt file
Click Submit
Step 5: Submit a Reconsideration Request to Google
After fixing the issue, you must ask Google to review your site and remove the manual action.
Page | 33
Search Engine Optimization (SEO)
When a website is flagged as unsafe, users may receive warnings such as:
Page | 34
Search Engine Optimization (SEO)
Google Search Console (GSC) helps website owners identify and fix security issues.
A hacked website means that an attacker has gained unauthorized access to modify
the content, insert harmful scripts, or add spam links.
Affects SEO:
Example:
Fix It:
Scan your website using security tools (e.g., Sucuri, Wordfence, or Google Safe
Browsing).
Check Google Search Console > "Security Issues" for a list of hacked pages.
Restore your website from a clean backup (if available).
Change all admin passwords and remove unknown users.
Update Word Press, plugins, and themes (outdated versions are vulnerable).
Request a review from Google after fixing the issue.
Malware is malicious software that infects a website and affects its visitors. Phishing
is a deceptive practice where attackers create fake login pages to steal user credentials.
Example:
Page | 35
Search Engine Optimization (SEO)
A hacker creates a fake "PayPal login page" on your website to steal user credentials.
Visitors who enter their details are unknowingly sending information to the hacker.
Deceptive content includes fake ads, misleading information, or hidden redirects that
take users to unrelated websites.
Example:
A webpage claims to provide free software downloads, but clicking the button
installs spyware or ransomware instead.
Use Google Search Console > "Security Issues" to identify deceptive content.
Scan your website files for unauthorized changes.
Manually check your website for any auto-redirects to suspicious sites.
Remove hidden scripts and fake ads that lead users to malicious sites.
Update website security and enforce strong passwords.
SQL Injection and XSS attacks occur when attackers exploit security vulnerabilities
in web forms to steal data or inject malicious code.
Example:
A hacker inserts an SQL command in a website’s login form to access the database.
An attacker uses JavaScript code to steal cookies and session tokens.
Page | 36
Search Engine Optimization (SEO)
An SSL (Secure Socket Layer) certificate encrypts data between a user’s browser
and the website. Google marks websites without SSL as “Not Secure.”
Example:
Google will list any security threats such as hacked pages, malware, deceptive
content, or phishing.
Click on "More Details" to view affected URLs.
Page | 37
Search Engine Optimization (SEO)
Once the issue is resolved, request a security review in Google Search Console:
Google will review the website and remove the security warning if no threats remain.
Page | 38
Search Engine Optimization (SEO)
UNIT-V: Legacy Tools and Reports: Links-settings-submit feedback- about new version-
International targeting-messages-URL paramets-web Tools
Google Search Console (GSC) is a free tool provided by Google to help webmasters
monitor, maintain, and troubleshoot their site’s performance in Google Search results. Over
the years, Google has introduced new features and discontinued some older ones. However,
some of the older tools are still available under the Legacy Tools and Reports section.
Even though Google has gradually phased out some of these tools, they still offer
valuable insights for website owners. In this guide, we will deeply explore each of these
legacy tools and reports, their features, their purpose, and how they impact SEO.
Page | 39
Search Engine Optimization (SEO)
Page | 40
Search Engine Optimization (SEO)
Page | 41
Search Engine Optimization (SEO)
Reduce deeply buried pages (pages that require many clicks to access).
Use descriptive anchor text when linking internally.
5. How to Analyze the Links Report for SEO
A. External Links Analysis (Backlink Profile)
1. Find Your Most Linked Pages
o Check if your most important pages have backlinks.
o If a low-value page has many backlinks, consider redirecting it to a more
relevant page.
2. Analyze Linking Websites
o Are they high-authority websites or spammy sites?
o If spammy, disavow bad links.
3. Check Anchor Text Distribution
o Ensure anchor text is relevant and not over-optimized.
B. Internal Links Analysis
1. Ensure Every Page Has Internal Links
o Pages with few or no internal links are harder for Google to find.
2. Improve Link Structure
o Use internal links to pass link equity to important pages.
3. Use Relevant Anchor Text
o Instead of "click here", use "read our SEO guide".
6. How to Fix Link-Related Issues
A. External Links Issues
Page | 42
Search Engine Optimization (SEO)
1. Ownership Verification
Ownership verification ensures that only authorized users can access a website’s
performance data, indexing status, and SEO reports.
Verification Methods:
HTML File Upload: Download and upload a verification file to your website.
HTML Meta Tag: Add a meta tag inside the <head> section of your website’s
homepage.
Google Analytics (GA) Account: If you have GA access, you can verify ownership
via GA.
Google Tag Manager (GTM): If your site is set up with GTM, verification can be
done through it.
Domain Name Provider (DNS Verification): Add a TXT record to your domain’s
DNS settings.
How to Verify Ownership?
1. Open Google Search Console > Settings.
2. Click on "Ownership Verification".
3. Choose a verification method.
4. Follow the instructions provided by Google.
5. Click Verify.
6. Once verified, your site will be successfully added.
Note: If ownership is lost (e.g., DNS records are deleted), verification needs to be redone.
2. Users and Permissions
The Users and Permissions feature allows the owner to add or remove users and
control their level of access to Google Search Console data.
Page | 43
Search Engine Optimization (SEO)
Page | 44
Search Engine Optimization (SEO)
Google actively collects feedback from users to enhance the platform, fix bugs, and
improve user experience. Let’s dive deep into this feature and explore its importance, how
to use it, and best practices.
Page | 45
Search Engine Optimization (SEO)
Page | 46
Search Engine Optimization (SEO)
The About New Version section in Google Search Console (GSC) provides
information on updates, new features, and changes introduced in the latest versions of GSC.
Over the years, Google has replaced many old features with improved versions, offering
better usability, insights, and automation.
This section helps users understand the differences between the old Search Console and the
new version, including:
Newly added tools
User interface (UI) updates
Deprecated or removed features
How to adapt to the changes
Features
(a) Enhanced Reports & Insights
The new version provides 16 months of data (previously only 3 months).
Improved Performance Reports with better filtering options.
More detailed reports on indexing, mobile usability, and search enhancements.
(b) Improved Index Coverage Report
Shows which pages are indexed and which have errors.
Provides detailed reasons for indexing failures (e.g., crawl errors, blocked by
robots.txt, duplicate content).
Suggests fixes for indexing issues.
(c) Better URL Inspection Tool
Allows checking the live indexing status of a URL.
Shows how Googlebot sees your page.
Helps in identifying and fixing structured data errors.
(d) Simplified UI (User Interface)
More user-friendly design with easy navigation.
Mobile-friendly dashboard for managing sites on the go.
Clearer reports and graphs with better readability.
(e) Actionable Fix Recommendations
The new version suggests specific SEO fixes for:
Indexing issues
Mobile usability problems
Page | 47
Search Engine Optimization (SEO)
Page | 48
Search Engine Optimization (SEO)
Hreflang is an HTML attribute that tells search engines which language and regional
version of a page should be shown to users.
Example of Hreflang Implementation (for English and French pages):
<link rel="alternate" hreflang="en" href="https://example.com/en/" />
<link rel="alternate" hreflang="fr" href="https://example.com/fr/" />
Country Targeting
If your website primarily serves users from a specific country, you can set a target
country in Google Search Console.
How to Set a Target Country
1. Open Google Search Console.
Page | 49
Search Engine Optimization (SEO)
Page | 50
Search Engine Optimization (SEO)
Google may issue a manual penalty if your site violates its guidelines. This can
result in lower rankings or deindexing from Google Search.
Common Messages:
"A manual action has been applied to your site" → Your site has been penalized
for violating Google’s policies.
"Unnatural links detected" → Spammy or paid links pointing to your site may lead
to ranking penalties.
"Thin content with little or no added value" → Your content is considered low
quality or duplicate.
4. Mobile Usability Issues
Google sends these messages if it detects issues that make your site unfriendly to
mobile users.
Common Messages:
"Text too small to read on mobile devices"
"Clickable elements too close together"
"Viewport not configured"
5. Performance and Ranking Alerts
These messages inform webmasters about changes in search traffic, structured data
issues, or ranking drops.
Common Messages:
"Your site’s traffic has dropped significantly" → A sudden drop in visitors from
Google Search.
"Errors in structured data markup detected" → Issues with schema markup (e.g.,
missing fields in rich snippets).
"New Search Console features available" → Notifications about updates and new
tools.
6. URL Removal Requests and Indexing Requests
If you or someone from your team requests the removal of a URL, Google will send a
confirmation message.
Common Messages:
"A request to remove URLs from Google Search has been processed"
"Your indexing request has been approved/rejected"
Page | 51
Search Engine Optimization (SEO)
Page | 52
Search Engine Optimization (SEO)
The Web Tools section in Google Search Console (GSC) consists of various utilities
that help webmasters monitor and optimize their websites. These tools assist in improving
site performance, mobile-friendliness, structured data implementation, and page speed.
Although many of these tools are now available as standalone services, they are still listed
under Legacy Tools and Reports in GSC.
1. Mobile-Friendly Test
The Mobile-Friendly Test checks whether a webpage is optimized for mobile
devices. Since Google uses mobile-first indexing, a mobile-friendly site improves SEO
rankings and user experience.
Features:
Detects responsive design compatibility.
Identifies mobile usability issues (e.g., small fonts, touch elements too close).
Provides screenshot preview of how the page appears on mobile.
Highlights page loading issues that affect mobile performance.
2. Rich Results Test
The Rich Results Test checks whether a webpage supports structured data for
enhanced search results. Structured data helps Google display rich snippets like star ratings,
FAQs, event details, recipes, etc.
Features:
Tests Schema.org markup (e.g., JSON-LD, Microdata, RDFa).
Identifies errors in structured data implementation.
Shows how the page will appear in search results.
Supports testing for Breadcrumbs, Reviews, FAQs, Products, Jobs, etc.
3. AMP Test (Accelerated Mobile Pages Test)
AMP (Accelerated Mobile Pages) is a framework designed to load web pages faster
on mobile devices. The AMP Test checks if a webpage is properly implemented with AMP
specifications.
Features:
Validates AMP HTML structure.
Detects errors in AMP implementation.
Provides a preview of how the AMP page will appear in search results.
Checks for AMP-specific issues (e.g., missing required tags, script errors).
4. Page Speed Insights
The Page Speed Insights (PSI) tool analyzes webpage loading speed and provides
performance improvement suggestions. It measures speed on both mobile and desktop
devices.
Features:
Provides Core Web Vitals (Largest Contentful Paint, First Input Delay, Cumulative
Layout Shift).
Page | 53
Search Engine Optimization (SEO)
Page | 54