[go: up one dir, main page]

0% found this document useful (0 votes)
35 views54 pages

Introduction To SEO: Page - 1

The document provides a comprehensive overview of Search Engine Optimization (SEO), detailing its importance, types (On-Page, Off-Page, Technical), and key performance metrics (Total Clicks, Total Impressions, Average CTR, Average Position). It also discusses URL Inspection tools and Index Coverage in Google Search Console, which help webmasters analyze and optimize their websites for better visibility in search engine results. The content emphasizes strategies for improving SEO performance and understanding how search engines index web pages.

Uploaded by

Rama Krishna
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as DOCX, PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
35 views54 pages

Introduction To SEO: Page - 1

The document provides a comprehensive overview of Search Engine Optimization (SEO), detailing its importance, types (On-Page, Off-Page, Technical), and key performance metrics (Total Clicks, Total Impressions, Average CTR, Average Position). It also discusses URL Inspection tools and Index Coverage in Google Search Console, which help webmasters analyze and optimize their websites for better visibility in search engine results. The content emphasizes strategies for improving SEO performance and understanding how search engines index web pages.

Uploaded by

Rama Krishna
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as DOCX, PDF, TXT or read online on Scribd
You are on page 1/ 54

Search Engine Optimization (SEO)

UNIT-I: Overview: Performance: total clicks, total impression, avg. CTR, avg position -
URL Inspection: URL on is Google-view crawled page-view source, learn more option

What is Search Engine Optimization (SEO)?

Introduction to SEO
Search Engine Optimization (SEO) is the process of improving a website’s visibility
on search engines like Google, Bing, and Yahoo. The goal of SEO is to increase organic
(non-paid) traffic by ranking higher in search engine results pages (SERPs). A well-
optimized website attracts more visitors, improves user experience, and enhances credibility.
Why is SEO Important?
 Increases Website Traffic: Higher rankings lead to more visibility and clicks.
 Builds Trust and Credibility: Users trust websites that rank high on Google.
 Improves User Experience: SEO involves optimizing website speed, mobile-
friendliness, and navigation.
 Cost-Effective Marketing: Organic traffic is free compared to paid ads.
 Gives a Competitive Advantage: Businesses with strong SEO outperform
competitors in online searches.
Types of SEO
1. On-Page SEO
On-Page SEO focuses on optimizing elements within a website to improve its search
rankings. Key aspects include:
 Keyword Optimization: Using relevant keywords in titles, headings, and content.
 Meta Tags: Optimizing title tags and meta descriptions to improve CTR.
 URL Structure: Keeping URLs short, descriptive, and keyword-rich.
 Internal Linking: Linking related pages to improve navigation and SEO strength.
 Content Quality: Creating high-quality, original, and engaging content.
2. Off-Page SEO
Off-Page SEO refers to activities outside the website that influence rankings. Major
strategies include:
 Backlinks: Getting high-quality links from other reputable websites.
 Social Media Signals: Increased engagement on platforms like Facebook and Twitter
can impact SEO.
 Brand Mentions: Positive mentions across the web boost credibility.

3. Technical SEO
Technical SEO improves website infrastructure to help search engines crawl and
index it efficiently. Key elements include:
 Site Speed Optimization: Faster websites rank better.
 Mobile-Friendliness: Websites should be responsive on all devices.
 XML Sitemaps: Helps search engines understand website structure.
 Secure Website (HTTPS): Secure sites are favored by search engines.

Page | 1
Search Engine Optimization (SEO)

Overview of Performance Metrics:

Introduction

In the digital marketing and SEO world, performance metrics play a crucial role in analyzing
the effectiveness of online campaigns. Four key performance indicators (KPIs) commonly
used to evaluate performance are:

1. Total Clicks
2. Total Impressions
3. Average Click-Through Rate (CTR)
4. Average Position

These metrics help businesses and website owners assess how well their website is
performing in search engine results and online advertising campaigns. In this document, we
will provide a detailed explanation of each metric, their significance, how they are calculated,
and examples to illustrate their use.

1. Total Clicks

 Google Analytics Definition: Total clicks refer to the number of times users have
clicked on a website link in search engine results.
 Google Ads Definition: The number of times users have clicked on an advertisement
in a PPC campaign.
 SEO Definition: The count of user interactions where they click on an organic search
result.

Importance:

 Measures user engagement with the website


 Helps in evaluating the effectiveness of search rankings
 Determines whether the content is compelling enough to attract users

Example 1:

If a website appears in Google's search results 50,000 times and users click on the link 5,000
times, the Total Clicks value is 5,000.

Example 2:

A business running a PPC (Pay-Per-Click) advertisement receives 10,000 impressions on


Google Ads. If 500 users click on the ad, then the Total Clicks is 500.

Increase Clicks:

1. Improve Meta Descriptions – A well-written, compelling meta description can


increase click rates.
2. Use Power Words in Titles – Words like "best," "ultimate," and "essential" attract
user attention.

Page | 2
Search Engine Optimization (SEO)

3. Optimize for Featured Snippets – Appearing in position zero increases visibility


and clicks.
4. Use Rich Snippets – Schema markup enhances search results with additional
information like star ratings.
5. A/B Testing Headlines – Testing different variations of titles helps in understanding
what resonates with the audience.

2. Total Impressions

 SEO Definition: The number of times a webpage is displayed in search engine


results.
 Google Ads Definition: The number of times an ad is shown on the Google Display
Network or search results.
 Social Media Definition: The total number of times a post or ad appears in a user’s
feed.

Importance:

 Indicates how often a website or ad is being displayed


 Helps in understanding the visibility of a webpage
 Aids in comparing the performance of different keywords

Example 1:

If a webpage is shown 100,000 times in search results, the Total Impressions value is
100,000, even if no one clicks on it.

Example 2:

A company runs an ad campaign for a new product. The ad appears 500,000 times across
various websites but only receives 5,000 clicks. The Total Impressions is 500,000.

Increase Impressions:

1. Target High-Search Volume Keywords – Optimizing content with popular


keywords ensures more visibility.
2. Increase Content Frequency – Publishing more quality content can improve the
reach.
3. Leverage Social Media Promotion – Sharing content across platforms can increase
visibility.
4. Optimize for Google Discover – Ensuring mobile-friendly, high-quality content
helps in appearing in Discover feeds.
5. Use Google Ads Extensions – Ad extensions enhance visibility in search ads.

3. Average Click-Through Rate (CTR)

 Marketing Definition: The percentage of users who click on a link after seeing it in
search results or an ad.

Page | 3
Search Engine Optimization (SEO)

 Google Ads Definition: The number of clicks divided by the number of impressions
for a specific ad.
 SEO Definition: The percentage of searchers who click on a webpage in the search
engine results page (SERP).

Formula:

Importance:

 Determines the effectiveness of keywords and metadata


 Helps in improving search engine optimization (SEO) strategies
 Assists in evaluating ad performance

Example 1:

If a webpage receives 2,000 clicks from 40,000 impressions, the CTR is calculated as
follows:

Thus, the Average CTR is 5%.

Example 2:

An e-commerce website’s ad campaign receives 1,000,000 impressions and gets 50,000


clicks. The CTR is calculated as:

Improve CTR:

1. Optimize Title Tags and Meta Descriptions – Compelling titles improve


engagement.
2. Use Emotional Triggers in Content – Words like "amazing," "secret," and
"exclusive" attract attention.
3. Ensure Mobile-Friendliness – Mobile-optimized pages perform better in search.
4. Improve Page Load Speed – Faster pages result in higher engagement and clicks.
5. Use Descriptive URLs – URLs with keywords can improve CTR.

4. Average Position

 SEO Definition: The ranking position of a webpage in search engine results pages
(SERPs).
 Google Search Console Definition: The mean ranking of a webpage for all search
queries.
 PPC Definition: The average ranking of a paid ad in search results.

Importance:

 Helps in tracking search ranking trends


 Indicates the effectiveness of SEO efforts
 Allows businesses to optimize keywords for better visibility

Page | 4
Search Engine Optimization (SEO)

Example 1:

If a webpage ranks at position 3 for some keywords and at position 7 for others, the Average
Position is calculated as: Thus, the Average Position is 5.

Example 2:

A website ranks in position 1 for 10 keywords, position 3 for 15 keywords, and position 5
for 5 keywords. The average position is: Thus, the Average Position is 2.67.

Improve Average Position:

1. Improve On-Page SEO – Optimizing headings, meta tags, and images helps in
rankings.
2. Use High-Quality Backlinks – Authoritative links improve credibility and rankings.
3. Target Long-Tail Keywords – Less competitive keywords improve visibility.
4. Enhance User Experience – Faster websites and better UI contribute to better
rankings.
5. Optimize for Voice Search – Many users search via voice, so optimizing for voice
queries helps.

URL Inspection: Google View, Crawled Page, View Source, and Learn
More Option

Introduction

URL Inspection is a critical tool in search engine optimization (SEO) and web
development that allows website owners to analyze how Google perceives a webpage. It
provides insights into indexing, crawling, and page rendering, helping webmasters optimize
their content for search engines.

URL Inspection include:

 URL on Google View


 Crawled Page Analysis
 Viewing the Source Code
 Learn More Option for Troubleshooting

We will explore each of these aspects in detail, providing multiple definitions, explanations,
examples, and strategies for effective usage.

1. URL on Google View

 Google Search Console Definition: The URL on Google View refers to the indexed
version of a webpage as seen by Google.
 SEO Definition: The state of a webpage within Google’s search index, determining
its visibility in search results.

Page | 5
Search Engine Optimization (SEO)

 Web Developer’s Definition: A snapshot of how Google renders and caches a


webpage after crawling.

Importance:

 Confirms whether a webpage is indexed by Google.


 Helps diagnose indexing and rendering issues.
 Ensures that the correct version of the page appears in search results.

Example 1:

A website owner submits a blog post URL to Google Search Console and uses the URL
Inspection tool. The report shows that the page is indexed and displays in search results.

Example 2:

A business updates a product page but finds that Google is still displaying an older version.
Checking the URL on Google View confirms that Google has not yet re-crawled and indexed
the updated version.

Optimize:

1. Ensure Mobile-Friendliness – Google prioritizes mobile-first indexing.


2. Use Canonical Tags – Avoid duplicate content issues.
3. Submit Sitemap to Google – Speed up indexing.
4. Request Indexing Manually – Use Google Search Console for immediate indexing.
5. Improve Page Load Speed – Faster pages get indexed and ranked better.

2. Crawled Page Analysis

 SEO Definition: The analysis of how Googlebot crawls and retrieves a webpage’s
content.
 Googlebot Definition: A record of when and how Googlebot last accessed a page.
 Technical SEO Definition: The study of HTTP status codes, JavaScript rendering,
and blocked resources during crawling.

Importance:

 Determines if Google can access and understand webpage content.


 Identifies crawling errors like 404 (Not Found) or 500 (Server Error).
 Ensures critical resources (CSS, JavaScript) are correctly loaded.

Example 1:

A developer finds that their JavaScript-heavy webpage isn’t appearing in search results. The
Crawled Page Report shows that Googlebot couldn’t execute JavaScript properly.

Page | 6
Search Engine Optimization (SEO)

Example 2:

A website with multiple language versions experiences indexing issues. Crawled Page
Analysis reveals that the hreflang tags are incorrectly implemented, causing confusion for
Googlebot.

Improve Crawl ability:

1. Fix Broken Links – 404 errors prevent Googlebot from accessing pages.
2. Ensure Robots.txt is Correctly Configured – Avoid blocking important pages.
3. Use Internal Linking – Helps Google discover and crawl pages efficiently.
4. Optimize Server Response Time – Slow servers can limit Googlebot’s crawling
frequency.
5. Implement Structured Data – Enhances understanding of webpage content.

3. Viewing Source Code (View Source)

 Browser Definition: The HTML, CSS, and JavaScript code that constructs a
webpage.
 Developer Definition: The raw markup that browsers and crawlers read to render a
webpage.
 SEO Definition: A way to analyze metadata, structured data, and canonical tags.

Importance:

 Verifies meta tags, Open Graph data, and schema markup.


 Identifies hidden elements affecting SEO (e.g., noindex tags).
 Helps debug issues related to JavaScript rendering.

Example 1:

A webpage isn’t appearing in search results. Inspecting the View Source reveals a mistakenly
added <meta name="robots" content="noindex"> tag.

Example 2:

A website with rich snippets isn’t displaying star ratings in search results. Checking the View
Source confirms missing structured data for reviews.

How to Use View Source Effectively:

1. Inspect Meta Tags – Ensure proper implementation of title, description, and robots
tags.
2. Check Canonical URLs – Avoid duplicate content issues.
3. Analyze Structured Data – Validate schema.org markup.
4. Ensure Proper H1-H6 Usage – Optimize header structure for SEO.
5. Look for Hidden Content – Some scripts might block important elements.

Page | 7
Search Engine Optimization (SEO)

<!DOCTYPE html>
<html lang="en">
<head>
<meta charset="UTF-8">
<title>Google</title>
</head>
<body>
<div id="main">
<input type="text" placeholder="Search Google">
</div>
</body>
</html>
4. Learn More Option for Troubleshooting

 Google Support Definition: A feature in Google Search Console that provides


additional guidance for errors and warnings.
 SEO Definition: A knowledge base offering solutions for indexing, crawling, and
ranking issues.
 Webmaster Definition: A help section with links to Google documentation, forums,
and troubleshooting steps.

Importance:

 Offers step-by-step solutions for resolving SEO issues.


 Provides Google’s official best practices.
 Helps webmasters understand complex indexing problems.

Example 1:

A website owner sees a "Crawled – Not Indexed" warning in Search Console. Clicking
Learn More directs them to Google’s guidelines on resolving indexing issues.

Example 2:

An e-commerce store experiences slow loading times. The Learn More option provides tips
on optimizing images and using lazy loading.

Use the Learn More Feature:

1. Click on Errors in Search Console – Access Google’s recommended fixes.


2. Follow Google’s Best Practices – Ensure compliance with guidelines.
3. Participate in Google Webmaster Forums – Seek community support.
4. Stay Updated on Algorithm Changes – Google frequently updates ranking factors.
5. Use Google’s Mobile-Friendly Test – Identify mobile usability issues.

Page | 8
Search Engine Optimization (SEO)

UNIT-II: Index: Coverage: valid, excluded, valid with warnings, submitted and indexed,
discovery, refrreing page, pages with errors, valid pages -Sitemaps-add new sitemap,
submitted sitemaps, type, submitted, lastread, status, discoveredURLs.

Introduction to Index Coverage

Index Coverage refers to the process by which Google tracks the indexing status of
URLs on a website. This feature in Google Search Console helps webmasters understand
which pages are successfully indexed, which are excluded, and which have warnings or
errors.

Index Coverage is a report in Google Search Console that provides insights into the
URLs Googlebot encounters, categorizing them into valid, excluded, valid with warnings,
and errors.

The term "Index Coverage" indicates the extent to which Google has indexed a
website’s content. It identifies how pages are processed, highlighting issues affecting
visibility in search results.

Example:

A website owner notices that some of their blog posts are missing from Google search
results. They check the Index Coverage Report in Search Console and find that some pages
are marked "Excluded" due to canonicalization issues. They resolve this by correctly
implementing canonical tags.

Types of Index Coverage Statuses

1. Valid Pages

Valid pages are successfully indexed and appear in Google Search results without any
errors or warnings.

A "Valid" status in the Index Coverage Report means Google has crawled and
indexed the page successfully, making it available for search rankings.

A valid page is any web page that meets Google's quality guidelines, is properly
structured, and is considered useful and relevant to users.

Example:

A website has an e-commerce product page that is correctly indexed, appears in search
results, and receives organic traffic from Google.

Common Issues That Can Affect Valid Pages:

 Incorrect robots.txt rules blocking crawling


 Canonicalization pointing to a different page
 Server response issues (5xx errors)

Page | 9
Search Engine Optimization (SEO)

2. Excluded Pages

Excluded pages are those intentionally or unintentionally omitted from Google’s


index due to directives such as no index or robots.txt restrictions.

The "Excluded" status in Index Coverage means Google found the page but decided
not to index it, either due to website settings or algorithmic determinations.

Excluded pages are web pages that Googlebot has crawled but determined they should
not appear in search results, often due to duplication, poor quality, or explicit exclusion
settings.

Example:

A page containing duplicate content is set with a noindex tag, preventing it from appearing
in search results.

Common Reasons for Exclusion:

 No index Meta Tag: Pages explicitly marked with <meta name="robots"


content="noindex">
 Canonicalization: Google selects a different URL as the preferred version.
 Soft 404 Errors: Google thinks a page doesn’t provide unique value.
 Blocked by robots.txt: The page is restricted from crawling.

3. Valid with Warnings

Pages marked as "Valid with Warnings" are indexed but may have issues affecting
their search performance.

A page is classified as "Valid with Warnings" when Google indexed it despite


conflicting signals, such as being blocked by robots.txt.

A "Valid with Warnings" page is an indexed page that Googlebot processed but
flagged for potential problems that might impact search rankings or user experience.

Example:

A blog post is indexed even though the robots.txt file disallows crawling, leading to a "Valid
with Warning" status.

Possible Causes:

 Indexing despite robots.txt restrictions


 Canonical conflicts
 Mobile usability issues

Common Errors That Lead to Warnings:

 Blocked Resources: Some page elements (CSS, JavaScript) are blocked.

Page | 10
Search Engine Optimization (SEO)

 Mismatched Canonical Tags: Google selects a different canonical than the one
specified.
 HTTPS Issues: Mixed content errors affect indexing.

4. Submitted and Indexed

"Submitted and Indexed" refers to URLs submitted via a sitemap that Google
successfully indexed.

This status confirms that a URL included in an XML sitemap has been processed and
indexed by Google.

"Submitted and Indexed" means the page was manually reported to Google through a
sitemap or Search Console request, and Google determined it fit for inclusion in the index.

Example:

A new article submitted in the sitemap appears in Google Search after a few days.

Common Problems:

 Sitemap errors preventing correct submission


 Duplicate URLs leading to incorrect indexing
 Slow crawling due to low page authority

5. Discovery

Discovery refers to how Googlebot finds new URLs on a website, either through
sitemaps, internal links, or external links.

A page is in "Discovery" status when Google knows about it but hasn’t crawled or
indexed it yet.

Discovery is the phase where Google identifies a page but has not yet determined if or
when it should be crawled and indexed.

Example:

A newly published blog post that hasn't been crawled appears under "Discovered – currently
not indexed."

Common Causes of Delayed Discovery:

 Poor internal linking structure


 Slow crawling due to server issues
 Low-quality backlinks affecting authority

Page | 11
Search Engine Optimization (SEO)

6. Referring Page

A "Referring Page" is the URL that links to the discovered page, helping Googlebot
find new content.

A referring page is an existing indexed page that contains a hyperlink pointing to


another page.

Referring pages act as pathways for Googlebot and users, assisting in page discovery
and link equity distribution.

Example:

A homepage linking to a new service page helps Google discover and index the new content.

Common Issues with Referring Pages:

 Broken links leading to 404 errors


 Orphan pages with no internal links
 Redirect chains causing indexing delays

7. Pages with Errors

Pages with errors are those Google attempted to index but encountered issues that
prevented successful processing.

An "Error" status in Index Coverage means Googlebot encountered problems like 404
errors, server issues, or incorrect redirects.

Error pages are URLs Googlebot could not successfully crawl and index due to client-
side or server-side issues, reducing their visibility in search.

Example:

A deleted blog post URL returning a "404 Not Found" error appears under "Pages with
Errors."

Common Errors and Their Causes:

 404 (Not Found): The page has been deleted or the URL is incorrect.
 500 (Server Error): The server is down or misconfigured.
 Redirect Loops: Pages stuck in infinite redirect chains.
 Crawl Anomalies: Googlebot encountered unexpected issues while crawling.

8. Valid Pages

Valid pages are web pages that have been successfully crawled and indexed by Google.
These pages appear in search results and contribute to a website's visibility.

Page | 12
Search Engine Optimization (SEO)

Subtypes of Valid Pages:

1. Indexed, Not Submitted in Sitemap

 Google found and indexed the page even though it was not included in the
sitemap.
 Example: A blog post linked internally but missing from the XML sitemap
gets indexed.

2. Submitted and Indexed

 The page was submitted through the sitemap and successfully indexed.
 Example: A product page submitted in the sitemap appears in search
results.

3. Indexed, But Blocked by Robots.txt

 Google indexed the page despite a robots.txt rule blocking crawling.


 Example: A test page meant to be hidden appears in Google Search due to
external links.

4. Canonical Page Indexed

 The preferred version of a duplicate page is indexed based on the


canonical tag.
 Example: A webpage with multiple language versions has its primary
version indexed.

5. Indexed, Serving Alternate Page with Proper Canonical Tag

 Google indexed a page but decided to show another version as the


canonical one.
 Example: A mobile and desktop version exists, but only one is shown in
search results.

Introduction to Sitemaps

A sitemap is an XML or HTML file that provides search engines with a structured list
of a website’s URLs, helping crawlers discover and index content efficiently. It acts as a
guide that tells search engines which pages are most important and should be indexed.

A sitemap is a file that acts as a roadmap for search engine crawlers, listing pages,
images, videos, and other resources to improve website visibility. It ensures search engines
do not miss any essential content on a site, even if the internal linking structure is weak.

Sitemaps are structured representations of a website’s content hierarchy, helping


search engines understand relationships between pages and prioritize indexing. They can also
contain metadata, such as when a page was last updated and how often it changes.

Example:

Page | 13
Search Engine Optimization (SEO)

An e-commerce website with thousands of product pages submits a sitemap to Google


Search Console to ensure all pages get indexed properly. Without a sitemap, Google might
struggle to discover deep product pages, reducing organic search visibility.

Importance of Sitemaps

Sitemaps play a crucial role in search engine optimization (SEO). Here’s why they are
essential:

1. Faster Indexing: Search engines can quickly find and index new or updated pages.
2. Better Crawl Efficiency: Googlebot and other crawlers efficiently navigate a
website, ensuring all critical content is considered.
3. Improved SEO Performance: Pages listed in a sitemap have a higher chance of
ranking in search results.
4. Handling Large Websites: Large sites or those with poor internal linking structures
benefit greatly from a sitemap.
5. Helps in Google Discover: Sites that want better visibility in Google Discover need a
well-maintained sitemap.

Types of Sitemaps

1. XML Sitemaps

An XML (Extensible Markup Language) sitemap is a structured file that helps search
engines understand the URLs available for crawling. An XML sitemap is a structured
document designed for search engines, listing URLs along with metadata like last modified
date, change frequency, and priority. It provides a clear structure of the site’s content for
crawlers.

It is a machine-readable file that lists the URLs of a website along with metadata (e.g.,
last modified date, priority, and update frequency).

Example:

Example of an XML Sitemap:

Page | 14
Search Engine Optimization (SEO)

2. HTML Sitemaps

An HTML sitemap is a user-friendly page listing all important website links to


improve navigation and user experience. An HTML sitemap is a web page that provides a
structured list of website pages for users to navigate easily.

An HTML sitemap is a structured page designed to help visitors find content easily by
organizing links in a logical manner. It acts as a user-friendly roadmap, listing important links
that help visitors find content quickly.

An HTML sitemap serves as a secondary navigation tool for users, ensuring that
important pages remain easily accessible.

Example:

A blog site creates an HTML sitemap with categories and posts, improving internal linking
and enhancing user experience.

Page | 15
Search Engine Optimization (SEO)

3. Video Sitemaps

A video sitemap helps search engines discover video content on a website by


providing additional details like title, description, and duration. A video sitemap provides
search engines with metadata about video content hosted on a website.

A video sitemap contains metadata about video files, enabling search engines to
display video-rich results in search rankings. It helps Google understand video details such as
duration, category, and thumbnail location for better indexing in Google Video Search.

A video sitemap is an XML file specifically structured to provide search engines with
important information about embedded video content.

Example:

A news website submits a video sitemap to Google, ensuring video clips appear in search
results with thumbnails.

4. Image Sitemaps

An image sitemap lists image URLs, helping search engines index and display them
in image search results. An image sitemap helps search engines discover and index images
from a website, improving visibility in Google Image Search.

An image sitemap provides a structured format for listing image assets, ensuring
better indexing and visibility in image search results. It is an XML-based file that contains
direct links to images along with attributes such as captions, titles, and licenses.

An image sitemap helps search engines understand the context of images on a site by
associating them with their respective web pages.

Example:

Page | 16
Search Engine Optimization (SEO)

A travel blog submits an image sitemap to ensure high-quality images appear in Google
Images search results.
5. News Sitemaps
News sitemaps are designed for Google News, ensuring timely indexing of news
articles. A news sitemap is designed for Google News, helping search engines quickly
discover time-sensitive news articles.
A news sitemap helps Google prioritize fresh news content, ensuring it appears in
news-specific search results. It provides additional metadata such as publication date, title,
and language, ensuring that news articles appear promptly in search results.
A news sitemap is a specialized XML format used by publishers to inform search
engines about recently published articles.
Example:

Page | 17
Search Engine Optimization (SEO)

A news portal submits a news sitemap containing breaking stories, ensuring they appear in
Google News results within minutes.

6. Mobile Sitemaps

A mobile sitemap is specifically designed for websites with mobile-friendly content,


ensuring better indexing for mobile users. It helps Google understand which pages are
optimized for mobile devices, especially for Google Mobile Search.

Example of a Mobile Sitemap:

7. RSS/Atom Feed Sitemaps

RSS (Really Simple Syndication) and Atom sitemaps automatically update search engines
with new blog posts or articles.

They are typically used for news sites or blogs where content updates frequently.

Example of an RSS Feed Sitemap:

Page | 18
Search Engine Optimization (SEO)

Submitted

"Submitted" refers to the total number of URLs included in a sitemap that has been
submitted to Google Search Console. It represents the pages a website owner wants Google
to crawl and index.

Explanation:

 When a sitemap is uploaded to Google Search Console, it contains a list of URLs that
the website owner wants Google to discover.
 These URLs can include web pages, images, videos, or news articles depending on
the type of sitemap submitted.
 Google does not guarantee that all submitted URLs will be indexed. It evaluates
each URL based on factors like content quality, uniqueness, and crawl ability.

Example:

Imagine a website owner submits a sitemap with 1,000 URLs to Google. However, due to
duplicate content or restricted pages (e.g., robots.txt blocking), Google may choose to index
only 800 URLs out of the 1,000 submitted.

Common Issues with Submitted URLs:

 Blocked by robots.txt – If a page is disallowed in robots.txt, Google won’t crawl it.


 No index Meta Tag – If the page has <meta name="robots" content="noindex">,
Google won’t index it.
 Duplicate Content – If Google finds identical content on multiple pages, it may not
index all of them.
 Crawl Budget Limitations – Large websites may experience limits on how many
pages Google crawls.

Last Read

"Last Read" refers to the last date and time Googlebot accessed and processed a
submitted sitemap.

Explanation:

 Google periodically checks sitemaps to see if there are new or updated URLs.
 If a sitemap is frequently updated (e.g., for a news website), Google will read it more
often.
 Websites with static content (pages that don’t change often) may have their sitemap
read less frequently.

Example Scenario:

A news website submits a sitemap daily with newly published articles. Google’s Last Read
date might show “March 22, 2025” if the sitemap was processed on that day. If no updates
are made, the Last Read date may remain the same for weeks.

Page | 19
Search Engine Optimization (SEO)

Common Issues with Last Read:

 Google Not Reading the Sitemap Frequently – This can happen if the website has
low authority or few updates.
 Sitemap Errors – If the sitemap has incorrect formatting, Google may not process it.
 Server Issues – If the sitemap URL is slow to load or returns an error (e.g., 404 Not
Found), Google won’t read it.

Status

"Status" indicates whether a sitemap submission was successfully processed or if


there were errors that need to be fixed.

Explanation:

The sitemap status tells the website owner whether Google accepted or rejected the sitemap.
It can have one of the following states:

Possible Sitemap Statuses in Google Search Console:

Example:

A website submits a sitemap, and Google Search Console shows Status: Success This means
all URLs were processed correctly. However, if the status says Has Issues, some URLs
might be blocked or unreachable.

Common Status Errors:

 "Couldn’t fetch" – Google can’t access the sitemap due to a broken URL.
 "Invalid XML format" – The sitemap has incorrect XML structure.
 "URLs not allowed" – The sitemap contains URLs blocked by robots.txt.
 "Large sitemap file" – Google allows a maximum of 50,000 URLs per sitemap. If
exceeded, Google may reject it.

Discovered URLs

"Discovered URLs" refers to the total number of URLs Google found in the
submitted sitemap. These are potential pages that could be crawled and indexed.

Page | 20
Search Engine Optimization (SEO)

Explanation:

 The number of discovered URLs can be equal to or less than the submitted URLs.
 Just because a URL is "discovered" doesn’t mean it will be indexed. Google decides
which URLs to index based on relevance and quality.
 If discovered URLs are significantly lower than submitted URLs, it may indicate
issues with the sitemap or website structure.

Example:

A sitemap containing 2,000 pages is submitted. Google discovers 1,800 URLs from the
sitemap, meaning some URLs were ignored due to errors like broken links, redirects, or
restricted pages.

Common Issues with Discovered URLs:

 Redirect Loops – If a page redirects multiple times, Google may ignore it.
 Thin Content – Pages with little or no content may not be discovered.
 Server Overload – If the website’s server is slow, Google may crawl fewer URLs.

Page | 21
Search Engine Optimization (SEO)

UNIT-III: Enhancements:--Core web vitals-Mobile usability-AMP-bread crumbs-FAQ-How


to-Logos- Riview snippets-Site Links Search box

Enhancements in SEO (Search Engine Optimization)

Search Engine Optimization (SEO) enhancements play a crucial role in improving


website visibility, user experience, and rankings on search engines like Google. These
enhancements include Core Web Vitals, Mobile Usability, AMP, Breadcrumbs, FAQ
Schema, How-To Schema, Logos, Review Snippets, and Site Links Search Box.

Each of these elements contributes to better user engagement, faster load times, and
higher search rankings. Below is an in-depth explanation of each enhancement.

1. Core Web Vitals – Google's Page Experience Metrics

Core Web Vitals are key performance indicators (KPIs) used by Google to evaluate
the loading speed, interactivity, and visual stability of a webpage. These are crucial ranking
factors since Google’s Page Experience Update (2021).

Core Web Vitals Metrics:

Core Web Vitals Important


 Affects Google rankings since page experience is a ranking factor.
 Faster websites retain more users, reducing bounce rates.
 Better UX leads to higher conversion rates.
Improve Core Web Vitals
 Optimize images using WebP format to reduce size.
 Use lazy loading for off-screen images to load them only when needed.
 Minimize JavaScript and defer unused scripts to reduce execution time.
 Use a Content Delivery Network (CDN) to serve pages faster.
2. Mobile Usability – Mobile-First Indexing
With Google’s Mobile-First Indexing, Google ranks websites based on their mobile
versions rather than desktop versions. This makes mobile usability a key ranking factor.
Factors Affecting Mobile Usability:
1. Responsive Design – The website should adapt to different screen sizes.
2. Text Readability – Fonts should be large enough to read without zooming.
3. Clickable Elements – Buttons and links should be easy to tap on mobile screens.

Page | 22
Search Engine Optimization (SEO)

4. Fast Loading – Mobile users expect instant loading, so page speed should be
optimized.
5. Touch-Friendly Navigation – Buttons and links should be easy to tap.
6. Readable Fonts & Layout – Text should be large enough without zooming.
Improve Mobile Usability
 Use Google’s Mobile-Friendly Test to check your site.
 Enable responsive design using CSS media queries.
 Avoid intrusive pop-ups that can negatively impact mobile UX.
 Optimize images and use browser caching to reduce loading times.
3. AMP (Accelerated Mobile Pages) – Faster Mobile Experience
AMP (Accelerated Mobile Pages) is an open-source framework by Google that creates
lightweight, fast-loading mobile web pages. It removes unnecessary HTML and JavaScript
elements, making mobile pages lightweight and super fast.
AMP Work
 Uses a stripped-down version of HTML (AMP HTML).
 Loads pages instantly by removing unnecessary elements like JavaScript.
 Google caches AMP pages to make them load even faster in search results.
Advantages of AMP:
 Faster Page Load – Improves user experience and reduces bounce rate.
 Higher SEO Ranking – AMP pages often rank better in mobile searches.
 Increased Ad Revenue – Faster pages mean users stay longer, improving ad
impressions.
Implement AMP
 Convert existing pages into AMP by using the <amp-html> tag.
 Replace standard elements like <img> with <amp-img>.
 Use Google AMP Validator to test AMP pages.

Example of AMP Code:

Page | 23
Search Engine Optimization (SEO)

4. Breadcrumbs – Better Navigation & SEO

Breadcrumbs are navigational elements that show the hierarchy of a website’s pages. They
help users track their location within a website.

Types of Breadcrumbs:

1. Location-Based Breadcrumbs: Shows site structure.


Example: Home > Electronics > Laptops
2. Attribute-Based Breadcrumbs: Used in e-commerce sites.
Example: Home > Shoes > Men’s > Running Shoes
3. History-Based Breadcrumbs: Tracks the user’s navigation path.
Example: Back to Previous Page

Benefits of Breadcrumbs:

 Improves user navigation and experience.


 Increases SEO visibility in search results.
 Enhances internal linking and reduces bounce rates.

Add Breadcrumbs

 Implement structured data markup in JSON-LD format.


 Use Google’s Rich Results Test to verify breadcrumbs.

Example of Breadcrumb Schema Markup:

{
"@context": "https://schema.org",
"@type": "BreadcrumbList",
"itemListElement": [
{
"@type": "ListItem",
"position": 1,
"name": "Home",
"item": "https://example.com/"
},
{
"@type": "ListItem",
"position": 2,
"name": "Electronics",
"item": "https://example.com/electronics"
},
{
"@type": "ListItem",
"position": 3,
"name": "Laptops",
"item": "https://example.com/electronics/laptops"
} ] }

Page | 24
Search Engine Optimization (SEO)

5. FAQ Schema – Displaying Questions in Search Results

FAQ Schema is structured data that allows frequently asked questions to appear directly in
Google search results.

Benefits of FAQ Schema:

 Increases visibility in search results.


 Helps answer user queries before they even visit the site.
 Boosts click-through rate (CTR).

Implement FAQ Schema

 Use FAQPage structured data in JSON-LD format.

Example of FAQ Schema:

{
"@context": "https://schema.org",
"@type": "FAQPage",
"mainEntity": [
{
"@type": "Question",
"name": "What is SEO?",
"acceptedAnswer": {
"@type": "Answer",
"text": "SEO stands for Search Engine Optimization, which helps improve website
ranking in search engines."
}
}
]
}

6. How-To Schema – Instructions in Search Results

How-To Schema provides step-by-step instructions in Google search results.

Add How-To Schema

 Use How To structured data.


 Define steps with descriptions and images.

Example of How-To Schema:

{
"@context": "https://schema.org",
"@type": "HowTo",
"name": "How to Make Coffee",

Page | 25
Search Engine Optimization (SEO)

"step": [
{
"@type": "HowToStep",
"name": "Boil Water",
"text": "Heat water to 95°C."
},
{
"@type": "HowToStep",
"name": "Add Coffee",
"text": "Pour hot water over ground coffee."
}
]
}

7. Site Links Search Box – Google Search Within a Website

A Site Links Search Box allows users to search within a website directly from Google.

Implement

 Add Web Site structured data with Search Action.

Example:

{
"@context": "https://schema.org",
"@type": "WebSite",
"url": "https://example.com/",
"potentialAction": {
"@type": "SearchAction",
"target": "https://example.com/search?q={search_term_string}",
"query-input": "required name=search_term_string"
}
}

Page | 26
Search Engine Optimization (SEO)

UNIT-IV: Security & Manual Actions: Manual actions-How do I remove Manual Actions in
Search Engine Optimisation-security issues and its report-

Security & Manual Actions in SEO

Search Engine Optimization (SEO) is essential for improving a website’s visibility on


search engines like Google, Bing, and Yahoo. However, search engines have strict guidelines
that websites must follow. If a website violates these guidelines, it may receive manual
actions or security warnings, which can significantly impact rankings and organic traffic.

We will explore manual actions, their causes, how to remove them, security
issues, and their reports in SEO.

Search Engine Optimization (SEO) is vital for improving a website’s visibility on


search engines. However, violating search engine guidelines can result in penalties, also
known as manual actions. These penalties, applied manually by Google’s reviewers, can
lower search rankings, remove pages from search results, or even deindex an entire
website.

Additionally, security issues can negatively impact a website’s credibility, causing


users to receive warnings before accessing your site. This guide provides on manual
actions, how to remove them, security threats, and Google’s security reports.

1. Manual Actions in SEO

1.1 What are Manual Actions?

Manual actions are penalties applied by Google's human reviewers (not algorithms)
when a website violates Google’s Webmaster Guidelines. These penalties result in lower
rankings, de indexing (removal from search results), or warnings in Google Search
Console.

 Manipulating search rankings with spammy content


 Buying or selling unnatural backlinks
 Using deceptive practices such as cloaking or redirects
 Security threats like hacking and malware

Unlike algorithmic penalties (such as Google’s Panda or Penguin updates, which are
automated), manual actions require a Google employee to review the site before applying
the penalty.

If a website receives a manual action, the website owner is notified through Google Search
Console (GSC).

Manual actions are different from algorithmic penalties (like Google’s Panda or Penguin
updates), which are applied automatically by search engine algorithms.

When Google detects violations, it applies manual actions and notifies the website owner
through Google Search Console under the "Security & Manual Actions" section.

Page | 27
Search Engine Optimization (SEO)

Consequences of Manual Actions:

 Lower Search Rankings – Affected pages rank lower in Google Search results.
 De indexing – In severe cases, the entire website is removed from Google Search.
 Warnings in Search Console – Affected websites receive alerts in Google Search
Console.
 Loss of Organic Traffic – A drop in rankings leads to reduced website traffic.

1.2 Where to Find Manual Actions?

To check if your site has received a manual action, follow these steps:

1. Log in to Google Search Console → Go to Google Search Console


2. Click on ‘Security & Manual Actions’ on the left-hand menu.
3. Select ‘Manual Actions’ → If your site has no manual actions, you will see:

 “No issues detected.”

4. If a penalty exists, Google will provide details about the issue and the affected URLs.

 Reason for the penalty


 Affected pages
 Severity level (site-wide or partial penalty)

1.3 Types of Manual Actions & Causes

Manual actions are penalties imposed by Google’s human reviewers when a website
violates Google’s Webmaster Guidelines. These actions can result in a drop in rankings,
removal from search results, or warnings in Google Search Console.

Below is a detailed breakdown of the different types of manual actions, their causes, and
examples of how they impact websites.

1. Spam-Related Manual Actions

Google considers spam to be any manipulative, deceptive, or low-quality content that


disrupts the search experience.

1.1 Pure Spam

This manual action is applied to websites that engage in aggressive spam tactics, including:

 Auto-generated content (content created by bots without human input)


 Scraped content (copying content from other sites without adding value)
 Cloaking (showing different content to users and search engines)

Example: A website creates thousands of pages using a bot, stuffing them with random
keywords and copied text from other websites. Google detects this and applies a Pure Spam
manual action, removing the site from search results.

Page | 28
Search Engine Optimization (SEO)

How to Fix:

 Remove auto-generated and copied content


 Ensure original, high-quality content is added
 Stop cloaking and provide the same content to search engines and users

1.2 User-Generated Spam

This action is applied when users (not the website owner) create spammy content on a
website. This often happens on:

 Forums (users posting irrelevant links)


 Blog comments (spammy advertisements)
 Profile pages (users creating fake accounts for link-building)

Example: A blog has a comment section where spammers post:


“Get free money now! Click here: www.spamlink.com”
Google detects the spam and penalizes the website.

How to Fix:

 Enable comment moderation (approve comments before they appear)


 Use CAPTCHA to prevent bots from spamming the site
 Block suspicious users and remove spam content

1.3 Spammy Free Hosts & Domains

If Google finds that many spam websites are hosted on the same free hosting platform, it
may penalize the entire host or domain provider.

Example: A free website hosting service allows users to create unlimited websites. Over
time, hundreds of spammy, low-quality websites appear. Google deindexes the entire
hosting provider, affecting all websites on it.

How to Fix:

 Avoid free hosting services that attract spam websites


 Choose a reliable web hosting provider

2. Link-Related Manual Actions

Google considers back links a major ranking factor, but manipulating links violates
guidelines.

2.1 Unnatural Links TO Your Site

This penalty occurs when a website acquires low-quality or paid backlinks to manipulate
rankings.

Page | 29
Search Engine Optimization (SEO)

Example: A website owner buys 10,000 backlinks from spammy websites to boost rankings.
Google detects the unnatural pattern and penalizes the site.

How to Fix:

 Identify bad backlinks using Google Search Console > Links


 Use Google’s Disavow Tool to ignore those links

2.2 Unnatural Links FROM Your Site

If a website sells links or adds unnatural outbound links, it can be penalized.

Example: A blog owner charges money for adding “do-follow” links to unrelated websites.
Google detects this as a link scheme and applies a manual action.

How to Fix:

 Remove or add nofollow to unnatural outbound links


 Stop selling or exchanging backlinks

3. Content-Related Manual Actions

Google penalizes websites with low-quality, duplicate, or deceptive content.

3.1 Thin Content with Little or No Value

This penalty is applied when a website has low-quality, auto-generated, or duplicate


content.

Example: A website creates 100 pages with slightly reworded content but provides no
unique value. Google detects this and applies a penalty.

How to Fix:

 Remove duplicate or low-quality content


 Add unique and valuable content

3.2 Keyword Stuffing & Hidden Text

Google penalizes websites that overuse keywords unnaturally or hide text to manipulate
rankings.

Example: A website writes:


“Buy cheap laptops online. Cheap laptops for sale. Best cheap laptops online.”
This repeats the keyword unnaturally, leading to a penalty.

How to Fix:

 Write natural, user-friendly content

Page | 30
Search Engine Optimization (SEO)

 Avoid hiding text or stuffing excessive keywords

4. Security & Technical Violations

Security issues affect user safety and trust, leading to manual actions or security
warnings.

4.1 Hacked Website with Spam

If a website is hacked and used for spamming or distributing malware, Google applies a
manual action.

Example: A hacker inserts hidden links to spam websites into a blog. Google detects this
and penalizes the site.

How to Fix:

 Scan for malware and remove hacked content


 Update passwords and security settings

4.2 Cloaking & Sneaky Redirects

Cloaking is when a website shows different content to Google and users. Redirects can
also be deceptive.

Example: A page appears as an educational blog post to Google but redirects users to an
adult website.

How to Fix:

 Remove sneaky redirects


 Ensure users and search engines see the same content

5. Local SEO & Structured Data Violations

Google applies manual actions to local businesses and websites using misleading
structured data.

5.1 Fake Business Listings (Local SEO Violations)

Businesses that use fake addresses or duplicate listings can receive penalties.

Example: A business creates multiple fake locations to dominate Google Maps results.

How to Fix:

 Remove fake business listings


 Use genuine addresses and information

Page | 31
Search Engine Optimization (SEO)

5.2 Misleading Structured Data Markup


Websites that use false schema markup to mislead users get penalized.
Example: A blog marks a normal article as a “5-star reviewed product” to trick search
engines.
How to Fix:
 Use accurate structured data
 Validate with Google’s Structured Data Testing Tool
2. How to Remove Manual Actions in SEO
A manual action must first confirm that Google has penalized your website. Google provides
notifications about manual actions in Google Search Console (GSC).
1.1 How to Access Manual Actions in Google Search Console
1. Log in to Google Search Console
o Visit Google Search Console
o Sign in with your Google account
o Select your website (property)
2. Go to the Manual Actions Report
o On the left-hand menu, click "Security & Manual Actions"
o Then select "Manual Actions"
3. Review the Manual Action Details
o If your website has been penalized, you will see a message explaining the type of
manual action, affected pages, and the reason for the penalty.
o If there are no manual actions, you will see the message: “No issues detected”
Step 2: Understand the Type of Manual Action
Google applies different types of manual actions depending on the severity of the violation. The
first step in fixing a manual action is understanding its type and why it was applied.
2.1 Common Types of Manual Actions and Their Causes

Page | 32
Search Engine Optimization (SEO)

Step 3: Fix the Manual Action Issue


Once you have identified the type of manual action, you must take corrective measures to fix the
problem.
3.1 How to Fix Different Manual Actions

Step 4: Remove Bad Backlinks Using Google’s Disavow Tool (If Needed)
If your manual action is due to unnatural links to your site, you need to remove or disavow those
bad backlinks.
4.1 How to Identify Bad Backlinks
1. Open Google Search Console
2. Go to Links > Top Linking Sites
3. Download your backlink report
4. Identify spammy, irrelevant, or paid backlinks
4.2 How to Disavow Bad Links
1. Create a Disavow File
o Open a Notepad (.txt) file
o List all spammy domain names like this:
domain:spamdomain1.com
domain:spamdomain2.com
2. Upload the Disavow File to Google
 Visit Google Disavow Tool
 Upload your .txt file
 Click Submit
Step 5: Submit a Reconsideration Request to Google
After fixing the issue, you must ask Google to review your site and remove the manual action.

Page | 33
Search Engine Optimization (SEO)

5.1 How to Submit a Reconsideration Request


1. Open Google Search Console
2. Go to "Manual Actions"
3. Click "Request a Review"
4. Write a detailed reconsideration request, explaining:
o What caused the issue
o What actions you took to fix it
o What steps you implemented to prevent future violations
5. Click Submit
Step 6: Wait for Google’s Response
 Google reviews reconsideration requests manually. This can take a few days to a few
weeks.
 If your request is approved, the manual action will be lifted, and rankings may recover over
time.
 If your request is rejected, Google will provide feedback, and you must reattempt the fixes.
Step 7: Monitor Your Website to Prevent Future Manual Actions
To avoid future penalties, follow these best practices:
✔ Follow Google’s Webmaster Guidelines – Never use black-hat SEO techniques.
✔ Audit Your Backlinks Regularly – Use tools like Ahrefs, SEMrush, and Google Search
Console to identify bad links.
✔ Improve Content Quality – Ensure all pages have valuable, original, and user-focused content.
✔ Secure Your Website – Use SSL certificates, firewalls, and security plugins to prevent hacking.

3. Security Issues in SEO & How to Fix Them


1. What are Security Issues in SEO?

Security issues in SEO occur when a website is compromised due to hacking,


malware, phishing, deceptive content, or SSL (Secure Socket Layer) issues. These issues can
lead to manual actions by Google, warnings in Google Search Console, and a decline in
search rankings.

When a website is flagged as unsafe, users may receive warnings such as:

 "This site may be hacked" (appears in Google search results)


 "Deceptive site ahead" (shown by browsers like Google Chrome)
 "The site ahead contains malware"

Impact of Security Issues on SEO:

Page | 34
Search Engine Optimization (SEO)

 Drop in Rankings: Google may lower the rankings of compromised websites.


 Removal from Search Index: If security issues persist, the website might be de -
indexed.
 Loss of Trust: Users may avoid visiting an insecure website.
 Traffic Reduction: Warnings discourage visitors, causing high bounce rates.

Google Search Console (GSC) helps website owners identify and fix security issues.

2. Common Security Issues in SEO

2.1 Hacked Website & Unwanted Content

A hacked website means that an attacker has gained unauthorized access to modify
the content, insert harmful scripts, or add spam links.

Affects SEO:

 Hackers inject spammy pages, links, and redirects to manipulate rankings.


 Google flags the website with a “This site may be hacked” warning.
 Visitors may be redirected to malicious or phishing websites.

Example:

 A WordPress website is hacked, and new pages about "cheap medicines" or


"gambling" are added without the owner's knowledge.

Fix It:

 Scan your website using security tools (e.g., Sucuri, Wordfence, or Google Safe
Browsing).
 Check Google Search Console > "Security Issues" for a list of hacked pages.
 Restore your website from a clean backup (if available).
 Change all admin passwords and remove unknown users.
 Update Word Press, plugins, and themes (outdated versions are vulnerable).
 Request a review from Google after fixing the issue.

2.2 Malware & Phishing Attacks

Malware is malicious software that infects a website and affects its visitors. Phishing
is a deceptive practice where attackers create fake login pages to steal user credentials.

How it affects SEO:

 Google blacklists websites containing malware or phishing pages.


 Browsers display a "Deceptive site ahead" warning.
 Google Safe Browsing blocks access to the site.

Example:

Page | 35
Search Engine Optimization (SEO)

 A hacker creates a fake "PayPal login page" on your website to steal user credentials.
 Visitors who enter their details are unknowingly sending information to the hacker.

How to Fix It:

 Use Google Safe Browsing (Check here) to find malicious links.


 Use a malware scanner (Sucuri, Norton Safe Web, or Wordfence) to detect threats.
 Remove infected files and any suspicious scripts.
 Reset all passwords and enable two-factor authentication (2FA).
 Request a security review from Google once cleaned.

2.3 Deceptive Content & Unwanted Redirects

Deceptive content includes fake ads, misleading information, or hidden redirects that
take users to unrelated websites.

How it affects SEO:

 Google applies a manual action for deceptive content.


 Users lose trust and report the site as fraudulent.
 Search rankings drop due to policy violations.

Example:

 A webpage claims to provide free software downloads, but clicking the button
installs spyware or ransomware instead.

How to Fix It:

 Use Google Search Console > "Security Issues" to identify deceptive content.
 Scan your website files for unauthorized changes.
 Manually check your website for any auto-redirects to suspicious sites.
 Remove hidden scripts and fake ads that lead users to malicious sites.
 Update website security and enforce strong passwords.

2.4 SQL Injection & Cross-Site Scripting (XSS)

SQL Injection and XSS attacks occur when attackers exploit security vulnerabilities
in web forms to steal data or inject malicious code.

How it affects SEO:

 Attackers can alter database content, leading to data leaks.


 Malicious scripts can be used to redirect users to phishing sites.

Example:

 A hacker inserts an SQL command in a website’s login form to access the database.
 An attacker uses JavaScript code to steal cookies and session tokens.

Page | 36
Search Engine Optimization (SEO)

How to Fix It:

 Use Web Application Firewalls (WAFs) like Cloudflare or Sucuri.


 Sanitize user input in forms (preventing attackers from injecting code).
 Use HTTPS & Content Security Policies (CSPs) to secure data transmission.
 Regularly update software and plugins to patch security flaws.

2.5 SSL Certificate Issues (HTTPS Not Enabled)

An SSL (Secure Socket Layer) certificate encrypts data between a user’s browser
and the website. Google marks websites without SSL as “Not Secure.”

How it affects SEO:

 Websites without SSL show security warnings in browsers.


 Google gives a ranking boost to HTTPS sites.
 Users may abandon the site due to security concerns.

Example:

 A website still uses HTTP instead of HTTPS, leading to an unsecured connection


warning.

How to Fix It:

 Purchase and install an SSL certificate (from Let’s Encrypt, GoDaddy, or


Cloudflare).
 Update internal links to use HTTPS instead of HTTP.
 Redirect all HTTP traffic to HTTPS using .htaccess or server settings.
 Check Google Search Console to ensure Google indexes the HTTPS version.

3. Security Issues in Google Search Console

Step 1: Open Google Search Console

 Visit Google Search Console


 Select your website property

Step 2: Navigate to the Security Issues Report

 Click on “Security & Manual Actions”


 Select “Security Issues”

Step 3: Review Identified Issues

 Google will list any security threats such as hacked pages, malware, deceptive
content, or phishing.
 Click on "More Details" to view affected URLs.

Page | 37
Search Engine Optimization (SEO)

4. How to Fix Security Issues & Request a Google Review

Once the issue is resolved, request a security review in Google Search Console:

Step 1: Fix the Identified Security Issues

 Remove malware, phishing content, and spam links.


 Secure website files, update plugins, and scan for vulnerabilities.

Step 2: Request a Review in Google Search Console

1. Open Google Search Console


2. Go to "Security Issues"
3. Click "Request a Review"
4. Write a detailed explanation covering:
o What caused the issue
o How you fixed it
o Steps taken to prevent future issues
5. Click Submit

Google will review the website and remove the security warning if no threats remain.

Page | 38
Search Engine Optimization (SEO)

UNIT-V: Legacy Tools and Reports: Links-settings-submit feedback- about new version-
International targeting-messages-URL paramets-web Tools

Legacy Tools and Reports in Google Search Console (GSC)

Google Search Console (GSC) is a free tool provided by Google to help webmasters
monitor, maintain, and troubleshoot their site’s performance in Google Search results. Over
the years, Google has introduced new features and discontinued some older ones. However,
some of the older tools are still available under the Legacy Tools and Reports section.

Even though Google has gradually phased out some of these tools, they still offer
valuable insights for website owners. In this guide, we will deeply explore each of these
legacy tools and reports, their features, their purpose, and how they impact SEO.

List of Legacy Tools and Reports in GSC


1. Links
2. Settings
3. Submit Feedback
4. About New Version
5. International Targeting
6. Messages
7. URL Parameters
8. Web Tools

Links Report in Google Search Console


The Links Report in Google Search Console (GSC) provides valuable insights into
how other websites link to your site (external links) and how your site links to its own pages
(internal links). Links play a crucial role in SEO (Search Engine Optimization) because
they help search engines understand the structure and authority of a website.
We will cover:
1. What is the Links Report?
2. Types of Links in the Report
3. How to Access the Links Report
4. Understanding the Different Sections
5. How to Analyze the Links Report for SEO
6. How to Fix Link-Related Issues
1. What is the Links Report?
The Links Report in Google Search Console helps website owners see:

Page | 39
Search Engine Optimization (SEO)

 Which websites are linking to their pages (backlinks).


 How frequently different pages on their site are linked (internal linking).
 What anchor texts are commonly used in links (anchor text distribution).
Since links are one of the most important ranking factors, this report helps webmasters
improve their website’s authority, internal linking structure, and SEO performance.
2. Types of Links in the Report
The Links Report is divided into two main categories:
1. External Links (Links from other websites to your site)
2. Internal Links (Links between pages on your own site)
Each of these categories is further divided into sub-sections to provide deeper insights.
A. External Links (Backlinks)
These are links from other websites pointing to your website. Google uses backlinks as an
indicator of a page’s authority and relevance.
Sub-sections of External Links:
 Top linked pages: The most frequently linked pages on your site.
 Top linking sites: Websites that link to your site the most.
 Top linking text (Anchor text): The most common anchor text used in backlinks.
B. Internal Links
These are links between pages on your own website. Good internal linking improves
navigation and helps search engines discover and rank pages more effectively.
Sub-sections of Internal Links:
 Top internally linked pages: Shows which pages receive the most internal links
from your website.
3. How to Access the Links Report in Google Search Console
Step-by-Step Process:
1. Log in to Google Search Console
o Go to Google Search Console.
o Sign in with your Google account.
2. Select Your Website
o Click on the property (your website) you want to analyze.
3. Open the Links Report
o In the left-hand menu, click on Links under the “Legacy tools and reports”
section.

Page | 40
Search Engine Optimization (SEO)

4. Explore the Links Data


o You will now see the External Links and Internal Links sections.
4. Understanding the Different Sections of the Links Report
A. External Links Section
These links come from other websites (backlinks) and play a crucial role in SEO rankings.
1. Top Linked Pages (Externally)
 This section lists the pages on your website that receive the most backlinks.
 Pages with a high number of backlinks generally rank higher in search results.
How to Use This Data:
 Identify your most popular content based on backlinks.
 Ensure that your highly linked pages are optimized with good content and
keywords.
 If a low-priority page is getting many backlinks, consider improving its content to
capture more traffic.
2. Top Linking Sites
 This section shows which websites link to you the most.
 High-quality backlinks from authoritative sites boost your search rankings.
How to Use This Data:
 Identify high-authority sites linking to you.
 Build relationships with sites linking to you and acquire more backlinks.
 If you see spammy or unwanted links, consider using the Google Disavow Tool.
3. Top Linking Text (Anchor Text)
 This shows the most common anchor text used in backlinks.
 Anchor text helps Google understand the content of your pages.
How to Use This Data:
 Ensure anchor text is relevant and descriptive (avoid generic text like “click here”).
 If many sites use spammy or misleading anchor text, disavow those links.
B. Internal Links Section
These links help Google understand your website structure and distribute link equity
(ranking power) across pages.
1. Top Internally Linked Pages
 This section lists which pages on your site receive the most internal links.
How to Use This Data:
 Ensure that important pages have enough internal links for better visibility.

Page | 41
Search Engine Optimization (SEO)

 Reduce deeply buried pages (pages that require many clicks to access).
 Use descriptive anchor text when linking internally.
5. How to Analyze the Links Report for SEO
A. External Links Analysis (Backlink Profile)
1. Find Your Most Linked Pages
o Check if your most important pages have backlinks.
o If a low-value page has many backlinks, consider redirecting it to a more
relevant page.
2. Analyze Linking Websites
o Are they high-authority websites or spammy sites?
o If spammy, disavow bad links.
3. Check Anchor Text Distribution
o Ensure anchor text is relevant and not over-optimized.
B. Internal Links Analysis
1. Ensure Every Page Has Internal Links
o Pages with few or no internal links are harder for Google to find.
2. Improve Link Structure
o Use internal links to pass link equity to important pages.
3. Use Relevant Anchor Text
o Instead of "click here", use "read our SEO guide".
6. How to Fix Link-Related Issues
A. External Links Issues

B. Internal Links Issues

Page | 42
Search Engine Optimization (SEO)

Settings Section in Google Search Console


The Settings section in Google Search Console (GSC) is a crucial area where
webmasters can manage their site preferences, ownership verification, user permissions, and
Googlebot crawling preferences. Proper configuration in this section ensures accurate website
monitoring, indexing, and access control.

Where to Find the Settings Section


1. Open Google Search Console: https://search.google.com/search-console
2. Select your website (property).
3. Scroll down the left-hand sidebar.
4. Click on "Settings".

Features of the Settings Section


The Settings page consists of the following important components:
1. Ownership Verification
2. Users and Permissions
3. Crawling and Indexing Settings
4. Associations
5. Property Removal
6. Change of Address Tool
7. Crawl Stats Report
8. About Google Search Console Version

1. Ownership Verification
Ownership verification ensures that only authorized users can access a website’s
performance data, indexing status, and SEO reports.
Verification Methods:
 HTML File Upload: Download and upload a verification file to your website.
 HTML Meta Tag: Add a meta tag inside the <head> section of your website’s
homepage.
 Google Analytics (GA) Account: If you have GA access, you can verify ownership
via GA.
 Google Tag Manager (GTM): If your site is set up with GTM, verification can be
done through it.
 Domain Name Provider (DNS Verification): Add a TXT record to your domain’s
DNS settings.
How to Verify Ownership?
1. Open Google Search Console > Settings.
2. Click on "Ownership Verification".
3. Choose a verification method.
4. Follow the instructions provided by Google.
5. Click Verify.
6. Once verified, your site will be successfully added.
Note: If ownership is lost (e.g., DNS records are deleted), verification needs to be redone.
2. Users and Permissions
The Users and Permissions feature allows the owner to add or remove users and
control their level of access to Google Search Console data.

Page | 43
Search Engine Optimization (SEO)

Types of User Roles:

How to Add a User?


1. Open Google Search Console > Settings.
2. Click on Users and Permissions.
3. Click "Add User".
4. Enter the user’s email address.
5. Select their access level (Full or Restricted).
6. Click "Add".
3. Crawling and Indexing Settings
Google crawls websites using Googlebot to analyze and index web pages. This setting helps
you monitor and control Google's crawl behavior.
Crawl Stats Report
 Shows how often Google crawls your site.
 Provides details on crawl requests, response times, and issues.
 Helps in troubleshooting slow-loading pages.
How to View Crawl Stats?
1. Go to Settings > Crawl Stats.
2. Check how often Googlebot visits your site.
3. Identify crawl errors or slow response times.
4. Optimize your server speed and robots.txt file if needed.
4. Associations
The Associations section in GSC allows you to connect your website to other Google
services for better insights and analysis.
Supported Associations:
 Google Analytics (GA4): Connects GSC with Google Analytics for better traffic
insights.
 Google Ads: Links with Google Ads to track paid and organic search performance.
 YouTube Channel: If your site is connected to YouTube, you can track video-related
searches.
 Google Merchant Center: Links your e-commerce website for product search
visibility.
How to Set Up Associations?
1. Open Settings > Associations.
2. Click "Associate" next to the service (e.g., Google Analytics).
3. Follow the Google authorization process.
4. Once approved, your website is linked.
5. Property Removal
If you no longer want to track a website in Google Search Console, you can remove it.
How to Remove a Property?
1. Open Google Search Console.

Page | 44
Search Engine Optimization (SEO)

2. Go to Settings > Property Settings.


3. Click "Remove Property".
Note: Removing a property does not remove your site from Google search results. It only
removes the site from your Search Console account.
6. Change of Address Tool
If you migrate your website to a new domain, you need to inform Google using this tool.
How to Use the Change of Address Tool?
1. Go to Settings > Change of Address.
2. Enter the new domain name.
3. Verify the new domain.
4. Submit the request to Google.
7. About Google Search Console Version
This section provides information about:
 The current version of Google Search Console.
 New features and updates.
 Deprecated features that have been removed.

Submit Feedback in Google Search Console

The Submit Feedback feature in Google Search Console allows webmasters to


report issues, suggest improvements, and provide feedback directly to Google. This tool
is helpful for users who encounter technical problems, usability concerns, or have
suggestions for new features in Google Search Console.

Google actively collects feedback from users to enhance the platform, fix bugs, and
improve user experience. Let’s dive deep into this feature and explore its importance, how
to use it, and best practices.

 Suggest new features or improvements.


 Share user experience concerns (such as interface difficulties).
 Provide direct feedback to Google’s Search Console team.

Helps Google Fix Bugs


 If GSC reports incorrect data, users can report the issue immediately.
 Helps Google identify recurring technical problems.
Improves the User Experience
 Users can suggest interface improvements to make navigation easier.
 Provides Google with real-world insights on how people use GSC.
Suggests New Features
 Users can request new tools or improvements.
 Google prioritizes updates based on user demand.
Ensures Better SEO Tools for Webmasters
 Feedback helps Google refine reports, enhance data accuracy, and improve
functionality.
 Allows users to shape future updates.

1. Reporting Bugs or Errors


 If search performance reports show incorrect data.
 If indexing or coverage reports aren’t updating correctly.
 If Search Console is not loading or crashing frequently.

Page | 45
Search Engine Optimization (SEO)

2. Issues with Google’s Crawling and Indexing Reports


 If the URL Inspection tool gives incorrect results.
 If valid pages are mistakenly reported as "Excluded".
 If Googlebot fails to crawl important pages.
3. Problems with User Interface (UI) or Navigation
 If the GSC layout is difficult to use.
 If buttons, filters, or menus are not working correctly.
4. Incorrect Warnings or Alerts
 If you receive false security warnings about malware or manual actions.
 If Google flags valid structured data as incorrect.
5. Suggestions for New Features
 If you want a new SEO tool in GSC.
 If you need better filters for performance reports.
 If you think Google should improve link tracking, search queries, or analytics.

How to Submit Feedback in Google Search Console


Step 1: Open Google Search Console
1. Go to Google Search Console.
2. Log in with your Google account.
Step 2: Navigate to the Issue
1. Find the page or report where you are facing an issue.
2. If the issue is in Performance Reports, Indexing, or Coverage, open that specific
section.
Step 3: Click on "Submit Feedback"
1. Click on the "?" (Help) icon in the top right corner.
2. From the drop-down menu, select “Send Feedback”.
Step 4: Describe the Issue in Detail
1. A new window will open.
2. Clearly explain the issue. Example:
o ❌ Bad Example: "My report isn’t working."
o ✅ Good Example: "The Performance Report is showing zero clicks for the
last 10 days, but my Google Analytics data shows traffic. Please check this
discrepancy."
3. If you're suggesting a new feature, explain:
o Why it would be useful.
o How it would improve SEO insights.
Step 5: Attach Screenshots (Optional but Recommended)
1. Click “Include screenshot” if you want to show Google what the issue looks like.
2. GSC automatically takes a screenshot of the page you are on.
3. You can also manually highlight parts of the screenshot to point out specific issues.
Step 6: Submit the Feedback
1. Click "Send".
2. Google will receive your feedback.
Step 7: Wait for a Response (If Applicable)
 Google does not respond to all feedback individually.
 If the issue is critical, it may be fixed in future updates.
 You can check Google’s official support pages for updates.

Page | 46
Search Engine Optimization (SEO)

About New Version in Google Search Console

The About New Version section in Google Search Console (GSC) provides
information on updates, new features, and changes introduced in the latest versions of GSC.
Over the years, Google has replaced many old features with improved versions, offering
better usability, insights, and automation.

This section helps users understand the differences between the old Search Console and the
new version, including:
Newly added tools
User interface (UI) updates
Deprecated or removed features
How to adapt to the changes

What is the "About New Version" Section?


The About New Version section acts as an update log for webmasters, showing what has
changed in the latest GSC updates.
Why It Exists?
 Google introduced the new Google Search Console (GSC) in 2018 with a modern
design and improved reports.
 Some legacy tools were removed, while others were replaced with better
alternatives.
 Google continues to enhance GSC features, and this section helps webmasters stay
informed.

Features
(a) Enhanced Reports & Insights
 The new version provides 16 months of data (previously only 3 months).
 Improved Performance Reports with better filtering options.
 More detailed reports on indexing, mobile usability, and search enhancements.
(b) Improved Index Coverage Report
 Shows which pages are indexed and which have errors.
 Provides detailed reasons for indexing failures (e.g., crawl errors, blocked by
robots.txt, duplicate content).
 Suggests fixes for indexing issues.
(c) Better URL Inspection Tool
 Allows checking the live indexing status of a URL.
 Shows how Googlebot sees your page.
 Helps in identifying and fixing structured data errors.
(d) Simplified UI (User Interface)
 More user-friendly design with easy navigation.
 Mobile-friendly dashboard for managing sites on the go.
 Clearer reports and graphs with better readability.
(e) Actionable Fix Recommendations
 The new version suggests specific SEO fixes for:
Indexing issues
Mobile usability problems

Page | 47
Search Engine Optimization (SEO)

Security & manual actions


Structured data errors

Differences Between the Old and New Search Console

Deprecated (Removed) Features in the New Version


 HTML Improvements Report → No longer needed because Google
automatically detects issues.
 Property Sets → Removed because Google Analytics integration is better.
 Blocked Resources Tool → No longer needed because indexing tools are
improved.
 Crawl Errors Report → Integrated into Index Coverage Report.
 Fetch as Google → Replaced by URL Inspection Tool.
 Old Sitemaps Report → Replaced by new Sitemaps Report.

International Targeting in Google Search Console

What is International Targeting?


The International Targeting tool in Google Search Console allows webmasters to
optimize their websites for specific countries and languages. It is particularly useful for
multilingual and multi-regional websites that want to ensure the correct content is
displayed for users in different locations.
There are two main aspects of international targeting:
1. Language Targeting – Uses hreflang tags to help Google serve the right language
version of a webpage to users.
2. Country Targeting – Allows webmasters to specify a preferred country for their
website in Google Search.
To access the International Targeting tool:
1. Open Google Search Console.
2. Navigate to Legacy Tools & Reports in the left-hand menu.
3. Click on International Targeting.
4. You will see two tabs:
o Language (for hreflang settings)
o Country (for selecting a target country)

What are Hreflang Tags?

Page | 48
Search Engine Optimization (SEO)

Hreflang is an HTML attribute that tells search engines which language and regional
version of a page should be shown to users.
Example of Hreflang Implementation (for English and French pages):
<link rel="alternate" hreflang="en" href="https://example.com/en/" />
<link rel="alternate" hreflang="fr" href="https://example.com/fr/" />

Uses Hreflang Tags


 If a French user searches for content, Google will show the French version of the
webpage (https://example.com/fr/).
 If an English user searches for the same content, Google will show the English
version (https://example.com/en/).

Hreflang Attributes & Their Meanings

Check Hreflang Errors in Google Search Console


1. Open Google Search Console.
2. Go to Legacy Tools & Reports > International Targeting.
3. Click the Language tab.
4. Review the Errors & Warnings section.
5. Fix any incorrect hreflang implementations (e.g., missing return links, incorrect
language codes).
Common Hreflang Issues & Fixes

Country Targeting
If your website primarily serves users from a specific country, you can set a target
country in Google Search Console.
How to Set a Target Country
1. Open Google Search Console.

Page | 49
Search Engine Optimization (SEO)

2. Go to Legacy Tools & Reports > International Targeting.


3. Click the Country tab.
4. Check the "Target users in" box.
5. Select your preferred country from the dropdown list.
Example Scenario
 If your website primarily serves users in Canada, selecting "Canada" in the
Country Targeting tool signals to Google that the site is more relevant to Canadian
users.
When to Use Country Targeting
✅ Your business only serves a specific country (e.g., a local business).
✅ Your content is primarily intended for users from one region.
✅ You have a country-specific domain (e.g., .ca for Canada, .uk for the United
Kingdom).
When NOT to Use Country Targeting
❌ Your website serves a global audience.
❌ You use multiple languages for different countries (use hreflang instead).
❌ You have a generic top-level domain (TLD) like .com, .org, or .net (Google
automatically determines the audience).

Messages in Google Search Console


The Messages section in Google Search Console (GSC) is a critical feature that
provides important notifications and alerts about your website’s health, performance, and
security. These messages come directly from Google Search Team and serve as a way to
inform webmasters about indexing issues, penalties, security threats, and best practices.

Types of Messages in Google Search Console


Google sends various types of messages, which can be categorized as follows:
1. Indexing and Crawling Issues
These messages inform webmasters about problems preventing Google from indexing
or crawling their pages properly.
Common Messages:
 "Googlebot can't access your site" → Indicates Google’s crawler is blocked due to
server errors or robots.txt restrictions.
 "Pages not indexed due to a crawl issue" → Some pages couldn’t be indexed
because of errors.
 "Your website is experiencing an increase in 404 errors" → Too many broken
links can affect user experience and rankings.
 "Redirect errors detected" → Issues with incorrect or looping redirects.
2. Security Issues and Hacking Alerts
If Google detects malware, phishing, or hacked content on your site, it will send an
immediate alert.
Common Messages:
 "Hacked content detected on your site" → Google found hacked pages, potentially
harming your site’s ranking.
 "Malware detected on your site" → Indicates that your site has been infected with
malware.
 "Social engineering attack warning" → Your site contains deceptive content (e.g.,
fake login pages).
3. Manual Actions and Penalties

Page | 50
Search Engine Optimization (SEO)

Google may issue a manual penalty if your site violates its guidelines. This can
result in lower rankings or deindexing from Google Search.
Common Messages:
 "A manual action has been applied to your site" → Your site has been penalized
for violating Google’s policies.
 "Unnatural links detected" → Spammy or paid links pointing to your site may lead
to ranking penalties.
 "Thin content with little or no added value" → Your content is considered low
quality or duplicate.
4. Mobile Usability Issues
Google sends these messages if it detects issues that make your site unfriendly to
mobile users.
Common Messages:
 "Text too small to read on mobile devices"
 "Clickable elements too close together"
 "Viewport not configured"
5. Performance and Ranking Alerts
These messages inform webmasters about changes in search traffic, structured data
issues, or ranking drops.
Common Messages:
 "Your site’s traffic has dropped significantly" → A sudden drop in visitors from
Google Search.
 "Errors in structured data markup detected" → Issues with schema markup (e.g.,
missing fields in rich snippets).
 "New Search Console features available" → Notifications about updates and new
tools.
6. URL Removal Requests and Indexing Requests
If you or someone from your team requests the removal of a URL, Google will send a
confirmation message.
Common Messages:
 "A request to remove URLs from Google Search has been processed"
 "Your indexing request has been approved/rejected"

URL Parameters in Google Search Console


The URL Parameters tool in Google Search Console (GSC) helps webmasters
manage how Googlebot crawls and indexes URLs that contain parameters.
What Are URL Parameters?
A URL parameter is a part of a web address (URL) that comes after a ? and modifies
how content is displayed or tracked. Parameters are commonly used for:
 Tracking (e.g., session IDs, UTM codes)
 Filtering and sorting (e.g., sorting products on an e-commerce site)
 Navigation (e.g., switching between pages of results)
Example URLs with parameters:
1. https://example.com/products?category=shoes → Category filter
2. https://example.com/products?sort=price_asc → Sorting by price
3. https://example.com/blog?page=2 → Pagination

Access the URL Parameters Tool


1. Open Google Search Console (https://search.google.com/search-console/).
2. Select your property (website) from the list.

Page | 51
Search Engine Optimization (SEO)

3. In the left menu, click on Legacy Tools and Reports.


4. Select URL Parameters.
5. You will see a list of parameters detected by Google, or you can add new
parameters manually.

Types of URL Parameters


There are two main types of URL parameters:
1. Active Parameters (Modify Page Content) – These change what is displayed on a
page (e.g., filtering, sorting, pagination).
2. Passive Parameters (Tracking or Analytics) – These do not change content but are
used for tracking purposes (e.g., UTM codes).

Managing URL Parameters in Google Search Console


When adding a new URL parameter in GSC, Google asks:
1. What does this parameter do? (Purpose)
Google provides five options:
 Sorts → Orders content (e.g., sort=price_asc).
 Narrows → Filters content (e.g., category=shoes).
 Specifies → Determines a page version (e.g., color=red).
 Translates → Changes language (e.g., lang=en).
 Tracks → Used for tracking (e.g., utm_source=google).
2. How should Google handle URLs with this parameter?
You can instruct Googlebot to:
 Let Googlebot decide (Default) → Google tries to figure out if the parameter matters.
 No URLs → Tells Googlebot to ignore this parameter.
 Only URLs with specific values → Google only indexes selected values.
 Every URL → Google treats each variation as a separate page.
Example Settings in GSC

Web Tools in Google Search Console

Page | 52
Search Engine Optimization (SEO)

The Web Tools section in Google Search Console (GSC) consists of various utilities
that help webmasters monitor and optimize their websites. These tools assist in improving
site performance, mobile-friendliness, structured data implementation, and page speed.
Although many of these tools are now available as standalone services, they are still listed
under Legacy Tools and Reports in GSC.

Types of Web Tools


1. Mobile-Friendly Test
2. Rich Results Test
3. AMP Test
4. PageSpeed Insights
5. Safe Browsing Report
6. Ad Experience Report
7. Abusive Experiences Report

1. Mobile-Friendly Test
The Mobile-Friendly Test checks whether a webpage is optimized for mobile
devices. Since Google uses mobile-first indexing, a mobile-friendly site improves SEO
rankings and user experience.
Features:
 Detects responsive design compatibility.
 Identifies mobile usability issues (e.g., small fonts, touch elements too close).
 Provides screenshot preview of how the page appears on mobile.
 Highlights page loading issues that affect mobile performance.
2. Rich Results Test
The Rich Results Test checks whether a webpage supports structured data for
enhanced search results. Structured data helps Google display rich snippets like star ratings,
FAQs, event details, recipes, etc.
Features:
 Tests Schema.org markup (e.g., JSON-LD, Microdata, RDFa).
 Identifies errors in structured data implementation.
 Shows how the page will appear in search results.
 Supports testing for Breadcrumbs, Reviews, FAQs, Products, Jobs, etc.
3. AMP Test (Accelerated Mobile Pages Test)
AMP (Accelerated Mobile Pages) is a framework designed to load web pages faster
on mobile devices. The AMP Test checks if a webpage is properly implemented with AMP
specifications.
Features:
 Validates AMP HTML structure.
 Detects errors in AMP implementation.
 Provides a preview of how the AMP page will appear in search results.
 Checks for AMP-specific issues (e.g., missing required tags, script errors).
4. Page Speed Insights
The Page Speed Insights (PSI) tool analyzes webpage loading speed and provides
performance improvement suggestions. It measures speed on both mobile and desktop
devices.
Features:
 Provides Core Web Vitals (Largest Contentful Paint, First Input Delay, Cumulative
Layout Shift).

Page | 53
Search Engine Optimization (SEO)

 Analyzes server response time, render-blocking resources, and JavaScript


execution.
 Offers optimization tips like image compression, caching, and lazy loading.
 Assigns a performance score based on speed and usability.
5. Safe Browsing Report
The Safe Browsing Report checks whether a site has security issues like malware,
phishing, or deceptive content.
Features:
 Scans for hacked content, phishing attacks, and malware infections.
 Sends alerts if Google detects security threats on your site.
 Provides recommendations to clean up hacked sites.
6. Ad Experience Report
The Ad Experience Report evaluates if a website follows Google’s Better Ads
Standards. Websites that violate these guidelines risk losing ad revenue.
Features:
 Detects intrusive or disruptive ads (e.g., pop-ups, auto-play videos).
 Provides recommendations to improve ad quality.
 Ensures compliance with Google Ads policies.
7. Abusive Experiences Report
The Abusive Experiences Report detects elements that mislead users, such as fake
system warnings, deceptive buttons, or unexpected redirects.
Features:
 Flags misleading or abusive website elements.
 Helps websites comply with Google’s security standards.
 Provides insights on user-friendly website practices.

Page | 54

You might also like