[go: up one dir, main page]

0% found this document useful (0 votes)
28 views5 pages

A - B Testing

Uploaded by

janmeshsingh02
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
28 views5 pages

A - B Testing

Uploaded by

janmeshsingh02
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 5

Assignment 1: A/B Testing Framework Design for landing page of a website

1. What is A/B testing?


● A/B testing is the framework that helps business owners to optimize their website’s
performance by comparing two variations of the web page and testing them on real
customers to find the better performing variant.

2. Why A/B testing?


● Let us suppose there is an eCommerce website and its not making sufficient sales
due to complicated naviagtion. Now with A/B testing one can try other navigation
methods and see if it makes any difference.

3. How to setup an A/B test?


● There are various steps involved for setting up an A/B Test
1. Defining objectives and goals : The foundational and foremost step in
developing a landing page A/B testing framework is establishing a clearly
defined conversion objective. Essentially, one must identify their desired
outcome before proceeding. This is vital as it provides a precise direction for
designing and refining the landing page, aligning with the objectives one’ve
established.
For eg. the objectives could be to increase the CTR(click through rates) on
the landing page.
2. Draft Hypothesis: A hypothesis is an assumption that serves as a proposed
solution or explanation for a given problem.
For eg. Changing the color of the call-to-action (CTA) button to red will lead to
higher CTRs compared to the current blue button. This hypothesis comprises
two key variables: the cause (the action we want to test) and the effect (the
anticipated outcome).
3. Select Variants to Test: Variant is another version of the current landing
page that we want to test on the basis of hypothesis. It is important to select a
variant according to the hypothesis so that one can find what works best for
the web page.
For eg. Current landing page with blue CTA button. Variant is another landing
page version with red CTA button.
4. Implement Testing Infrastructure: One can use tools like VWO, Optimizely,
AB Tasty to setup their testing environment and run the tests. To setup the
analytics one needs to integrate the testing tool's code snippet into the
landing page HTML for proper tracking of data.
5. Set Up the Test: To setup the test one can either randomly assign traffic
division( 50% to current and 50% to variant A) or specifically define these
traffic divisions to avoid sampling bias. Note: It is important to consider factors
like seasonality where fluctuations in traffic due to holiday season might result
in inaccurate analysis and sutva(stable unit treatment value) where changes
made to variant A audience should not affect current audience. Also, setting
up the test duration is an important step while setting up the test. The duration
of an A/B test depends on some factors such as sample size, statistical
significance, site traffic, etc.
6. Collect Data: Using tools like heatmaps, Google Analaytics or session
recordings may provide deeper insights into user behavior. These tools
analyses various metrics and data collected from the user.
For eg. Measuring click-through rates, conversion rates, bounce rates, and
other relevant metrics for both variants.
7. Analyze Results: Using statistical analysis to compare the performance of
different variants and determining if any observed differences are statistically
significant. One needs to determine if the observed increase in CTRs for
Variant A is statistically significant and not due to chance.
8. Determine Success: Evaluate the results of the A/B test to determine if any
variants outperformed others in achieving the defined goals.
For eg. If Variant A (red button) shows a statistically significant increase in
CTRs compared to current variant (blue button), consider it a successful
outcome. But if there's no significant difference or if Variant A performs worse,
retain the original design or iterate on further variations.
9. Iterate or Implement: Based on the results of the A/B test, one needs to
decide whether to implement the changes permanently on the landing page
or iterate further on the winning variant. Note: One should always document
the analysis from the test to inform future optimization efforts.
For eg. If Variant B is successful, implement the changes permanently on the
landing page. If Variant B isn't successful, analyze the results to understand
why and consider testing other hypotheses or iterations.

4. What are the Key metrics to optimize a landing page?


● The key metrics are as follows-
1. CTR: Click through rate is the percentage of clicks on a specific link
compared to the total number of times the link was shown (impressions).
Optimizing the placement and visibility of clickable elements on the landing
page help increase the CTR. For eg. a searchbar placed on top of the landing
page
CTR = (Clicks / Impressions) x 100
2. Bounce rate: Bounce Rate represents the percentage of visitors entering the
website but quickly leaving without taking any actions, such as interacting
with the elements or clicking a link. These instances are called single-page
sessions
3. Conversion rate: Conversion rate is the percentage of users who take a
desired action on the website such as signing up for a service, or buying a
product. Conversion rate = (Number of conversions / Total number of visitors)
x 100
4. Scroll depth: Scroll depth measures how far down a web page a user scrolls,
revealing a page’s most engaging parts and its drop-off points.
5. Abandonment rate: Abandonment rate refers to the percentage of tasks users
start but don't complete such as adding an item to an online shopping cart but
not purchasing or filling a job application form but not submitting it. Cart
abandonment rate = (Number of carts abandoned / Number of orders
initiated) x 100. Note: Simplifying checkout process or reducing form fields
may help decrease the abandonment rate
6. ASD: Average Session duration refers to the time a user spends on a website
during a single visit. It measures each user’s session from the moment they
enter your site until they leave or become inactive. Providing user engaging
content helps increase the average session duration
7. Average order value (AOV): It is the average amount a customer spends
during a single purchase on a website. It's an important metric for ecommerce
brands as modifying user experience may encourage customers to spend
more.
5. How to determine the success of variants?
● To determine success of variants one must follow these:
1. Check for statistical significance and winning variant: Determine if the
observed differences in key metrics between variants are statistically
significant. This ensures that any observed improvements or declines are not
due to random chance.
For eg: If Variant A shows a 20% increase in conversion rates compared to
Current Variant , conduct statistical tests (e.g., t-tests, chi-square tests) to
assess if this difference is statistically significant at a certain confidence level.
2. Business Impact: Assess the practical significance and business impact of
observed changes in metrics. Consider factors such as revenue generated,
customer acquisition costs, and overall return on investment (ROI).
For eg. While Variant A may show a statistically significant increase in
conversion rates, evaluate if this translates into tangible business outcomes
such as higher sales revenue or improved customer lifetime value(LTV).
3. Consistency Across Segments: Ensure that observed improvements or
declines in metrics are consistent across different user segments or traffic
sources. Variants that perform well universally are more likely to be
successful.
For eg. Compare the performance of Variants A and Current variant across
different demographic groups (e.g., age, location) or traffic channels (e.g.,
organic search, paid advertising) to identify any segment-specific trends or
anomalies.
4. Qualitative Feedback: Gather qualitative feedback from users through
surveys to complement quantitative data. This provides insights into users'
perceptions, preferences, and pain points.
For eg. Conducting user survey to understand why certain variants resonate
more with users or use heatmaps to visualize user interaction patterns and
identify areas of friction on the landing page.

6. What are the tools and technologies used to implement and manage A/B testing?
● Google Optimize was considered one of the best A/B Testing tool but was
discontinued from September 2023. However, there are various other tools that have
extraordinary features to conduct A/B tests.
1. A/B testing tools: VWO, Optimizely, AB Tasty
Suggested: VWO for large scale enterprises, Optimizely for SMEs
Rationale: Both provide testing options with advanced targeting. They include
heatmaps, multivariate testing, behavioral targeting and heatmaps.
Extra: Optimonk A/B testing tool has AI incorporated where one only needs to
select what they want to optimize on the landing page and AI decides the test
variants, duration etc itself. It fully automates the A/B testing process.
2. Web Analytics Tools: Google Analytics is one of the best analytical tool.
Alternatives: Mixpanel, Kissmetrics.
3. User Feedback Tools: Hotjar
Rationale: Hotjar makes it possible for companies to get the insights on how
users behave on their website. This is made possible by heatmaps,
recordings, and surveys. Hotjar helps to discover which parts of the website
are getting the most attention.
Alternatives: Podium, Canny

7. Let’s take a scenario of an ecommerce landing page to understand the process.

Scenario:
An eCommerce website is launching a new line of shoes and wants to optimize the landing
page for better conversion rates.

Objective:
To determine if changing the layout of product images on the landing page affects user
engagement and conversion rates.

Steps:

1. Hypothesis: Changing the layout of product images from a grid view to a carousel
view will increase user engagement and conversion rates.

2. Test Setup with VWO: Use VWO to set up the A/B test by creating two variants of the
landing page:
Variant A: Landing page with a grid view layout displaying multiple product images at
once.
Variant B: Landing page with a carousel view layout showcasing product images one
at a time.
Randomly assign visitors to each variant to ensure unbiased results(50-50).

3. Data Collection with Google Analytics: Set up goals in Google Analytics to track
conversions, such as adding products to the cart or completing purchases.
Monitor key metrics including conversion rate, bounce rate, and average session
duration for each variant.

4. User Behavior Analysis with Hotjar: Use Hotjar to create heatmaps and session
recordings of user interactions on both variants.
Analyze user behavior to identify any patterns or areas of interest, such as where
users are clicking or scrolling.

5. Run the Test: Launch the A/B test and monitor its progress over a predetermined
timeframe (2 weeks), ensuring sufficient sample size for statistical significance.
6. Data Analysis: Use VWO's built-in reporting tools to analyze the results of the A/B
test, comparing conversion rates between Variant A and Variant B.
Verify statistical significance using VWO's statistical analysis features or external
tools like Google Analytics.

7. Results Interpretation: If Variant B (carousel view) shows a statistically significant


increase in conversion rates compared to Variant A (grid view), consider it a
successful outcome. If there's no significant difference or if Variant B performs worse,
retain the original design or iterate further on the winning variant.

8. Implementation: If Variant B is successful, implement the changes permanently on


the landing page based on the insights gained from the A/B test.

8. Sources and References


1. https://www.youtube.com/watch?v=eiIhTbFP0ls
2. https://www.linkedin.com/blog/engineering/ab-testing-experimentation/detecting-interf
erence-an-a-b-test-of-a-b-tests
3. https://vwo.com/testing/web/
4. https://www.hotjar.com/ab-testing/metrics/
5. https://www.apexure.com/blog/landing-page-ab-testing-framework-to-maximize-conv
ersions
6. https://userpilot.com/blog/ab-testing-metrics/
7. https://www.webfx.com/digital-marketing/learn/landing-page-ab-testing-tips/
8. https://www.leadpages.com/blog/ab-testing-split-testing
9. https://docs.prepr.io/ab-testing
10. https://www.figpii.com/blog/how-to-setup-and-run-an-a-b-test-a-step-by-step-guide/
11. https://linkdoctor.io/landing-page-testing-tools/
12. https://www.optimonk.com/analyze-a-b-test-results/
13. https://blog.hubspot.com/service/customer-feedback-tool
14. https://goodui.org/
15. Kohavi, Ron & Deng, Alex & Longbotham, Roger & Xu, Ya. (2014). Seven Rules of
Thumb for Web Site Experimenters. Proceedings of the ACM SIGKDD International
Conference on Knowledge Discovery and Data Mining. 10.1145/2623330.2623341.

You might also like