[go: up one dir, main page]

Page MenuHomePhabricator

GSoC 2020 Proposal: Evaluate WebdriverIO replacements for Wikimedia's browser automation framework
Closed, DeclinedPublic

Description

Profile Information

Name: Bello Gbadebo
Email: gbahdeybohbello@gmail.com
IRC nick: Gbahdeyboh
Resume: Download Resume
Location: Lagos, Nigeria
Typical working hours: Between 6pm and 2am WAT(UTC+01:00)

Synopsis

Wikimedia has a couple of projects which are being tested from time to time either via Jenkins or as a part of Continuous Integration on each check-in to Gerrit, some of this tests are completely automated using a browser automation framework, WikiMedia currently uses a test automation framework built atop of WebdriverIO and selenium for testing their repositories. WebdriverIO allows you to run tests based on web driver protocols. While this is a great tool, there are useful alternatives which provide certain features that either WebdriverIO does not have at all, or has a less efficient implementation of which are also easier to set up and work with in the alternate frameworks.

This, therefore makes it important to consider switching to other alternatives. Currently, there are two most probable alternatives that could be switched to which are Puppeteer and Cypress.
Puppeteer is a high-level browser automation API built atop Chrome DevTools Protocols while Cypress is a Front end End to End testing framework.

browser-testing-frameworks.png (672×922 px, 105 KB)

The chart above shows a comparison between the usage of Puppeteer, Cypress, and WebdriverIO. This chart automatically shows us that Puppeteer and Cypress are good alternatives for WebdriverIO.

I aim to extensively evaluate the usage of both Puppeteer and Cypress by deeply Integrating both of them with some of Wikimedia's repositories. The evaluation process would take into consideration several factors that would help determine which is the best alternative for WebdriverIO and how much work would be required to switch all repositories to the selected tool.

The following are some of the Cons of WebdriverIO;

Difficult to Setup

WebdriverIO is difficult to set up, the current implementation of WebdriverIO requires the installation of a separate browser driver for the specific browser the test is to be run with as well as separately installing WebdriverIO itself. Unlike tools like puppeteer and Cypress that can be installed using one line of code.

Difficult to upgrade

When a new version of WebdriverIO is released, it's always a difficult process to migrate to a new version of it as WebdriverIO does not synchronize well with updates in selenium.

Device Emulation

Puppeteer and Cypress allows the emulation of several mobile devices, including tablets, androids, and iPhones. Puppeteer and Cypress handles device emulation differently from WebdriverIO. With WebdriverIO, you simply resize the screen of the webpage to that of the device you want to emulate. There are a couple of problems with this implementation, some of the functionality of the webpage might be dependent on the device OS and not necessarily it's screen size, In such a scenario, Webdriver IO would handle device emulated tests poorly.

Access to Chrome Dev Tools

Puppeteer and Cypress by default have full access and control over all of the chrome Devtools functionality which makes it very useful when writing tests, It's gives access to functionalities like Network throttling, code coverage, browser's console, etc. However, WebdriverIO needs to use Puppeteer under the hood to access chrome dev tools.

Simpler JavaScript Execution

Puppeteer makes it very easy to run JavaScript codes within the browser or current page context. This feature is super useful and comes in handy when testing.

Mentor(s): @zeljkofilipin , @Jpita
Have you contacted your mentor already? Yes, I have already contacted them.

Deliverables

  • Getting familiar with all the tests codebase of some Wikimedia projects I'll be working on.
  • Working with mentors to define what needs to be tested on each project.
  • Re-writing automated tests for some of the Wikimedia projects I'll be working on as a part of the evaluation.
  • Providing a well-detailed documentation for these tests.
  • Working with mentors to define standards for the tests.
  • Working with mentors to deploy these tests to various platforms i.e Jenkins, CI/CD, etc.
  • Giving the Pros and Cons of each framework this test was written in
Phase I evaluation
  • Understand the workflow of MediaWiki and run manual tests on it.
  • Identify what needs to be tested and new tests that need to be added that wasn't previously implemented with WebdiverIO
  • Document the tests that were identified for future reference
  • Re-write all of MediaWiki's test in puppeteer
  • Evaluate the performance of the test with puppeteer critically with the help of mentors
  • Document and analyze the evaluation results
Phase II evaluation
  • Re-write all of MediaWiki's test in Cypress
  • Evaluate the performance of the test with cypress critically with the guidance of mentors
  • Document and analyze the evaluation results
  • Understand the workflow MobileFrontend Extension Repository
  • Re-write all of MobileFrontend Extension tests in Puppeteer
Final evaluation
  • Evaluate the performance of MobileFrontend test with Puppeteer critically
  • Document and analyze the evaluation results with the guidance of mentors
  • Re-write all of MobileFrontend Extension tests in Cypress
  • Evaluate the performance of the test with Cypress critically
  • Document and analyze the evaluation results
  • Collate all of the evaluation results from Cypress and Puppeteer
  • Compare the evaluation results and draw facts and conclusions
  • Document this conclusions and facts
  • Study and evaluate the test coverage of other repositories relative to Mediawiki(which is the largest)
  • Determine how much work was done to write Mediawiki tests in Puppeteer and Cypress and use it as a benchmark to determine how much work would be done to write tests for other repositories with respect to the code coverage earlier carried out on them with respect to Mediawiki .
  • Document the overall results of the evaluation.
  • Submit a final report

Timeline

May 4 - May 31

  • Community bonding
  • Familiarizing myself with Wikimedia codebase and their deployment and testing processes
  • Learning more advanced concepts in Puppeteer and Cypress that would be useful for the projects

June 1 - June 6

  • Understand the workflow of MediaWiki and run manual tests on it.
  • Identify what needs to be tested and new tests that need to be added that wasn't previously implemented with WebdiverIO
  • Document the tests that were identified for future reference

June 7 - June 13

  • Start re-writing MediaWiki's test in puppeteer

June 14 - June 20

  • Continue re-writing MediaWiki's test in puppeteer

June 21 - 29

  • Evaluate the performance of the test with puppeteer critically with the help of mentors
  • Document and analyze the evaluation results
  • Write a documentation for the Puppeteer tests

June 29 - July 3

  • First phase evaluation

July 4 - July 10

  • Start re-writing MediaWiki's test in Cypress

July 11 - 17

  • Start re-writing MediaWiki's test in Cypress

July 18 - 24

  • Evaluate the performance of the test with cypress critically with the guidance of mentors
  • Document and analyze the evaluation results

July 24 - 27

July 27 - 31

Second phase evaluation

August 1 - 7

  • Start writing MobileFrontend Extension tests in Puppeteer
  • Evaluate the performance of MobileFrontend test with Puppeteer critically
  • Document and analyze the evaluation results with the guidance of mentors

August 7 - 13

  • Start writing MobileFrontend Extension tests in Cypress
  • Evaluate the performance of the test with Cypress critically
  • Document and analyze the evaluation results

August 13 - 15

  • Collate all of the evaluation results from Cypress and Puppeteer

August 16 - 23

  • Compare the overall evaluation results and draw facts and conclusions
  • Document this conclusions and facts
  • Study and evaluate the test coverage of other repositories relative to Mediawiki(which is the largest)
  • Determine how much work was done to write Mediawiki tests in Puppeteer and Cypress and use it as a benchmark to determine how much work would be done to write tests for other repositories with respect to the code coverage earlier carried out on them with respect to Mediawiki .

August 24 - 31

  • Document the overall results of the evaluation.
  • Submit a final report
  • Final evaluation phase

After GSoC

  • Be in close contact with mentors regarding the continuation of the project
  • Identify a replacement tool with the help of mentors based on the evaluation carried out
  • Re-write all of Wikimedia's repository automated tests in the selected testing framework.
  • Deploy the tests with the help of mentors
  • Keep maintaining the tests across all repositories

Participation

  • I will be online on Zulip during my working hours ( 6:00 pm to 2:00 pm WAT(UTC+1)).
  • I will use Phabricator for managing bugs.
  • I will be easily reachable via my Gmail during my non-working hours.
  • I will publish my weekly reports as a task on phabricator.
  • All my codes would be pushed to a new branch on gerrit.

About Me

I'm currently in my 3rd year, B.Tech Electrical and Electronics Engineering at the Federal University of Agriculture, Abeokuta.
I got to hear about GSoC from a friend and saw a project that interests me on Wikimedia's Project Ideas page.
I'm currently eligible to participate in GSoC and I believe the program would help me kick-start my career in Open Source Development
I'm a very enthusiastic person, and I'm usually very excited when learning something new or improving on something I previously knew. I've had some past experience with puppeteer, I feel with this project, I could improve my automation testing skills and write code that actually gets used by a lot of people

Past Experience

  • I wrote a crawler that scrapes similar data off of 8 different websites, wrangles and correlates this data and serves the data over a RESTful API (Node JS, Puppeteer).
  • I contributed to the Open Collective API and fixed an issue they had with Rounding monetary amounts here
  • I built a simple tool that does a deep comparison of Objects in JavaScript here
Micro Tasks Completed

Event Timeline

Hi @Aklapper, I'm sorry I sent in an incomplete proposal. I was still typing in a draft and mistakenly submitted it.

I'll edit and send in a complete application soon.

Gbahdeyboh renamed this task from Evaluate Puppeteer as a WebdriverIO replacement for wikimedia's browser automation framework to GSoC 2020 Proposal: Evaluate WebdriverIO replacements for Wikimedia's browser automation framework.Mar 29 2020, 10:27 AM
Gbahdeyboh updated the task description. (Show Details)

Hi @zeljkofilipin @Jpita and @Aklapper.

I just finished writing my GSoC proposal, can you kindly review and give feedbacks regarding what I wrote? I would really appreciate it if you do. Thanks.

Pavithraes subscribed.

@Gbahdeyboh We are sorry to say that we could not allocate a slot for you this time. Please do not consider the rejection to be an assessment of your proposal. We received over 100 quality applications, and we could only accept 14 students. We were not able to give all applicants a slot that would have deserved one, and these were some very tough decisions to make. Please know that you are still a valued member of our community and we by no means want to exclude you. Many students who we did not accept in 2019 have become Wikimedia maintainers, contractors and even GSoC students and mentors this year!

If you would like a de-brief on why your proposal was not accepted, please let me know as a reply to this comment or on the ‘Feeback on Proposals’ topic of the Zulip stream #gsoc20-outreachy20. I will respond to you within a week or so. :)

Your ideas and contributions to our projects are still welcome! As a next step, you could consider finishing up any pending pull requests or inform us that someone has to take them over. Here is the recommended place for you to get started as a newcomer: https://www.mediawiki.org/wiki/New_Developers.

If you would still be eligible for GSoC next year, we look forward to your participation!