This document outlines a preliminary project plan for the W3C QA Manager position. It also identifies the stakeholders, their needs, and a set of tasks for the QA Manager to kick-start the project. I’ve assumed that the project will run for approximately 2 years. Some aspects may require additional funding from the W3C or its membership.
This is a draft/personal proposal for review by the general public. It has no official standing and is not endorsed by the W3C.
Having spoken to various stakeholders and W3C members, I’d like to propose that the QA manager have the following responsibilities (these cover the responsibilities listed on the W3C's description of the position, but go into a bit more detail on each):
-
Gather understanding of the requirements from the various industry sectors involved at the W3C. Particularly from telecommunications, entertainment/media, social media, and gaming industries.
-
Coordinate the allocation of resource to create test suites (or similar applications) that meet the testing and conformance requirements of various industry sectors involved at the W3C.
-
Collaborate with the appropriate working groups to ensure that test suites have enough coverage to achieve technological interoperability and satisfy W3C Process requirements. The aim being to make sure these specifications can progress to “Recommendation” status in a timely manner.
-
Where possible, push to harmonize the format, quality, and means of accessing tests that pertain to the Web Platform (i.e., those technologies that come under the HTML5 umbrella).
-
Manage improvements to existing testing tools – particularly testharness.js and WebIDL harness.
-
Leverage existing infrastructure (within and outside the W3C) to make tests readily accessible to the W3C membership and the public at large.
-
Track testing efforts across working groups and report to stakeholders on progress. Where progress has stalled, find ways to get things moving again.
-
Establish an active/informed testing community across the W3C membership, in which the public can also participate.
-
Document the location of test suites, as well as relevant details pertaining to those test suites. Where possible, encourage Editors to provide pointers to test suite in their specifications and provide relevant markers of stability.
Given the set of responsibilities, the QA Manager will need to coordinate with the following stakeholders:
- Industry representatives, particularly those involved in:
- Core Mob
- the Web and TV IG.
- Tools and Tests WG.
- Browser vendors.
- Developers of testharness.js and Web IDL Harness.
- Coordinators of Test the Web Forward.
- W3C management and team.
- W3C Chairs, editors, and community at large.
By coordinating tasks across all these stakeholders, the W3C can create a complete quality assurance solution to meet their needs.
Broadly speaking, these needs of stakeholders are as follows:
- Have a documented shared understanding of where we want to be in the short, medium, and long term (i.e., to fit into the HTMLWG’s 2014 Plan).
- Have test suites that over time can result in interoperable implementations of Web standards.
- Get IPR commitments from document reaching Recommendation status.
- Put in place resources and infrastructure that allows the membership to collaborate effectively on QA related matters.
- Establish services or tangible deliverables (e.g., downloadable/modular test suites) that allow industry to assure the quality of their own products and services.
To meet the needs of stakeholders, the following section describes an initial set of tasks that need to be undertaken by the QA manager.
Each industry has its own acceptance testing that is performed to ascertain if a product is ready to go to market. In order to better service major industry sectors involved at the W3C, it is important to understand and document how these industries undertake their testing. Having that knowledge can potentially help the W3C better serve its membership in various industry sectors.
Task: Meet with various industry representatives to understand and document their requirements. If possible, gain access to their current testing frameworks to get a technical understanding of how they currently operate.
As is evident by the number of participants in the Core Mob Community Group, there is a strong desire to rapidly achieve a standardized and competitive Web Platform. The value in Core Mob is that it contains an active community of industry players ranging from hardware manufacturers, mobile network operators, news media publishers, browser vendors, game publishers, web developers, and representatives from large e-commerce websites.
Unfortunately, due to various factors, Core Mob has made limited progress towards meeting its goals. As a consequence, the Core Mob CG has now switched towards creating scenario/personal-based use cases and requirements. This is a positive step forward, but it does not yet cover the needs of the industries currently represented in the community group. This gives an opportunity for the QA Manager steer the group towards defining a set of scenarios that meet the needs of the represented industries.
If consensus can be reached in the group, the Core Mob CG could define actual applications that would serve as test suites. With appropriate funding, these applications could be professionally developed. This can be seen as akin to the Acid Test style testing – but unlike the Acid tests, the applications would be created to specifically exercise certain capabilities under realistic situations, as opposed to focusing on edge cases.
Note that testing scenarios would compliment traditional test suites, not supersede them.
Task: Coordinate with the Core Mob CG to define usage scenarios and create a set of applications that meet the testing needs of industry (i.e., make sure key industries are represented and that they are satisfied with the requirements).
Task: Manage the creation of test suites or applications that can meet industry requirements. Depending on the applications that the group would build, the W3C should seek approximately $20,000 per application (it will probably need 4-5 distinct applications to cover all the requirements). A budget can be drafted once we have more information and consensus to move forward with the project.
###Web and TV IG The Web and TV IG recently formed a testing Task Force (TF) that aims to:
- reach out to industry to understand where tests are currently lacking, or are inconsistant, and what test suites need to be prioritized.
- gather requirements to enable a single point of access from where one or many test suites can be run.
- explore how different devices can be tested in a way that meets industry requirements.
See also presentation from TPAC 2012.
Task: help with outreach to industry members with Web and TV IG testing task force.
Task: help requirements gathering and documentation.
A critical component that is being developed by the Tools and Tests WG is Web Driver. Web Driver facilitates automation of dynamic data and user interaction in Web applications. This piece of technology is critical for testing parts of the Web platform that cannot otherwise be tested without human intervention.
Additionally, the Tools and Test WG contains subject matter experts with significant experience in testing browser-based software on a large scale (e.g., Wilhelm Joys Andersen, formally of Opera software). Given their experience, it would be helpful to have such individuals involved throughout the lifetime of this project.
Task: Investigate and document the limitations of Web Driver.
Task: Speak to various industry representatives about the feasibility of using Web Driver on various devices and platforms (e.g., on TVs and mobile).
Task: Track progress on Web Driver and reach out to any browser vendor not yet fully supporting the standard.
Task: Discuss infrastructure requirements, as well as general QA issues, with key members of the working group. Particularly, work out what the pros and cons are with current the testharness.js approach.
Browser vendors contribute the majority of tests that make up the test suites at the W3C. As such, the QA Manager will need to liaise with browser vendors to source the majority of tests for the Web Platform.
Task: Identify and co-ordinate with key QA staff working for browser vendors to source tests. Where possible, and under confidentiality if necessary, work with those individuals to find ways to reduce duplication (and to source the test suites once browser vendors are ready to hand them over).
Task: For when test suites are sought, coordinate the review and cleanup of test suites (i.e., remove any vendor specific material and identify gaps).
Task: Communicate and promote the sourcing of a test suite with the rest of the W3C community.
Task: Where no browser vendor is working on test suites, seek funding from industry to allow other resources to create a test suite.
testharness.js is rapidly becoming the testing framework of choice for those involved with HTML5. As such, this resource must be managed in a way that meets the needs of stakeholders (currently, it’s primarily serving the needs of browser vendors).
Another important side project that builds on testharness.js is IDL Harness. This application converts Web IDL fragments into testharness.js compatible tests automatically. Effectively, IDL harness can generate thousands of tests thus providing more complete coverage of specifications – while saving potentially hundreds of hours in manual test creation and verification. IDL harness was developed by Aryeh Gregor, but unfortunately, it is not yet complete and its development needs to be managed (perhaps even funded).
Task: Coordinate with James Graham and make sure he has everything he needs to keep testharness.js going.
Task: Coordinate with members using testharness.js to make sure they have everything they need to use test harness effectively.
Task: Make sure IDL harness is completed and continues to match Web IDL. Continue to find resources to improve documentation.
Task: Talk to Aryeh Gregor and see what he needs to finish IDL harness. If possible, find funds and other resources to help maintain the project.
The W3C currently requires that all specifications meet a certain quality criteria before being eligible for publication on /TR/. However, the “Pub Rules” system currently lacks any checks for conformance to Web IDL. This is currently a problem with some specifications that have reached /TR/ because their IDL does not conform to Web IDL (hence, they won’t be machine processable).
For specifications that use Web IDL, when a specification reaches a certain publication status, Pub Rules should enforce conformance to Web IDL. This could be done through Robin Berjon’s Web IDL parser.
Task: Coordinate with W3C system’s team to see if they can add optional conformance checking for Web IDL. If possible, work out how to make it user friendly.
Task: Coordinate with Robin Berjon to make sure that the Web IDL parser can be used as a service and outputs human-friendly error messages.
Task: Communicate with spec Editors to make sure they put the right hooks in place to allow Pub Rules to find their Web IDL.
Although the majority of tests come from browser vendors, other members sometimes contribute a small number of tests. Where possible, documentation should be made available to allow other W3C members to contribute tests.
Task: Gather documentation for how to contribute tests, as well as any templates, etc.
Task: Established a centralized place where people can find out about testing.
Task: Help members who want to do testing get started.
Test the Web Forward, and similar events, plays an important role educating the development community about browser testing and reporting bugs. Unfortunately, the majority of the tests sourced from Test the Web Forward are generally not of sufficient quality to be incorporated into a HTML test suite. Regardless, some percentage of tests from the event can be salvaged for integration into test suites of related specifications.
Task: investigate ways to increase the quality of tests generated at these events (so less tests are discarded). This will include discussing what the limitations are with organizers and test reviewers, and making sure that these are documented and articulated to those making tests.