Diff for "LEP/ResultsTracker"

Not logged in - Log In / Register

Differences between revisions 8 and 9
Revision 8 as of 2010-05-18 21:13:53
Size: 6140
Editor: cr3
Comment: Minor changes to the text.
Revision 9 as of 2010-05-19 15:00:10
Size: 6140
Editor: cr3
Comment: Fixed capitalization.
Deletions are marked like this. Additions are marked like this.
Line 30: Line 30:
 * For hardware certification and OEM, they could have a central repository where test results from the community and hardware vendors can be compared against each other.  * For Hardware Certification and OEM, they could have a central repository where test results from the community and hardware vendors can be compared against each other.

Results Tracker

A repository of results for the purpose of reporting the history of tests and their relation to hardware devices and package versions.

As a member of the community
I want to provide test results from my computer
so that Ubuntu and upstream developers can know which hardware is compatible and which packages work.

As a member of a team working with hardware vendors
I want to provide test results in a private repository
so that other members in the team can work on fixing problems together.

Consider clarifying the feature by describing what it is not?

  • The Results Tracker does not run tests, it is only a repository of results from tests run elsewhere.

Rationale

Why are we doing this now?

Many teams have expressed the need for a repository of test results for reporting purposes. These teams have either not provided any reports, created their own custom database or plan to create their own database. This is resulting in a fragmentation of the data which will only continue to grow unless a solution is provided:

  • The community has been submitting test results to Launchpad since Hoary, with the original hwdb client later replaced with Checkbox, but this information has never been used.
  • The QA community has created their own website, ISO Testing Tracker, to track test results from testing ISO images.

  • The Hardware Certification team has also created their own website, Hardware Certification, to track and manage test results from hardware vendors.

  • The ARM and Kernel teams might have to create their own website as well unless a solution is provided to support their use cases.

What value does this give our users? Which users?

  • For the community of users, this will finally enable them to reap the rewards from running tests on their computer by making this information available to Ubuntu and upstream developers.
  • For the community of developers, this will provide them feedback from the community to determine which hardware is compatible and which packages work.
  • For Hardware Certification and OEM, they could have a central repository where test results from the community and hardware vendors can be compared against each other.
  • For ARM and Kernel, they need a generic repository where test results can be shared between members of their team.

Stakeholders

Who really cares about this feature? When did you last talk to them?

In general, anyone who happens to be interested in the results from testing the compatibility of hardware and the working of package. In particular, the following people have expressed interest during the Maverick UDS and have attended at least one session about enhancing Launchpad with test results:

  • ARM team: Paul Larson and Zygmunt Krynicki
  • Community: James Tatum, Paolo Sammicheli and Stéphane Graber
  • Hardware Certification: Marc Tardif and Ronald McCollam

  • Kernel team: Hugh Blemings and Jeremy Kerr
  • OEM team: Chris Gregan and Javier Collado
  • QA team: Ara Pulido and Mathieu Trudel
  • Server team: Carlos de Avillez, Mathias Gug and Scott Moser

And, the following people have specifically expressed interest in contributing to the design and implementation:

  • Jeremy Kerr
  • Marc Tardif
  • Mathias Gug
  • Scott Moser
  • Zygmunt Krynicki

Constraints

What MUST the new behaviour provide?

  • Interface to submit new test results and append to existing submissions in the tracker.
  • Interface to retrieve sets of results across multiple submissions from the tracker.
  • Interface to query results based on hardware devices and/or package versions.
  • Interface to get reporting information from the tracker.
  • Ability to make private test results private and vice versa.
  • Ability to remove test results which might contain sensitive information.

What MUST it not do?

  • No HTML interface should be provided at this point. At this point, reporting should be done using the interfaces.
  • No upper bound should be enforced when submitting test results. The objective is to encourage the community to submit test results as much as possible.

Workflows

What are the workflows for this feature?

  • TODO

Provide mockups for each workflow.

  • TODO

Success

How will we know when we are done?

  • When the community of users and developers can finally see reports from the tests they have run on their computers.
  • When the ARM and Kernel teams use the Results Tracker as a repository for their test results.
  • When the Hardware Certification website uses the Results Tracker instead of its own database.
  • When the ISO Tracker uses the Results Tracker instead of its own database.

How will we measure how well we have done?

  • By the simplicity of the interfaces for generating reports from test results submitted by the community.
  • By the completeness of the interfaces for integration in the other sites, eg Hardware Certification and ISO Tracker.

Thoughts?

Put everything else here. Better out than in.

  • The Results Tracker should be implemented as a satellite service which could run on a separate server and database.
  • The existing Hardware Database (hwdb) in Launchpad should be migrated to the Results Tracker for integration with test results.
  • The initial web service could simply fork the Launchpad core and run standalone on the dogfood server in order to start profiling the models as early as possible.
  • The eventual web service should consider frameworks other than Zope3, like Twisted for example, which might be better suited for this service.
  • The storage backend should consider alternatives to an SQL database such as a MapReduce framework like Hadoop for scalability and reporting.

  • The interface should support the optional concept of test versions and results should be extensible with attachments in addition to the output.

LEP/ResultsTracker (last edited 2010-09-14 19:46:33 by cr3)