= Deployment-oriented QA =
Make the QA process more convenient and useful those requesting deployment.
'''Contact:''' AaronBentley <
>
'''On Launchpad:''' [[https://bugs.launchpad.net/qa-tagger/+bugs?field.tag=deployment-qa|deployment-qa]]
As a Launchpad Developer requesting a nodowntime deployment, <
>
I want to know which revisions are known to be safe to deploy '''as tip''' of a nodowntime deployment, <
>
so that I can decide which revision to deploy.
As a Release Manager, <
>
I want to know which revisions are known to be safe to deploy '''as tip''', of a release, <
>
so that I can decide which revision to deploy.
As a Release Manager or Launchpad Developer, <
>
I want to know why the other revisions are not known to be safe, <
>
so that I can work to make them or subsequent revisions deployable.
As a Release Manager, <
>
I want to know how much time it takes to deploy all the database patches present in a revision, <
>
so that I know which db-stable revisions are suitable to merge into devel prior to release.
As a Release manager, <
>
I want to know which revisions have not been QAed yet, and who can QA them, <
>
so that I can work to ensure QA is performed.
As a Launchpad contributor, <
>
I want a simple system to use to indicate when I have done QA, <
>
so that I can get my work done and track my progress.
This LEP is ''not'' about adding QA as a feature of Launchpad. Rather, it's about updating the Launchpad development process, potentially adjusting our data model to support this.
== Rationale ==
We are doing this now because simplifying the process will allow us to do releases more safely.
This gives our release managers and contributors a simpler, safer process.
== Stakeholders ==
* Release managers (I just was one)
* Launchpad developers who request nodowntime deployments.
* QA engineers (not yet discussed)
* Launchpad contributors (I am one)
== Constraints and Requirements ==
=== Must ===
* Clearly indicate which revisions are deployable as tip
* Show which revisions need QA and who can QA them (e.g. by linking to the merge proposal)
* Be documented clearly and accurately
* Allow mistakes in the landing messages (such as a missing [rollback=]) to be corrected.
* Consider whether a revision is deployable based on whether it introduces problems (regressions) rather than whether it fixes
a bug. (i.e. "safe to deploy", rather than "meets requirements")
=== Nice to have ===
* Provide database update times for each revision.
* Explanation of why revisions are not deployable.
* Reusing the same branch for multiple landings doesn't confuse the system.
=== Must not ===
* Changes in the status of a revision must not affect the qa status of any other revision (in the current system, changing QA
status of a bug changes the QA status of all revisions linked to that bug)
=== Out of scope ===
== Subfeatures ==
None
== Workflows ==
As a release manager planning to merge db-stable into devel, I use the deployment-db-stable report to determine what needs to be done in order to deploy the latest revision possible, and who can do it. I check in with those people.
As a release manager merging db-stable into devel, I use the deployment-db-stable report to determine which revision I can safely merge into devel.
As a release manager who has merged db-stable into devel, I use the deployment-stable report to determine what needs to be done in order to deploy the revision that merged db-stable and who can do it. I check in with those people.
As a release manager who is ready to release, I use the deployment-stable report to determine which revision to use as the basis for the rollout.
As Launchpad Developer requesting a nodowntime deployment, I use the deployment-stable report to determine whether the revision I want to deploy is deployable. I also use it to determine which revision to use as the basis for the rollout.
== Success ==
'''Bugs are at:''' [[https://bugs.launchpad.net/qa-tagger/+bugs?field.tag=deployment-qa|deployment-qa]]
=== How will we know when we are done? ===
* It's obvious which revisions can be used as tip of a deployment.
* It's easy to chase down QA
* It's rarely necessary to chase down QA
=== How will we measure how well we have done? ===
* Interview outgoing release managers about what made their work difficult.
== Thoughts? ==
The obvious solution would be to allow tagging revisions. Another option would be to use merge proposals, but there may be multiple proposals per revision, e.g. when a pipeline is landed all at once.