Diff for "Translations/GenerateTemplatesOnTestServers"

Not logged in - Log In / Register

Differences between revisions 23 and 29 (spanning 6 versions)
Revision 23 as of 2010-04-23 16:55:03
Size: 4960
Editor: jtv
Comment:
Revision 29 as of 2010-04-27 09:19:52
Size: 4845
Editor: jtv
Comment:
Deletions are marked like this. Additions are marked like this.
Line 17: Line 17:
We currently [[http://paste.ubuntu.com/421149/|fail while cleaning up the job]] from the database. We currently [[Bug:569108|fail while cleaning up the job]] from the database.

When re-running on a non-virtualized slave (so we can get to the logs and filesystem), the builder log gets stuck at the chroot unpack stage, even though [[http://pastebin.ubuntu.com/423233/|the slave log says it continues to do its job]].
Line 25: Line 28:

=== Firewall ===

The slaves in the `dogfood` build farm must be allowed to make `http` connections to whatever codehosting server you will be using.

'''TODO: Ensure this—[[Bug:499407|bug 499407]].'''


=== Config ===

Make sure that generating templates is enabled in the Launchpad lazr config.
 * Enabled for `dogfood`.
 * Enabled on development systems.
 * Being enabled for `staging`. '''TODO: what `production-configs` revision? How will we know it's landed?'''

The configuration item in question is `generate_templates` in the `[rosetta]` section.
Line 69: Line 55:


=== Builders ===

There must be at least one ''virtualized'' 386 builder active.

With a bit of hacking, Julian can get a job to run on a non-virtualized builder. This makes the tail of the build log visible on the builder page.
Line 110: Line 103:
VALUES (<jobid>, 4) VALUES (<jobid>, 4);

Trying out template generation on dogfood/staging

Right now we have no good test platform for this: staging has no build farm, and dogfood has no codehosting. Expect this page to change a lot as we work and learn. We already have pages about doing this on a development system that may also help.

Internal Soyuz documentation on dealing with dogfood.

Approach

For now, we'll have to live with a split test setup:

  • Push changes to generate jobs on staging.

  • Copy the jobs over to the dogfood database.

  • Let the dogfood build farm generate the templates.

Known Problems

We currently fail while cleaning up the job from the database.

When re-running on a non-virtualized slave (so we can get to the logs and filesystem), the builder log gets stuck at the chroot unpack stage, even though the slave log says it continues to do its job.

intltool recognition

It looks like gettext has changed, and our check for intltool branches now also accepts some plain gettext branches. This happens with lp:gawk, which pottery mistakenly recognizes as intltool but doesn't process successfully because there is no GETTEXT_PACKAGE definition.

Checklist

Suitable branches

Make sure you have a branch that is pottery-compatible. (We call the code that generates templates from code "pottery"). For testing, it should normally not contain any .pot files. Otherwise we'd just end up importing those instead of ones we generated.

We don't yet know for sure what real-world branches will work right out of the box. We must fix that!

According to wgrant, on many branches, editing configure.ac to set GETTEXT_PACKAGE does the trick. See bug 568355.

You can easily check if a branch will work by generating a template locally. Run the following command from the LP tree in the packages root directory:

scripts/rosetta/pottery-generate-intltool.py

It outputs the template files it creates. It uses the same code as the build slave.

TODO: Find a good real-world sample branch for testing—preferably small but with multiple templates.

Candidate branches:

  • gimp, but it's large.

  • lp:ubuntu/gawk does not use intltool, but pottery seems to think that it does. See bug 568372.

  • lp:libgnomeui: OK, one template

  • lp:gconf: OK, one template

Once you have a suitable branch, you can copy it to staging's codehosting using "bzr push -d lp:<branch> lp://staging/<branch> --use-existing-dir"

Builders

There must be at least one virtualized 386 builder active.

With a bit of hacking, Julian can get a job to run on a non-virtualized builder. This makes the tail of the build log visible on the builder page.

Procedure

Set up a product series on both staging and dogfood.

Set up a pottery-compatible branch on staging; set it as the product series' development branch.

Register a branch on dogfood, but do not push it (as dogfood has no codehosting). Set it as the product series' development branch.

Configure translation template imports for both product series.

Push a change to your branch on staging. Wait for it to be scanned.

Once the job is scanned, a triplet of matching records should appear in the staging database: a BranchJob, a Job, and a BuildQueue. They are matched on BranchJob.job, Job.id, and BuildQueue.job. In SQL it should be at the top of...

SELECT *
FROM BranchJob
JOIN Job ON Job.id = BranchJob.job
JOIN BuildQueue ON BuildQueue.job = Job.id
WHERE
    BranchJob.job_type = 6 AND
    BuildQueue.job_type = 4 AND
    Job.status = 0
ORDER BY Job.id DESC;

Copy what you find directly into the dogfood database. Start with the Job:

INSERT INTO Job (status) values (0) RETURNING id;

Note the id that this returns. Now, create a BranchJob using that job id, the id of your branch on dogfood, and a copy of the json_data as created on staging:

INSERT INTO BranchJob (job, branch, job_type, json_data)
VALUES (<jobid>, <branchid>, 6, <json>);

Finally, the BuildQueue entry:

INSERT INTO BuildQueue (job, job_type)
VALUES (<jobid>, 4);

This should be enough to get this job onto the build farm. Keep an eye on https://dogfood.launchpad.net/builders where it should appear shortly (and then, after processing, disappear again).

Once the job is successfully completed, any generated templates should appear on the productseries' translations import queue on dogfood.

Translations/GenerateTemplatesOnTestServers (last edited 2020-09-17 12:48:23 by cjwatson)