Diff for "Translations/GenerateTemplatesOnLocalBuildFarm"

Not logged in - Log In / Register

Differences between revisions 37 and 38
Revision 37 as of 2010-03-21 07:57:38
Size: 7453
Editor: jtv
Comment:
Revision 38 as of 2010-03-21 08:18:39
Size: 6620
Editor: jtv
Comment:
Deletions are marked like this. Additions are marked like this.
Line 136: Line 136:
/!\ And at this point it currently fails. :-( /!\ The job currently starts, but does not complete.
Line 141: Line 141:
The last thing I see in the logs is that the slave is told to download the Lucid chroot tarball. There's one more xmlrpc entry, and then things just stop:
{{{
2010-03-11 12:17:38+0000 [HTTPChannel,343,127.0.0.1] 127.0.0.1 - - [11/Mar/2010:12:17:37 +0000] "POST /rpc HTTP/1.0" 200 222 "-" "xmlrpclib.py/1.0.1 (by www.pythonware.com)"
2010-03-11 12:17:38+0000 [HTTPChannel,344,127.0.0.1] Build log: Fetching f1f10b8402ed686aaf0307676c76f07b45af2a09 by url http://launchpad.dev:58080/93/chroot-ubuntu-lucid-i386.t
ar.bz2
2010-03-11 12:17:40+0000 [HTTPChannel,344,127.0.0.1] Build log: Download
2010-03-11 12:17:40+0000 [HTTPChannel,344,127.0.0.1] 127.0.0.1 - - [11/Mar/2010:12:17:37 +0000] "POST /rpc HTTP/1.0" 200 215 "-" "Twisted/XMLRPClib"
}}}


The master apparently gets stuck when there is no answer. It too just stops:
{{{
2010-03-11 19:17:32+0700 [-] Starting scanning cycle.
2010-03-11 19:17:38+0700 [-] Starting scanning cycle.
2010-03-11 19:17:38+0700 [-] Starting templates build.
2010-03-11 19:17:38+0700 [-] Dispatching: <bob:http://localhost:8221/>
}}}
We now get as far as the slave doing the job, and seemingly completing it; but the master never quite picks up on this. According to {{{/var/tmp/development-buildd-manager.log}}} the master finds no {{{build_status}}} entry in the slave status.

Translation Templates from Branch

This is an attempt to guide you through a manual test of the nascent mechanism to produce translation templates based on a bzr branch.

It still needs lots of work.

Patches

  • Slave build id generation is broken; until 539499 is fixed, merge lp:~jtv/launchpad/bug-539499. Actually whole slave build id system needs to be replaced with something simpler and better.

  • For debugging, edit lib/lp/buildmaster/manager.py and replace logging.INFO with logging.DEBUG.

Enabling local branch access

mkdir -p /var/tmp/bazaar.launchpad.dev/static
mkdir -p /var/tmp/ppa

There's a bug somewhere in Windmill that may break access to local branches through http. This bug is scheduled to be fixed soon.

To see if this is biting you, look in your Apache error log (/var/log/apache2/error.log) for a traceback ending with something like this (your home directory and the version numbers will vary):

  File "/home/me/canonical/lp-sourcedeps/eggs/windmill-1.3beta3_lp_r1440-py2.5.egg/windmill/dep/_mozrunner/global_settings.py", line 42, in <module>
    def findInPath(fileName, path=os.environ['PATH']):
  File "/usr/lib/python2.5/UserDict.py", line 22, in __getitem__
    raise KeyError(key)
KeyError: 'PATH'

Near the top of the traceback you'll find mention of a Launchpad branch, probably canonical/lp-branches/devel or canonical/lp-branches/trunk. From that branch, remove two imports:

  • From lib/lp/testing/__init__.py remove this import:

from windmill.authoring import WindmillTestClient

  • From lib/canonical/testing/layers.py remove this:

from windmill.bin.admin_lib import (
    start_windmill, teardown as windmill_teardown)

Now restart Apache.

Setting up the testing environment in a branch

  • Create a Launchpad source tree, and cd into it.
  • Substituting your email address, but not one that is also in the sample data, run (parts may fail, but it's probably okay):

make schema
make run_codehosting >/dev/null 2>&1 &
# Wait until that's all up and running...
utilities/start-dev-soyuz.sh
utilities/soyuz-sampledata-setup.py -e <your@email.address>

Preparing the slave

Follow the instructions for setting up a local buildd slave at BuildFarm/TryOutBuildSlave. Skip the part where it tells you to run Launchpad, since you're already running it. You should end up with a shell open in a chroot for your simulated build slave. Keep it running; we're going to use that slave!

Did you obtain the Lucid chroot tarball as described on that page? Good. Give it to your running Launchpad instance and tell it to use it for Lucid:

  scripts/ftpmaster-tools/manage-chroot.py -s lucid -a i386 add -f chroot-ubuntu-lucid.tar.bz2

Making some branch updates to generate templates from

    pushd /tmp
    bzr init bf-test
    cd bf-test
    cat <<EOF >test.c
#include <gettext-po.h>
#include <stdio.h>
main(){puts(_("Hi"));}
EOF
    mkdir po
    echo test.c >po/POTFILES.in
    bzr add test.c
    bzr add po
    bzr add po/POTFILES.in
    bzr commit -m "Setting up fake intltool branch."
    bzr push lp://dev/bf-test --remember --use-existing-dir
    popd
  • Back in your branch: make sync_branches

Now you should have a TranslationTemplatesBuildJob sitting in the BranchJob table. You should see a very wide line of data coming out of:

psql launchpad_dev -c '
    select *
    from branchjob
    join job on job.id=branchjob.job
    join buildqueue on buildqueue.job=job.id
    where branchjob.job_type=6
    order by job.id'

It will show a branch_url parameter to be passed to the slave, pointing to your branch: http://bazaar.launchpad.dev/~ppa-user/bf-test/bf-test

/!\ Some system setups apparently can't access this branch. Try this:

pushd /tmp
bzr export export-experiment http://bazaar.launchpad.dev/~ppa-user/bf-test/bf-test

If that succeeds and creates a copy of bf-test in a directory called export-experiment, then it's fine; otherwise, re-do your rocketfuel setup by running rocketfuel-setup. And remove your branch. And start over. Sorry!

Simulating a builder

Now we tell Soyuz that your local buildd slave is in fact Bob the Builder.

Logged in as an admin (I suggest using a different browser so as not to break your ongoing session), check the Virtualized checkbox at https://launchpad.dev/builders/bob/+edit and enter an arbitrary hostname under Virtual Machine Host. Clear the "Builder State OK" checkbox to convince it not to continue building firefox for hoary, as it's been trying to do for the past few years.

Submit. Wait for a few seconds. Go back to the form and re-check "Builder State OK." Submit.

Running the job on the slave

After a while, the buildd master should dispatch your job to Bob the Builder, running in your slave. Keep an eye on these pages:

/!\ The job currently starts, but does not complete.

Stuckage

We now get as far as the slave doing the job, and seemingly completing it; but the master never quite picks up on this. According to /var/tmp/development-buildd-manager.log the master finds no build_status entry in the slave status.

Watch these logs:

  • /var/log/postgresql/postgresql-8.*-main.log

  • /var/tmp/development-buildd-manager.log

  • On the slave: /var/log/launchpad-buildd/default.log

Cleanup

To undo the changes made by utilities/soyuz-sampledata-setup.py:

  • rm -f /var/tmp/zeca/*

  • make schema

Translations/GenerateTemplatesOnLocalBuildFarm (last edited 2019-05-28 23:55:19 by cjwatson)