Merge lp:~stevenk/launchpad/db-add-derivedistroseries-api into lp:launchpad/db-devel
- db-add-derivedistroseries-api
- Merge into db-devel
Status: | Merged | ||||
---|---|---|---|---|---|
Approved by: | Graham Binns | ||||
Approved revision: | no longer in the source branch. | ||||
Merged at revision: | 9903 | ||||
Proposed branch: | lp:~stevenk/launchpad/db-add-derivedistroseries-api | ||||
Merge into: | lp:launchpad/db-devel | ||||
Prerequisite: | lp:~stevenk/launchpad/db-add-parameters-to-idsjob | ||||
Diff against target: |
2430 lines (+563/-752) 42 files modified
cronscripts/garbo-daily.py (+1/-1) cronscripts/garbo-hourly.py (+1/-1) lib/canonical/config/tests/test_database_config.py (+10/-6) lib/canonical/database/ftests/test_postgresql.py (+5/-3) lib/canonical/database/ftests/test_sqlbaseconnect.txt (+2/-2) lib/canonical/ftests/pgsql.py (+11/-30) lib/canonical/ftests/test_pgsql.py (+60/-63) lib/canonical/launchpad/doc/canonical-config.txt (+6/-4) lib/canonical/launchpad/doc/old-testing.txt (+19/-127) lib/canonical/launchpad/doc/security-proxies.txt (+0/-8) lib/canonical/launchpad/ftests/harness.py (+0/-84) lib/canonical/launchpad/interfaces/_schema_circular_imports.py (+2/-0) lib/canonical/launchpad/pagetests/standalone/xx-dbpolicy.txt (+5/-3) lib/canonical/launchpad/tests/test_sampledata.py (+2/-6) lib/canonical/launchpad/webapp/ftests/test_adapter.txt (+6/-2) lib/canonical/lp/ftests/test_zopeless.py (+14/-9) lib/canonical/testing/ftests/test_layers.py (+26/-21) lib/canonical/testing/layers.py (+53/-55) lib/lp/bugs/doc/bug-heat.txt (+1/-1) lib/lp/bugs/tests/test_bugwatch.py (+1/-1) lib/lp/code/model/tests/test_revision.py (+1/-1) lib/lp/code/model/tests/test_revisionauthor.py (+1/-1) lib/lp/code/scripts/tests/test_revisionkarma.py (+1/-1) lib/lp/codehosting/tests/test_acceptance.py (+20/-21) lib/lp/hardwaredb/doc/hwdb.txt (+1/-1) lib/lp/poppy/tests/test_poppy.py (+3/-3) lib/lp/registry/interfaces/distroseries.py (+80/-4) lib/lp/registry/model/distroseries.py (+62/-0) lib/lp/registry/stories/webservice/xx-derivedistroseries.txt (+68/-0) lib/lp/registry/tests/test_derivedistroseries.py (+76/-0) lib/lp/scripts/tests/test_garbo.py (+1/-1) lib/lp/scripts/utilities/importfascist.py (+1/-1) lib/lp/soyuz/configure.zcml (+5/-2) lib/lp/soyuz/doc/sampledata-setup.txt (+2/-2) lib/lp/soyuz/scripts/initialise_distroseries.py (+2/-2) lib/lp/soyuz/scripts/tests/test_buildd_cronscripts.py (+2/-3) lib/lp/testing/__init__.py (+1/-15) lib/lp/testing/fixture.py (+7/-118) lib/lp/testing/tests/test_fixture.py (+0/-138) lib/lp/translations/doc/fix_translation_credits.txt (+2/-2) lib/lp/translations/doc/message-sharing-merge-script.txt (+2/-2) lib/lp/translations/doc/request_country.txt (+0/-7) |
||||
To merge this branch: | bzr merge lp:~stevenk/launchpad/db-add-derivedistroseries-api | ||||
Related bugs: |
|
Reviewer | Review Type | Date Requested | Status |
---|---|---|---|
Graham Binns (community) | code | Approve | |
Review via email: mp+38189@code.launchpad.net |
Commit message
Allow derivers to create a derived distroseries via the API.
Description of the change
This branch adds a new method to IDistroSeries: deriveDistroSer
This method is intended to be called via API scripts for both the distro use-case as well as the Linaro use-case to initialise a new distroseries (which may or may not exist), from an existing distroseries. It will create the new distroseries if it doesn't exist, under the specified distribution, unless it isn't specified, in which case it will use the current distroseries's distribution (.parent in the interface).
After it has verified everything, and created the distroseries if need be, it checks the details submitted via the InitialiseDistr
I have written two seperate tests, one that calls the method directly and tests it, and a lighter doctest that checks via the API.
I have discussed this change with both Julian and Michael, and they agreed that it sounded good.
To test: bin/test -vvt test_derivedist
This branch also drive-bys two mis-spellings of distribution, since I noticed them.
This MP supersedes the one linked at https:/
Steve Kowalik (stevenk) wrote : | # |
Hi Graham,
The reason I'm checking for those arguments is because the API call can be used one of two ways. Firstly, it can create the distroseries for the caller, and secondly, it can use the distroseries that already exists, so we only need to check if arguments are set if we are creating a distroseries.
Graham Binns (gmb) wrote : | # |
On 12 October 2010 11:43, Steve Kowalik <email address hidden> wrote:
> Hi Graham,
>
> The reason I'm checking for those arguments is because the API call can be used one of two ways. Firstly, it can create the distroseries for the caller, and secondly, it can use the distroseries that already exists, so we only need to check if arguments are set if we are creating a distroseries.
Okay, I see your reasoning. However, I think that the code still needs
clarification. I'd rather see:
if not displayname:
raise DerivationError
if not title:
raise DerivationError
...
Etc. That way it's clearer upon reading what's going on.
Graham Binns (gmb) wrote : | # |
Final nitpick, otherwise r=me:
> 210 + if displayname is None or len(displayname) == 0:
Why not just use `if not displayname`? (Lather, rinse, repeat for other conditions in this method).
> 211 + raise DerivationError
> 212 + " creating a distroseries.")
When you have to wrap a call, always start wrapping after the opening parenthesis:
raise DerivationError(
"Display Name needs to be set when"
" creating a distroseries.")
(And if the string will fit on one line thereafter, make it so.)
Preview Diff
1 | === modified file 'cronscripts/garbo-daily.py' |
2 | --- cronscripts/garbo-daily.py 2010-04-27 19:48:39 +0000 |
3 | +++ cronscripts/garbo-daily.py 2010-10-18 06:17:50 +0000 |
4 | @@ -13,7 +13,7 @@ |
5 | __all__ = [] |
6 | |
7 | import _pythonpath |
8 | -from canonical.launchpad.scripts.garbo import DailyDatabaseGarbageCollector |
9 | +from lp.scripts.garbo import DailyDatabaseGarbageCollector |
10 | |
11 | if __name__ == '__main__': |
12 | script = DailyDatabaseGarbageCollector() |
13 | |
14 | === modified file 'cronscripts/garbo-hourly.py' |
15 | --- cronscripts/garbo-hourly.py 2010-04-27 19:48:39 +0000 |
16 | +++ cronscripts/garbo-hourly.py 2010-10-18 06:17:50 +0000 |
17 | @@ -13,7 +13,7 @@ |
18 | __all__ = [] |
19 | |
20 | import _pythonpath |
21 | -from canonical.launchpad.scripts.garbo import HourlyDatabaseGarbageCollector |
22 | +from lp.scripts.garbo import HourlyDatabaseGarbageCollector |
23 | |
24 | if __name__ == '__main__': |
25 | script = HourlyDatabaseGarbageCollector() |
26 | |
27 | === modified file 'lib/canonical/config/tests/test_database_config.py' |
28 | --- lib/canonical/config/tests/test_database_config.py 2010-01-13 20:06:09 +0000 |
29 | +++ lib/canonical/config/tests/test_database_config.py 2010-10-18 06:17:50 +0000 |
30 | @@ -3,17 +3,20 @@ |
31 | |
32 | __metaclass__ = type |
33 | |
34 | -from lp.testing import TestCase |
35 | - |
36 | from canonical.config import config, dbconfig |
37 | - |
38 | from canonical.launchpad.readonly import read_only_file_exists |
39 | from canonical.launchpad.tests.readonly import ( |
40 | - remove_read_only_file, touch_read_only_file) |
41 | + remove_read_only_file, |
42 | + touch_read_only_file, |
43 | + ) |
44 | +from canonical.testing.layers import DatabaseLayer |
45 | +from lp.testing import TestCase |
46 | |
47 | |
48 | class TestDatabaseConfig(TestCase): |
49 | |
50 | + layer = DatabaseLayer |
51 | + |
52 | def test_overlay(self): |
53 | # The dbconfig option overlays the database configurations of a |
54 | # chosen config section over the base section. |
55 | @@ -25,11 +28,12 @@ |
56 | self.assertEquals('librarian', config.librarian.dbuser) |
57 | |
58 | dbconfig.setConfigSection('librarian') |
59 | - self.assertEquals('dbname=launchpad_ftest', dbconfig.rw_main_master) |
60 | + expected_db = 'dbname=%s' % DatabaseLayer._db_fixture.dbname |
61 | + self.assertEquals(expected_db, dbconfig.rw_main_master) |
62 | self.assertEquals('librarian', dbconfig.dbuser) |
63 | |
64 | dbconfig.setConfigSection('launchpad') |
65 | - self.assertEquals('dbname=launchpad_ftest', dbconfig.rw_main_master) |
66 | + self.assertEquals(expected_db, dbconfig.rw_main_master) |
67 | self.assertEquals('launchpad_main', dbconfig.dbuser) |
68 | |
69 | def test_required_values(self): |
70 | |
71 | === modified file 'lib/canonical/database/ftests/test_postgresql.py' |
72 | --- lib/canonical/database/ftests/test_postgresql.py 2010-07-14 14:11:15 +0000 |
73 | +++ lib/canonical/database/ftests/test_postgresql.py 2010-10-18 06:17:50 +0000 |
74 | @@ -10,8 +10,9 @@ |
75 | def setUp(test): |
76 | |
77 | # Build a fresh, empty database and connect |
78 | - PgTestSetup().setUp() |
79 | - con = PgTestSetup().connect() |
80 | + test._db_fixture = PgTestSetup() |
81 | + test._db_fixture.setUp() |
82 | + con = test._db_fixture.connect() |
83 | |
84 | # Create a test schema demonstrating the edge cases |
85 | cur = con.cursor() |
86 | @@ -53,8 +54,9 @@ |
87 | test.globs['cur'] = cur |
88 | |
89 | def tearDown(test): |
90 | - PgTestSetup().tearDown() |
91 | test.globs['con'].close() |
92 | + test._db_fixture.tearDown() |
93 | + del test._db_fixture |
94 | |
95 | def test_suite(): |
96 | suite = DocTestSuite( |
97 | |
98 | === modified file 'lib/canonical/database/ftests/test_sqlbaseconnect.txt' |
99 | --- lib/canonical/database/ftests/test_sqlbaseconnect.txt 2009-04-17 10:32:16 +0000 |
100 | +++ lib/canonical/database/ftests/test_sqlbaseconnect.txt 2010-10-18 06:17:50 +0000 |
101 | @@ -19,7 +19,7 @@ |
102 | Specifying the user connects as that user. |
103 | |
104 | >>> do_connect(user=config.launchpad_session.dbuser) |
105 | - Connected as session to launchpad_ftest in read committed isolation. |
106 | + Connected as session to ... in read committed isolation. |
107 | |
108 | Specifying the database name connects to that database. |
109 | |
110 | @@ -31,5 +31,5 @@ |
111 | >>> do_connect( |
112 | ... user=config.launchpad.dbuser, |
113 | ... isolation=ISOLATION_LEVEL_SERIALIZABLE) |
114 | - Connected as launchpad_main to launchpad_ftest in serializable isolation. |
115 | + Connected as launchpad_main to ... in serializable isolation. |
116 | |
117 | |
118 | === modified file 'lib/canonical/ftests/pgsql.py' |
119 | --- lib/canonical/ftests/pgsql.py 2009-10-09 04:05:34 +0000 |
120 | +++ lib/canonical/ftests/pgsql.py 2010-10-18 06:17:50 +0000 |
121 | @@ -7,7 +7,7 @@ |
122 | |
123 | __metaclass__ = type |
124 | |
125 | -import unittest |
126 | +import os |
127 | import time |
128 | |
129 | import psycopg2 |
130 | @@ -119,7 +119,6 @@ |
131 | |
132 | _org_connect = None |
133 | def fake_connect(*args, **kw): |
134 | - global _org_connect |
135 | return ConnectionWrapper(_org_connect(*args, **kw)) |
136 | |
137 | def installFakeConnect(): |
138 | @@ -136,9 +135,13 @@ |
139 | |
140 | |
141 | class PgTestSetup: |
142 | + |
143 | connections = [] # Shared |
144 | + # Use a dynamically generated dbname: |
145 | + dynamic = object() |
146 | |
147 | template = 'template1' |
148 | + # Needs to match configs/testrunner*/*: |
149 | dbname = 'launchpad_ftest' |
150 | dbuser = None |
151 | host = None |
152 | @@ -165,8 +168,13 @@ |
153 | ''' |
154 | if template is not None: |
155 | self.template = template |
156 | - if dbname is not None: |
157 | + if dbname is PgTestSetup.dynamic: |
158 | + self.dbname = self.__class__.dbname + "_" + str(os.getpid()) |
159 | + elif dbname is not None: |
160 | self.dbname = dbname |
161 | + else: |
162 | + # Fallback to the class name. |
163 | + self.dbname = self.__class__.dbname |
164 | if dbuser is not None: |
165 | self.dbuser = dbuser |
166 | if host is not None: |
167 | @@ -331,30 +339,3 @@ |
168 | as database changes made from a subprocess. |
169 | """ |
170 | PgTestSetup._reset_db = True |
171 | - |
172 | - |
173 | -class PgTestCase(unittest.TestCase): |
174 | - dbname = None |
175 | - dbuser = None |
176 | - host = None |
177 | - port = None |
178 | - template = None |
179 | - def setUp(self): |
180 | - pg_test_setup = PgTestSetup( |
181 | - self.template, self.dbname, self.dbuser, self.host, self.port |
182 | - ) |
183 | - pg_test_setup.setUp() |
184 | - self.dbname = pg_test_setup.dbname |
185 | - self.dbuser = pg_test_setup.dbuser |
186 | - assert self.dbname, 'self.dbname is not set.' |
187 | - |
188 | - def tearDown(self): |
189 | - PgTestSetup( |
190 | - self.template, self.dbname, self.dbuser, self.host, self.port |
191 | - ).tearDown() |
192 | - |
193 | - def connect(self): |
194 | - return PgTestSetup( |
195 | - self.template, self.dbname, self.dbuser, self.host, self.port |
196 | - ).connect() |
197 | - |
198 | |
199 | === modified file 'lib/canonical/ftests/test_pgsql.py' |
200 | --- lib/canonical/ftests/test_pgsql.py 2009-06-25 05:30:52 +0000 |
201 | +++ lib/canonical/ftests/test_pgsql.py 2010-10-18 06:17:50 +0000 |
202 | @@ -1,77 +1,84 @@ |
203 | -# Copyright 2009 Canonical Ltd. This software is licensed under the |
204 | +# Copyright 2009-2010 Canonical Ltd. This software is licensed under the |
205 | # GNU Affero General Public License version 3 (see the file LICENSE). |
206 | |
207 | -import unittest |
208 | -from canonical.ftests.pgsql import PgTestCase, PgTestSetup, ConnectionWrapper |
209 | - |
210 | - |
211 | -class TestPgTestCase(PgTestCase): |
212 | - |
213 | - def testRollback(self): |
214 | - # This test creates a table. We run the same test twice, |
215 | - # which will fail if database changes are not rolled back |
216 | - con = self.connect() |
217 | - cur = con.cursor() |
218 | - cur.execute('CREATE TABLE foo (x int)') |
219 | - cur.execute('INSERT INTO foo VALUES (1)') |
220 | - cur.execute('SELECT x FROM foo') |
221 | - res = list(cur.fetchall()) |
222 | - self.failUnless(len(res) == 1) |
223 | - self.failUnless(res[0][0] == 1) |
224 | - con.commit() |
225 | - |
226 | - testRollback2 = testRollback |
227 | - |
228 | -class TestOptimization(unittest.TestCase): |
229 | +import os |
230 | + |
231 | +import testtools |
232 | + |
233 | +from canonical.ftests.pgsql import ( |
234 | + ConnectionWrapper, |
235 | + PgTestSetup, |
236 | + ) |
237 | + |
238 | + |
239 | +class TestPgTestSetup(testtools.TestCase): |
240 | + |
241 | + def test_db_naming(self): |
242 | + fixture = PgTestSetup(dbname=PgTestSetup.dynamic) |
243 | + expected_name = "%s_%s" % (PgTestSetup.dbname, os.getpid()) |
244 | + self.assertEqual(expected_name, fixture.dbname) |
245 | + fixture.setUp() |
246 | + self.addCleanup(fixture.dropDb) |
247 | + self.addCleanup(fixture.tearDown) |
248 | + cur = fixture.connect().cursor() |
249 | + cur.execute('SELECT current_database()') |
250 | + where = cur.fetchone()[0] |
251 | + self.assertEqual(expected_name, where) |
252 | + |
253 | def testOptimization(self): |
254 | # Test to ensure that the database is destroyed only when necessary |
255 | |
256 | # Make a change to a database |
257 | - PgTestSetup().setUp() |
258 | + fixture = PgTestSetup() |
259 | + fixture.setUp() |
260 | try: |
261 | - con = PgTestSetup().connect() |
262 | + con = fixture.connect() |
263 | cur = con.cursor() |
264 | cur.execute('CREATE TABLE foo (x int)') |
265 | con.commit() |
266 | # Fake it so the harness doesn't know a change has been made |
267 | ConnectionWrapper.committed = False |
268 | finally: |
269 | - PgTestSetup().tearDown() |
270 | + fixture.tearDown() |
271 | |
272 | - # Now check to ensure that the table we just created is still there |
273 | - PgTestSetup().setUp() |
274 | + # Now check to ensure that the table we just created is still there if |
275 | + # we reuse the fixture. |
276 | + fixture.setUp() |
277 | try: |
278 | - con = PgTestSetup().connect() |
279 | + con = fixture.connect() |
280 | cur = con.cursor() |
281 | # This tests that the table still exists, as well as modifying the |
282 | # db |
283 | cur.execute('INSERT INTO foo VALUES (1)') |
284 | con.commit() |
285 | finally: |
286 | - PgTestSetup().tearDown() |
287 | + fixture.tearDown() |
288 | |
289 | - # Now ensure that the table is gone |
290 | - PgTestSetup().setUp() |
291 | + # Now ensure that the table is gone - the commit must have been rolled |
292 | + # back. |
293 | + fixture.setUp() |
294 | try: |
295 | - con = PgTestSetup().connect() |
296 | + con = fixture.connect() |
297 | cur = con.cursor() |
298 | cur.execute('CREATE TABLE foo (x int)') |
299 | con.commit() |
300 | ConnectionWrapper.committed = False # Leave the table |
301 | finally: |
302 | - PgTestSetup().tearDown() |
303 | + fixture.tearDown() |
304 | |
305 | - # The database should *always* be recreated if the template |
306 | - # changes. |
307 | - PgTestSetup._last_db = ('whatever', 'launchpad_ftest') |
308 | - PgTestSetup().setUp() |
309 | + # The database should *always* be recreated if a new template had been |
310 | + # chosen. |
311 | + PgTestSetup._last_db = ('different-template', fixture.dbname) |
312 | + fixture.setUp() |
313 | try: |
314 | - con = PgTestSetup().connect() |
315 | + con = fixture.connect() |
316 | cur = con.cursor() |
317 | + # If this fails, TABLE foo still existed and the DB wasn't rebuilt |
318 | + # correctly. |
319 | cur.execute('CREATE TABLE foo (x int)') |
320 | con.commit() |
321 | finally: |
322 | - PgTestSetup().tearDown() |
323 | + fixture.tearDown() |
324 | |
325 | def test_sequences(self): |
326 | # Sequences may be affected by connections even if the connection |
327 | @@ -80,9 +87,10 @@ |
328 | # the sequences. |
329 | |
330 | # Setup a table that uses a sequence |
331 | - PgTestSetup().setUp() |
332 | + fixture = PgTestSetup() |
333 | + fixture.setUp() |
334 | try: |
335 | - con = PgTestSetup().connect() |
336 | + con = fixture.connect() |
337 | cur = con.cursor() |
338 | cur.execute('CREATE TABLE foo (x serial, y integer)') |
339 | con.commit() |
340 | @@ -90,15 +98,15 @@ |
341 | # Fake it so the harness doesn't know a change has been made |
342 | ConnectionWrapper.committed = False |
343 | finally: |
344 | - PgTestSetup().tearDown() |
345 | + fixture.tearDown() |
346 | |
347 | sequence_values = [] |
348 | # Insert a row into it and roll back the changes. Each time, we |
349 | # should end up with the same sequence value |
350 | for i in range(3): |
351 | - PgTestSetup().setUp() |
352 | + fixture.setUp() |
353 | try: |
354 | - con = PgTestSetup().connect() |
355 | + con = fixture.connect() |
356 | cur = con.cursor() |
357 | cur.execute('INSERT INTO foo (y) VALUES (1)') |
358 | cur.execute("SELECT currval('foo_x_seq')") |
359 | @@ -106,7 +114,7 @@ |
360 | con.rollback() |
361 | con.close() |
362 | finally: |
363 | - PgTestSetup().tearDown() |
364 | + fixture.tearDown() |
365 | |
366 | # Fail if we got a diffent sequence value at some point |
367 | for v in sequence_values: |
368 | @@ -114,9 +122,9 @@ |
369 | |
370 | # Repeat the test, but this time with some data already in the |
371 | # table |
372 | - PgTestSetup().setUp() |
373 | + fixture.setUp() |
374 | try: |
375 | - con = PgTestSetup().connect() |
376 | + con = fixture.connect() |
377 | cur = con.cursor() |
378 | cur.execute('INSERT INTO foo (y) VALUES (1)') |
379 | con.commit() |
380 | @@ -124,15 +132,15 @@ |
381 | # Fake it so the harness doesn't know a change has been made |
382 | ConnectionWrapper.committed = False |
383 | finally: |
384 | - PgTestSetup().tearDown() |
385 | + fixture.tearDown() |
386 | |
387 | sequence_values = [] |
388 | # Insert a row into it and roll back the changes. Each time, we |
389 | # should end up with the same sequence value |
390 | for i in range(1,3): |
391 | - PgTestSetup().setUp() |
392 | + fixture.setUp() |
393 | try: |
394 | - con = PgTestSetup().connect() |
395 | + con = fixture.connect() |
396 | cur = con.cursor() |
397 | cur.execute('INSERT INTO foo (y) VALUES (1)') |
398 | cur.execute("SELECT currval('foo_x_seq')") |
399 | @@ -140,19 +148,8 @@ |
400 | con.rollback() |
401 | con.close() |
402 | finally: |
403 | - PgTestSetup().tearDown() |
404 | + fixture.tearDown() |
405 | |
406 | # Fail if we got a diffent sequence value at some point |
407 | for v in sequence_values: |
408 | self.failUnlessEqual(v, sequence_values[0]) |
409 | - |
410 | - |
411 | -def test_suite(): |
412 | - suite = unittest.TestSuite() |
413 | - suite.addTest(unittest.makeSuite(TestPgTestCase)) |
414 | - suite.addTest(unittest.makeSuite(TestOptimization)) |
415 | - return suite |
416 | - |
417 | -if __name__ == '__main__': |
418 | - unittest.main() |
419 | - |
420 | |
421 | === modified file 'lib/canonical/launchpad/doc/canonical-config.txt' |
422 | --- lib/canonical/launchpad/doc/canonical-config.txt 2010-01-05 19:09:58 +0000 |
423 | +++ lib/canonical/launchpad/doc/canonical-config.txt 2010-10-18 06:17:50 +0000 |
424 | @@ -14,8 +14,10 @@ |
425 | simple configuration). |
426 | |
427 | >>> from canonical.config import config |
428 | - >>> print config.database.rw_main_master |
429 | - dbname=launchpad_ftest |
430 | + >>> from canonical.testing.layers import DatabaseLayer |
431 | + >>> expected = 'dbname=%s' % DatabaseLayer._db_fixture.dbname |
432 | + >>> expected == config.database.rw_main_master |
433 | + True |
434 | >>> config.database.db_statement_timeout is None |
435 | True |
436 | >>> config.launchpad.dbuser |
437 | @@ -226,7 +228,7 @@ |
438 | # >>> canonical.config.config = config |
439 | # >>> config.filename |
440 | # '.../configs/testrunner/launchpad-lazr.conf' |
441 | -# >>> config.dbname |
442 | -# 'launchpad_ftest' |
443 | +# >>> config.dbname == DatabaseLayer._db_fixture.dbname |
444 | +# True |
445 | # >>> config._cache.testrunner |
446 | # <SectionValue for canonical 'testrunner'> |
447 | |
448 | === modified file 'lib/canonical/launchpad/doc/old-testing.txt' |
449 | --- lib/canonical/launchpad/doc/old-testing.txt 2010-10-03 20:23:37 +0000 |
450 | +++ lib/canonical/launchpad/doc/old-testing.txt 2010-10-18 06:17:50 +0000 |
451 | @@ -18,11 +18,6 @@ |
452 | zope, we should not be testing it with the full Z3 functional test |
453 | harness). |
454 | |
455 | -If you are wondering why we use `PgTestSetup().setUp()` and |
456 | -`PgTestSetup.tearDown()` instead of `pgtestsetup.setUp()` or |
457 | -`pgtestsetup.tearDown()`, it is because I'm mirroring the design used in |
458 | -Zope3's `FunctionalTestSetup`. |
459 | - |
460 | canonical.functional.FunctionalTestCase |
461 | --------------------------------------- |
462 | |
463 | @@ -42,11 +37,12 @@ |
464 | |
465 | The setup procedure builds us a fresh, empty database |
466 | |
467 | ->>> PgTestSetup().setUp() |
468 | +>>> fixture = PgTestSetup() |
469 | +>>> fixture.setUp() |
470 | |
471 | We can get connections to this database |
472 | |
473 | ->>> connection = PgTestSetup().connect() |
474 | +>>> connection = fixture.connect() |
475 | >>> cursor = connection.cursor() |
476 | >>> cursor.execute("""CREATE TABLE Beer ( |
477 | ... id serial PRIMARY KEY, name text, stamp timestamp without time zone |
478 | @@ -68,28 +64,29 @@ |
479 | When we have finished, we need to call the tearDown method which closes |
480 | all outstanding connections and destroys the database |
481 | |
482 | ->>> PgTestSetup().tearDown() |
483 | +>>> fixture.tearDown() |
484 | |
485 | Because the database has been destroyed, further tests will not be |
486 | affected. |
487 | |
488 | ->>> PgTestSetup().setUp() |
489 | ->>> connection = PgTestSetup().connect() |
490 | +>>> fixture.setUp() |
491 | +>>> connection = fixture.connect() |
492 | >>> cursor = connection.cursor() |
493 | >>> cursor.execute("CREATE TABLE Beer (id serial PRIMARY KEY, name text)") |
494 | ->>> PgTestSetup().tearDown() |
495 | +>>> fixture.tearDown() |
496 | |
497 | We can also specify a different template to duplicate than the default |
498 | clean one (template1). For example, if you need a launchpad database |
499 | containing no data, you can use `launchpad_empty` as the template. |
500 | |
501 | ->>> PgTestSetup('launchpad_empty').setUp() |
502 | ->>> connection = PgTestSetup().connect() |
503 | +>>> fixture = PgTestSetup('launchpad_empty') |
504 | +>>> fixture.setUp() |
505 | +>>> connection = fixture.connect() |
506 | >>> cursor = connection.cursor() |
507 | >>> cursor.execute("SELECT COUNT(*) FROM Person") |
508 | >>> int(cursor.fetchone()[0]) |
509 | 0 |
510 | ->>> PgTestSetup().tearDown() |
511 | +>>> fixture.tearDown() |
512 | |
513 | We can also specify the user that we connect as to avoid connecting as the |
514 | PostgreSQL default user. |
515 | @@ -108,14 +105,12 @@ |
516 | ------------------ |
517 | |
518 | LaunchpadTestSetup is identical to PgTestSetup, except that it creates a |
519 | -fresh copy of the Launchpad database filled with our sample data. This |
520 | -class is defined in canonical.launchpad.ftests.harness. |
521 | - |
522 | -Note that at this level, you cannot access any of the SQLBase objects |
523 | - |
524 | ->>> from canonical.launchpad.ftests.harness import LaunchpadTestSetup |
525 | ->>> LaunchpadTestSetup().setUp() |
526 | ->>> connection = LaunchpadTestSetup().connect() |
527 | +fresh copy of the Launchpad database filled with our sample data. |
528 | + |
529 | +>>> from canonical.testing.layers import LaunchpadTestSetup |
530 | +>>> fixture = LaunchpadTestSetup() |
531 | +>>> fixture.setUp() |
532 | +>>> connection = fixture.connect() |
533 | >>> cursor = connection.cursor() |
534 | >>> cursor.execute("SELECT displayname FROM person WHERE name='carlos'") |
535 | >>> cursor.fetchone()[0] |
536 | @@ -127,7 +122,7 @@ |
537 | >>> cursor.fetchone()[0] |
538 | u'launchpad' |
539 | |
540 | ->>> LaunchpadTestSetup().tearDown() |
541 | +>>> fixture.tearDown() |
542 | |
543 | You can connect as a different database user using the same mechanism |
544 | described above for PgTestSetup |
545 | @@ -143,114 +138,13 @@ |
546 | >>> lpsetup.tearDown() |
547 | |
548 | |
549 | -LaunchpadZopelessTestSetup |
550 | --------------------------- |
551 | - |
552 | -LaunchpadZopelessTestSetup builds on LaunchpadTestSetup, calling |
553 | -initZopeless for you so you can access the SQLBase objects without needing |
554 | -the Zope3 infrastructure. |
555 | - |
556 | ->>> from canonical.launchpad.ftests.harness import LaunchpadZopelessTestSetup |
557 | ->>> LaunchpadZopelessTestSetup().setUp() |
558 | ->>> from lp.registry.model.person import Person |
559 | ->>> stub = Person.byName('stub') |
560 | ->>> stub.displayname |
561 | -u'Stuart Bishop' |
562 | ->>> stub.displayname = u'The Walrus' |
563 | ->>> stub.displayname |
564 | -u'The Walrus' |
565 | - |
566 | -You have access to the zopeless transaction |
567 | - |
568 | ->>> LaunchpadZopelessTestSetup().txn.abort() |
569 | ->>> stub.displayname |
570 | -u'Stuart Bishop' |
571 | - |
572 | -And always remember to tearDown or you will victimize other tests! |
573 | - |
574 | ->>> LaunchpadZopelessTestSetup().tearDown() |
575 | - |
576 | - |
577 | -In general, Zopeless tests should never be running as the launchpad user. |
578 | -You can select the user you connect as: |
579 | - |
580 | ->>> setup = LaunchpadZopelessTestSetup(dbuser=config.librarian.dbuser) |
581 | ->>> setup.setUp() |
582 | ->>> from lp.registry.model.sourcepackagename import SourcePackageName |
583 | ->>> SourcePackageName.get(1).name |
584 | -Traceback (most recent call last): |
585 | -... |
586 | -ProgrammingError: permission denied for relation sourcepackagename |
587 | -<BLANKLINE> |
588 | ->>> setup.tearDown() |
589 | - |
590 | - |
591 | -LaunchpadFunctionalTestSetup |
592 | ----------------------------- |
593 | - |
594 | -One with the lot. A LaunchpadTestSetup which also loads in the Zope3 |
595 | -environment. |
596 | - |
597 | ->>> from canonical.launchpad.ftests.harness import LaunchpadFunctionalTestSetup |
598 | ->>> LaunchpadFunctionalTestSetup().setUp() |
599 | - |
600 | -You have full access to the SQLBase objects |
601 | - |
602 | ->>> mark = Person.byName('mark') |
603 | ->>> mark.displayname |
604 | -u'Mark Shuttleworth' |
605 | - |
606 | -You also have access to the Zope3 component architecture, as registered |
607 | -by ftesting.zcml |
608 | - |
609 | ->>> from zope.app import zapi |
610 | ->>> from zope.sendmail.interfaces import IMailer |
611 | ->>> zapi.getUtility(IMailer, 'smtp') is not None |
612 | -True |
613 | - |
614 | ->>> LaunchpadFunctionalTestSetup().tearDown() |
615 | - |
616 | -You can change the user that the tests connect as: |
617 | - |
618 | - XXX 2008-05-29 jamesh: |
619 | - Using LaunchpadFunctionalLayer for non-webapp db users is generally |
620 | - a sign of a bug. These bits of code should generally be using |
621 | - LaunchpadZopelessLayer. |
622 | - |
623 | -##>>> setup = LaunchpadFunctionalTestSetup(dbuser=config.librarian.dbuser) |
624 | -##>>> setup.setUp() |
625 | -##>>> connection = setup.connect() |
626 | -##>>> cursor = connection.cursor() |
627 | -##>>> cursor.execute('SELECT current_user') |
628 | -##>>> cursor.fetchone()[0] |
629 | -##u'librarian' |
630 | -##>>> SourcePackageName.get(1).name |
631 | -##Traceback (most recent call last): |
632 | -##... |
633 | -##ProgrammingError: permission denied ... |
634 | -##>>> setup.tearDown() |
635 | - |
636 | -And the next test will be unaffected: |
637 | - |
638 | ->>> setup = LaunchpadFunctionalTestSetup() |
639 | ->>> setup.setUp() |
640 | ->>> connection = setup.connect() |
641 | ->>> cursor = connection.cursor() |
642 | ->>> cursor.execute('SELECT current_user') |
643 | ->>> cursor.fetchone()[0] |
644 | -u'launchpad' |
645 | ->>> SourcePackageName.get(1).name |
646 | -u'mozilla-firefox' |
647 | ->>> setup.tearDown() |
648 | - |
649 | - |
650 | LibrarianTestSetup |
651 | ------------------ |
652 | |
653 | Code that needs to access the Librarian can do so easily. Note that |
654 | LibrarianTestSetup requires the Launchpad database to be available, and |
655 | thus requires LaunchpadTestSetup or similar to be used in tandam. |
656 | -You probably really want LaunchpadFunctionalTestSetup so you can access |
657 | +You probably really want LaunchpadFunctionalLayer so you can access |
658 | the Librarian as a Utility. |
659 | |
660 | >>> from canonical.librarian.testing.server import LibrarianTestSetup |
661 | @@ -259,7 +153,6 @@ |
662 | >>> from canonical.librarian.interfaces import ILibrarianClient |
663 | >>> from StringIO import StringIO |
664 | |
665 | ->>> LaunchpadFunctionalTestSetup().setUp() |
666 | >>> librarian = LibrarianTestSetup() |
667 | >>> librarian.setUp() |
668 | >>> login(ANONYMOUS) |
669 | @@ -285,7 +178,6 @@ |
670 | True |
671 | |
672 | >>> librarian.tearDown() |
673 | ->>> LaunchpadFunctionalTestSetup().tearDown() |
674 | |
675 | >>> from canonical.testing import reset_logging |
676 | >>> reset_logging() |
677 | |
678 | === modified file 'lib/canonical/launchpad/doc/security-proxies.txt' |
679 | --- lib/canonical/launchpad/doc/security-proxies.txt 2010-10-09 16:36:22 +0000 |
680 | +++ lib/canonical/launchpad/doc/security-proxies.txt 2010-10-18 06:17:50 +0000 |
681 | @@ -6,13 +6,10 @@ |
682 | |
683 | First, some imports and set up:: |
684 | |
685 | - >>> from canonical.launchpad.ftests.harness import LaunchpadFunctionalTestSetup |
686 | >>> from zope.component import getUtility |
687 | >>> from lp.registry.interfaces.person import IPersonSet |
688 | >>> from lp.registry.model.person import Person |
689 | |
690 | - >>> LaunchpadFunctionalTestSetup().setUp() |
691 | - |
692 | Get a proxied and unproxied person object for the same person, and demonstrate |
693 | working comparisons:: |
694 | |
695 | @@ -57,8 +54,3 @@ |
696 | True |
697 | >>> hoary.status is SeriesStatus.DEVELOPMENT |
698 | False |
699 | - |
700 | -Finally, tear down the test: |
701 | - |
702 | - >>> LaunchpadFunctionalTestSetup().tearDown() |
703 | - |
704 | |
705 | === removed file 'lib/canonical/launchpad/ftests/harness.py' |
706 | --- lib/canonical/launchpad/ftests/harness.py 2010-10-04 19:50:45 +0000 |
707 | +++ lib/canonical/launchpad/ftests/harness.py 1970-01-01 00:00:00 +0000 |
708 | @@ -1,84 +0,0 @@ |
709 | -# Copyright 2009 Canonical Ltd. This software is licensed under the |
710 | -# GNU Affero General Public License version 3 (see the file LICENSE). |
711 | - |
712 | -""" |
713 | -Launchpad functional test helpers. |
714 | - |
715 | -This file needs to be refactored, moving its functionality into |
716 | -canonical.testing |
717 | -""" |
718 | - |
719 | -__metaclass__ = type |
720 | - |
721 | - |
722 | -from zope.app.testing.functional import FunctionalTestSetup |
723 | - |
724 | -from canonical.database.sqlbase import ZopelessTransactionManager |
725 | -from canonical.ftests.pgsql import PgTestSetup |
726 | -from canonical.lp import initZopeless |
727 | -from canonical.testing.layers import ( |
728 | - FunctionalLayer, |
729 | - ZopelessLayer, |
730 | - ) |
731 | -from canonical.testing.layers import ( |
732 | - disconnect_stores, |
733 | - reconnect_stores, |
734 | - ) |
735 | - |
736 | - |
737 | -__all__ = [ |
738 | - 'LaunchpadTestSetup', 'LaunchpadZopelessTestSetup', |
739 | - 'LaunchpadFunctionalTestSetup', |
740 | - ] |
741 | - |
742 | - |
743 | -class LaunchpadTestSetup(PgTestSetup): |
744 | - template = 'launchpad_ftest_template' |
745 | - dbname = 'launchpad_ftest' # Needs to match ftesting.zcml |
746 | - dbuser = 'launchpad' |
747 | - |
748 | - |
749 | -class LaunchpadZopelessTestSetup(LaunchpadTestSetup): |
750 | - txn = ZopelessTransactionManager |
751 | - def setUp(self, dbuser=None): |
752 | - assert ZopelessTransactionManager._installed is None, \ |
753 | - 'Last test using Zopeless failed to tearDown correctly' |
754 | - super(LaunchpadZopelessTestSetup, self).setUp() |
755 | - if self.host is not None: |
756 | - raise NotImplementedError('host not supported yet') |
757 | - if self.port is not None: |
758 | - raise NotImplementedError('port not supported yet') |
759 | - if dbuser is not None: |
760 | - self.dbuser = dbuser |
761 | - initZopeless(dbname=self.dbname, dbuser=self.dbuser) |
762 | - |
763 | - def tearDown(self): |
764 | - LaunchpadZopelessTestSetup.txn.uninstall() |
765 | - assert ZopelessTransactionManager._installed is None, \ |
766 | - 'Failed to tearDown Zopeless correctly' |
767 | - |
768 | - |
769 | -class LaunchpadFunctionalTestSetup(LaunchpadTestSetup): |
770 | - def _checkLayerInvariants(self): |
771 | - assert FunctionalLayer.isSetUp or ZopelessLayer.isSetUp, """ |
772 | - FunctionalTestSetup invoked at an inappropriate time. |
773 | - May only be invoked in the FunctionalLayer or ZopelessLayer |
774 | - """ |
775 | - |
776 | - def setUp(self, dbuser=None): |
777 | - self._checkLayerInvariants() |
778 | - if dbuser is not None: |
779 | - self.dbuser = dbuser |
780 | - assert self.dbuser == 'launchpad', ( |
781 | - "Non-default user names should probably be using " |
782 | - "script layer or zopeless layer.") |
783 | - disconnect_stores() |
784 | - super(LaunchpadFunctionalTestSetup, self).setUp() |
785 | - FunctionalTestSetup().setUp() |
786 | - reconnect_stores() |
787 | - |
788 | - def tearDown(self): |
789 | - self._checkLayerInvariants() |
790 | - FunctionalTestSetup().tearDown() |
791 | - disconnect_stores() |
792 | - super(LaunchpadFunctionalTestSetup, self).tearDown() |
793 | |
794 | === modified file 'lib/canonical/launchpad/interfaces/_schema_circular_imports.py' |
795 | --- lib/canonical/launchpad/interfaces/_schema_circular_imports.py 2010-10-05 08:17:29 +0000 |
796 | +++ lib/canonical/launchpad/interfaces/_schema_circular_imports.py 2010-10-18 06:17:50 +0000 |
797 | @@ -371,6 +371,8 @@ |
798 | patch_collection_return_type( |
799 | IDistroSeries, 'getPackageUploads', IPackageUpload) |
800 | patch_reference_property(IDistroSeries, 'parent_series', IDistroSeries) |
801 | +patch_plain_parameter_type( |
802 | + IDistroSeries, 'deriveDistroSeries', 'distribution', IDistroSeries) |
803 | |
804 | # IDistroSeriesDifferenceComment |
805 | IDistroSeriesDifferenceComment['comment_author'].schema = IPerson |
806 | |
807 | === modified file 'lib/canonical/launchpad/pagetests/standalone/xx-dbpolicy.txt' |
808 | --- lib/canonical/launchpad/pagetests/standalone/xx-dbpolicy.txt 2010-01-13 13:50:39 +0000 |
809 | +++ lib/canonical/launchpad/pagetests/standalone/xx-dbpolicy.txt 2010-10-18 06:17:50 +0000 |
810 | @@ -20,9 +20,11 @@ |
811 | >>> from zope.component import getUtility |
812 | >>> from canonical.launchpad.webapp.interfaces import ( |
813 | ... IStoreSelector, MAIN_STORE, MASTER_FLAVOR, SLAVE_FLAVOR) |
814 | + >>> from canonical.testing.layers import DatabaseLayer |
815 | >>> master = getUtility(IStoreSelector).get(MAIN_STORE, MASTER_FLAVOR) |
816 | - >>> master.execute("SELECT current_database()").get_one()[0] |
817 | - u'launchpad_ftest' |
818 | + >>> dbname = DatabaseLayer._db_fixture.dbname |
819 | + >>> dbname == master.execute("SELECT current_database()").get_one()[0] |
820 | + True |
821 | >>> slave = getUtility(IStoreSelector).get(MAIN_STORE, SLAVE_FLAVOR) |
822 | >>> slave.execute("SELECT current_database()").get_one()[0] |
823 | u'launchpad_empty' |
824 | @@ -47,7 +49,7 @@ |
825 | |
826 | >>> def whichdb(browser): |
827 | ... dbname = extract_text(find_tag_by_id(browser.contents, 'dbname')) |
828 | - ... if dbname == 'launchpad_ftest': |
829 | + ... if dbname == DatabaseLayer._db_fixture.dbname: |
830 | ... return 'MASTER' |
831 | ... elif dbname == 'launchpad_empty': |
832 | ... return 'SLAVE' |
833 | |
834 | === modified file 'lib/canonical/launchpad/tests/test_sampledata.py' |
835 | --- lib/canonical/launchpad/tests/test_sampledata.py 2010-09-22 13:26:50 +0000 |
836 | +++ lib/canonical/launchpad/tests/test_sampledata.py 2010-10-18 06:17:50 +0000 |
837 | @@ -12,7 +12,6 @@ |
838 | __all__ = [] |
839 | |
840 | import subprocess |
841 | -import unittest |
842 | |
843 | from canonical.testing.layers import DatabaseLayer |
844 | from lp.testing import TestCase |
845 | @@ -37,14 +36,11 @@ |
846 | cmd = ( |
847 | "pg_dump --format=c --compress=0 --no-privileges --no-owner" |
848 | " --schema=public %s | pg_restore --clean" |
849 | - " --exit-on-error --dbname=launchpad_ftest" % source_dbname) |
850 | + " --exit-on-error --dbname=%s" % ( |
851 | + source_dbname, DatabaseLayer._db_fixture.dbname)) |
852 | proc = subprocess.Popen( |
853 | cmd, shell=True, stdout=subprocess.PIPE, stderr=subprocess.STDOUT, |
854 | stdin=subprocess.PIPE) |
855 | (stdout, stderr) = proc.communicate() |
856 | rv = proc.wait() |
857 | self.failUnlessEqual(rv, 0, "Dump/Restore failed: %s" % stdout) |
858 | - |
859 | - |
860 | -def test_suite(): |
861 | - return unittest.TestLoader().loadTestsFromName(__name__) |
862 | |
863 | === modified file 'lib/canonical/launchpad/webapp/ftests/test_adapter.txt' |
864 | --- lib/canonical/launchpad/webapp/ftests/test_adapter.txt 2010-09-17 00:53:33 +0000 |
865 | +++ lib/canonical/launchpad/webapp/ftests/test_adapter.txt 2010-10-18 06:17:50 +0000 |
866 | @@ -18,14 +18,18 @@ |
867 | >>> from canonical.launchpad.webapp.adapter import ( |
868 | ... clear_request_started, get_request_statements, |
869 | ... set_request_started) |
870 | + >>> from canonical.testing.layers import DatabaseLayer |
871 | >>> from lp.services.timeline.requesttimeline import get_request_timeline |
872 | |
873 | There are several possible database connections available via the |
874 | IStoreSelector utility. |
875 | |
876 | >>> store = getUtility(IStoreSelector).get(MAIN_STORE, MASTER_FLAVOR) |
877 | - >>> print store.execute("SELECT current_database()").get_one()[0] |
878 | - launchpad_ftest |
879 | + >>> dbname = DatabaseLayer._db_fixture.dbname |
880 | + >>> active_name = store.execute("SELECT current_database()").get_one()[0] |
881 | + >>> if active_name != dbname: print '%s != %s' % (active_name, dbname) |
882 | + >>> active_name == dbname |
883 | + True |
884 | |
885 | |
886 | Statement Logging |
887 | |
888 | === modified file 'lib/canonical/lp/ftests/test_zopeless.py' |
889 | --- lib/canonical/lp/ftests/test_zopeless.py 2010-10-04 19:50:45 +0000 |
890 | +++ lib/canonical/lp/ftests/test_zopeless.py 2010-10-18 06:17:50 +0000 |
891 | @@ -14,9 +14,11 @@ |
892 | from sqlobject import StringCol, IntCol |
893 | |
894 | from canonical.database.sqlbase import SQLBase, alreadyInstalledMsg, cursor |
895 | -from canonical.ftests.pgsql import PgTestSetup |
896 | from canonical.lp import initZopeless |
897 | -from canonical.testing.layers import LaunchpadScriptLayer |
898 | +from canonical.testing.layers import ( |
899 | + DatabaseLayer, |
900 | + LaunchpadScriptLayer, |
901 | + ) |
902 | |
903 | |
904 | class MoreBeer(SQLBase): |
905 | @@ -28,6 +30,7 @@ |
906 | |
907 | |
908 | class TestInitZopeless(unittest.TestCase): |
909 | + |
910 | layer = LaunchpadScriptLayer |
911 | |
912 | def test_initZopelessTwice(self): |
913 | @@ -47,10 +50,11 @@ |
914 | # Calling initZopeless with the same arguments twice should return |
915 | # the exact same object twice, but also emit a warning. |
916 | try: |
917 | - tm1 = initZopeless(dbname=PgTestSetup().dbname, dbhost='', |
918 | - dbuser='launchpad') |
919 | - tm2 = initZopeless(dbname=PgTestSetup().dbname, dbhost='', |
920 | - dbuser='launchpad') |
921 | + dbname = DatabaseLayer._db_fixture.dbname |
922 | + tm1 = initZopeless( |
923 | + dbname=dbname, dbhost='', dbuser='launchpad') |
924 | + tm2 = initZopeless( |
925 | + dbname=dbname, dbhost='', dbuser='launchpad') |
926 | self.failUnless(tm1 is tm2) |
927 | self.failUnless(self.warned) |
928 | finally: |
929 | @@ -65,10 +69,11 @@ |
930 | |
931 | |
932 | class TestZopeless(unittest.TestCase): |
933 | + |
934 | layer = LaunchpadScriptLayer |
935 | |
936 | def setUp(self): |
937 | - self.tm = initZopeless(dbname=PgTestSetup().dbname, |
938 | + self.tm = initZopeless(dbname=DatabaseLayer._db_fixture.dbname, |
939 | dbuser='launchpad') |
940 | |
941 | c = cursor() |
942 | @@ -182,7 +187,7 @@ |
943 | self.tm.commit() |
944 | |
945 | # Make another change from a non-SQLObject connection, and commit that |
946 | - conn = psycopg2.connect('dbname=' + PgTestSetup().dbname) |
947 | + conn = psycopg2.connect('dbname=' + DatabaseLayer._db_fixture.dbname) |
948 | cur = conn.cursor() |
949 | cur.execute("BEGIN TRANSACTION;") |
950 | cur.execute("UPDATE MoreBeer SET rating=4 " |
951 | @@ -202,7 +207,7 @@ |
952 | >>> isZopeless() |
953 | False |
954 | |
955 | - >>> tm = initZopeless(dbname=PgTestSetup().dbname, |
956 | + >>> tm = initZopeless(dbname=DatabaseLayer._db_fixture.dbname, |
957 | ... dbhost='', dbuser='launchpad') |
958 | >>> isZopeless() |
959 | True |
960 | |
961 | === modified file 'lib/canonical/testing/ftests/test_layers.py' |
962 | --- lib/canonical/testing/ftests/test_layers.py 2010-07-26 13:18:18 +0000 |
963 | +++ lib/canonical/testing/ftests/test_layers.py 2010-10-18 06:17:50 +0000 |
964 | @@ -20,16 +20,27 @@ |
965 | from zope.component import getUtility, ComponentLookupError |
966 | |
967 | from canonical.config import config, dbconfig |
968 | -from canonical.launchpad.ftests.harness import LaunchpadTestSetup |
969 | from lazr.config import as_host_port |
970 | from canonical.librarian.client import LibrarianClient, UploadFailed |
971 | from canonical.librarian.interfaces import ILibrarianClient |
972 | from canonical.lazr.pidfile import pidfile_path |
973 | from canonical.testing.layers import ( |
974 | - AppServerLayer, BaseLayer, DatabaseLayer, FunctionalLayer, |
975 | - LaunchpadFunctionalLayer, LaunchpadLayer, LaunchpadScriptLayer, |
976 | - LaunchpadZopelessLayer, LayerInvariantError, LayerIsolationError, |
977 | - LayerProcessController, LibrarianLayer, MemcachedLayer, ZopelessLayer) |
978 | + AppServerLayer, |
979 | + BaseLayer, |
980 | + DatabaseLayer, |
981 | + FunctionalLayer, |
982 | + LaunchpadFunctionalLayer, |
983 | + LaunchpadLayer, |
984 | + LaunchpadScriptLayer, |
985 | + LaunchpadTestSetup, |
986 | + LaunchpadZopelessLayer, |
987 | + LayerInvariantError, |
988 | + LayerIsolationError, |
989 | + LayerProcessController, |
990 | + LibrarianLayer, |
991 | + MemcachedLayer, |
992 | + ZopelessLayer, |
993 | + ) |
994 | from lp.services.memcache.client import memcache_client_factory |
995 | |
996 | class BaseTestCase(unittest.TestCase): |
997 | @@ -123,22 +134,13 @@ |
998 | ) |
999 | |
1000 | def testLaunchpadDbAvailable(self): |
1001 | - try: |
1002 | - con = DatabaseLayer.connect() |
1003 | - cur = con.cursor() |
1004 | - cur.execute("SELECT id FROM Person LIMIT 1") |
1005 | - if cur.fetchone() is not None: |
1006 | - self.failUnless( |
1007 | - self.want_launchpad_database, |
1008 | - 'Launchpad database should not be available.' |
1009 | - ) |
1010 | - return |
1011 | - except psycopg2.Error: |
1012 | - pass |
1013 | - self.failIf( |
1014 | - self.want_launchpad_database, |
1015 | - 'Launchpad database should be available but is not.' |
1016 | - ) |
1017 | + if not self.want_launchpad_database: |
1018 | + self.assertEqual(None, DatabaseLayer._db_fixture) |
1019 | + return |
1020 | + con = DatabaseLayer.connect() |
1021 | + cur = con.cursor() |
1022 | + cur.execute("SELECT id FROM Person LIMIT 1") |
1023 | + self.assertNotEqual(None, cur.fetchone()) |
1024 | |
1025 | def testMemcachedWorking(self): |
1026 | client = MemcachedLayer.client or memcache_client_factory() |
1027 | @@ -424,6 +426,9 @@ |
1028 | # The database should be reset by the test invariants. |
1029 | LayerProcessController.startAppServer() |
1030 | LayerProcessController.postTestInvariants() |
1031 | + # XXX: Robert Collins 2010-10-17 bug=661967 - this isn't a reset, its |
1032 | + # a flag that it *needs* a reset, which is actually quite different; |
1033 | + # the lack of a teardown will leak daabases. |
1034 | self.assertEquals(True, LaunchpadTestSetup()._reset_db) |
1035 | |
1036 | |
1037 | |
1038 | === modified file 'lib/canonical/testing/layers.py' |
1039 | --- lib/canonical/testing/layers.py 2010-10-05 13:25:01 +0000 |
1040 | +++ lib/canonical/testing/layers.py 2010-10-18 06:17:50 +0000 |
1041 | @@ -35,6 +35,7 @@ |
1042 | 'LaunchpadFunctionalLayer', |
1043 | 'LaunchpadLayer', |
1044 | 'LaunchpadScriptLayer', |
1045 | + 'LaunchpadTestSetup', |
1046 | 'LaunchpadZopelessLayer', |
1047 | 'LayerInvariantError', |
1048 | 'LayerIsolationError', |
1049 | @@ -92,6 +93,7 @@ |
1050 | from zope.server.logger.pythonlogger import PythonLogger |
1051 | from zope.testing.testrunner.runner import FakeInputContinueGenerator |
1052 | |
1053 | +from canonical.ftests.pgsql import PgTestSetup |
1054 | from canonical.launchpad.webapp.vhosts import allvhosts |
1055 | from canonical.lazr import pidfile |
1056 | from canonical.config import CanonicalConfig, config, dbconfig |
1057 | @@ -264,12 +266,14 @@ |
1058 | if not BaseLayer.persist_test_services: |
1059 | kill_by_pidfile(MemcachedLayer.getPidFile(), num_polls=0) |
1060 | # Kill any database left lying around from a previous test run. |
1061 | + db_fixture = LaunchpadTestSetup() |
1062 | try: |
1063 | - DatabaseLayer.connect().close() |
1064 | + db_fixture.connect().close() |
1065 | except psycopg2.Error: |
1066 | + # We assume this means 'no test database exists.' |
1067 | pass |
1068 | else: |
1069 | - DatabaseLayer._dropDb() |
1070 | + db_fixture.dropDb() |
1071 | |
1072 | @classmethod |
1073 | @profiled |
1074 | @@ -693,19 +697,19 @@ |
1075 | _reset_between_tests = True |
1076 | |
1077 | _is_setup = False |
1078 | + _db_fixture = None |
1079 | |
1080 | @classmethod |
1081 | @profiled |
1082 | def setUp(cls): |
1083 | cls._is_setup = True |
1084 | - DatabaseLayer.force_dirty_database() |
1085 | - # Imported here to avoid circular import issues. This |
1086 | - # functionality should be migrated into this module at some |
1087 | - # point. -- StuartBishop 20060712 |
1088 | - from canonical.launchpad.ftests.harness import LaunchpadTestSetup |
1089 | - LaunchpadTestSetup().tearDown() |
1090 | - DatabaseLayer._reset_sequences_sql = LaunchpadTestSetup( |
1091 | + # Read the sequences we'll need from the test template database. |
1092 | + reset_sequences_sql = LaunchpadTestSetup( |
1093 | dbname='launchpad_ftest_template').generateResetSequencesSQL() |
1094 | + cls._db_fixture = LaunchpadTestSetup( |
1095 | + reset_sequences_sql=reset_sequences_sql) |
1096 | + cls.force_dirty_database() |
1097 | + cls._db_fixture.tearDown() |
1098 | |
1099 | @classmethod |
1100 | @profiled |
1101 | @@ -716,32 +720,22 @@ |
1102 | # Don't leave the DB lying around or it might break tests |
1103 | # that depend on it not being there on startup, such as found |
1104 | # in test_layers.py |
1105 | - DatabaseLayer.force_dirty_database() |
1106 | - # Imported here to avoid circular import issues. This |
1107 | - # functionality should be migrated into this module at some |
1108 | - # point. -- StuartBishop 20060712 |
1109 | - from canonical.launchpad.ftests.harness import LaunchpadTestSetup |
1110 | - LaunchpadTestSetup().tearDown() |
1111 | - DatabaseLayer._reset_sequences_sql = None |
1112 | + cls.force_dirty_database() |
1113 | + cls._db_fixture.tearDown() |
1114 | + cls._db_fixture = None |
1115 | |
1116 | @classmethod |
1117 | @profiled |
1118 | def testSetUp(cls): |
1119 | - # Imported here to avoid circular import issues. This |
1120 | - # functionality should be migrated into this module at some |
1121 | - # point. -- StuartBishop 20060712 |
1122 | - from canonical.launchpad.ftests.harness import LaunchpadTestSetup |
1123 | - if DatabaseLayer._reset_between_tests: |
1124 | - LaunchpadTestSetup( |
1125 | - reset_sequences_sql=DatabaseLayer._reset_sequences_sql |
1126 | - ).setUp() |
1127 | + if cls._reset_between_tests: |
1128 | + cls._db_fixture.setUp() |
1129 | # Ensure that the database is connectable. Because we might have |
1130 | # just created it, keep trying for a few seconds incase PostgreSQL |
1131 | # is taking its time getting its house in order. |
1132 | attempts = 60 |
1133 | for count in range(0, attempts): |
1134 | try: |
1135 | - DatabaseLayer.connect().close() |
1136 | + cls.connect().close() |
1137 | except psycopg2.Error: |
1138 | if count == attempts - 1: |
1139 | raise |
1140 | @@ -749,24 +743,20 @@ |
1141 | else: |
1142 | break |
1143 | |
1144 | - if DatabaseLayer.use_mockdb is True: |
1145 | - DatabaseLayer.installMockDb() |
1146 | + if cls.use_mockdb is True: |
1147 | + cls.installMockDb() |
1148 | |
1149 | @classmethod |
1150 | @profiled |
1151 | def testTearDown(cls): |
1152 | - if DatabaseLayer.use_mockdb is True: |
1153 | - DatabaseLayer.uninstallMockDb() |
1154 | + if cls.use_mockdb is True: |
1155 | + cls.uninstallMockDb() |
1156 | |
1157 | # Ensure that the database is connectable |
1158 | - DatabaseLayer.connect().close() |
1159 | + cls.connect().close() |
1160 | |
1161 | - # Imported here to avoid circular import issues. This |
1162 | - # functionality should be migrated into this module at some |
1163 | - # point. -- StuartBishop 20060712 |
1164 | - from canonical.launchpad.ftests.harness import LaunchpadTestSetup |
1165 | - if DatabaseLayer._reset_between_tests: |
1166 | - LaunchpadTestSetup().tearDown() |
1167 | + if cls._reset_between_tests: |
1168 | + cls._db_fixture.tearDown() |
1169 | |
1170 | # Fail tests that forget to uninstall their database policies. |
1171 | from canonical.launchpad.webapp.adapter import StoreSelector |
1172 | @@ -781,7 +771,7 @@ |
1173 | @classmethod |
1174 | @profiled |
1175 | def installMockDb(cls): |
1176 | - assert DatabaseLayer.mockdb_mode is None, 'mock db already installed' |
1177 | + assert cls.mockdb_mode is None, 'mock db already installed' |
1178 | |
1179 | from canonical.testing.mockdb import ( |
1180 | script_filename, ScriptRecorder, ScriptPlayer, |
1181 | @@ -795,32 +785,32 @@ |
1182 | # mock db script. |
1183 | filename = script_filename(test_key) |
1184 | if os.path.exists(filename): |
1185 | - DatabaseLayer.mockdb_mode = 'replay' |
1186 | - DatabaseLayer.script = ScriptPlayer(test_key) |
1187 | + cls.mockdb_mode = 'replay' |
1188 | + cls.script = ScriptPlayer(test_key) |
1189 | else: |
1190 | - DatabaseLayer.mockdb_mode = 'record' |
1191 | - DatabaseLayer.script = ScriptRecorder(test_key) |
1192 | + cls.mockdb_mode = 'record' |
1193 | + cls.script = ScriptRecorder(test_key) |
1194 | |
1195 | global _org_connect |
1196 | _org_connect = psycopg2.connect |
1197 | # Proxy real connections with our mockdb. |
1198 | def fake_connect(*args, **kw): |
1199 | - return DatabaseLayer.script.connect(_org_connect, *args, **kw) |
1200 | + return cls.script.connect(_org_connect, *args, **kw) |
1201 | psycopg2.connect = fake_connect |
1202 | |
1203 | @classmethod |
1204 | @profiled |
1205 | def uninstallMockDb(cls): |
1206 | - if DatabaseLayer.mockdb_mode is None: |
1207 | + if cls.mockdb_mode is None: |
1208 | return # Already uninstalled |
1209 | |
1210 | # Store results if we are recording |
1211 | - if DatabaseLayer.mockdb_mode == 'record': |
1212 | - DatabaseLayer.script.store() |
1213 | - assert os.path.exists(DatabaseLayer.script.script_filename), ( |
1214 | + if cls.mockdb_mode == 'record': |
1215 | + cls.script.store() |
1216 | + assert os.path.exists(cls.script.script_filename), ( |
1217 | "Stored results but no script on disk.") |
1218 | |
1219 | - DatabaseLayer.mockdb_mode = None |
1220 | + cls.mockdb_mode = None |
1221 | global _org_connect |
1222 | psycopg2.connect = _org_connect |
1223 | _org_connect = None |
1224 | @@ -828,20 +818,17 @@ |
1225 | @classmethod |
1226 | @profiled |
1227 | def force_dirty_database(cls): |
1228 | - from canonical.launchpad.ftests.harness import LaunchpadTestSetup |
1229 | - LaunchpadTestSetup().force_dirty_database() |
1230 | + cls._db_fixture.force_dirty_database() |
1231 | |
1232 | @classmethod |
1233 | @profiled |
1234 | def connect(cls): |
1235 | - from canonical.launchpad.ftests.harness import LaunchpadTestSetup |
1236 | - return LaunchpadTestSetup().connect() |
1237 | + return cls._db_fixture.connect() |
1238 | |
1239 | @classmethod |
1240 | @profiled |
1241 | def _dropDb(cls): |
1242 | - from canonical.launchpad.ftests.harness import LaunchpadTestSetup |
1243 | - return LaunchpadTestSetup().dropDb() |
1244 | + return cls._db_fixture.dropDb() |
1245 | |
1246 | |
1247 | def test_default_timeout(): |
1248 | @@ -1378,6 +1365,11 @@ |
1249 | reconnect_stores(database_config_section=database_config_section) |
1250 | |
1251 | |
1252 | +class LaunchpadTestSetup(PgTestSetup): |
1253 | + template = 'launchpad_ftest_template' |
1254 | + dbuser = 'launchpad' |
1255 | + |
1256 | + |
1257 | class LaunchpadZopelessLayer(LaunchpadScriptLayer): |
1258 | """Full Zopeless environment including Component Architecture and |
1259 | database connections initialized. |
1260 | @@ -1643,6 +1635,9 @@ |
1261 | # configs/testrunner-appserver/mail-configure.zcml |
1262 | smtp_controller = None |
1263 | |
1264 | + # The DB fixture in use |
1265 | + _db_fixture = None |
1266 | + |
1267 | @classmethod |
1268 | @profiled |
1269 | def startSMTPServer(cls): |
1270 | @@ -1770,9 +1765,12 @@ |
1271 | @classmethod |
1272 | def _runAppServer(cls): |
1273 | """Start the app server using runlaunchpad.py""" |
1274 | - from canonical.launchpad.ftests.harness import LaunchpadTestSetup |
1275 | # The database must be available for the app server to start. |
1276 | - LaunchpadTestSetup().setUp() |
1277 | + cls._db_fixture = LaunchpadTestSetup() |
1278 | + # This is not torn down properly: rather the singleton nature is abused |
1279 | + # and the fixture is simply marked as being dirty. |
1280 | + # XXX: Robert Collins 2010-10-17 bug=661967 |
1281 | + cls._db_fixture.setUp() |
1282 | # The app server will not start at all if the database hasn't been |
1283 | # correctly patched. The app server will make exactly this check, |
1284 | # doing it here makes the error more obvious. |
1285 | |
1286 | === modified file 'lib/lp/bugs/doc/bug-heat.txt' |
1287 | --- lib/lp/bugs/doc/bug-heat.txt 2010-08-23 22:05:26 +0000 |
1288 | +++ lib/lp/bugs/doc/bug-heat.txt 2010-10-18 06:17:50 +0000 |
1289 | @@ -250,7 +250,7 @@ |
1290 | The BugHeatUpdater class is used to create bug heat calculation jobs for |
1291 | bugs with out-of-date heat. |
1292 | |
1293 | - >>> from canonical.launchpad.scripts.garbo import BugHeatUpdater |
1294 | + >>> from lp.scripts.garbo import BugHeatUpdater |
1295 | >>> from canonical.launchpad.scripts import FakeLogger |
1296 | |
1297 | We'll commit the transaction so that the BugHeatUpdater updates the |
1298 | |
1299 | === modified file 'lib/lp/bugs/tests/test_bugwatch.py' |
1300 | --- lib/lp/bugs/tests/test_bugwatch.py 2010-10-04 19:50:45 +0000 |
1301 | +++ lib/lp/bugs/tests/test_bugwatch.py 2010-10-18 06:17:50 +0000 |
1302 | @@ -26,7 +26,7 @@ |
1303 | login, |
1304 | ) |
1305 | from canonical.launchpad.interfaces.launchpad import ILaunchpadCelebrities |
1306 | -from canonical.launchpad.scripts.garbo import BugWatchActivityPruner |
1307 | +from lp.scripts.garbo import BugWatchActivityPruner |
1308 | from canonical.launchpad.scripts.logger import QuietFakeLogger |
1309 | from canonical.launchpad.webapp import urlsplit |
1310 | from canonical.testing.layers import ( |
1311 | |
1312 | === modified file 'lib/lp/code/model/tests/test_revision.py' |
1313 | --- lib/lp/code/model/tests/test_revision.py 2010-10-04 19:50:45 +0000 |
1314 | +++ lib/lp/code/model/tests/test_revision.py 2010-10-18 06:17:50 +0000 |
1315 | @@ -28,7 +28,7 @@ |
1316 | ) |
1317 | from canonical.launchpad.interfaces.account import AccountStatus |
1318 | from canonical.launchpad.interfaces.lpstorm import IMasterObject |
1319 | -from canonical.launchpad.scripts.garbo import RevisionAuthorEmailLinker |
1320 | +from lp.scripts.garbo import RevisionAuthorEmailLinker |
1321 | from canonical.launchpad.webapp.interfaces import ( |
1322 | DEFAULT_FLAVOR, |
1323 | IStoreSelector, |
1324 | |
1325 | === modified file 'lib/lp/code/model/tests/test_revisionauthor.py' |
1326 | --- lib/lp/code/model/tests/test_revisionauthor.py 2010-10-04 19:50:45 +0000 |
1327 | +++ lib/lp/code/model/tests/test_revisionauthor.py 2010-10-18 06:17:50 +0000 |
1328 | @@ -12,7 +12,7 @@ |
1329 | |
1330 | from canonical.config import config |
1331 | from canonical.launchpad.interfaces.emailaddress import EmailAddressStatus |
1332 | -from canonical.launchpad.scripts.garbo import RevisionAuthorEmailLinker |
1333 | +from lp.scripts.garbo import RevisionAuthorEmailLinker |
1334 | from canonical.testing.layers import LaunchpadZopelessLayer |
1335 | from lp.code.model.revision import ( |
1336 | RevisionAuthor, |
1337 | |
1338 | === modified file 'lib/lp/code/scripts/tests/test_revisionkarma.py' |
1339 | --- lib/lp/code/scripts/tests/test_revisionkarma.py 2010-10-04 19:50:45 +0000 |
1340 | +++ lib/lp/code/scripts/tests/test_revisionkarma.py 2010-10-18 06:17:50 +0000 |
1341 | @@ -12,7 +12,7 @@ |
1342 | |
1343 | from canonical.config import config |
1344 | from canonical.launchpad.database.emailaddress import EmailAddressSet |
1345 | -from canonical.launchpad.scripts.garbo import RevisionAuthorEmailLinker |
1346 | +from lp.scripts.garbo import RevisionAuthorEmailLinker |
1347 | from canonical.testing.layers import LaunchpadZopelessLayer |
1348 | from lp.code.model.revision import RevisionSet |
1349 | from lp.code.scripts.revisionkarma import RevisionKarmaAllocator |
1350 | |
1351 | === modified file 'lib/lp/codehosting/tests/test_acceptance.py' |
1352 | --- lib/lp/codehosting/tests/test_acceptance.py 2010-10-04 19:50:45 +0000 |
1353 | +++ lib/lp/codehosting/tests/test_acceptance.py 2010-10-18 06:17:50 +0000 |
1354 | @@ -21,7 +21,6 @@ |
1355 | from zope.component import getUtility |
1356 | |
1357 | from canonical.config import config |
1358 | -from canonical.launchpad.ftests.harness import LaunchpadZopelessTestSetup |
1359 | from canonical.testing.layers import ZopelessAppServerLayer |
1360 | from canonical.testing.profiled import profiled |
1361 | from lp.code.bzr import ( |
1362 | @@ -334,7 +333,7 @@ |
1363 | remote_url = self.getTransportURL('~testuser/+junk/test-branch') |
1364 | self.push(self.local_branch_path, remote_url) |
1365 | self.assertBranchesMatch(self.local_branch_path, remote_url) |
1366 | - LaunchpadZopelessTestSetup().txn.begin() |
1367 | + ZopelessAppServerLayer.txn.begin() |
1368 | db_branch = getUtility(IBranchSet).getByUniqueName( |
1369 | '~testuser/+junk/test-branch') |
1370 | self.assertEqual( |
1371 | @@ -343,7 +342,7 @@ |
1372 | BranchFormat.BZR_BRANCH_7, db_branch.branch_format) |
1373 | self.assertEqual( |
1374 | ControlFormat.BZR_METADIR_1, db_branch.control_format) |
1375 | - LaunchpadZopelessTestSetup().txn.commit() |
1376 | + ZopelessAppServerLayer.txn.commit() |
1377 | |
1378 | def test_push_to_existing_branch(self): |
1379 | """Pushing to an existing branch must work.""" |
1380 | @@ -374,12 +373,12 @@ |
1381 | self.push(self.local_branch_path, remote_url) |
1382 | |
1383 | # Rename owner, product and branch in the database |
1384 | - LaunchpadZopelessTestSetup().txn.begin() |
1385 | + ZopelessAppServerLayer.txn.begin() |
1386 | branch = self.getDatabaseBranch('testuser', None, 'test-branch') |
1387 | branch.owner.name = 'renamed-user' |
1388 | branch.setTarget(user=branch.owner, project=Product.byName('firefox')) |
1389 | branch.name = 'renamed-branch' |
1390 | - LaunchpadZopelessTestSetup().txn.commit() |
1391 | + ZopelessAppServerLayer.txn.commit() |
1392 | |
1393 | # Check that it's not at the old location. |
1394 | self.assertNotBranch( |
1395 | @@ -405,23 +404,23 @@ |
1396 | '~testuser/+junk/totally-new-branch') |
1397 | self.push(self.local_branch_path, remote_url) |
1398 | |
1399 | - LaunchpadZopelessTestSetup().txn.begin() |
1400 | + ZopelessAppServerLayer.txn.begin() |
1401 | branch = self.getDatabaseBranch( |
1402 | 'testuser', None, 'totally-new-branch') |
1403 | |
1404 | self.assertEqual( |
1405 | ['~testuser/+junk/totally-new-branch', self.revid], |
1406 | [branch.unique_name, branch.last_mirrored_id]) |
1407 | - LaunchpadZopelessTestSetup().txn.abort() |
1408 | + ZopelessAppServerLayer.txn.abort() |
1409 | |
1410 | def test_record_default_stacking(self): |
1411 | # If the location being pushed to has a default stacked-on branch, |
1412 | # then branches pushed to that location end up stacked on it by |
1413 | # default. |
1414 | product = self.factory.makeProduct() |
1415 | - LaunchpadZopelessTestSetup().txn.commit() |
1416 | + ZopelessAppServerLayer.txn.commit() |
1417 | |
1418 | - LaunchpadZopelessTestSetup().txn.begin() |
1419 | + ZopelessAppServerLayer.txn.begin() |
1420 | |
1421 | self.make_branch_and_tree('stacked-on') |
1422 | trunk_unique_name = '~testuser/%s/trunk' % product.name |
1423 | @@ -431,7 +430,7 @@ |
1424 | self.factory.enableDefaultStackingForProduct( |
1425 | db_trunk.product, db_trunk) |
1426 | |
1427 | - LaunchpadZopelessTestSetup().txn.commit() |
1428 | + ZopelessAppServerLayer.txn.commit() |
1429 | |
1430 | stacked_unique_name = '~testuser/%s/stacked' % product.name |
1431 | self.push( |
1432 | @@ -447,7 +446,7 @@ |
1433 | # attribute of the database branch, and stacked on location of the new |
1434 | # branch is normalized to be a relative path. |
1435 | product = self.factory.makeProduct() |
1436 | - LaunchpadZopelessTestSetup().txn.commit() |
1437 | + ZopelessAppServerLayer.txn.commit() |
1438 | |
1439 | self.make_branch_and_tree('stacked-on') |
1440 | trunk_unique_name = '~testuser/%s/trunk' % product.name |
1441 | @@ -507,11 +506,11 @@ |
1442 | def test_push_to_new_short_branch_alias(self): |
1443 | # We can also push branches to URLs like /+branch/firefox |
1444 | # Hack 'firefox' so we have permission to do this. |
1445 | - LaunchpadZopelessTestSetup().txn.begin() |
1446 | + ZopelessAppServerLayer.txn.begin() |
1447 | firefox = Product.selectOneBy(name='firefox') |
1448 | testuser = Person.selectOneBy(name='testuser') |
1449 | firefox.development_focus.owner = testuser |
1450 | - LaunchpadZopelessTestSetup().txn.commit() |
1451 | + ZopelessAppServerLayer.txn.commit() |
1452 | remote_url = self.getTransportURL('+branch/firefox') |
1453 | self.push(self.local_branch_path, remote_url) |
1454 | self.assertBranchesMatch(self.local_branch_path, remote_url) |
1455 | @@ -520,10 +519,10 @@ |
1456 | # If a hosted branch exists in the database, but not on the |
1457 | # filesystem, and is writable by the user, then the user is able to |
1458 | # push to it. |
1459 | - LaunchpadZopelessTestSetup().txn.begin() |
1460 | + ZopelessAppServerLayer.txn.begin() |
1461 | branch = self.makeDatabaseBranch('testuser', 'firefox', 'some-branch') |
1462 | remote_url = self.getTransportURL(branch.unique_name) |
1463 | - LaunchpadZopelessTestSetup().txn.commit() |
1464 | + ZopelessAppServerLayer.txn.commit() |
1465 | self.push( |
1466 | self.local_branch_path, remote_url, |
1467 | extra_args=['--use-existing-dir']) |
1468 | @@ -531,21 +530,21 @@ |
1469 | |
1470 | def test_cant_push_to_existing_mirrored_branch(self): |
1471 | # Users cannot push to mirrored branches. |
1472 | - LaunchpadZopelessTestSetup().txn.begin() |
1473 | + ZopelessAppServerLayer.txn.begin() |
1474 | branch = self.makeDatabaseBranch( |
1475 | 'testuser', 'firefox', 'some-branch', BranchType.MIRRORED) |
1476 | remote_url = self.getTransportURL(branch.unique_name) |
1477 | - LaunchpadZopelessTestSetup().txn.commit() |
1478 | + ZopelessAppServerLayer.txn.commit() |
1479 | self.assertCantPush( |
1480 | self.local_branch_path, remote_url, |
1481 | ['Permission denied:', 'Transport operation not possible:']) |
1482 | |
1483 | def test_cant_push_to_existing_unowned_hosted_branch(self): |
1484 | # Users can only push to hosted branches that they own. |
1485 | - LaunchpadZopelessTestSetup().txn.begin() |
1486 | + ZopelessAppServerLayer.txn.begin() |
1487 | branch = self.makeDatabaseBranch('mark', 'firefox', 'some-branch') |
1488 | remote_url = self.getTransportURL(branch.unique_name) |
1489 | - LaunchpadZopelessTestSetup().txn.commit() |
1490 | + ZopelessAppServerLayer.txn.commit() |
1491 | self.assertCantPush( |
1492 | self.local_branch_path, remote_url, |
1493 | ['Permission denied:', 'Transport operation not possible:']) |
1494 | @@ -566,12 +565,12 @@ |
1495 | person_name, product_name, branch_name) |
1496 | |
1497 | # Mark as mirrored. |
1498 | - LaunchpadZopelessTestSetup().txn.begin() |
1499 | + ZopelessAppServerLayer.txn.begin() |
1500 | branch = self.getDatabaseBranch( |
1501 | person_name, product_name, branch_name) |
1502 | branch.branch_type = BranchType.MIRRORED |
1503 | branch.url = "http://example.com/smartservertest/branch" |
1504 | - LaunchpadZopelessTestSetup().txn.commit() |
1505 | + ZopelessAppServerLayer.txn.commit() |
1506 | return ro_branch_url |
1507 | |
1508 | def test_can_read_readonly_branch(self): |
1509 | |
1510 | === modified file 'lib/lp/hardwaredb/doc/hwdb.txt' |
1511 | --- lib/lp/hardwaredb/doc/hwdb.txt 2010-10-09 16:36:22 +0000 |
1512 | +++ lib/lp/hardwaredb/doc/hwdb.txt 2010-10-18 06:17:50 +0000 |
1513 | @@ -375,7 +375,7 @@ |
1514 | ... u'beeblebrox@example.com') |
1515 | >>> user.validateAndEnsurePreferredEmail(email) |
1516 | >>> transaction.commit() |
1517 | - >>> from canonical.launchpad.scripts.garbo import HWSubmissionEmailLinker |
1518 | + >>> from lp.scripts.garbo import HWSubmissionEmailLinker |
1519 | >>> from lp.testing.logger import MockLogger |
1520 | >>> HWSubmissionEmailLinker(log=MockLogger()).run() |
1521 | >>> submission = hw_submission_set.getBySubmissionKey(u'unique-id-2') |
1522 | |
1523 | === modified file 'lib/lp/poppy/tests/test_poppy.py' |
1524 | --- lib/lp/poppy/tests/test_poppy.py 2010-09-28 22:33:42 +0000 |
1525 | +++ lib/lp/poppy/tests/test_poppy.py 2010-10-18 06:17:50 +0000 |
1526 | @@ -49,7 +49,7 @@ |
1527 | self.root_dir, port=self.port, cmd='echo CLOSED') |
1528 | self.poppy.startPoppy() |
1529 | |
1530 | - def tearDown(self): |
1531 | + def cleanUp(self): |
1532 | self.poppy.killPoppy() |
1533 | |
1534 | def getTransport(self): |
1535 | @@ -129,7 +129,7 @@ |
1536 | self._tac = PoppyTac(self.root_dir) |
1537 | self._tac.setUp() |
1538 | |
1539 | - def tearDown(self): |
1540 | + def cleanUp(self): |
1541 | shutil.rmtree(self._home_dir) |
1542 | os.environ['HOME'] = self._current_home |
1543 | self._tac.tearDown() |
1544 | @@ -199,7 +199,7 @@ |
1545 | super(TestPoppy, self).setUp() |
1546 | self.root_dir = self.makeTemporaryDirectory() |
1547 | self.server = self.server_factory(self.root_dir, self.factory) |
1548 | - self.installFixture(self.server) |
1549 | + self.useFixture(self.server) |
1550 | |
1551 | def _uploadPath(self, path): |
1552 | """Return system path of specified path inside an upload. |
1553 | |
1554 | === modified file 'lib/lp/registry/interfaces/distroseries.py' |
1555 | --- lib/lp/registry/interfaces/distroseries.py 2010-10-15 10:54:34 +0000 |
1556 | +++ lib/lp/registry/interfaces/distroseries.py 2010-10-18 06:17:50 +0000 |
1557 | @@ -8,6 +8,7 @@ |
1558 | __metaclass__ = type |
1559 | |
1560 | __all__ = [ |
1561 | + 'DerivationError', |
1562 | 'IDistroSeries', |
1563 | 'IDistroSeriesEditRestricted', |
1564 | 'IDistroSeriesPublic', |
1565 | @@ -16,20 +17,25 @@ |
1566 | |
1567 | from lazr.enum import DBEnumeratedType |
1568 | from lazr.restful.declarations import ( |
1569 | + call_with, |
1570 | export_as_webservice_entry, |
1571 | export_factory_operation, |
1572 | export_read_operation, |
1573 | + export_write_operation, |
1574 | exported, |
1575 | LAZR_WEBSERVICE_EXPORTED, |
1576 | operation_parameters, |
1577 | operation_returns_collection_of, |
1578 | operation_returns_entry, |
1579 | rename_parameters_as, |
1580 | + REQUEST_USER, |
1581 | + webservice_error, |
1582 | ) |
1583 | from lazr.restful.fields import ( |
1584 | Reference, |
1585 | ReferenceChoice, |
1586 | ) |
1587 | +from lazr.restful.interface import copy_field |
1588 | from zope.component import getUtility |
1589 | from zope.interface import ( |
1590 | Attribute, |
1591 | @@ -39,6 +45,7 @@ |
1592 | Bool, |
1593 | Choice, |
1594 | Datetime, |
1595 | + List, |
1596 | Object, |
1597 | TextLine, |
1598 | ) |
1599 | @@ -673,8 +680,8 @@ |
1600 | If sourcename is passed, only packages that are built from |
1601 | source packages by that name will be returned. |
1602 | If archive is passed, restricted the results to the given archive, |
1603 | - if it is suppressed the results will be restricted to the distribtion |
1604 | - 'main_archive'. |
1605 | + if it is suppressed the results will be restricted to the |
1606 | + distribution 'main_archive'. |
1607 | """ |
1608 | |
1609 | def getSourcePackagePublishing(status, pocket, component=None, |
1610 | @@ -683,8 +690,8 @@ |
1611 | |
1612 | According status and pocket. |
1613 | If archive is passed, restricted the results to the given archive, |
1614 | - if it is suppressed the results will be restricted to the distribtion |
1615 | - 'main_archive'. |
1616 | + if it is suppressed the results will be restricted to the |
1617 | + distribution 'main_archive'. |
1618 | """ |
1619 | |
1620 | def getBinaryPackageCaches(archive=None): |
1621 | @@ -789,6 +796,69 @@ |
1622 | :param format: The SourcePackageFormat to check. |
1623 | """ |
1624 | |
1625 | + @operation_parameters( |
1626 | + name=copy_field(name, required=True), |
1627 | + displayname=copy_field(displayname, required=False), |
1628 | + title=copy_field(title, required=False), |
1629 | + summary=TextLine( |
1630 | + title=_("The summary of the distroseries to derive."), |
1631 | + required=False), |
1632 | + description=copy_field(description, required=False), |
1633 | + version=copy_field(version, required=False), |
1634 | + distribution=copy_field(distribution, required=False), |
1635 | + status=copy_field(status, required=False), |
1636 | + architectures=List( |
1637 | + title=_("The list of architectures to copy to the derived " |
1638 | + "distroseries."), |
1639 | + required=False), |
1640 | + packagesets=List( |
1641 | + title=_("The list of packagesets to copy to the derived " |
1642 | + "distroseries"), |
1643 | + required=False), |
1644 | + rebuild=Bool( |
1645 | + title=_("If binaries will be copied to the derived " |
1646 | + "distroseries."), |
1647 | + required=True), |
1648 | + ) |
1649 | + @call_with(user=REQUEST_USER) |
1650 | + @export_write_operation() |
1651 | + def deriveDistroSeries(user, name, displayname, title, summary, |
1652 | + description, version, distribution, status, |
1653 | + architectures, packagesets, rebuild): |
1654 | + """Derive a distroseries from this one. |
1655 | + |
1656 | + This method performs checks, can create the new distroseries if |
1657 | + necessary, and then creates a job to populate the new |
1658 | + distroseries. |
1659 | + |
1660 | + :param name: The name of the new distroseries we will create if it |
1661 | + doesn't exist, or the name of the distroseries we will look |
1662 | + up, and then initialise. |
1663 | + :param displayname: The Display Name for the new distroseries. |
1664 | + If the distroseries already exists this parameter is ignored. |
1665 | + :param title: The Title for the new distroseries. If the |
1666 | + distroseries already exists this parameter is ignored. |
1667 | + :param summary: The Summary for the new distroseries. If the |
1668 | + distroseries already exists this parameter is ignored. |
1669 | + :param description: The Description for the new distroseries. If the |
1670 | + distroseries already exists this parameter is ignored. |
1671 | + :param version: The version for the new distroseries. If the |
1672 | + distroseries already exists this parameter is ignored. |
1673 | + :param distribution: The distribution the derived series will |
1674 | + belong to. If it isn't specified this distroseries' |
1675 | + distribution is used. |
1676 | + :param status: The status the new distroseries will be created |
1677 | + in. If the distroseries isn't specified, this parameter will |
1678 | + be ignored. Defaults to FROZEN. |
1679 | + :param architectures: The architectures to copy to the derived |
1680 | + series. If not specified, all of the architectures are copied. |
1681 | + :param packagesets: The packagesets to copy to the derived series. |
1682 | + If not specified, all of the packagesets are copied. |
1683 | + :param rebuild: Whether binaries will be copied to the derived |
1684 | + series. If it's true, they will not be, and if it's false, they |
1685 | + will be. |
1686 | + """ |
1687 | + |
1688 | |
1689 | class IDistroSeries(IDistroSeriesEditRestricted, IDistroSeriesPublic, |
1690 | IStructuralSubscriptionTarget): |
1691 | @@ -856,5 +926,11 @@ |
1692 | """ |
1693 | |
1694 | |
1695 | +class DerivationError(Exception): |
1696 | + """Raised when there is a problem deriving a distroseries.""" |
1697 | + webservice_error(400) # Bad Request |
1698 | + _message_prefix = "Error deriving distro series" |
1699 | + |
1700 | + |
1701 | # Monkey patch for circular import avoidance done in |
1702 | # _schema_circular_imports.py |
1703 | |
1704 | === modified file 'lib/lp/registry/model/distroseries.py' |
1705 | --- lib/lp/registry/model/distroseries.py 2010-10-15 10:54:34 +0000 |
1706 | +++ lib/lp/registry/model/distroseries.py 2010-10-18 06:17:50 +0000 |
1707 | @@ -35,6 +35,7 @@ |
1708 | ) |
1709 | from zope.component import getUtility |
1710 | from zope.interface import implements |
1711 | +from zope.security.interfaces import Unauthorized |
1712 | |
1713 | from canonical.database.constants import ( |
1714 | DEFAULT, |
1715 | @@ -88,6 +89,7 @@ |
1716 | ) |
1717 | from lp.bugs.model.bugtask import BugTask |
1718 | from lp.registry.interfaces.distroseries import ( |
1719 | + DerivationError, |
1720 | IDistroSeries, |
1721 | IDistroSeriesSet, |
1722 | ) |
1723 | @@ -128,6 +130,9 @@ |
1724 | from lp.soyuz.interfaces.binarypackagebuild import IBinaryPackageBuildSet |
1725 | from lp.soyuz.interfaces.binarypackagename import IBinaryPackageName |
1726 | from lp.soyuz.interfaces.buildrecords import IHasBuildRecords |
1727 | +from lp.soyuz.interfaces.distributionjob import ( |
1728 | + IInitialiseDistroSeriesJobSource, |
1729 | + ) |
1730 | from lp.soyuz.interfaces.publishing import ( |
1731 | active_publishing_status, |
1732 | ICanPublishPackages, |
1733 | @@ -162,6 +167,10 @@ |
1734 | ) |
1735 | from lp.soyuz.model.section import Section |
1736 | from lp.soyuz.model.sourcepackagerelease import SourcePackageRelease |
1737 | +from lp.soyuz.scripts.initialise_distroseries import ( |
1738 | + InitialisationError, |
1739 | + InitialiseDistroSeries, |
1740 | + ) |
1741 | from lp.translations.interfaces.languagepack import LanguagePackType |
1742 | from lp.translations.model.distroseries_translations_copy import ( |
1743 | copy_active_translations, |
1744 | @@ -1847,6 +1856,59 @@ |
1745 | ISourcePackageFormatSelectionSet).getBySeriesAndFormat( |
1746 | self, format) is not None |
1747 | |
1748 | + def deriveDistroSeries(self, user, name, distribution=None, |
1749 | + displayname=None, title=None, summary=None, |
1750 | + description=None, version=None, |
1751 | + status=SeriesStatus.FROZEN, architectures=(), |
1752 | + packagesets=(), rebuild=False): |
1753 | + """See `IDistroSeries`.""" |
1754 | + # XXX StevenK bug=643369 This should be in the security adapter |
1755 | + # This should be allowed if the user is a driver for self.parent |
1756 | + # or the child.parent's drivers. |
1757 | + if not (user.inTeam('soyuz-team') or user.inTeam('admins')): |
1758 | + raise Unauthorized |
1759 | + child = IStore(self).find(DistroSeries, name=name).one() |
1760 | + if child is None: |
1761 | + if distribution is None: |
1762 | + distribution = self.distribution |
1763 | + if not displayname: |
1764 | + raise DerivationError( |
1765 | + "Display Name needs to be set when creating a " |
1766 | + "distroseries.") |
1767 | + if not title: |
1768 | + raise DerivationError( |
1769 | + "Title needs to be set when creating a distroseries.") |
1770 | + if not summary: |
1771 | + raise DerivationError( |
1772 | + "Summary needs to be set when creating a " |
1773 | + "distroseries.") |
1774 | + if not description: |
1775 | + raise DerivationError( |
1776 | + "Description needs to be set when creating a " |
1777 | + "distroseries.") |
1778 | + if not version: |
1779 | + raise DerivationError( |
1780 | + "Version needs to be set when creating a " |
1781 | + "distroseries.") |
1782 | + child = distribution.newSeries( |
1783 | + name=name, displayname=displayname, title=title, |
1784 | + summary=summary, description=description, |
1785 | + version=version, parent_series=self, owner=user) |
1786 | + child.status = status |
1787 | + IStore(self).add(child) |
1788 | + else: |
1789 | + if child.parent_series is not self: |
1790 | + raise DerivationError( |
1791 | + "DistroSeries %s parent series isn't %s" % ( |
1792 | + child.name, self.name)) |
1793 | + initialise_series = InitialiseDistroSeries(child) |
1794 | + try: |
1795 | + initialise_series.check() |
1796 | + except InitialisationError, e: |
1797 | + raise DerivationError(e) |
1798 | + getUtility(IInitialiseDistroSeriesJobSource).create( |
1799 | + child, architectures, packagesets, rebuild) |
1800 | + |
1801 | |
1802 | class DistroSeriesSet: |
1803 | implements(IDistroSeriesSet) |
1804 | |
1805 | === added file 'lib/lp/registry/stories/webservice/xx-derivedistroseries.txt' |
1806 | --- lib/lp/registry/stories/webservice/xx-derivedistroseries.txt 1970-01-01 00:00:00 +0000 |
1807 | +++ lib/lp/registry/stories/webservice/xx-derivedistroseries.txt 2010-10-18 06:17:50 +0000 |
1808 | @@ -0,0 +1,68 @@ |
1809 | +Derive Distributions |
1810 | +-------------------- |
1811 | + |
1812 | +Using the DistroSeries.deriveDistroSeries() function, we can call it with the |
1813 | +parent distroseries. We can call it with the distroseries already created, |
1814 | +or it can create it for the user. |
1815 | + |
1816 | +Set Up |
1817 | +------ |
1818 | + |
1819 | + >>> login('admin@canonical.com') |
1820 | + >>> soyuz = factory.makeTeam(name='soyuz-team') |
1821 | + >>> parent = factory.makeDistroSeries() |
1822 | + >>> child = factory.makeDistroSeries(parent_series=parent) |
1823 | + >>> other = factory.makeDistroSeries() |
1824 | + >>> logout() |
1825 | + >>> from canonical.launchpad.testing.pages import webservice_for_person |
1826 | + >>> from canonical.launchpad.webapp.interfaces import OAuthPermission |
1827 | + >>> soyuz_webservice = webservice_for_person( |
1828 | + ... soyuz.teamowner, permission=OAuthPermission.WRITE_PUBLIC) |
1829 | + |
1830 | +Calling |
1831 | +------- |
1832 | + |
1833 | +We can't call .deriveDistroSeries() with a distroseries that isn't the |
1834 | +child's parent |
1835 | + |
1836 | + >>> series_url = '/%s/%s' % (parent.parent.name, parent.name) |
1837 | + >>> other_series_url = '/%s/%s' % ( |
1838 | + ... other.parent.name, other.name) |
1839 | + >>> child_name = child.name |
1840 | + >>> series = webservice.get(series_url).jsonBody() |
1841 | + >>> other_series = webservice.get(other_series_url).jsonBody() |
1842 | + >>> derived = soyuz_webservice.named_post( |
1843 | + ... other_series['self_link'], 'deriveDistroSeries', {}, |
1844 | + ... name=child_name, rebuild=False) |
1845 | + >>> print derived |
1846 | + HTTP/1.1 400 Bad Request |
1847 | + Status: 400 Bad Request |
1848 | + ... |
1849 | + <BLANKLINE> |
1850 | + DistroSeries ... parent series isn't ... |
1851 | + <BLANKLINE> |
1852 | + ... |
1853 | + |
1854 | +If we call it correctly, it works. |
1855 | + |
1856 | + >>> derived = soyuz_webservice.named_post( |
1857 | + ... series['self_link'], 'deriveDistroSeries', {}, |
1858 | + ... name=child_name, rebuild=False) |
1859 | + >>> print derived |
1860 | + HTTP/1.1 200 Ok |
1861 | + Status: 200 Ok |
1862 | + ... |
1863 | + <BLANKLINE> |
1864 | + ... |
1865 | + |
1866 | +And we can verify the job exists. |
1867 | + |
1868 | + >>> from zope.component import getUtility |
1869 | + >>> from lp.soyuz.interfaces.distributionjob import ( |
1870 | + ... IInitialiseDistroSeriesJobSource) |
1871 | + >>> login('admin@canonical.com') |
1872 | + >>> [job] = list( |
1873 | + ... getUtility(IInitialiseDistroSeriesJobSource).iterReady()) |
1874 | + >>> job.distroseries == child |
1875 | + True |
1876 | + |
1877 | |
1878 | === added file 'lib/lp/registry/tests/test_derivedistroseries.py' |
1879 | --- lib/lp/registry/tests/test_derivedistroseries.py 1970-01-01 00:00:00 +0000 |
1880 | +++ lib/lp/registry/tests/test_derivedistroseries.py 2010-10-18 06:17:50 +0000 |
1881 | @@ -0,0 +1,76 @@ |
1882 | +# Copyright 2010 Canonical Ltd. This software is licensed under the |
1883 | +# GNU Affero General Public License version 3 (see the file LICENSE). |
1884 | + |
1885 | +"""Test initialising a distroseries using |
1886 | +IDistroSeries.deriveDistroSeries.""" |
1887 | + |
1888 | +__metaclass__ = type |
1889 | + |
1890 | +from canonical.testing.layers import LaunchpadFunctionalLayer |
1891 | +from lp.registry.interfaces.distroseries import DerivationError |
1892 | +from lp.soyuz.interfaces.distributionjob import ( |
1893 | + IInitialiseDistroSeriesJobSource, |
1894 | + ) |
1895 | +from lp.testing import ( |
1896 | + login, |
1897 | + logout, |
1898 | + TestCaseWithFactory, |
1899 | + ) |
1900 | +from lp.testing.sampledata import ADMIN_EMAIL |
1901 | +from zope.component import getUtility |
1902 | +from zope.security.interfaces import Unauthorized |
1903 | + |
1904 | + |
1905 | +class TestDeriveDistroSeries(TestCaseWithFactory): |
1906 | + |
1907 | + layer = LaunchpadFunctionalLayer |
1908 | + |
1909 | + def setUp(self): |
1910 | + super(TestDeriveDistroSeries, self).setUp() |
1911 | + self.soyuz = self.factory.makeTeam(name='soyuz-team') |
1912 | + self.parent = self.factory.makeDistroSeries() |
1913 | + self.child = self.factory.makeDistroSeries( |
1914 | + parent_series=self.parent) |
1915 | + |
1916 | + def test_no_permission_to_call(self): |
1917 | + login(ADMIN_EMAIL) |
1918 | + person = self.factory.makePerson() |
1919 | + logout() |
1920 | + self.assertRaises( |
1921 | + Unauthorized, self.parent.deriveDistroSeries, person, |
1922 | + self.child.name) |
1923 | + |
1924 | + def test_no_distroseries_and_no_arguments(self): |
1925 | + """Test that calling deriveDistroSeries() when the distroseries |
1926 | + doesn't exist, and not enough arguments are specified that the |
1927 | + function errors.""" |
1928 | + self.assertRaisesWithContent( |
1929 | + DerivationError, |
1930 | + 'Display Name needs to be set when creating a distroseries.', |
1931 | + self.parent.deriveDistroSeries, self.soyuz.teamowner, |
1932 | + 'newdistro') |
1933 | + |
1934 | + def test_parent_is_not_self(self): |
1935 | + other = self.factory.makeDistroSeries() |
1936 | + self.assertRaisesWithContent( |
1937 | + DerivationError, |
1938 | + "DistroSeries %s parent series isn't %s" % ( |
1939 | + self.child.name, other.name), |
1940 | + other.deriveDistroSeries, self.soyuz.teamowner, |
1941 | + self.child.name) |
1942 | + |
1943 | + def test_create_new_distroseries(self): |
1944 | + self.parent.deriveDistroSeries( |
1945 | + self.soyuz.teamowner, self.child.name) |
1946 | + [job] = list( |
1947 | + getUtility(IInitialiseDistroSeriesJobSource).iterReady()) |
1948 | + self.assertEqual(job.distroseries, self.child) |
1949 | + |
1950 | + def test_create_fully_new_distroseries(self): |
1951 | + self.parent.deriveDistroSeries( |
1952 | + self.soyuz.teamowner, 'deribuntu', displayname='Deribuntu', |
1953 | + title='The Deribuntu', summary='Deribuntu', |
1954 | + description='Deribuntu is great', version='11.11') |
1955 | + [job] = list( |
1956 | + getUtility(IInitialiseDistroSeriesJobSource).iterReady()) |
1957 | + self.assertEqual(job.distroseries.name, 'deribuntu') |
1958 | |
1959 | === renamed file 'lib/canonical/launchpad/scripts/garbo.py' => 'lib/lp/scripts/garbo.py' |
1960 | === added directory 'lib/lp/scripts/tests' |
1961 | === added file 'lib/lp/scripts/tests/__init__.py' |
1962 | === renamed file 'lib/canonical/launchpad/scripts/tests/test_garbo.py' => 'lib/lp/scripts/tests/test_garbo.py' |
1963 | --- lib/canonical/launchpad/scripts/tests/test_garbo.py 2010-10-03 15:30:06 +0000 |
1964 | +++ lib/lp/scripts/tests/test_garbo.py 2010-10-18 06:17:50 +0000 |
1965 | @@ -34,7 +34,7 @@ |
1966 | from canonical.launchpad.database.openidconsumer import OpenIDConsumerNonce |
1967 | from canonical.launchpad.interfaces import IMasterStore |
1968 | from canonical.launchpad.interfaces.emailaddress import EmailAddressStatus |
1969 | -from canonical.launchpad.scripts.garbo import ( |
1970 | +from lp.scripts.garbo import ( |
1971 | DailyDatabaseGarbageCollector, |
1972 | HourlyDatabaseGarbageCollector, |
1973 | OpenIDConsumerAssociationPruner, |
1974 | |
1975 | === modified file 'lib/lp/scripts/utilities/importfascist.py' |
1976 | --- lib/lp/scripts/utilities/importfascist.py 2010-09-03 04:14:41 +0000 |
1977 | +++ lib/lp/scripts/utilities/importfascist.py 2010-10-18 06:17:50 +0000 |
1978 | @@ -35,7 +35,7 @@ |
1979 | canonical.launchpad.feed.branch |
1980 | lp.code.feed.branch |
1981 | canonical.launchpad.interfaces.person |
1982 | - canonical.launchpad.scripts.garbo |
1983 | + lp.scripts.garbo |
1984 | canonical.launchpad.vocabularies.dbobjects |
1985 | lp.registry.vocabularies |
1986 | canonical.librarian.client |
1987 | |
1988 | === modified file 'lib/lp/soyuz/configure.zcml' |
1989 | --- lib/lp/soyuz/configure.zcml 2010-10-06 18:53:53 +0000 |
1990 | +++ lib/lp/soyuz/configure.zcml 2010-10-18 06:17:50 +0000 |
1991 | @@ -905,9 +905,12 @@ |
1992 | provides="lp.soyuz.interfaces.distributionjob.IInitialiseDistroSeriesJobSource"> |
1993 | <allow interface="lp.soyuz.interfaces.distributionjob.IInitialiseDistroSeriesJobSource"/> |
1994 | </securedutility> |
1995 | - |
1996 | + <class class="lp.soyuz.model.distributionjob.DistributionJob"> |
1997 | + <allow interface="lp.soyuz.interfaces.distributionjob.IDistributionJob" /> |
1998 | + </class> |
1999 | <class class="lp.soyuz.model.initialisedistroseriesjob.InitialiseDistroSeriesJob"> |
2000 | - <allow interface="lp.services.job.interfaces.job.IRunnableJob" /> |
2001 | + <allow interface="lp.soyuz.interfaces.distributionjob.IInitialiseDistroSeriesJob" /> |
2002 | + <allow interface="lp.soyuz.interfaces.distributionjob.IDistributionJob" /> |
2003 | </class> |
2004 | |
2005 | </configure> |
2006 | |
2007 | === modified file 'lib/lp/soyuz/doc/sampledata-setup.txt' |
2008 | --- lib/lp/soyuz/doc/sampledata-setup.txt 2010-08-13 02:59:14 +0000 |
2009 | +++ lib/lp/soyuz/doc/sampledata-setup.txt 2010-10-18 06:17:50 +0000 |
2010 | @@ -21,5 +21,5 @@ |
2011 | INFO ... |
2012 | INFO Done. |
2013 | |
2014 | - >>> from canonical.launchpad.ftests.harness import LaunchpadTestSetup |
2015 | - >>> LaunchpadTestSetup().force_dirty_database() |
2016 | + >>> from canonical.testing.layers import DatabaseLayer |
2017 | + >>> DatabaseLayer.force_dirty_database() |
2018 | |
2019 | === modified file 'lib/lp/soyuz/scripts/initialise_distroseries.py' |
2020 | --- lib/lp/soyuz/scripts/initialise_distroseries.py 2010-10-14 12:56:31 +0000 |
2021 | +++ lib/lp/soyuz/scripts/initialise_distroseries.py 2010-10-18 06:17:50 +0000 |
2022 | @@ -16,7 +16,6 @@ |
2023 | from canonical.launchpad.interfaces.lpstorm import IMasterStore |
2024 | from lp.buildmaster.enums import BuildStatus |
2025 | from lp.registry.interfaces.pocket import PackagePublishingPocket |
2026 | -from lp.registry.model.distroseries import DistroSeries |
2027 | from lp.soyuz.adapters.packagelocation import PackageLocation |
2028 | from lp.soyuz.enums import ( |
2029 | ArchivePurpose, |
2030 | @@ -61,7 +60,8 @@ |
2031 | |
2032 | def __init__( |
2033 | self, distroseries, arches=(), packagesets=(), rebuild=False): |
2034 | - |
2035 | + # Avoid circular imports |
2036 | + from lp.registry.model.distroseries import DistroSeries |
2037 | self.distroseries = distroseries |
2038 | self.parent = self.distroseries.parent_series |
2039 | self.arches = arches |
2040 | |
2041 | === modified file 'lib/lp/soyuz/scripts/tests/test_buildd_cronscripts.py' |
2042 | --- lib/lp/soyuz/scripts/tests/test_buildd_cronscripts.py 2010-10-04 19:50:45 +0000 |
2043 | +++ lib/lp/soyuz/scripts/tests/test_buildd_cronscripts.py 2010-10-18 06:17:50 +0000 |
2044 | @@ -74,9 +74,8 @@ |
2045 | rc, out, err = runner() |
2046 | self.assertEqual(0, rc, "Err:\n%s" % err) |
2047 | |
2048 | - # 'runners' commit to the launchpad_ftest database in |
2049 | - # subprocesses, so we need to tell the layer to fully |
2050 | - # tear down and restore the database. |
2051 | + # 'runners' commit to the test database in subprocesses, so we need to |
2052 | + # tell the layer to fully tear down and restore the database. |
2053 | DatabaseLayer.force_dirty_database() |
2054 | |
2055 | return rc, out, err |
2056 | |
2057 | === modified file 'lib/lp/testing/__init__.py' |
2058 | --- lib/lp/testing/__init__.py 2010-10-05 01:54:15 +0000 |
2059 | +++ lib/lp/testing/__init__.py 2010-10-18 06:17:50 +0000 |
2060 | @@ -325,20 +325,6 @@ |
2061 | transaction.commit() |
2062 | self.layer.switchDbUser(dbuser) |
2063 | |
2064 | - def installFixture(self, fixture): |
2065 | - """Install 'fixture', an object that has a `setUp` and `tearDown`. |
2066 | - |
2067 | - `installFixture` will run 'fixture.setUp' and schedule |
2068 | - 'fixture.tearDown' to be run during the test's tear down (using |
2069 | - `addCleanup`). |
2070 | - |
2071 | - :param fixture: Any object that has a `setUp` and `tearDown` method. |
2072 | - :return: `fixture`. |
2073 | - """ |
2074 | - fixture.setUp() |
2075 | - self.addCleanup(fixture.tearDown) |
2076 | - return fixture |
2077 | - |
2078 | def __str__(self): |
2079 | """The string representation of a test is its id. |
2080 | |
2081 | @@ -511,7 +497,7 @@ |
2082 | self.factory = ObjectFactory() |
2083 | # Record the oopses generated during the test run. |
2084 | self.oopses = [] |
2085 | - self.installFixture(ZopeEventHandlerFixture(self._recordOops)) |
2086 | + self.useFixture(ZopeEventHandlerFixture(self._recordOops)) |
2087 | self.addCleanup(self.attachOopses) |
2088 | |
2089 | @adapter(ErrorReportEvent) |
2090 | |
2091 | === modified file 'lib/lp/testing/fixture.py' |
2092 | --- lib/lp/testing/fixture.py 2010-08-20 20:31:18 +0000 |
2093 | +++ lib/lp/testing/fixture.py 2010-10-18 06:17:50 +0000 |
2094 | @@ -1,121 +1,25 @@ |
2095 | # Copyright 2009, 2010 Canonical Ltd. This software is licensed under the |
2096 | # GNU Affero General Public License version 3 (see the file LICENSE). |
2097 | |
2098 | -# pylint: disable-msg=E0211 |
2099 | - |
2100 | -"""Basic support for 'fixtures'. |
2101 | - |
2102 | -In this case, 'fixture' means an object that has a setUp and a tearDown |
2103 | -method. |
2104 | -""" |
2105 | +"""Launchpad test fixtures that have no better home.""" |
2106 | |
2107 | __metaclass__ = type |
2108 | __all__ = [ |
2109 | - 'Fixtures', |
2110 | - 'FixtureWithCleanup', |
2111 | - 'IFixture', |
2112 | - 'run_with_fixture', |
2113 | - 'ServerFixture', |
2114 | - 'with_fixture', |
2115 | + 'ZopeEventHandlerFixture', |
2116 | ] |
2117 | |
2118 | -from twisted.python.util import mergeFunctionMetadata |
2119 | +from fixtures import Fixture |
2120 | from zope.component import ( |
2121 | getGlobalSiteManager, |
2122 | provideHandler, |
2123 | ) |
2124 | -from zope.interface import ( |
2125 | - implements, |
2126 | - Interface, |
2127 | - ) |
2128 | - |
2129 | - |
2130 | -class IFixture(Interface): |
2131 | - """A fixture has a setUp and a tearDown method.""" |
2132 | - |
2133 | - def setUp(): |
2134 | - """Set up the fixture.""" |
2135 | - |
2136 | - def tearDown(): |
2137 | - """Tear down the fixture.""" |
2138 | - |
2139 | - |
2140 | -class FixtureWithCleanup: |
2141 | - """Fixture that allows arbitrary cleanup methods to be added. |
2142 | - |
2143 | - Subclass this if you'd like to define a fixture that calls 'addCleanup'. |
2144 | - This is most often useful for fixtures that provide a way for users to |
2145 | - acquire resources arbitrarily. |
2146 | - |
2147 | - Cleanups are run during 'tearDown' in reverse order to the order they were |
2148 | - added. If any of the cleanups raise an error, this error will be bubbled |
2149 | - up, causing tearDown to raise an exception, and the rest of the cleanups |
2150 | - will be run in a finally block. |
2151 | - """ |
2152 | - |
2153 | - implements(IFixture) |
2154 | - |
2155 | - def setUp(self): |
2156 | - """See `IFixture`.""" |
2157 | - self._cleanups = [] |
2158 | - |
2159 | - def _runCleanups(self): |
2160 | - if [] == self._cleanups: |
2161 | - return |
2162 | - f, args, kwargs = self._cleanups.pop() |
2163 | - try: |
2164 | - f(*args, **kwargs) |
2165 | - finally: |
2166 | - self._runCleanups() |
2167 | - |
2168 | - def tearDown(self): |
2169 | - """See `IFixture`.""" |
2170 | - self._runCleanups() |
2171 | - |
2172 | - def addCleanup(self, function, *args, **kwargs): |
2173 | - """Run 'function' with arguments during tear down.""" |
2174 | - self._cleanups.append((function, args, kwargs)) |
2175 | - |
2176 | - |
2177 | -class Fixtures(FixtureWithCleanup): |
2178 | - """A collection of `IFixture`s.""" |
2179 | - |
2180 | - def __init__(self, fixtures): |
2181 | - """Construct a fixture that groups many fixtures together. |
2182 | - |
2183 | - :param fixtures: A list of `IFixture` objects. |
2184 | - """ |
2185 | - self._fixtures = fixtures |
2186 | - |
2187 | - def setUp(self): |
2188 | - super(Fixtures, self).setUp() |
2189 | - for fixture in self._fixtures: |
2190 | - fixture.setUp() |
2191 | - self.addCleanup(fixture.tearDown) |
2192 | - |
2193 | - |
2194 | -def with_fixture(fixture): |
2195 | - """Decorate a function to run with a given fixture.""" |
2196 | - def decorator(f): |
2197 | - def decorated(*args, **kwargs): |
2198 | - return run_with_fixture(fixture, f, fixture, *args, **kwargs) |
2199 | - return mergeFunctionMetadata(f, decorated) |
2200 | - return decorator |
2201 | - |
2202 | - |
2203 | -def run_with_fixture(fixture, f, *args, **kwargs): |
2204 | - """Run `f` within the given `fixture`.""" |
2205 | - try: |
2206 | - fixture.setUp() |
2207 | - return f(*args, **kwargs) |
2208 | - finally: |
2209 | - fixture.tearDown() |
2210 | - |
2211 | - |
2212 | -class ZopeEventHandlerFixture(FixtureWithCleanup): |
2213 | + |
2214 | + |
2215 | +class ZopeEventHandlerFixture(Fixture): |
2216 | """A fixture that provides and then unprovides a Zope event handler.""" |
2217 | |
2218 | def __init__(self, handler): |
2219 | + super(ZopeEventHandlerFixture, self).__init__() |
2220 | self._handler = handler |
2221 | |
2222 | def setUp(self): |
2223 | @@ -123,18 +27,3 @@ |
2224 | gsm = getGlobalSiteManager() |
2225 | provideHandler(self._handler) |
2226 | self.addCleanup(gsm.unregisterHandler, self._handler) |
2227 | - |
2228 | - |
2229 | -class ServerFixture: |
2230 | - """Adapt a bzrlib `Server` into an `IFixture`.""" |
2231 | - |
2232 | - implements(IFixture) |
2233 | - |
2234 | - def __init__(self, server): |
2235 | - self.server = server |
2236 | - |
2237 | - def setUp(self): |
2238 | - self.server.start_server() |
2239 | - |
2240 | - def tearDown(self): |
2241 | - self.server.stop_server() |
2242 | |
2243 | === removed file 'lib/lp/testing/tests/test_fixture.py' |
2244 | --- lib/lp/testing/tests/test_fixture.py 2010-08-20 20:31:18 +0000 |
2245 | +++ lib/lp/testing/tests/test_fixture.py 1970-01-01 00:00:00 +0000 |
2246 | @@ -1,138 +0,0 @@ |
2247 | -# Copyright 2009 Canonical Ltd. This software is licensed under the |
2248 | -# GNU Affero General Public License version 3 (see the file LICENSE). |
2249 | - |
2250 | -"""Tests for fixture support.""" |
2251 | - |
2252 | -__metaclass__ = type |
2253 | - |
2254 | -import unittest |
2255 | - |
2256 | -from zope.interface import implements |
2257 | - |
2258 | -from lp.testing import TestCase |
2259 | -from lp.testing.fixture import ( |
2260 | - Fixtures, |
2261 | - FixtureWithCleanup, |
2262 | - IFixture, |
2263 | - run_with_fixture, |
2264 | - with_fixture, |
2265 | - ) |
2266 | - |
2267 | - |
2268 | -class LoggingFixture: |
2269 | - |
2270 | - implements(IFixture) |
2271 | - |
2272 | - def __init__(self, log): |
2273 | - self.log = log |
2274 | - |
2275 | - def setUp(self): |
2276 | - self.log.append('setUp') |
2277 | - |
2278 | - def tearDown(self): |
2279 | - self.log.append('tearDown') |
2280 | - |
2281 | - |
2282 | -class TestFixture(TestCase): |
2283 | - |
2284 | - def test_run_with_fixture(self): |
2285 | - # run_with_fixture runs the setUp method of the fixture, the passed |
2286 | - # function and then the tearDown method of the fixture. |
2287 | - log = [] |
2288 | - fixture = LoggingFixture(log) |
2289 | - run_with_fixture(fixture, log.append, 'hello') |
2290 | - self.assertEqual(['setUp', 'hello', 'tearDown'], log) |
2291 | - |
2292 | - def test_run_tearDown_even_with_exception(self): |
2293 | - # run_with_fixture runs the setUp method of the fixture, the passed |
2294 | - # function and then the tearDown method of the fixture even if the |
2295 | - # function raises an exception. |
2296 | - log = [] |
2297 | - fixture = LoggingFixture(log) |
2298 | - self.assertRaises( |
2299 | - ZeroDivisionError, run_with_fixture, fixture, lambda: 1/0) |
2300 | - self.assertEqual(['setUp', 'tearDown'], log) |
2301 | - |
2302 | - def test_with_fixture(self): |
2303 | - # with_fixture decorates a function so that it gets passed the fixture |
2304 | - # and the fixture is set up and torn down around the function. |
2305 | - log = [] |
2306 | - fixture = LoggingFixture(log) |
2307 | - @with_fixture(fixture) |
2308 | - def function(fixture, **kwargs): |
2309 | - log.append(fixture) |
2310 | - log.append(kwargs) |
2311 | - return 'oi' |
2312 | - result = function(foo='bar') |
2313 | - self.assertEqual('oi', result) |
2314 | - self.assertEqual(['setUp', fixture, {'foo': 'bar'}, 'tearDown'], log) |
2315 | - |
2316 | - |
2317 | -class TestFixtureWithCleanup(TestCase): |
2318 | - """Tests for `FixtureWithCleanup`.""" |
2319 | - |
2320 | - def test_cleanup_called_during_teardown(self): |
2321 | - log = [] |
2322 | - fixture = FixtureWithCleanup() |
2323 | - fixture.setUp() |
2324 | - fixture.addCleanup(log.append, 'foo') |
2325 | - self.assertEqual([], log) |
2326 | - fixture.tearDown() |
2327 | - self.assertEqual(['foo'], log) |
2328 | - |
2329 | - def test_cleanup_called_in_reverse_order(self): |
2330 | - log = [] |
2331 | - fixture = FixtureWithCleanup() |
2332 | - fixture.setUp() |
2333 | - fixture.addCleanup(log.append, 'foo') |
2334 | - fixture.addCleanup(log.append, 'bar') |
2335 | - fixture.tearDown() |
2336 | - self.assertEqual(['bar', 'foo'], log) |
2337 | - |
2338 | - def test_cleanup_run_even_in_failure(self): |
2339 | - log = [] |
2340 | - fixture = FixtureWithCleanup() |
2341 | - fixture.setUp() |
2342 | - fixture.addCleanup(log.append, 'foo') |
2343 | - fixture.addCleanup(lambda: 1/0) |
2344 | - self.assertRaises(ZeroDivisionError, fixture.tearDown) |
2345 | - self.assertEqual(['foo'], log) |
2346 | - |
2347 | - |
2348 | -class TestFixtures(TestCase): |
2349 | - """Tests the `Fixtures` class, which groups multiple `IFixture`s.""" |
2350 | - |
2351 | - class LoggingFixture: |
2352 | - |
2353 | - def __init__(self, log): |
2354 | - self._log = log |
2355 | - |
2356 | - def setUp(self): |
2357 | - self._log.append((self, 'setUp')) |
2358 | - |
2359 | - def tearDown(self): |
2360 | - self._log.append((self, 'tearDown')) |
2361 | - |
2362 | - def test_with_single_fixture(self): |
2363 | - log = [] |
2364 | - a = self.LoggingFixture(log) |
2365 | - fixtures = Fixtures([a]) |
2366 | - fixtures.setUp() |
2367 | - fixtures.tearDown() |
2368 | - self.assertEqual([(a, 'setUp'), (a, 'tearDown')], log) |
2369 | - |
2370 | - def test_with_multiple_fixtures(self): |
2371 | - log = [] |
2372 | - a = self.LoggingFixture(log) |
2373 | - b = self.LoggingFixture(log) |
2374 | - fixtures = Fixtures([a, b]) |
2375 | - fixtures.setUp() |
2376 | - fixtures.tearDown() |
2377 | - self.assertEqual( |
2378 | - [(a, 'setUp'), (b, 'setUp'), (b, 'tearDown'), (a, 'tearDown')], |
2379 | - log) |
2380 | - |
2381 | - |
2382 | -def test_suite(): |
2383 | - return unittest.TestLoader().loadTestsFromName(__name__) |
2384 | - |
2385 | |
2386 | === modified file 'lib/lp/translations/doc/fix_translation_credits.txt' |
2387 | --- lib/lp/translations/doc/fix_translation_credits.txt 2010-04-01 04:05:10 +0000 |
2388 | +++ lib/lp/translations/doc/fix_translation_credits.txt 2010-10-18 06:17:50 +0000 |
2389 | @@ -19,5 +19,5 @@ |
2390 | After altering the database from a separate process, we must tell the |
2391 | test setup that the database is dirty in spite of appearances. |
2392 | |
2393 | - >>> from canonical.launchpad.ftests.harness import LaunchpadTestSetup |
2394 | - >>> LaunchpadTestSetup().force_dirty_database() |
2395 | + >>> from canonical.testing.layers import DatabaseLayer |
2396 | + >>> DatabaseLayer.force_dirty_database() |
2397 | |
2398 | === modified file 'lib/lp/translations/doc/message-sharing-merge-script.txt' |
2399 | --- lib/lp/translations/doc/message-sharing-merge-script.txt 2009-08-04 13:37:57 +0000 |
2400 | +++ lib/lp/translations/doc/message-sharing-merge-script.txt 2010-10-18 06:17:50 +0000 |
2401 | @@ -20,5 +20,5 @@ |
2402 | # The script modified the database, even though the database layer may |
2403 | # not have noticed it. |
2404 | |
2405 | - >>> from canonical.launchpad.ftests.harness import LaunchpadTestSetup |
2406 | - >>> LaunchpadTestSetup().force_dirty_database() |
2407 | + >>> from canonical.testing.layers import DatabaseLayer |
2408 | + >>> DatabaseLayer.force_dirty_database() |
2409 | |
2410 | === modified file 'lib/lp/translations/doc/request_country.txt' |
2411 | --- lib/lp/translations/doc/request_country.txt 2010-02-26 21:58:15 +0000 |
2412 | +++ lib/lp/translations/doc/request_country.txt 2010-10-18 06:17:50 +0000 |
2413 | @@ -4,10 +4,6 @@ |
2414 | |
2415 | Adapting a request to a country allows you to see where the request came from. |
2416 | |
2417 | - >>> from canonical.launchpad.ftests.harness import ( |
2418 | - ... LaunchpadFunctionalTestSetup) |
2419 | - >>> LaunchpadFunctionalTestSetup().setUp() |
2420 | - |
2421 | Here's a dummy request. Zope adds the REMOTE_ADDR CGI environment variable |
2422 | for us. Upstream proxy servers (and tinkering users!) may also add |
2423 | X-Forwarded-For: headers. The X-Forwarded-For: header takes precidence |
2424 | @@ -34,6 +30,3 @@ |
2425 | Traceback (most recent call last): |
2426 | ... |
2427 | TypeError: ('Could not adapt', ... |
2428 | - |
2429 | - >>> LaunchpadFunctionalTestSetup().tearDown() |
2430 | - |
Hi Steve,
Overall I'm happy with this branch, but there are a couple of fixes that
it needs before it's ready to land:
> 195 + def deriveDistroSeries( SeriesStatus. FROZEN, architectures=(), packagesets=(),
> 196 + self, user, name, distribution=None, displayname=None,
> 197 + title=None, summary=None, description=None, version=None,
> 198 + status=
> 199 + rebuild=False):
Minor nitpick: When defining methods (as opposed to calling them) we
wrap their arguments thus so that the difference between method
declaration and code is clearer:
def deriveDistroSer ies(self, user, name, distribution=None,
displayna me=None, title=None, summary=None,
descripti on=None, version=None,
status= SeriesStatus. FROZEN, architectures=(),
packagese ts=(), rebuild=False):
> 210 + for param in (
> 211 + displayname, title, summary, description, version):
> 212 + if param is None or len(param) == 0:
> 213 + raise DerivationError(
> 214 + "Display Name, Title, Summary, Description and"
> 215 + " Version all need to be set when creating a"
> 216 + " distroseries")
Why bother to do this at all? Why not just put all of these things
earlier in the parameter list and make them required rather than
optional? This validation code is quite confusing; it took me > 5
seconds to work out what was going on and if a DerivationError gets
raised it doesn't actually tell you which parameter you forgot to pass.
Might as well let Python do the work for you.
> 351 + login('<email address hidden>')
This (and all subsequent lines that do the same thing) should use the sampledata rather than having the
ADMIN_EMAIL constant from lp.testing.
email address hardcoded.