Merge lp:~leonardr/launchpad/grant-permissions-oauth into lp:launchpad/db-devel
- grant-permissions-oauth
- Merge into db-devel
Status: | Merged |
---|---|
Approved by: | Edwin Grubbs |
Approved revision: | 11095 |
Merge reported by: | Leonard Richardson |
Merged at revision: | not available |
Proposed branch: | lp:~leonardr/launchpad/grant-permissions-oauth |
Merge into: | lp:launchpad/db-devel |
Diff against target: |
1583 lines (+503/-318) 25 files modified
bootstrap.py (+76/-24) lib/canonical/launchpad/browser/oauth.py (+41/-14) lib/canonical/launchpad/components/decoratedresultset.py (+0/-4) lib/canonical/launchpad/doc/webapp-authorization.txt (+21/-2) lib/canonical/launchpad/pagetests/oauth/authorize-token.txt (+28/-3) lib/canonical/launchpad/webapp/interfaces.py (+9/-0) lib/lp/archivepublisher/utils.py (+2/-1) lib/lp/archiveuploader/dscfile.py (+131/-106) lib/lp/archiveuploader/nascentupload.py (+0/-7) lib/lp/archiveuploader/tests/test_dscfile.py (+58/-41) lib/lp/archiveuploader/tests/test_nascentuploadfile.py (+1/-2) lib/lp/hardwaredb/model/hwdb.py (+1/-10) lib/lp/registry/browser/distribution.py (+1/-12) lib/lp/registry/browser/sourcepackage.py (+1/-1) lib/lp/registry/browser/team.py (+2/-2) lib/lp/registry/model/distroseries.py (+3/-7) lib/lp/registry/model/mailinglist.py (+15/-13) lib/lp/registry/tests/test_mailinglist.py (+29/-29) lib/lp/registry/vocabularies.py (+2/-11) lib/lp/soyuz/doc/package-diff.txt (+1/-6) lib/lp/soyuz/scripts/initialise_distroseries.py (+27/-7) lib/lp/soyuz/scripts/tests/test_initialise_distroseries.py (+28/-10) lib/lp/testing/fakelibrarian.py (+14/-3) lib/lp/testing/tests/test_fakelibrarian.py (+9/-0) versions.cfg (+3/-3) |
To merge this branch: | bzr merge lp:~leonardr/launchpad/grant-permissions-oauth |
Related bugs: |
Reviewer | Review Type | Date Requested | Status |
---|---|---|---|
Edwin Grubbs (community) | Approve | ||
Review via email: mp+29425@code.launchpad.net |
Commit message
Description of the change
This branch introduces a brand new access level for OAuth tokens: GRANT_PERMISSIONS. GRANT_PERMISSIONS is different from other access levels (eg. READ_PUBLIC) in that it's designed to be used by one specific application: the forthcoming desktop credential manager for the Launchpad web service.
Currently GRANT_PERMISSIONS acts exactly like READ_PUBLIC, with the following exceptions:
1. GRANT_PERMISSIONS is not published in the list of access levels -- the client must know its name ahead of time.
2. GRANT_PERMISSIONS does not show up on the list of access levels in +authorize-token unless it is specifically requested and the _only_ access level the client requests. You can't let the end-user choose between WRITE_PRIVATE and GRANT_PERMISSIONS -- either your program needs GRANT_PERMISSIONS or it doesn't.
Eventually there will be a third exception:
3. GRANT_PERMISSIONS will be the only access level that can access the current user's list of OAuth access tokens, or invoke the named operation to create a new OAuth access token.
- 11096. By Leonard Richardson
-
Rewording in response to feedback.
- 11097. By Leonard Richardson
-
Merge from trunk.
- 11098. By Leonard Richardson
-
Merge from trunk.
Brad Crittenden (bac) wrote : | # |
Leonard can this branch be landed now? Is there anything blocking you?
Preview Diff
1 | === modified file 'bootstrap.py' |
2 | --- bootstrap.py 2010-03-20 01:13:25 +0000 |
3 | +++ bootstrap.py 2010-08-24 16:45:57 +0000 |
4 | @@ -1,6 +1,6 @@ |
5 | ############################################################################## |
6 | # |
7 | -# Copyright (c) 2006 Zope Corporation and Contributors. |
8 | +# Copyright (c) 2006 Zope Foundation and Contributors. |
9 | # All Rights Reserved. |
10 | # |
11 | # This software is subject to the provisions of the Zope Public License, |
12 | @@ -16,11 +16,9 @@ |
13 | Simply run this script in a directory containing a buildout.cfg. |
14 | The script accepts buildout command-line options, so you can |
15 | use the -c option to specify an alternate configuration file. |
16 | - |
17 | -$Id$ |
18 | """ |
19 | |
20 | -import os, shutil, sys, tempfile, textwrap, urllib, urllib2 |
21 | +import os, shutil, sys, tempfile, textwrap, urllib, urllib2, subprocess |
22 | from optparse import OptionParser |
23 | |
24 | if sys.platform == 'win32': |
25 | @@ -32,11 +30,23 @@ |
26 | else: |
27 | quote = str |
28 | |
29 | +# See zc.buildout.easy_install._has_broken_dash_S for motivation and comments. |
30 | +stdout, stderr = subprocess.Popen( |
31 | + [sys.executable, '-Sc', |
32 | + 'try:\n' |
33 | + ' import ConfigParser\n' |
34 | + 'except ImportError:\n' |
35 | + ' print 1\n' |
36 | + 'else:\n' |
37 | + ' print 0\n'], |
38 | + stdout=subprocess.PIPE, stderr=subprocess.PIPE).communicate() |
39 | +has_broken_dash_S = bool(int(stdout.strip())) |
40 | + |
41 | # In order to be more robust in the face of system Pythons, we want to |
42 | # run without site-packages loaded. This is somewhat tricky, in |
43 | # particular because Python 2.6's distutils imports site, so starting |
44 | # with the -S flag is not sufficient. However, we'll start with that: |
45 | -if 'site' in sys.modules: |
46 | +if not has_broken_dash_S and 'site' in sys.modules: |
47 | # We will restart with python -S. |
48 | args = sys.argv[:] |
49 | args[0:0] = [sys.executable, '-S'] |
50 | @@ -109,13 +119,22 @@ |
51 | help=("Specify a directory for storing eggs. Defaults to " |
52 | "a temporary directory that is deleted when the " |
53 | "bootstrap script completes.")) |
54 | +parser.add_option("-t", "--accept-buildout-test-releases", |
55 | + dest='accept_buildout_test_releases', |
56 | + action="store_true", default=False, |
57 | + help=("Normally, if you do not specify a --version, the " |
58 | + "bootstrap script and buildout gets the newest " |
59 | + "*final* versions of zc.buildout and its recipes and " |
60 | + "extensions for you. If you use this flag, " |
61 | + "bootstrap and buildout will get the newest releases " |
62 | + "even if they are alphas or betas.")) |
63 | parser.add_option("-c", None, action="store", dest="config_file", |
64 | help=("Specify the path to the buildout configuration " |
65 | "file to be used.")) |
66 | |
67 | options, args = parser.parse_args() |
68 | |
69 | -# if -c was provided, we push it back into args for buildout' main function |
70 | +# if -c was provided, we push it back into args for buildout's main function |
71 | if options.config_file is not None: |
72 | args += ['-c', options.config_file] |
73 | |
74 | @@ -130,16 +149,15 @@ |
75 | else: |
76 | options.setup_source = setuptools_source |
77 | |
78 | -args = args + ['bootstrap'] |
79 | - |
80 | +if options.accept_buildout_test_releases: |
81 | + args.append('buildout:accept-buildout-test-releases=true') |
82 | +args.append('bootstrap') |
83 | |
84 | try: |
85 | - to_reload = False |
86 | import pkg_resources |
87 | - to_reload = True |
88 | + import setuptools # A flag. Sometimes pkg_resources is installed alone. |
89 | if not hasattr(pkg_resources, '_distribute'): |
90 | raise ImportError |
91 | - import setuptools # A flag. Sometimes pkg_resources is installed alone. |
92 | except ImportError: |
93 | ez_code = urllib2.urlopen( |
94 | options.setup_source).read().replace('\r\n', '\n') |
95 | @@ -151,10 +169,8 @@ |
96 | if options.use_distribute: |
97 | setup_args['no_fake'] = True |
98 | ez['use_setuptools'](**setup_args) |
99 | - if to_reload: |
100 | - reload(pkg_resources) |
101 | - else: |
102 | - import pkg_resources |
103 | + reload(sys.modules['pkg_resources']) |
104 | + import pkg_resources |
105 | # This does not (always?) update the default working set. We will |
106 | # do it. |
107 | for path in sys.path: |
108 | @@ -167,23 +183,59 @@ |
109 | '-mqNxd', |
110 | quote(eggs_dir)] |
111 | |
112 | -if options.download_base: |
113 | - cmd.extend(['-f', quote(options.download_base)]) |
114 | +if not has_broken_dash_S: |
115 | + cmd.insert(1, '-S') |
116 | |
117 | -requirement = 'zc.buildout' |
118 | -if options.version: |
119 | - requirement = '=='.join((requirement, options.version)) |
120 | -cmd.append(requirement) |
121 | +find_links = options.download_base |
122 | +if not find_links: |
123 | + find_links = os.environ.get('bootstrap-testing-find-links') |
124 | +if find_links: |
125 | + cmd.extend(['-f', quote(find_links)]) |
126 | |
127 | if options.use_distribute: |
128 | setup_requirement = 'distribute' |
129 | else: |
130 | setup_requirement = 'setuptools' |
131 | ws = pkg_resources.working_set |
132 | +setup_requirement_path = ws.find( |
133 | + pkg_resources.Requirement.parse(setup_requirement)).location |
134 | env = dict( |
135 | os.environ, |
136 | - PYTHONPATH=ws.find( |
137 | - pkg_resources.Requirement.parse(setup_requirement)).location) |
138 | + PYTHONPATH=setup_requirement_path) |
139 | + |
140 | +requirement = 'zc.buildout' |
141 | +version = options.version |
142 | +if version is None and not options.accept_buildout_test_releases: |
143 | + # Figure out the most recent final version of zc.buildout. |
144 | + import setuptools.package_index |
145 | + _final_parts = '*final-', '*final' |
146 | + def _final_version(parsed_version): |
147 | + for part in parsed_version: |
148 | + if (part[:1] == '*') and (part not in _final_parts): |
149 | + return False |
150 | + return True |
151 | + index = setuptools.package_index.PackageIndex( |
152 | + search_path=[setup_requirement_path]) |
153 | + if find_links: |
154 | + index.add_find_links((find_links,)) |
155 | + req = pkg_resources.Requirement.parse(requirement) |
156 | + if index.obtain(req) is not None: |
157 | + best = [] |
158 | + bestv = None |
159 | + for dist in index[req.project_name]: |
160 | + distv = dist.parsed_version |
161 | + if _final_version(distv): |
162 | + if bestv is None or distv > bestv: |
163 | + best = [dist] |
164 | + bestv = distv |
165 | + elif distv == bestv: |
166 | + best.append(dist) |
167 | + if best: |
168 | + best.sort() |
169 | + version = best[-1].version |
170 | +if version: |
171 | + requirement = '=='.join((requirement, version)) |
172 | +cmd.append(requirement) |
173 | |
174 | if is_jython: |
175 | import subprocess |
176 | @@ -193,7 +245,7 @@ |
177 | if exitcode != 0: |
178 | sys.stdout.flush() |
179 | sys.stderr.flush() |
180 | - print ("An error occured when trying to install zc.buildout. " |
181 | + print ("An error occurred when trying to install zc.buildout. " |
182 | "Look above this message for any errors that " |
183 | "were output by easy_install.") |
184 | sys.exit(exitcode) |
185 | |
186 | === modified file 'lib/canonical/launchpad/browser/oauth.py' |
187 | --- lib/canonical/launchpad/browser/oauth.py 2010-08-20 20:31:18 +0000 |
188 | +++ lib/canonical/launchpad/browser/oauth.py 2010-08-24 16:45:57 +0000 |
189 | @@ -88,8 +88,13 @@ |
190 | |
191 | token = consumer.newRequestToken() |
192 | if self.request.headers.get('Accept') == HTTPResource.JSON_TYPE: |
193 | + # Don't show the client the GRANT_PERMISSIONS access |
194 | + # level. If they have a legitimate need to use it, they'll |
195 | + # already know about it. |
196 | + permissions = [permission for permission in OAuthPermission.items |
197 | + if permission != OAuthPermission.GRANT_PERMISSIONS] |
198 | return self.getJSONRepresentation( |
199 | - OAuthPermission.items, token, include_secret=True) |
200 | + permissions, token, include_secret=True) |
201 | return u'oauth_token=%s&oauth_token_secret=%s' % ( |
202 | token.key, token.secret) |
203 | |
204 | @@ -100,6 +105,7 @@ |
205 | def create_oauth_permission_actions(): |
206 | """Return a list of `Action`s for each possible `OAuthPermission`.""" |
207 | actions = Actions() |
208 | + actions_excluding_grant_permissions = Actions() |
209 | def success(form, action, data): |
210 | form.reviewToken(action.permission) |
211 | for permission in OAuthPermission.items: |
212 | @@ -108,13 +114,15 @@ |
213 | condition=token_exists_and_is_not_reviewed) |
214 | action.permission = permission |
215 | actions.append(action) |
216 | - return actions |
217 | - |
218 | + if permission != OAuthPermission.GRANT_PERMISSIONS: |
219 | + actions_excluding_grant_permissions.append(action) |
220 | + return actions, actions_excluding_grant_permissions |
221 | |
222 | class OAuthAuthorizeTokenView(LaunchpadFormView, JSONTokenMixin): |
223 | """Where users authorize consumers to access Launchpad on their behalf.""" |
224 | |
225 | - actions = create_oauth_permission_actions() |
226 | + actions, actions_excluding_grant_permissions = ( |
227 | + create_oauth_permission_actions()) |
228 | label = "Authorize application to access Launchpad on your behalf" |
229 | schema = IOAuthRequestToken |
230 | field_names = [] |
231 | @@ -132,28 +140,47 @@ |
232 | acceptable subset of OAuthPermission. |
233 | |
234 | The user always has the option to deny the client access |
235 | - altogether, so it makes sense for the client to specify the |
236 | - least restrictions possible. |
237 | + altogether, so it makes sense for the client to ask for the |
238 | + least access possible. |
239 | |
240 | If the client sends nonsensical values for allow_permissions, |
241 | - the end-user will be given an unrestricted choice. |
242 | + the end-user will be given a choice among all the permissions |
243 | + used by normal applications. |
244 | """ |
245 | + |
246 | allowed_permissions = self.request.form_ng.getAll('allow_permission') |
247 | if len(allowed_permissions) == 0: |
248 | - return self.actions |
249 | + return self.actions_excluding_grant_permissions |
250 | actions = Actions() |
251 | + |
252 | + # UNAUTHORIZED is always one of the options. If the client |
253 | + # explicitly requested UNAUTHORIZED, remove it from the list |
254 | + # to simplify the algorithm: we'll add it back later. |
255 | + if OAuthPermission.UNAUTHORIZED.name in allowed_permissions: |
256 | + allowed_permissions.remove(OAuthPermission.UNAUTHORIZED.name) |
257 | + |
258 | + # GRANT_PERMISSIONS cannot be requested as one of several |
259 | + # options--it must be the only option (other than |
260 | + # UNAUTHORIZED). If GRANT_PERMISSIONS is one of several |
261 | + # options, remove it from the list. |
262 | + if (OAuthPermission.GRANT_PERMISSIONS.name in allowed_permissions |
263 | + and len(allowed_permissions) > 1): |
264 | + allowed_permissions.remove(OAuthPermission.GRANT_PERMISSIONS.name) |
265 | + |
266 | for action in self.actions: |
267 | if (action.permission.name in allowed_permissions |
268 | or action.permission is OAuthPermission.UNAUTHORIZED): |
269 | actions.append(action) |
270 | + |
271 | if len(list(actions)) == 1: |
272 | # The only visible action is UNAUTHORIZED. That means the |
273 | - # client tried to restrict the actions but didn't name any |
274 | - # actual actions (except possibly UNAUTHORIZED). Rather |
275 | - # than present the end-user with an impossible situation |
276 | - # where their only option is to deny access, we'll present |
277 | - # the full range of actions. |
278 | - return self.actions |
279 | + # client tried to restrict the permissions but didn't name |
280 | + # any actual permissions (except possibly |
281 | + # UNAUTHORIZED). Rather than present the end-user with an |
282 | + # impossible situation where their only option is to deny |
283 | + # access, we'll present the full range of actions (except |
284 | + # for GRANT_PERMISSIONS). |
285 | + return self.actions_excluding_grant_permissions |
286 | return actions |
287 | |
288 | def initialize(self): |
289 | |
290 | === modified file 'lib/canonical/launchpad/components/decoratedresultset.py' |
291 | --- lib/canonical/launchpad/components/decoratedresultset.py 2010-08-20 20:31:18 +0000 |
292 | +++ lib/canonical/launchpad/components/decoratedresultset.py 2010-08-24 16:45:57 +0000 |
293 | @@ -9,7 +9,6 @@ |
294 | ] |
295 | |
296 | from lazr.delegates import delegates |
297 | -from storm.expr import Column |
298 | from storm.zope.interfaces import IResultSet |
299 | from zope.security.proxy import removeSecurityProxy |
300 | |
301 | @@ -31,9 +30,6 @@ |
302 | |
303 | This behaviour is required for other classes as well (Distribution, |
304 | DistroArchSeries), hence a generalised solution. |
305 | - |
306 | - This class also fixes a bug currently in Storm's ResultSet.count |
307 | - method (see below) |
308 | """ |
309 | delegates(IResultSet, context='result_set') |
310 | |
311 | |
312 | === modified file 'lib/canonical/launchpad/doc/webapp-authorization.txt' |
313 | --- lib/canonical/launchpad/doc/webapp-authorization.txt 2010-04-16 15:06:55 +0000 |
314 | +++ lib/canonical/launchpad/doc/webapp-authorization.txt 2010-08-24 16:45:57 +0000 |
315 | @@ -79,8 +79,27 @@ |
316 | >>> check_permission('launchpad.View', bug_1) |
317 | False |
318 | |
319 | -Users logged in through the web application, though, have full access, |
320 | -which means they can read/change any object they have access to. |
321 | +Now consider a principal authorized to create OAuth tokens. Whenever |
322 | +it's not creating OAuth tokens, it has a level of permission |
323 | +equivalent to READ_PUBLIC. |
324 | + |
325 | + >>> principal.access_level = AccessLevel.GRANT_PERMISSIONS |
326 | + >>> setupInteraction(principal) |
327 | + >>> check_permission('launchpad.View', bug_1) |
328 | + False |
329 | + |
330 | + >>> check_permission('launchpad.Edit', sample_person) |
331 | + False |
332 | + |
333 | +This may seem useless from a security standpoint, since once a |
334 | +malicious client is authorized to create OAuth tokens, it can escalate |
335 | +its privileges at any time by creating a new token for itself. The |
336 | +security benefit is more subtle: by discouraging feature creep in |
337 | +clients that have this super-access level, we reduce the risk that a |
338 | +bug in a _trusted_ client will enable privilege escalation attacks. |
339 | + |
340 | +Users logged in through the web application have full access, which |
341 | +means they can read/change any object they have access to. |
342 | |
343 | >>> mock_participation = Participation() |
344 | >>> login('test@canonical.com', mock_participation) |
345 | |
346 | === modified file 'lib/canonical/launchpad/pagetests/oauth/authorize-token.txt' |
347 | --- lib/canonical/launchpad/pagetests/oauth/authorize-token.txt 2010-02-05 13:25:46 +0000 |
348 | +++ lib/canonical/launchpad/pagetests/oauth/authorize-token.txt 2010-08-24 16:45:57 +0000 |
349 | @@ -44,7 +44,8 @@ |
350 | ... |
351 | See all applications authorized to access Launchpad on your behalf. |
352 | |
353 | -This page contains one submit button for each item of OAuthPermission. |
354 | +This page contains one submit button for each item of OAuthPermission, |
355 | +except for 'Grant Permissions', which must be specifically requested. |
356 | |
357 | >>> browser.getControl('No Access') |
358 | <SubmitControl... |
359 | @@ -57,9 +58,14 @@ |
360 | >>> browser.getControl('Change Anything') |
361 | <SubmitControl... |
362 | |
363 | + >>> browser.getControl('Grant Permissions') |
364 | + Traceback (most recent call last): |
365 | + ... |
366 | + LookupError: label 'Grant Permissions' |
367 | + |
368 | >>> actions = main_content.findAll('input', attrs={'type': 'submit'}) |
369 | >>> from canonical.launchpad.webapp.interfaces import OAuthPermission |
370 | - >>> len(actions) == len(OAuthPermission.items) |
371 | + >>> len(actions) == len(OAuthPermission.items) - 1 |
372 | True |
373 | |
374 | An application, when asking to access Launchpad on a user's behalf, |
375 | @@ -83,9 +89,28 @@ |
376 | Change Non-Private Data |
377 | Change Anything |
378 | |
379 | +The only time the 'Grant Permissions' permission shows up in this list |
380 | +is if the client specifically requests it, and no other |
381 | +permission. (Also requesting UNAUTHORIZED is okay--it will show up |
382 | +anyway.) |
383 | + |
384 | + >>> print_access_levels('allow_permission=GRANT_PERMISSIONS') |
385 | + No Access |
386 | + Grant Permissions |
387 | + |
388 | + >>> print_access_levels( |
389 | + ... 'allow_permission=GRANT_PERMISSIONS&allow_permission=UNAUTHORIZED') |
390 | + No Access |
391 | + Grant Permissions |
392 | + |
393 | + >>> print_access_levels( |
394 | + ... 'allow_permission=WRITE_PUBLIC&allow_permission=GRANT_PERMISSIONS') |
395 | + No Access |
396 | + Change Non-Private Data |
397 | + |
398 | If an application doesn't specify any valid access levels, or only |
399 | specifies the UNAUTHORIZED access level, Launchpad will show all the |
400 | -access levels. |
401 | +access levels, except for GRANT_PERMISSIONS. |
402 | |
403 | >>> print_access_levels('') |
404 | No Access |
405 | |
406 | === modified file 'lib/canonical/launchpad/webapp/interfaces.py' |
407 | --- lib/canonical/launchpad/webapp/interfaces.py 2010-08-20 20:31:18 +0000 |
408 | +++ lib/canonical/launchpad/webapp/interfaces.py 2010-08-24 16:45:57 +0000 |
409 | @@ -527,6 +527,15 @@ |
410 | for reading and changing anything, including private data. |
411 | """) |
412 | |
413 | + GRANT_PERMISSIONS = DBItem(60, """ |
414 | + Grant Permissions |
415 | + |
416 | + The application will be able to grant access to your Launchpad |
417 | + account to any other application. This is a very powerful |
418 | + level of access. You should not grant this level of access to |
419 | + any application except the official Launchpad credential |
420 | + manager. |
421 | + """) |
422 | |
423 | class AccessLevel(DBEnumeratedType): |
424 | """The level of access any given principal has.""" |
425 | |
426 | === modified file 'lib/lp/archivepublisher/utils.py' |
427 | --- lib/lp/archivepublisher/utils.py 2010-08-20 20:31:18 +0000 |
428 | +++ lib/lp/archivepublisher/utils.py 2010-08-24 16:45:57 +0000 |
429 | @@ -109,7 +109,8 @@ |
430 | end = start + chunk_size |
431 | |
432 | # The reason why we listify the sliced ResultSet is because we |
433 | - # cannot very it's size using 'count' (see bug #217644). However, |
434 | + # cannot very it's size using 'count' (see bug #217644 and note |
435 | + # that it was fixed in storm but not SQLObjectResultSet). However, |
436 | # It's not exactly a problem considering non-empty set will be |
437 | # iterated anyway. |
438 | batch = list(self.input[start:end]) |
439 | |
440 | === modified file 'lib/lp/archiveuploader/dscfile.py' |
441 | --- lib/lp/archiveuploader/dscfile.py 2010-08-21 13:54:20 +0000 |
442 | +++ lib/lp/archiveuploader/dscfile.py 2010-08-24 16:45:57 +0000 |
443 | @@ -13,10 +13,11 @@ |
444 | 'SignableTagFile', |
445 | 'DSCFile', |
446 | 'DSCUploadedFile', |
447 | - 'findChangelog', |
448 | - 'findCopyright', |
449 | + 'find_changelog', |
450 | + 'find_copyright', |
451 | ] |
452 | |
453 | +from cStringIO import StringIO |
454 | import errno |
455 | import glob |
456 | import os |
457 | @@ -73,6 +74,70 @@ |
458 | from lp.soyuz.interfaces.sourcepackageformat import SourcePackageFormat |
459 | |
460 | |
461 | +class DpkgSourceError(Exception): |
462 | + |
463 | + _fmt = "Unable to unpack source package (%(result)s): %(output)s" |
464 | + |
465 | + def __init__(self, output, result): |
466 | + Exception.__init__( |
467 | + self, self._fmt % {"output": output, "result": result}) |
468 | + self.output = output |
469 | + self.result = result |
470 | + |
471 | + |
472 | +def unpack_source(dsc_filepath): |
473 | + """Unpack a source package into a temporary directory |
474 | + |
475 | + :param dsc_filepath: Path to the dsc file |
476 | + :return: Path to the temporary directory with the unpacked sources |
477 | + """ |
478 | + # Get a temporary dir together. |
479 | + unpacked_dir = tempfile.mkdtemp() |
480 | + try: |
481 | + # chdir into it |
482 | + cwd = os.getcwd() |
483 | + os.chdir(unpacked_dir) |
484 | + try: |
485 | + args = ["dpkg-source", "-sn", "-x", dsc_filepath] |
486 | + dpkg_source = subprocess.Popen(args, stdout=subprocess.PIPE, |
487 | + stderr=subprocess.PIPE) |
488 | + output, unused = dpkg_source.communicate() |
489 | + result = dpkg_source.wait() |
490 | + finally: |
491 | + # When all is said and done, chdir out again so that we can |
492 | + # clean up the tree with shutil.rmtree without leaving the |
493 | + # process in a directory we're trying to remove. |
494 | + os.chdir(cwd) |
495 | + |
496 | + if result != 0: |
497 | + dpkg_output = prefix_multi_line_string(output, " ") |
498 | + raise DpkgSourceError(result=result, output=dpkg_output) |
499 | + except: |
500 | + shutil.rmtree(unpacked_dir) |
501 | + raise |
502 | + |
503 | + return unpacked_dir |
504 | + |
505 | + |
506 | +def cleanup_unpacked_dir(unpacked_dir): |
507 | + """Remove the directory with an unpacked source package. |
508 | + |
509 | + :param unpacked_dir: Path to the directory. |
510 | + """ |
511 | + try: |
512 | + shutil.rmtree(unpacked_dir) |
513 | + except OSError, error: |
514 | + if errno.errorcode[error.errno] != 'EACCES': |
515 | + raise UploadError( |
516 | + "couldn't remove tmp dir %s: code %s" % ( |
517 | + unpacked_dir, error.errno)) |
518 | + else: |
519 | + result = os.system("chmod -R u+rwx " + unpacked_dir) |
520 | + if result != 0: |
521 | + raise UploadError("chmod failed with %s" % result) |
522 | + shutil.rmtree(unpacked_dir) |
523 | + |
524 | + |
525 | class SignableTagFile: |
526 | """Base class for signed file verification.""" |
527 | |
528 | @@ -160,7 +225,7 @@ |
529 | "rfc2047": rfc2047, |
530 | "name": name, |
531 | "email": email, |
532 | - "person": person |
533 | + "person": person, |
534 | } |
535 | |
536 | |
537 | @@ -187,9 +252,9 @@ |
538 | |
539 | # Note that files is actually only set inside verify(). |
540 | files = None |
541 | - # Copyright and changelog_path are only set inside unpackAndCheckSource(). |
542 | + # Copyright and changelog are only set inside unpackAndCheckSource(). |
543 | copyright = None |
544 | - changelog_path = None |
545 | + changelog = None |
546 | |
547 | def __init__(self, filepath, digest, size, component_and_section, |
548 | priority, package, version, changes, policy, logger): |
549 | @@ -238,12 +303,9 @@ |
550 | else: |
551 | self.processSignature() |
552 | |
553 | - self.unpacked_dir = None |
554 | - |
555 | # |
556 | # Useful properties. |
557 | # |
558 | - |
559 | @property |
560 | def source(self): |
561 | """Return the DSC source name.""" |
562 | @@ -277,12 +339,11 @@ |
563 | # |
564 | # DSC file checks. |
565 | # |
566 | - |
567 | def verify(self): |
568 | """Verify the uploaded .dsc file. |
569 | |
570 | - This method is an error generator, i.e, it returns an iterator over all |
571 | - exceptions that are generated while processing DSC file checks. |
572 | + This method is an error generator, i.e, it returns an iterator over |
573 | + all exceptions that are generated while processing DSC file checks. |
574 | """ |
575 | |
576 | for error in SourceUploadFile.verify(self): |
577 | @@ -518,82 +579,53 @@ |
578 | self.logger.debug( |
579 | "Verifying uploaded source package by unpacking it.") |
580 | |
581 | - # Get a temporary dir together. |
582 | - self.unpacked_dir = tempfile.mkdtemp() |
583 | - |
584 | - # chdir into it |
585 | - cwd = os.getcwd() |
586 | - os.chdir(self.unpacked_dir) |
587 | - dsc_in_tmpdir = os.path.join(self.unpacked_dir, self.filename) |
588 | - |
589 | - package_files = self.files + [self] |
590 | try: |
591 | - for source_file in package_files: |
592 | - os.symlink( |
593 | - source_file.filepath, |
594 | - os.path.join(self.unpacked_dir, source_file.filename)) |
595 | - args = ["dpkg-source", "-sn", "-x", dsc_in_tmpdir] |
596 | - dpkg_source = subprocess.Popen(args, stdout=subprocess.PIPE, |
597 | - stderr=subprocess.PIPE) |
598 | - output, unused = dpkg_source.communicate() |
599 | - result = dpkg_source.wait() |
600 | - finally: |
601 | - # When all is said and done, chdir out again so that we can |
602 | - # clean up the tree with shutil.rmtree without leaving the |
603 | - # process in a directory we're trying to remove. |
604 | - os.chdir(cwd) |
605 | - |
606 | - if result != 0: |
607 | - dpkg_output = prefix_multi_line_string(output, " ") |
608 | + unpacked_dir = unpack_source(self.filepath) |
609 | + except DpkgSourceError, e: |
610 | yield UploadError( |
611 | "dpkg-source failed for %s [return: %s]\n" |
612 | "[dpkg-source output: %s]" |
613 | - % (self.filename, result, dpkg_output)) |
614 | - |
615 | - # Copy debian/copyright file content. It will be stored in the |
616 | - # SourcePackageRelease records. |
617 | - |
618 | - # Check if 'dpkg-source' created only one directory. |
619 | - temp_directories = [ |
620 | - dirname for dirname in os.listdir(self.unpacked_dir) |
621 | - if os.path.isdir(dirname)] |
622 | - if len(temp_directories) > 1: |
623 | - yield UploadError( |
624 | - 'Unpacked source contains more than one directory: %r' |
625 | - % temp_directories) |
626 | - |
627 | - # XXX cprov 20070713: We should access only the expected directory |
628 | - # name (<sourcename>-<no_epoch(no_revision(version))>). |
629 | - |
630 | - # Locate both the copyright and changelog files for later processing. |
631 | - for error in findCopyright(self, self.unpacked_dir, self.logger): |
632 | - yield error |
633 | - |
634 | - for error in findChangelog(self, self.unpacked_dir, self.logger): |
635 | - yield error |
636 | - |
637 | - self.logger.debug("Cleaning up source tree.") |
638 | + % (self.filename, e.result, e.output)) |
639 | + return |
640 | + |
641 | + try: |
642 | + # Copy debian/copyright file content. It will be stored in the |
643 | + # SourcePackageRelease records. |
644 | + |
645 | + # Check if 'dpkg-source' created only one directory. |
646 | + temp_directories = [ |
647 | + dirname for dirname in os.listdir(unpacked_dir) |
648 | + if os.path.isdir(dirname)] |
649 | + if len(temp_directories) > 1: |
650 | + yield UploadError( |
651 | + 'Unpacked source contains more than one directory: %r' |
652 | + % temp_directories) |
653 | + |
654 | + # XXX cprov 20070713: We should access only the expected directory |
655 | + # name (<sourcename>-<no_epoch(no_revision(version))>). |
656 | + |
657 | + # Locate both the copyright and changelog files for later |
658 | + # processing. |
659 | + try: |
660 | + self.copyright = find_copyright(unpacked_dir, self.logger) |
661 | + except UploadError, error: |
662 | + yield error |
663 | + return |
664 | + except UploadWarning, warning: |
665 | + yield warning |
666 | + |
667 | + try: |
668 | + self.changelog = find_changelog(unpacked_dir, self.logger) |
669 | + except UploadError, error: |
670 | + yield error |
671 | + return |
672 | + except UploadWarning, warning: |
673 | + yield warning |
674 | + finally: |
675 | + self.logger.debug("Cleaning up source tree.") |
676 | + cleanup_unpacked_dir(unpacked_dir) |
677 | self.logger.debug("Done") |
678 | |
679 | - def cleanUp(self): |
680 | - if self.unpacked_dir is None: |
681 | - return |
682 | - try: |
683 | - shutil.rmtree(self.unpacked_dir) |
684 | - except OSError, error: |
685 | - # XXX: dsilvers 2006-03-15: We currently lack a test for this. |
686 | - if errno.errorcode[error.errno] != 'EACCES': |
687 | - raise UploadError( |
688 | - "%s: couldn't remove tmp dir %s: code %s" % ( |
689 | - self.filename, self.unpacked_dir, error.errno)) |
690 | - else: |
691 | - result = os.system("chmod -R u+rwx " + self.unpacked_dir) |
692 | - if result != 0: |
693 | - raise UploadError("chmod failed with %s" % result) |
694 | - shutil.rmtree(self.unpacked_dir) |
695 | - self.unpacked_dir = None |
696 | - |
697 | - |
698 | def findBuild(self): |
699 | """Find and return the SourcePackageRecipeBuild, if one is specified. |
700 | |
701 | @@ -651,8 +683,8 @@ |
702 | |
703 | changelog_lfa = self.librarian.create( |
704 | "changelog", |
705 | - os.stat(self.changelog_path).st_size, |
706 | - open(self.changelog_path, "r"), |
707 | + len(self.changelog), |
708 | + StringIO(self.changelog), |
709 | "text/x-debian-source-changelog", |
710 | restricted=self.policy.archive.private) |
711 | |
712 | @@ -716,6 +748,7 @@ |
713 | validation inside DSCFile.verify(); there is no |
714 | store_in_database() method. |
715 | """ |
716 | + |
717 | def __init__(self, filepath, digest, size, policy, logger): |
718 | component_and_section = priority = "--no-value--" |
719 | NascentUploadFile.__init__( |
720 | @@ -735,7 +768,7 @@ |
721 | |
722 | :param source_file: The directory where the source was extracted |
723 | :param source_dir: The directory where the source was extracted. |
724 | - :return fullpath: The full path of the file, else return None if the |
725 | + :return fullpath: The full path of the file, else return None if the |
726 | file is not found. |
727 | """ |
728 | # Instead of trying to predict the unpacked source directory name, |
729 | @@ -758,50 +791,42 @@ |
730 | return fullpath |
731 | return None |
732 | |
733 | -def findCopyright(dsc_file, source_dir, logger): |
734 | + |
735 | +def find_copyright(source_dir, logger): |
736 | """Find and store any debian/copyright. |
737 | |
738 | - :param dsc_file: A DSCFile object where the copyright will be stored. |
739 | :param source_dir: The directory where the source was extracted. |
740 | :param logger: A logger object for debug output. |
741 | + :return: Contents of copyright file |
742 | """ |
743 | - try: |
744 | - copyright_file = findFile(source_dir, 'debian/copyright') |
745 | - except UploadError, error: |
746 | - yield error |
747 | - return |
748 | + copyright_file = findFile(source_dir, 'debian/copyright') |
749 | if copyright_file is None: |
750 | - yield UploadWarning("No copyright file found.") |
751 | - return |
752 | + raise UploadWarning("No copyright file found.") |
753 | |
754 | logger.debug("Copying copyright contents.") |
755 | - dsc_file.copyright = open(copyright_file).read().strip() |
756 | - |
757 | - |
758 | -def findChangelog(dsc_file, source_dir, logger): |
759 | + return open(copyright_file).read().strip() |
760 | + |
761 | + |
762 | +def find_changelog(source_dir, logger): |
763 | """Find and move any debian/changelog. |
764 | |
765 | This function finds the changelog file within the source package. The |
766 | changelog file is later uploaded to the librarian by |
767 | DSCFile.storeInDatabase(). |
768 | |
769 | - :param dsc_file: A DSCFile object where the copyright will be stored. |
770 | :param source_dir: The directory where the source was extracted. |
771 | :param logger: A logger object for debug output. |
772 | + :return: Changelog contents |
773 | """ |
774 | - try: |
775 | - changelog_file = findFile(source_dir, 'debian/changelog') |
776 | - except UploadError, error: |
777 | - yield error |
778 | - return |
779 | + changelog_file = findFile(source_dir, 'debian/changelog') |
780 | if changelog_file is None: |
781 | # Policy requires debian/changelog to always exist. |
782 | - yield UploadError("No changelog file found.") |
783 | - return |
784 | + raise UploadError("No changelog file found.") |
785 | |
786 | # Move the changelog file out of the package direcotry |
787 | logger.debug("Found changelog") |
788 | - dsc_file.changelog_path = changelog_file |
789 | + return open(changelog_file, 'r').read() |
790 | + |
791 | |
792 | |
793 | def check_format_1_0_files(filename, file_type_counts, component_counts, |
794 | |
795 | === modified file 'lib/lp/archiveuploader/nascentupload.py' |
796 | --- lib/lp/archiveuploader/nascentupload.py 2010-08-20 20:31:18 +0000 |
797 | +++ lib/lp/archiveuploader/nascentupload.py 2010-08-24 16:45:57 +0000 |
798 | @@ -893,12 +893,6 @@ |
799 | 'Exception while accepting:\n %s' % e, exc_info=True) |
800 | self.do_reject(notify) |
801 | return False |
802 | - else: |
803 | - self.cleanUp() |
804 | - |
805 | - def cleanUp(self): |
806 | - if self.changes.dsc is not None: |
807 | - self.changes.dsc.cleanUp() |
808 | |
809 | def do_reject(self, notify=True): |
810 | """Reject the current upload given the reason provided.""" |
811 | @@ -929,7 +923,6 @@ |
812 | self.queue_root.notify(summary_text=self.rejection_message, |
813 | changes_file_object=changes_file_object, logger=self.logger) |
814 | changes_file_object.close() |
815 | - self.cleanUp() |
816 | |
817 | def _createQueueEntry(self): |
818 | """Return a PackageUpload object.""" |
819 | |
820 | === modified file 'lib/lp/archiveuploader/tests/test_dscfile.py' |
821 | --- lib/lp/archiveuploader/tests/test_dscfile.py 2010-08-20 20:31:18 +0000 |
822 | +++ lib/lp/archiveuploader/tests/test_dscfile.py 2010-08-24 16:45:57 +0000 |
823 | @@ -10,10 +10,12 @@ |
824 | from canonical.launchpad.scripts.logger import QuietFakeLogger |
825 | from canonical.testing.layers import LaunchpadZopelessLayer |
826 | from lp.archiveuploader.dscfile import ( |
827 | + cleanup_unpacked_dir, |
828 | DSCFile, |
829 | - findChangelog, |
830 | - findCopyright, |
831 | + find_changelog, |
832 | + find_copyright, |
833 | format_to_file_checker_map, |
834 | + unpack_source, |
835 | ) |
836 | from lp.archiveuploader.nascentuploadfile import UploadError |
837 | from lp.archiveuploader.tests import ( |
838 | @@ -37,9 +39,6 @@ |
839 | |
840 | class TestDscFile(TestCase): |
841 | |
842 | - class MockDSCFile: |
843 | - copyright = None |
844 | - |
845 | def setUp(self): |
846 | super(TestDscFile, self).setUp() |
847 | self.tmpdir = self.makeTemporaryDirectory() |
848 | @@ -47,7 +46,6 @@ |
849 | os.makedirs(self.dir_path) |
850 | self.copyright_path = os.path.join(self.dir_path, "copyright") |
851 | self.changelog_path = os.path.join(self.dir_path, "changelog") |
852 | - self.dsc_file = self.MockDSCFile() |
853 | |
854 | def testBadDebianCopyright(self): |
855 | """Test that a symlink as debian/copyright will fail. |
856 | @@ -56,14 +54,10 @@ |
857 | dangling symlink in an attempt to try and access files on the system |
858 | processing the source packages.""" |
859 | os.symlink("/etc/passwd", self.copyright_path) |
860 | - errors = list(findCopyright( |
861 | - self.dsc_file, self.tmpdir, mock_logger_quiet)) |
862 | - |
863 | - self.assertEqual(len(errors), 1) |
864 | - self.assertIsInstance(errors[0], UploadError) |
865 | + error = self.assertRaises( |
866 | + UploadError, find_copyright, self.tmpdir, mock_logger_quiet) |
867 | self.assertEqual( |
868 | - errors[0].args[0], |
869 | - "Symbolic link for debian/copyright not allowed") |
870 | + error.args[0], "Symbolic link for debian/copyright not allowed") |
871 | |
872 | def testGoodDebianCopyright(self): |
873 | """Test that a proper copyright file will be accepted""" |
874 | @@ -72,11 +66,8 @@ |
875 | file.write(copyright) |
876 | file.close() |
877 | |
878 | - errors = list(findCopyright( |
879 | - self.dsc_file, self.tmpdir, mock_logger_quiet)) |
880 | - |
881 | - self.assertEqual(len(errors), 0) |
882 | - self.assertEqual(self.dsc_file.copyright, copyright) |
883 | + self.assertEquals( |
884 | + copyright, find_copyright(self.tmpdir, mock_logger_quiet)) |
885 | |
886 | def testBadDebianChangelog(self): |
887 | """Test that a symlink as debian/changelog will fail. |
888 | @@ -85,14 +76,10 @@ |
889 | dangling symlink in an attempt to try and access files on the system |
890 | processing the source packages.""" |
891 | os.symlink("/etc/passwd", self.changelog_path) |
892 | - errors = list(findChangelog( |
893 | - self.dsc_file, self.tmpdir, mock_logger_quiet)) |
894 | - |
895 | - self.assertEqual(len(errors), 1) |
896 | - self.assertIsInstance(errors[0], UploadError) |
897 | + error = self.assertRaises( |
898 | + UploadError, find_changelog, self.tmpdir, mock_logger_quiet) |
899 | self.assertEqual( |
900 | - errors[0].args[0], |
901 | - "Symbolic link for debian/changelog not allowed") |
902 | + error.args[0], "Symbolic link for debian/changelog not allowed") |
903 | |
904 | def testGoodDebianChangelog(self): |
905 | """Test that a proper changelog file will be accepted""" |
906 | @@ -101,12 +88,8 @@ |
907 | file.write(changelog) |
908 | file.close() |
909 | |
910 | - errors = list(findChangelog( |
911 | - self.dsc_file, self.tmpdir, mock_logger_quiet)) |
912 | - |
913 | - self.assertEqual(len(errors), 0) |
914 | - self.assertEqual(self.dsc_file.changelog_path, |
915 | - self.changelog_path) |
916 | + self.assertEquals( |
917 | + changelog, find_changelog(self.tmpdir, mock_logger_quiet)) |
918 | |
919 | def testOversizedFile(self): |
920 | """Test that a file larger than 10MiB will fail. |
921 | @@ -125,13 +108,10 @@ |
922 | file.write(empty_file) |
923 | file.close() |
924 | |
925 | - errors = list(findChangelog( |
926 | - self.dsc_file, self.tmpdir, mock_logger_quiet)) |
927 | - |
928 | - self.assertIsInstance(errors[0], UploadError) |
929 | + error = self.assertRaises( |
930 | + UploadError, find_changelog, self.tmpdir, mock_logger_quiet) |
931 | self.assertEqual( |
932 | - errors[0].args[0], |
933 | - "debian/changelog file too large, 10MiB max") |
934 | + error.args[0], "debian/changelog file too large, 10MiB max") |
935 | |
936 | |
937 | class TestDscFileLibrarian(TestCaseWithFactory): |
938 | @@ -141,6 +121,7 @@ |
939 | |
940 | def getDscFile(self, name): |
941 | dsc_path = datadir(os.path.join('suite', name, name + '.dsc')) |
942 | + |
943 | class Changes: |
944 | architectures = ['source'] |
945 | logger = QuietFakeLogger() |
946 | @@ -157,10 +138,7 @@ |
947 | os.chmod(tempdir, 0555) |
948 | try: |
949 | dsc_file = self.getDscFile('bar_1.0-1') |
950 | - try: |
951 | - list(dsc_file.verify()) |
952 | - finally: |
953 | - dsc_file.cleanUp() |
954 | + list(dsc_file.verify()) |
955 | finally: |
956 | os.chmod(tempdir, 0755) |
957 | |
958 | @@ -292,3 +270,42 @@ |
959 | # A 3.0 (native) source with component tarballs is invalid. |
960 | self.assertErrorsForFiles( |
961 | [self.wrong_files_error], {NATIVE_TARBALL: 1}, {'foo': 1}) |
962 | + |
963 | + |
964 | +class UnpackedDirTests(TestCase): |
965 | + """Tests for unpack_source and cleanup_unpacked_dir.""" |
966 | + |
967 | + def test_unpack_source(self): |
968 | + # unpack_source unpacks in a temporary directory and returns the |
969 | + # path. |
970 | + unpacked_dir = unpack_source( |
971 | + datadir(os.path.join('suite', 'bar_1.0-1', 'bar_1.0-1.dsc'))) |
972 | + try: |
973 | + self.assertEquals(["bar-1.0"], os.listdir(unpacked_dir)) |
974 | + self.assertContentEqual( |
975 | + ["THIS_IS_BAR", "debian"], |
976 | + os.listdir(os.path.join(unpacked_dir, "bar-1.0"))) |
977 | + finally: |
978 | + cleanup_unpacked_dir(unpacked_dir) |
979 | + |
980 | + def test_cleanup(self): |
981 | + # cleanup_dir removes the temporary directory and all files under it. |
982 | + temp_dir = self.makeTemporaryDirectory() |
983 | + unpacked_dir = os.path.join(temp_dir, "unpacked") |
984 | + os.mkdir(unpacked_dir) |
985 | + os.mkdir(os.path.join(unpacked_dir, "bar_1.0")) |
986 | + cleanup_unpacked_dir(unpacked_dir) |
987 | + self.assertFalse(os.path.exists(unpacked_dir)) |
988 | + |
989 | + def test_cleanup_invalid_mode(self): |
990 | + # cleanup_dir can remove a directory even if the mode does |
991 | + # not allow it. |
992 | + temp_dir = self.makeTemporaryDirectory() |
993 | + unpacked_dir = os.path.join(temp_dir, "unpacked") |
994 | + os.mkdir(unpacked_dir) |
995 | + bar_path = os.path.join(unpacked_dir, "bar_1.0") |
996 | + os.mkdir(bar_path) |
997 | + os.chmod(bar_path, 0600) |
998 | + os.chmod(unpacked_dir, 0600) |
999 | + cleanup_unpacked_dir(unpacked_dir) |
1000 | + self.assertFalse(os.path.exists(unpacked_dir)) |
1001 | |
1002 | === modified file 'lib/lp/archiveuploader/tests/test_nascentuploadfile.py' |
1003 | --- lib/lp/archiveuploader/tests/test_nascentuploadfile.py 2010-08-21 13:54:20 +0000 |
1004 | +++ lib/lp/archiveuploader/tests/test_nascentuploadfile.py 2010-08-24 16:45:57 +0000 |
1005 | @@ -169,8 +169,7 @@ |
1006 | uploadfile = self.createDSCFile( |
1007 | "foo.dsc", dsc, "main/net", "extra", "dulwich", "0.42", |
1008 | self.createChangesFile("foo.changes", changes)) |
1009 | - (uploadfile.changelog_path, changelog_digest, changelog_size) = ( |
1010 | - self.writeUploadFile("changelog", "DUMMY")) |
1011 | + uploadfile.changelog = "DUMMY" |
1012 | uploadfile.files = [] |
1013 | release = uploadfile.storeInDatabase(None) |
1014 | self.assertEquals("0.42", release.version) |
1015 | |
1016 | === modified file 'lib/lp/hardwaredb/model/hwdb.py' |
1017 | --- lib/lp/hardwaredb/model/hwdb.py 2010-08-20 20:31:18 +0000 |
1018 | +++ lib/lp/hardwaredb/model/hwdb.py 2010-08-24 16:45:57 +0000 |
1019 | @@ -64,9 +64,6 @@ |
1020 | SQLBase, |
1021 | sqlvalues, |
1022 | ) |
1023 | -from canonical.launchpad.components.decoratedresultset import ( |
1024 | - DecoratedResultSet, |
1025 | - ) |
1026 | from canonical.launchpad.interfaces.launchpad import ILaunchpadCelebrities |
1027 | from canonical.launchpad.interfaces.librarian import ILibraryFileAliasSet |
1028 | from canonical.launchpad.validators.name import valid_name |
1029 | @@ -347,13 +344,7 @@ |
1030 | # DISTINCT clause. |
1031 | result_set.config(distinct=True) |
1032 | result_set.order_by(HWSubmission.id) |
1033 | - # The Storm implementation of ResultSet.count() is incorrect if |
1034 | - # the select query uses the distinct directive (see bug #217644). |
1035 | - # DecoratedResultSet solves this problem by modifying the query |
1036 | - # to count only the records appearing in a subquery. |
1037 | - # We don't actually need to transform the results, which is why |
1038 | - # the second argument is a no-op. |
1039 | - return DecoratedResultSet(result_set, lambda result: result) |
1040 | + return result_set |
1041 | |
1042 | def _submissionsSubmitterSelects( |
1043 | self, target_column, bus, vendor_id, product_id, driver_name, |
1044 | |
1045 | === modified file 'lib/lp/registry/browser/distribution.py' |
1046 | --- lib/lp/registry/browser/distribution.py 2010-08-20 20:31:18 +0000 |
1047 | +++ lib/lp/registry/browser/distribution.py 2010-08-24 16:45:57 +0000 |
1048 | @@ -474,18 +474,7 @@ |
1049 | """See `AbstractPackageSearchView`.""" |
1050 | |
1051 | if self.search_by_binary_name: |
1052 | - non_exact_matches = self.context.searchBinaryPackages(self.text) |
1053 | - |
1054 | - # XXX Michael Nelson 20090605 bug=217644 |
1055 | - # We are only using a decorated resultset here to conveniently |
1056 | - # get around the storm bug whereby count returns the count |
1057 | - # of non-distinct results, even though this result set |
1058 | - # is configured for distinct results. |
1059 | - def dummy_func(result): |
1060 | - return result |
1061 | - non_exact_matches = DecoratedResultSet( |
1062 | - non_exact_matches, dummy_func) |
1063 | - |
1064 | + return self.context.searchBinaryPackages(self.text) |
1065 | else: |
1066 | non_exact_matches = self.context.searchSourcePackageCaches( |
1067 | self.text) |
1068 | |
1069 | === modified file 'lib/lp/registry/browser/sourcepackage.py' |
1070 | --- lib/lp/registry/browser/sourcepackage.py 2010-08-20 20:31:18 +0000 |
1071 | +++ lib/lp/registry/browser/sourcepackage.py 2010-08-24 16:45:57 +0000 |
1072 | @@ -542,7 +542,7 @@ |
1073 | self.form_fields = Fields( |
1074 | Choice(__name__='upstream', |
1075 | title=_('Registered upstream project'), |
1076 | - default=None, |
1077 | + default=self.other_upstream, |
1078 | vocabulary=upstream_vocabulary, |
1079 | required=True)) |
1080 | |
1081 | |
1082 | === modified file 'lib/lp/registry/browser/team.py' |
1083 | --- lib/lp/registry/browser/team.py 2010-08-20 20:31:18 +0000 |
1084 | +++ lib/lp/registry/browser/team.py 2010-08-24 16:45:57 +0000 |
1085 | @@ -150,7 +150,7 @@ |
1086 | "name", "visibility", "displayname", "contactemail", |
1087 | "teamdescription", "subscriptionpolicy", |
1088 | "defaultmembershipperiod", "renewal_policy", |
1089 | - "defaultrenewalperiod", "teamowner", |
1090 | + "defaultrenewalperiod", "teamowner", |
1091 | ] |
1092 | private_prefix = PRIVATE_TEAM_PREFIX |
1093 | |
1094 | @@ -767,7 +767,7 @@ |
1095 | |
1096 | def renderTable(self): |
1097 | html = ['<table style="max-width: 80em">'] |
1098 | - items = self.subscribers.currentBatch() |
1099 | + items = list(self.subscribers.currentBatch()) |
1100 | assert len(items) > 0, ( |
1101 | "Don't call this method if there are no subscribers to show.") |
1102 | # When there are more than 10 items, we use multiple columns, but |
1103 | |
1104 | === modified file 'lib/lp/registry/model/distroseries.py' |
1105 | --- lib/lp/registry/model/distroseries.py 2010-08-23 08:12:39 +0000 |
1106 | +++ lib/lp/registry/model/distroseries.py 2010-08-24 16:45:57 +0000 |
1107 | @@ -343,7 +343,7 @@ |
1108 | @cachedproperty |
1109 | def _all_packagings(self): |
1110 | """Get an unordered list of all packagings. |
1111 | - |
1112 | + |
1113 | :return: A ResultSet which can be decorated or tuned further. Use |
1114 | DistroSeries._packaging_row_to_packaging to extract the |
1115 | packaging objects out. |
1116 | @@ -353,7 +353,7 @@ |
1117 | # Packaging object. |
1118 | # NB: precaching objects like this method tries to do has a very poor |
1119 | # hit rate with storm - many queries will still be executed; consider |
1120 | - # ripping this out and instead allowing explicit inclusion of things |
1121 | + # ripping this out and instead allowing explicit inclusion of things |
1122 | # like Person._all_members does - returning a cached object graph. |
1123 | # -- RBC 20100810 |
1124 | # Avoid circular import failures. |
1125 | @@ -1810,11 +1810,7 @@ |
1126 | DistroSeries.hide_all_translations == False, |
1127 | DistroSeries.id == POTemplate.distroseriesID) |
1128 | result_set = result_set.config(distinct=True) |
1129 | - # XXX: henninge 2009-02-11 bug=217644: Convert to sequence right here |
1130 | - # because ResultSet reports a wrong count() when using DISTINCT. Also |
1131 | - # ResultSet does not implement __len__(), which would make it more |
1132 | - # like a sequence. |
1133 | - return list(result_set) |
1134 | + return result_set |
1135 | |
1136 | def findByName(self, name): |
1137 | """See `IDistroSeriesSet`.""" |
1138 | |
1139 | === modified file 'lib/lp/registry/model/mailinglist.py' |
1140 | --- lib/lp/registry/model/mailinglist.py 2010-08-20 20:31:18 +0000 |
1141 | +++ lib/lp/registry/model/mailinglist.py 2010-08-24 16:45:57 +0000 |
1142 | @@ -389,7 +389,7 @@ |
1143 | TeamParticipation.team == self.team, |
1144 | MailingListSubscription.person == Person.id, |
1145 | MailingListSubscription.mailing_list == self) |
1146 | - return results.order_by(Person.displayname) |
1147 | + return results.order_by(Person.displayname, Person.name) |
1148 | |
1149 | def subscribe(self, person, address=None): |
1150 | """See `IMailingList`.""" |
1151 | @@ -451,8 +451,9 @@ |
1152 | MailingListSubscription.personID |
1153 | == EmailAddress.personID), |
1154 | # pylint: disable-msg=C0301 |
1155 | - LeftJoin(MailingList, |
1156 | - MailingList.id == MailingListSubscription.mailing_listID), |
1157 | + LeftJoin( |
1158 | + MailingList, |
1159 | + MailingList.id == MailingListSubscription.mailing_listID), |
1160 | LeftJoin(TeamParticipation, |
1161 | TeamParticipation.personID |
1162 | == MailingListSubscription.personID), |
1163 | @@ -472,8 +473,9 @@ |
1164 | MailingListSubscription.email_addressID |
1165 | == EmailAddress.id), |
1166 | # pylint: disable-msg=C0301 |
1167 | - LeftJoin(MailingList, |
1168 | - MailingList.id == MailingListSubscription.mailing_listID), |
1169 | + LeftJoin( |
1170 | + MailingList, |
1171 | + MailingList.id == MailingListSubscription.mailing_listID), |
1172 | LeftJoin(TeamParticipation, |
1173 | TeamParticipation.personID |
1174 | == MailingListSubscription.personID), |
1175 | @@ -664,8 +666,9 @@ |
1176 | MailingListSubscription.personID |
1177 | == EmailAddress.personID), |
1178 | # pylint: disable-msg=C0301 |
1179 | - LeftJoin(MailingList, |
1180 | - MailingList.id == MailingListSubscription.mailing_listID), |
1181 | + LeftJoin( |
1182 | + MailingList, |
1183 | + MailingList.id == MailingListSubscription.mailing_listID), |
1184 | LeftJoin(TeamParticipation, |
1185 | TeamParticipation.personID |
1186 | == MailingListSubscription.personID), |
1187 | @@ -678,8 +681,7 @@ |
1188 | team.id for team in store.find( |
1189 | Person, |
1190 | And(Person.name.is_in(team_names), |
1191 | - Person.teamowner != None)) |
1192 | - ) |
1193 | + Person.teamowner != None))) |
1194 | list_ids = set( |
1195 | mailing_list.id for mailing_list in store.find( |
1196 | MailingList, |
1197 | @@ -709,8 +711,9 @@ |
1198 | MailingListSubscription.email_addressID |
1199 | == EmailAddress.id), |
1200 | # pylint: disable-msg=C0301 |
1201 | - LeftJoin(MailingList, |
1202 | - MailingList.id == MailingListSubscription.mailing_listID), |
1203 | + LeftJoin( |
1204 | + MailingList, |
1205 | + MailingList.id == MailingListSubscription.mailing_listID), |
1206 | LeftJoin(TeamParticipation, |
1207 | TeamParticipation.personID |
1208 | == MailingListSubscription.personID), |
1209 | @@ -756,8 +759,7 @@ |
1210 | team.id for team in store.find( |
1211 | Person, |
1212 | And(Person.name.is_in(team_names), |
1213 | - Person.teamowner != None)) |
1214 | - ) |
1215 | + Person.teamowner != None))) |
1216 | team_members = store.using(*tables).find( |
1217 | (Team.name, Person.displayname, EmailAddress.email), |
1218 | And(TeamParticipation.teamID.is_in(team_ids), |
1219 | |
1220 | === modified file 'lib/lp/registry/tests/test_mailinglist.py' |
1221 | --- lib/lp/registry/tests/test_mailinglist.py 2010-08-20 20:31:18 +0000 |
1222 | +++ lib/lp/registry/tests/test_mailinglist.py 2010-08-24 16:45:57 +0000 |
1223 | @@ -1,64 +1,64 @@ |
1224 | # Copyright 2009 Canonical Ltd. This software is licensed under the |
1225 | # GNU Affero General Public License version 3 (see the file LICENSE). |
1226 | |
1227 | +from __future__ import with_statement |
1228 | + |
1229 | __metaclass__ = type |
1230 | __all__ = [] |
1231 | |
1232 | - |
1233 | -import unittest |
1234 | - |
1235 | -from canonical.launchpad.ftests import login |
1236 | -from canonical.testing import LaunchpadFunctionalLayer |
1237 | +from canonical.testing import DatabaseFunctionalLayer |
1238 | from lp.registry.interfaces.mailinglistsubscription import ( |
1239 | MailingListAutoSubscribePolicy, |
1240 | ) |
1241 | from lp.registry.interfaces.person import TeamSubscriptionPolicy |
1242 | -from lp.testing import TestCaseWithFactory |
1243 | +from lp.testing import login_celebrity, person_logged_in, TestCaseWithFactory |
1244 | |
1245 | |
1246 | class MailingList_getSubscribers_TestCase(TestCaseWithFactory): |
1247 | """Tests for `IMailingList`.getSubscribers().""" |
1248 | |
1249 | - layer = LaunchpadFunctionalLayer |
1250 | + layer = DatabaseFunctionalLayer |
1251 | |
1252 | def setUp(self): |
1253 | - # Create a team (tied to a mailing list) with one former member, one |
1254 | - # pending member and one active member. |
1255 | TestCaseWithFactory.setUp(self) |
1256 | - login('foo.bar@canonical.com') |
1257 | + self.team, self.mailing_list = self.factory.makeTeamAndMailingList( |
1258 | + 'test-mailinglist', 'team-owner') |
1259 | + |
1260 | + def test_only_active_members_can_be_subscribers(self): |
1261 | former_member = self.factory.makePerson() |
1262 | pending_member = self.factory.makePerson() |
1263 | active_member = self.active_member = self.factory.makePerson() |
1264 | - self.team, self.mailing_list = self.factory.makeTeamAndMailingList( |
1265 | - 'test-mailinglist', 'team-owner') |
1266 | - self.team.subscriptionpolicy = TeamSubscriptionPolicy.MODERATED |
1267 | - |
1268 | # Each of our members want to be subscribed to a team's mailing list |
1269 | # whenever they join the team. |
1270 | + login_celebrity('admin') |
1271 | former_member.mailing_list_auto_subscribe_policy = ( |
1272 | MailingListAutoSubscribePolicy.ALWAYS) |
1273 | active_member.mailing_list_auto_subscribe_policy = ( |
1274 | MailingListAutoSubscribePolicy.ALWAYS) |
1275 | pending_member.mailing_list_auto_subscribe_policy = ( |
1276 | MailingListAutoSubscribePolicy.ALWAYS) |
1277 | - |
1278 | + self.team.subscriptionpolicy = TeamSubscriptionPolicy.MODERATED |
1279 | pending_member.join(self.team) |
1280 | - self.assertEqual(False, pending_member.inTeam(self.team)) |
1281 | - |
1282 | self.team.addMember(former_member, reviewer=self.team.teamowner) |
1283 | former_member.leave(self.team) |
1284 | - self.assertEqual(False, former_member.inTeam(self.team)) |
1285 | - |
1286 | self.team.addMember(active_member, reviewer=self.team.teamowner) |
1287 | - self.assertEqual(True, active_member.inTeam(self.team)) |
1288 | - |
1289 | - def test_only_active_members_can_be_subscribers(self): |
1290 | # Even though our 3 members want to subscribe to the team's mailing |
1291 | # list, only the active member is considered a subscriber. |
1292 | - subscribers = [self.active_member] |
1293 | - self.assertEqual( |
1294 | - subscribers, list(self.mailing_list.getSubscribers())) |
1295 | - |
1296 | - |
1297 | -def test_suite(): |
1298 | - return unittest.TestLoader().loadTestsFromName(__name__) |
1299 | + self.assertEqual( |
1300 | + [active_member], list(self.mailing_list.getSubscribers())) |
1301 | + |
1302 | + def test_getSubscribers_order(self): |
1303 | + person_1 = self.factory.makePerson(name="pb1", displayname="Me") |
1304 | + with person_logged_in(person_1): |
1305 | + person_1.mailing_list_auto_subscribe_policy = ( |
1306 | + MailingListAutoSubscribePolicy.ALWAYS) |
1307 | + person_1.join(self.team) |
1308 | + person_2 = self.factory.makePerson(name="pa2", displayname="Me") |
1309 | + with person_logged_in(person_2): |
1310 | + person_2.mailing_list_auto_subscribe_policy = ( |
1311 | + MailingListAutoSubscribePolicy.ALWAYS) |
1312 | + person_2.join(self.team) |
1313 | + subscribers = self.mailing_list.getSubscribers() |
1314 | + self.assertEqual(2, subscribers.count()) |
1315 | + self.assertEqual( |
1316 | + ['pa2', 'pb1'], [person.name for person in subscribers]) |
1317 | |
1318 | === modified file 'lib/lp/registry/vocabularies.py' |
1319 | --- lib/lp/registry/vocabularies.py 2010-08-22 03:09:51 +0000 |
1320 | +++ lib/lp/registry/vocabularies.py 2010-08-24 16:45:57 +0000 |
1321 | @@ -95,9 +95,6 @@ |
1322 | SQLBase, |
1323 | sqlvalues, |
1324 | ) |
1325 | -from canonical.launchpad.components.decoratedresultset import ( |
1326 | - DecoratedResultSet, |
1327 | - ) |
1328 | from canonical.launchpad.database.account import Account |
1329 | from canonical.launchpad.database.emailaddress import EmailAddress |
1330 | from canonical.launchpad.database.stormsugar import StartsWith |
1331 | @@ -648,10 +645,7 @@ |
1332 | else: |
1333 | result.order_by(Person.displayname, Person.name) |
1334 | result.config(limit=self.LIMIT) |
1335 | - # XXX: BradCrittenden 2009-04-24 bug=217644: Wrap the results to |
1336 | - # ensure the .count() method works until the Storm bug is fixed and |
1337 | - # integrated. |
1338 | - return DecoratedResultSet(result) |
1339 | + return result |
1340 | |
1341 | def search(self, text): |
1342 | """Return people/teams whose fti or email address match :text:.""" |
1343 | @@ -727,10 +721,7 @@ |
1344 | result.config(distinct=True) |
1345 | result.order_by(Person.displayname, Person.name) |
1346 | result.config(limit=self.LIMIT) |
1347 | - # XXX: BradCrittenden 2009-04-24 bug=217644: Wrap the results to |
1348 | - # ensure the .count() method works until the Storm bug is fixed and |
1349 | - # integrated. |
1350 | - return DecoratedResultSet(result) |
1351 | + return result |
1352 | |
1353 | |
1354 | class ValidPersonVocabulary(ValidPersonOrTeamVocabulary): |
1355 | |
1356 | === modified file 'lib/lp/soyuz/doc/package-diff.txt' |
1357 | --- lib/lp/soyuz/doc/package-diff.txt 2010-05-13 12:04:56 +0000 |
1358 | +++ lib/lp/soyuz/doc/package-diff.txt 2010-08-24 16:45:57 +0000 |
1359 | @@ -451,12 +451,7 @@ |
1360 | >>> packagediff_set.getPendingDiffs().count() |
1361 | 7 |
1362 | |
1363 | -XXX cprov 20070530: storm doesn't go well with limited count()s |
1364 | -See bug #217644. For now we have to listify the results and used |
1365 | -the list length. |
1366 | - |
1367 | - >>> r = packagediff_set.getPendingDiffs(limit=2) |
1368 | - >>> len(list(r)) |
1369 | + >>> packagediff_set.getPendingDiffs(limit=2).count() |
1370 | 2 |
1371 | |
1372 | All package diffs targeting a set of source package releases can also |
1373 | |
1374 | === modified file 'lib/lp/soyuz/scripts/initialise_distroseries.py' |
1375 | --- lib/lp/soyuz/scripts/initialise_distroseries.py 2010-08-20 20:31:18 +0000 |
1376 | +++ lib/lp/soyuz/scripts/initialise_distroseries.py 2010-08-24 16:45:57 +0000 |
1377 | @@ -25,8 +25,10 @@ |
1378 | ArchivePurpose, |
1379 | IArchiveSet, |
1380 | ) |
1381 | +from lp.soyuz.interfaces.packageset import IPackagesetSet |
1382 | from lp.soyuz.interfaces.queue import PackageUploadStatus |
1383 | from lp.soyuz.model.packagecloner import clone_packages |
1384 | +from lp.soyuz.model.packageset import Packageset |
1385 | |
1386 | |
1387 | class InitialisationError(Exception): |
1388 | @@ -270,10 +272,28 @@ |
1389 | |
1390 | def _copy_packagesets(self): |
1391 | """Copy packagesets from the parent distroseries.""" |
1392 | - self._store.execute(""" |
1393 | - INSERT INTO Packageset |
1394 | - (distroseries, owner, name, description, packagesetgroup) |
1395 | - SELECT %s, %s, name, description, packagesetgroup |
1396 | - FROM Packageset WHERE distroseries = %s |
1397 | - """ % sqlvalues( |
1398 | - self.distroseries, self.distroseries.owner, self.parent)) |
1399 | + packagesets = self._store.find(Packageset, distroseries=self.parent) |
1400 | + parent_to_child = {} |
1401 | + # Create the packagesets, and any archivepermissions |
1402 | + for parent_ps in packagesets: |
1403 | + child_ps = getUtility(IPackagesetSet).new( |
1404 | + parent_ps.name, parent_ps.description, |
1405 | + self.distroseries.owner, distroseries=self.distroseries, |
1406 | + related_set=parent_ps) |
1407 | + self._store.execute(""" |
1408 | + INSERT INTO Archivepermission |
1409 | + (person, permission, archive, packageset, explicit) |
1410 | + SELECT person, permission, %s, %s, explicit |
1411 | + FROM Archivepermission WHERE packageset = %s |
1412 | + """ % sqlvalues( |
1413 | + self.distroseries.main_archive, child_ps.id, |
1414 | + parent_ps.id)) |
1415 | + parent_to_child[parent_ps] = child_ps |
1416 | + # Copy the relations between sets, and the contents |
1417 | + for old_series_ps, new_series_ps in parent_to_child.items(): |
1418 | + old_series_sets = old_series_ps.setsIncluded( |
1419 | + direct_inclusion=True) |
1420 | + for old_series_child in old_series_sets: |
1421 | + new_series_ps.add(parent_to_child[old_series_child]) |
1422 | + new_series_ps.add(old_series_ps.sourcesIncluded( |
1423 | + direct_inclusion=True)) |
1424 | |
1425 | === modified file 'lib/lp/soyuz/scripts/tests/test_initialise_distroseries.py' |
1426 | --- lib/lp/soyuz/scripts/tests/test_initialise_distroseries.py 2010-08-20 20:31:18 +0000 |
1427 | +++ lib/lp/soyuz/scripts/tests/test_initialise_distroseries.py 2010-08-24 16:45:57 +0000 |
1428 | @@ -23,6 +23,7 @@ |
1429 | from canonical.testing.layers import LaunchpadZopelessLayer |
1430 | from lp.buildmaster.interfaces.buildbase import BuildStatus |
1431 | from lp.registry.interfaces.pocket import PackagePublishingPocket |
1432 | +from lp.soyuz.interfaces.archivepermission import IArchivePermissionSet |
1433 | from lp.soyuz.interfaces.packageset import IPackagesetSet |
1434 | from lp.soyuz.interfaces.sourcepackageformat import SourcePackageFormat |
1435 | from lp.soyuz.model.distroarchseries import DistroArchSeries |
1436 | @@ -87,7 +88,7 @@ |
1437 | self.ubuntu['breezy-autotest']) |
1438 | ids = InitialiseDistroSeries(foobuntu) |
1439 | self.assertRaisesWithContent( |
1440 | - InitialisationError,"Parent series queues are not empty.", |
1441 | + InitialisationError, "Parent series queues are not empty.", |
1442 | ids.check) |
1443 | |
1444 | def assertDistroSeriesInitialisedCorrectly(self, foobuntu): |
1445 | @@ -191,6 +192,7 @@ |
1446 | |
1447 | def test_copying_packagesets(self): |
1448 | # If a parent series has packagesets, we should copy them |
1449 | + uploader = self.factory.makePerson() |
1450 | test1 = getUtility(IPackagesetSet).new( |
1451 | u'test1', u'test 1 packageset', self.hoary.owner, |
1452 | distroseries=self.hoary) |
1453 | @@ -199,13 +201,11 @@ |
1454 | distroseries=self.hoary) |
1455 | test3 = getUtility(IPackagesetSet).new( |
1456 | u'test3', u'test 3 packageset', self.hoary.owner, |
1457 | - distroseries=self.hoary) |
1458 | - foobuntu = self._create_distroseries(self.hoary) |
1459 | - self._set_pending_to_failed(self.hoary) |
1460 | - transaction.commit() |
1461 | - ids = InitialiseDistroSeries(foobuntu) |
1462 | - ids.check() |
1463 | - ids.initialise() |
1464 | + distroseries=self.hoary, related_set=test2) |
1465 | + test1.addSources('pmount') |
1466 | + getUtility(IArchivePermissionSet).newPackagesetUploader( |
1467 | + self.hoary.main_archive, uploader, test1) |
1468 | + foobuntu = self._full_initialise() |
1469 | # We can fetch the copied sets from foobuntu |
1470 | foobuntu_test1 = getUtility(IPackagesetSet).getByName( |
1471 | u'test1', distroseries=foobuntu) |
1472 | @@ -219,8 +219,26 @@ |
1473 | self.assertEqual(test2.description, foobuntu_test2.description) |
1474 | self.assertEqual(test3.description, foobuntu_test3.description) |
1475 | self.assertEqual(foobuntu_test1.relatedSets().one(), test1) |
1476 | - self.assertEqual(foobuntu_test2.relatedSets().one(), test2) |
1477 | - self.assertEqual(foobuntu_test3.relatedSets().one(), test3) |
1478 | + self.assertEqual( |
1479 | + list(foobuntu_test2.relatedSets()), |
1480 | + [test2, test3, foobuntu_test3]) |
1481 | + self.assertEqual( |
1482 | + list(foobuntu_test3.relatedSets()), |
1483 | + [test2, foobuntu_test2, test3]) |
1484 | + # The contents of the packagesets will have been copied. |
1485 | + foobuntu_srcs = foobuntu_test1.getSourcesIncluded( |
1486 | + direct_inclusion=True) |
1487 | + hoary_srcs = test1.getSourcesIncluded(direct_inclusion=True) |
1488 | + self.assertEqual(foobuntu_srcs, hoary_srcs) |
1489 | + # The uploader can also upload to the new distroseries. |
1490 | + self.assertTrue( |
1491 | + getUtility(IArchivePermissionSet).isSourceUploadAllowed( |
1492 | + self.hoary.main_archive, 'pmount', uploader, |
1493 | + distroseries=self.hoary)) |
1494 | + self.assertTrue( |
1495 | + getUtility(IArchivePermissionSet).isSourceUploadAllowed( |
1496 | + foobuntu.main_archive, 'pmount', uploader, |
1497 | + distroseries=foobuntu)) |
1498 | |
1499 | def test_script(self): |
1500 | # Do an end-to-end test using the command-line tool |
1501 | |
1502 | === modified file 'lib/lp/testing/fakelibrarian.py' |
1503 | --- lib/lp/testing/fakelibrarian.py 2010-08-20 20:31:18 +0000 |
1504 | +++ lib/lp/testing/fakelibrarian.py 2010-08-24 16:45:57 +0000 |
1505 | @@ -151,6 +151,19 @@ |
1506 | alias.checkCommitted() |
1507 | return StringIO(alias.content_string) |
1508 | |
1509 | + def pretendCommit(self): |
1510 | + """Pretend that there's been a commit. |
1511 | + |
1512 | + When you add a file to the librarian (real or fake), it is not |
1513 | + fully available until the transaction that added the file has |
1514 | + been committed. Call this method to make the FakeLibrarian act |
1515 | + as if there's been a commit, without actually committing a |
1516 | + database transaction. |
1517 | + """ |
1518 | + # Note that all files have been committed to storage. |
1519 | + for alias in self.aliases.itervalues(): |
1520 | + alias.file_committed = True |
1521 | + |
1522 | def _makeAlias(self, file_id, name, content, content_type): |
1523 | """Create a `LibraryFileAlias`.""" |
1524 | alias = InstrumentedLibraryFileAlias( |
1525 | @@ -195,9 +208,7 @@ |
1526 | |
1527 | def afterCompletion(self, txn): |
1528 | """See `ISynchronizer`.""" |
1529 | - # Note that all files have been committed to storage. |
1530 | - for alias in self.aliases.itervalues(): |
1531 | - alias.file_committed = True |
1532 | + self.pretendCommit() |
1533 | |
1534 | def newTransaction(self, txn): |
1535 | """See `ISynchronizer`.""" |
1536 | |
1537 | === modified file 'lib/lp/testing/tests/test_fakelibrarian.py' |
1538 | --- lib/lp/testing/tests/test_fakelibrarian.py 2010-08-20 20:31:18 +0000 |
1539 | +++ lib/lp/testing/tests/test_fakelibrarian.py 2010-08-24 16:45:57 +0000 |
1540 | @@ -109,6 +109,15 @@ |
1541 | self.assertTrue(verifyObject(ISynchronizer, self.fake_librarian)) |
1542 | self.assertIsInstance(self.fake_librarian, FakeLibrarian) |
1543 | |
1544 | + def test_pretend_commit(self): |
1545 | + name, text, alias_id = self._storeFile() |
1546 | + |
1547 | + self.fake_librarian.pretendCommit() |
1548 | + |
1549 | + retrieved_alias = getUtility(ILibraryFileAliasSet)[alias_id] |
1550 | + retrieved_alias.open() |
1551 | + self.assertEqual(text, retrieved_alias.read()) |
1552 | + |
1553 | |
1554 | class TestRealLibrarian(LibraryAccessScenarioMixin, TestCaseWithFactory): |
1555 | """Test the supported interface subset on the real librarian.""" |
1556 | |
1557 | === modified file 'versions.cfg' |
1558 | --- versions.cfg 2010-08-18 19:41:20 +0000 |
1559 | +++ versions.cfg 2010-08-24 16:45:57 +0000 |
1560 | @@ -101,7 +101,7 @@ |
1561 | z3c.ptcompat = 0.5.3 |
1562 | z3c.recipe.filetemplate = 2.1.0 |
1563 | z3c.recipe.i18n = 0.5.3 |
1564 | -z3c.recipe.scripts = 1.0.0dev-gary-r110068 |
1565 | +z3c.recipe.scripts = 1.0.0 |
1566 | z3c.recipe.tag = 0.2.0 |
1567 | z3c.rml = 0.7.3 |
1568 | z3c.skin.pagelet = 1.0.2 |
1569 | @@ -111,12 +111,12 @@ |
1570 | z3c.viewlet = 1.0.0 |
1571 | z3c.viewtemplate = 0.3.2 |
1572 | z3c.zrtresource = 1.0.1 |
1573 | -zc.buildout = 1.5.0dev-gary-r111190 |
1574 | +zc.buildout = 1.5.0 |
1575 | zc.catalog = 1.2.0 |
1576 | zc.datetimewidget = 0.5.2 |
1577 | zc.i18n = 0.5.2 |
1578 | zc.lockfile = 1.0.0 |
1579 | -zc.recipe.egg = 1.2.3dev-gary-r110068 |
1580 | +zc.recipe.egg = 1.3.0 |
1581 | zc.zservertracelog = 1.1.5 |
1582 | ZConfig = 2.7.1 |
1583 | zdaemon = 2.0.4 |
Hi Leonard,
This looks good. I just have a couple of wording suggestions below.
-Edwin
>=== modified file 'lib/canonical/ launchpad/ webapp/ interfaces. py' launchpad/ webapp/ interfaces. py 2010-05-02 23:43:35 + 0000 launchpad/ webapp/ interfaces. py 2010-07-07 20:53:55 + 0000
>--- lib/canonical/
>+++ lib/canonical/
>@@ -547,6 +547,16 @@
> for reading and changing anything, including private data.
> """)
>
>+ GRANT_PERMISSIONS = DBItem(60, """
>+ Grant Permissions
>+
>+ Not only will the application will be able to access Launchpad
s/will the application will be/will the application be/
>+ on your behalf, it will be able to grant access to your
>+ Launchpad account to any other application. This is a very
>+ powerful level of access. You should not grant this level of
>+ access to any application except the official desktop
>+ Launchpad credential manager.
"official Launchpad desktop..." makes more sense than "official desktop
Launchpad ...".
>+ """) DBEnumeratedTyp e):
>
> class AccessLevel(
> """The level of access any given principal has."""
>