Merge lp:~leonardr/launchpad/grant-permissions-oauth into lp:launchpad/db-devel
- grant-permissions-oauth
- Merge into db-devel
Status: | Merged |
---|---|
Approved by: | Edwin Grubbs |
Approved revision: | 11095 |
Merge reported by: | Leonard Richardson |
Merged at revision: | not available |
Proposed branch: | lp:~leonardr/launchpad/grant-permissions-oauth |
Merge into: | lp:launchpad/db-devel |
Diff against target: |
1583 lines (+503/-318) 25 files modified
bootstrap.py (+76/-24) lib/canonical/launchpad/browser/oauth.py (+41/-14) lib/canonical/launchpad/components/decoratedresultset.py (+0/-4) lib/canonical/launchpad/doc/webapp-authorization.txt (+21/-2) lib/canonical/launchpad/pagetests/oauth/authorize-token.txt (+28/-3) lib/canonical/launchpad/webapp/interfaces.py (+9/-0) lib/lp/archivepublisher/utils.py (+2/-1) lib/lp/archiveuploader/dscfile.py (+131/-106) lib/lp/archiveuploader/nascentupload.py (+0/-7) lib/lp/archiveuploader/tests/test_dscfile.py (+58/-41) lib/lp/archiveuploader/tests/test_nascentuploadfile.py (+1/-2) lib/lp/hardwaredb/model/hwdb.py (+1/-10) lib/lp/registry/browser/distribution.py (+1/-12) lib/lp/registry/browser/sourcepackage.py (+1/-1) lib/lp/registry/browser/team.py (+2/-2) lib/lp/registry/model/distroseries.py (+3/-7) lib/lp/registry/model/mailinglist.py (+15/-13) lib/lp/registry/tests/test_mailinglist.py (+29/-29) lib/lp/registry/vocabularies.py (+2/-11) lib/lp/soyuz/doc/package-diff.txt (+1/-6) lib/lp/soyuz/scripts/initialise_distroseries.py (+27/-7) lib/lp/soyuz/scripts/tests/test_initialise_distroseries.py (+28/-10) lib/lp/testing/fakelibrarian.py (+14/-3) lib/lp/testing/tests/test_fakelibrarian.py (+9/-0) versions.cfg (+3/-3) |
To merge this branch: | bzr merge lp:~leonardr/launchpad/grant-permissions-oauth |
Related bugs: |
Reviewer | Review Type | Date Requested | Status |
---|---|---|---|
Edwin Grubbs (community) | Approve | ||
Review via email: mp+29425@code.launchpad.net |
Commit message
Description of the change
This branch introduces a brand new access level for OAuth tokens: GRANT_PERMISSIONS. GRANT_PERMISSIONS is different from other access levels (eg. READ_PUBLIC) in that it's designed to be used by one specific application: the forthcoming desktop credential manager for the Launchpad web service.
Currently GRANT_PERMISSIONS acts exactly like READ_PUBLIC, with the following exceptions:
1. GRANT_PERMISSIONS is not published in the list of access levels -- the client must know its name ahead of time.
2. GRANT_PERMISSIONS does not show up on the list of access levels in +authorize-token unless it is specifically requested and the _only_ access level the client requests. You can't let the end-user choose between WRITE_PRIVATE and GRANT_PERMISSIONS -- either your program needs GRANT_PERMISSIONS or it doesn't.
Eventually there will be a third exception:
3. GRANT_PERMISSIONS will be the only access level that can access the current user's list of OAuth access tokens, or invoke the named operation to create a new OAuth access token.
- 11096. By Leonard Richardson
-
Rewording in response to feedback.
- 11097. By Leonard Richardson
-
Merge from trunk.
- 11098. By Leonard Richardson
-
Merge from trunk.
Brad Crittenden (bac) wrote : | # |
Leonard can this branch be landed now? Is there anything blocking you?
Preview Diff
1 | === modified file 'bootstrap.py' | |||
2 | --- bootstrap.py 2010-03-20 01:13:25 +0000 | |||
3 | +++ bootstrap.py 2010-08-24 16:45:57 +0000 | |||
4 | @@ -1,6 +1,6 @@ | |||
5 | 1 | ############################################################################## | 1 | ############################################################################## |
6 | 2 | # | 2 | # |
8 | 3 | # Copyright (c) 2006 Zope Corporation and Contributors. | 3 | # Copyright (c) 2006 Zope Foundation and Contributors. |
9 | 4 | # All Rights Reserved. | 4 | # All Rights Reserved. |
10 | 5 | # | 5 | # |
11 | 6 | # This software is subject to the provisions of the Zope Public License, | 6 | # This software is subject to the provisions of the Zope Public License, |
12 | @@ -16,11 +16,9 @@ | |||
13 | 16 | Simply run this script in a directory containing a buildout.cfg. | 16 | Simply run this script in a directory containing a buildout.cfg. |
14 | 17 | The script accepts buildout command-line options, so you can | 17 | The script accepts buildout command-line options, so you can |
15 | 18 | use the -c option to specify an alternate configuration file. | 18 | use the -c option to specify an alternate configuration file. |
16 | 19 | |||
17 | 20 | $Id$ | ||
18 | 21 | """ | 19 | """ |
19 | 22 | 20 | ||
21 | 23 | import os, shutil, sys, tempfile, textwrap, urllib, urllib2 | 21 | import os, shutil, sys, tempfile, textwrap, urllib, urllib2, subprocess |
22 | 24 | from optparse import OptionParser | 22 | from optparse import OptionParser |
23 | 25 | 23 | ||
24 | 26 | if sys.platform == 'win32': | 24 | if sys.platform == 'win32': |
25 | @@ -32,11 +30,23 @@ | |||
26 | 32 | else: | 30 | else: |
27 | 33 | quote = str | 31 | quote = str |
28 | 34 | 32 | ||
29 | 33 | # See zc.buildout.easy_install._has_broken_dash_S for motivation and comments. | ||
30 | 34 | stdout, stderr = subprocess.Popen( | ||
31 | 35 | [sys.executable, '-Sc', | ||
32 | 36 | 'try:\n' | ||
33 | 37 | ' import ConfigParser\n' | ||
34 | 38 | 'except ImportError:\n' | ||
35 | 39 | ' print 1\n' | ||
36 | 40 | 'else:\n' | ||
37 | 41 | ' print 0\n'], | ||
38 | 42 | stdout=subprocess.PIPE, stderr=subprocess.PIPE).communicate() | ||
39 | 43 | has_broken_dash_S = bool(int(stdout.strip())) | ||
40 | 44 | |||
41 | 35 | # In order to be more robust in the face of system Pythons, we want to | 45 | # In order to be more robust in the face of system Pythons, we want to |
42 | 36 | # run without site-packages loaded. This is somewhat tricky, in | 46 | # run without site-packages loaded. This is somewhat tricky, in |
43 | 37 | # particular because Python 2.6's distutils imports site, so starting | 47 | # particular because Python 2.6's distutils imports site, so starting |
44 | 38 | # with the -S flag is not sufficient. However, we'll start with that: | 48 | # with the -S flag is not sufficient. However, we'll start with that: |
46 | 39 | if 'site' in sys.modules: | 49 | if not has_broken_dash_S and 'site' in sys.modules: |
47 | 40 | # We will restart with python -S. | 50 | # We will restart with python -S. |
48 | 41 | args = sys.argv[:] | 51 | args = sys.argv[:] |
49 | 42 | args[0:0] = [sys.executable, '-S'] | 52 | args[0:0] = [sys.executable, '-S'] |
50 | @@ -109,13 +119,22 @@ | |||
51 | 109 | help=("Specify a directory for storing eggs. Defaults to " | 119 | help=("Specify a directory for storing eggs. Defaults to " |
52 | 110 | "a temporary directory that is deleted when the " | 120 | "a temporary directory that is deleted when the " |
53 | 111 | "bootstrap script completes.")) | 121 | "bootstrap script completes.")) |
54 | 122 | parser.add_option("-t", "--accept-buildout-test-releases", | ||
55 | 123 | dest='accept_buildout_test_releases', | ||
56 | 124 | action="store_true", default=False, | ||
57 | 125 | help=("Normally, if you do not specify a --version, the " | ||
58 | 126 | "bootstrap script and buildout gets the newest " | ||
59 | 127 | "*final* versions of zc.buildout and its recipes and " | ||
60 | 128 | "extensions for you. If you use this flag, " | ||
61 | 129 | "bootstrap and buildout will get the newest releases " | ||
62 | 130 | "even if they are alphas or betas.")) | ||
63 | 112 | parser.add_option("-c", None, action="store", dest="config_file", | 131 | parser.add_option("-c", None, action="store", dest="config_file", |
64 | 113 | help=("Specify the path to the buildout configuration " | 132 | help=("Specify the path to the buildout configuration " |
65 | 114 | "file to be used.")) | 133 | "file to be used.")) |
66 | 115 | 134 | ||
67 | 116 | options, args = parser.parse_args() | 135 | options, args = parser.parse_args() |
68 | 117 | 136 | ||
70 | 118 | # if -c was provided, we push it back into args for buildout' main function | 137 | # if -c was provided, we push it back into args for buildout's main function |
71 | 119 | if options.config_file is not None: | 138 | if options.config_file is not None: |
72 | 120 | args += ['-c', options.config_file] | 139 | args += ['-c', options.config_file] |
73 | 121 | 140 | ||
74 | @@ -130,16 +149,15 @@ | |||
75 | 130 | else: | 149 | else: |
76 | 131 | options.setup_source = setuptools_source | 150 | options.setup_source = setuptools_source |
77 | 132 | 151 | ||
80 | 133 | args = args + ['bootstrap'] | 152 | if options.accept_buildout_test_releases: |
81 | 134 | 153 | args.append('buildout:accept-buildout-test-releases=true') | |
82 | 154 | args.append('bootstrap') | ||
83 | 135 | 155 | ||
84 | 136 | try: | 156 | try: |
85 | 137 | to_reload = False | ||
86 | 138 | import pkg_resources | 157 | import pkg_resources |
88 | 139 | to_reload = True | 158 | import setuptools # A flag. Sometimes pkg_resources is installed alone. |
89 | 140 | if not hasattr(pkg_resources, '_distribute'): | 159 | if not hasattr(pkg_resources, '_distribute'): |
90 | 141 | raise ImportError | 160 | raise ImportError |
91 | 142 | import setuptools # A flag. Sometimes pkg_resources is installed alone. | ||
92 | 143 | except ImportError: | 161 | except ImportError: |
93 | 144 | ez_code = urllib2.urlopen( | 162 | ez_code = urllib2.urlopen( |
94 | 145 | options.setup_source).read().replace('\r\n', '\n') | 163 | options.setup_source).read().replace('\r\n', '\n') |
95 | @@ -151,10 +169,8 @@ | |||
96 | 151 | if options.use_distribute: | 169 | if options.use_distribute: |
97 | 152 | setup_args['no_fake'] = True | 170 | setup_args['no_fake'] = True |
98 | 153 | ez['use_setuptools'](**setup_args) | 171 | ez['use_setuptools'](**setup_args) |
103 | 154 | if to_reload: | 172 | reload(sys.modules['pkg_resources']) |
104 | 155 | reload(pkg_resources) | 173 | import pkg_resources |
101 | 156 | else: | ||
102 | 157 | import pkg_resources | ||
105 | 158 | # This does not (always?) update the default working set. We will | 174 | # This does not (always?) update the default working set. We will |
106 | 159 | # do it. | 175 | # do it. |
107 | 160 | for path in sys.path: | 176 | for path in sys.path: |
108 | @@ -167,23 +183,59 @@ | |||
109 | 167 | '-mqNxd', | 183 | '-mqNxd', |
110 | 168 | quote(eggs_dir)] | 184 | quote(eggs_dir)] |
111 | 169 | 185 | ||
114 | 170 | if options.download_base: | 186 | if not has_broken_dash_S: |
115 | 171 | cmd.extend(['-f', quote(options.download_base)]) | 187 | cmd.insert(1, '-S') |
116 | 172 | 188 | ||
121 | 173 | requirement = 'zc.buildout' | 189 | find_links = options.download_base |
122 | 174 | if options.version: | 190 | if not find_links: |
123 | 175 | requirement = '=='.join((requirement, options.version)) | 191 | find_links = os.environ.get('bootstrap-testing-find-links') |
124 | 176 | cmd.append(requirement) | 192 | if find_links: |
125 | 193 | cmd.extend(['-f', quote(find_links)]) | ||
126 | 177 | 194 | ||
127 | 178 | if options.use_distribute: | 195 | if options.use_distribute: |
128 | 179 | setup_requirement = 'distribute' | 196 | setup_requirement = 'distribute' |
129 | 180 | else: | 197 | else: |
130 | 181 | setup_requirement = 'setuptools' | 198 | setup_requirement = 'setuptools' |
131 | 182 | ws = pkg_resources.working_set | 199 | ws = pkg_resources.working_set |
132 | 200 | setup_requirement_path = ws.find( | ||
133 | 201 | pkg_resources.Requirement.parse(setup_requirement)).location | ||
134 | 183 | env = dict( | 202 | env = dict( |
135 | 184 | os.environ, | 203 | os.environ, |
138 | 185 | PYTHONPATH=ws.find( | 204 | PYTHONPATH=setup_requirement_path) |
139 | 186 | pkg_resources.Requirement.parse(setup_requirement)).location) | 205 | |
140 | 206 | requirement = 'zc.buildout' | ||
141 | 207 | version = options.version | ||
142 | 208 | if version is None and not options.accept_buildout_test_releases: | ||
143 | 209 | # Figure out the most recent final version of zc.buildout. | ||
144 | 210 | import setuptools.package_index | ||
145 | 211 | _final_parts = '*final-', '*final' | ||
146 | 212 | def _final_version(parsed_version): | ||
147 | 213 | for part in parsed_version: | ||
148 | 214 | if (part[:1] == '*') and (part not in _final_parts): | ||
149 | 215 | return False | ||
150 | 216 | return True | ||
151 | 217 | index = setuptools.package_index.PackageIndex( | ||
152 | 218 | search_path=[setup_requirement_path]) | ||
153 | 219 | if find_links: | ||
154 | 220 | index.add_find_links((find_links,)) | ||
155 | 221 | req = pkg_resources.Requirement.parse(requirement) | ||
156 | 222 | if index.obtain(req) is not None: | ||
157 | 223 | best = [] | ||
158 | 224 | bestv = None | ||
159 | 225 | for dist in index[req.project_name]: | ||
160 | 226 | distv = dist.parsed_version | ||
161 | 227 | if _final_version(distv): | ||
162 | 228 | if bestv is None or distv > bestv: | ||
163 | 229 | best = [dist] | ||
164 | 230 | bestv = distv | ||
165 | 231 | elif distv == bestv: | ||
166 | 232 | best.append(dist) | ||
167 | 233 | if best: | ||
168 | 234 | best.sort() | ||
169 | 235 | version = best[-1].version | ||
170 | 236 | if version: | ||
171 | 237 | requirement = '=='.join((requirement, version)) | ||
172 | 238 | cmd.append(requirement) | ||
173 | 187 | 239 | ||
174 | 188 | if is_jython: | 240 | if is_jython: |
175 | 189 | import subprocess | 241 | import subprocess |
176 | @@ -193,7 +245,7 @@ | |||
177 | 193 | if exitcode != 0: | 245 | if exitcode != 0: |
178 | 194 | sys.stdout.flush() | 246 | sys.stdout.flush() |
179 | 195 | sys.stderr.flush() | 247 | sys.stderr.flush() |
181 | 196 | print ("An error occured when trying to install zc.buildout. " | 248 | print ("An error occurred when trying to install zc.buildout. " |
182 | 197 | "Look above this message for any errors that " | 249 | "Look above this message for any errors that " |
183 | 198 | "were output by easy_install.") | 250 | "were output by easy_install.") |
184 | 199 | sys.exit(exitcode) | 251 | sys.exit(exitcode) |
185 | 200 | 252 | ||
186 | === modified file 'lib/canonical/launchpad/browser/oauth.py' | |||
187 | --- lib/canonical/launchpad/browser/oauth.py 2010-08-20 20:31:18 +0000 | |||
188 | +++ lib/canonical/launchpad/browser/oauth.py 2010-08-24 16:45:57 +0000 | |||
189 | @@ -88,8 +88,13 @@ | |||
190 | 88 | 88 | ||
191 | 89 | token = consumer.newRequestToken() | 89 | token = consumer.newRequestToken() |
192 | 90 | if self.request.headers.get('Accept') == HTTPResource.JSON_TYPE: | 90 | if self.request.headers.get('Accept') == HTTPResource.JSON_TYPE: |
193 | 91 | # Don't show the client the GRANT_PERMISSIONS access | ||
194 | 92 | # level. If they have a legitimate need to use it, they'll | ||
195 | 93 | # already know about it. | ||
196 | 94 | permissions = [permission for permission in OAuthPermission.items | ||
197 | 95 | if permission != OAuthPermission.GRANT_PERMISSIONS] | ||
198 | 91 | return self.getJSONRepresentation( | 96 | return self.getJSONRepresentation( |
200 | 92 | OAuthPermission.items, token, include_secret=True) | 97 | permissions, token, include_secret=True) |
201 | 93 | return u'oauth_token=%s&oauth_token_secret=%s' % ( | 98 | return u'oauth_token=%s&oauth_token_secret=%s' % ( |
202 | 94 | token.key, token.secret) | 99 | token.key, token.secret) |
203 | 95 | 100 | ||
204 | @@ -100,6 +105,7 @@ | |||
205 | 100 | def create_oauth_permission_actions(): | 105 | def create_oauth_permission_actions(): |
206 | 101 | """Return a list of `Action`s for each possible `OAuthPermission`.""" | 106 | """Return a list of `Action`s for each possible `OAuthPermission`.""" |
207 | 102 | actions = Actions() | 107 | actions = Actions() |
208 | 108 | actions_excluding_grant_permissions = Actions() | ||
209 | 103 | def success(form, action, data): | 109 | def success(form, action, data): |
210 | 104 | form.reviewToken(action.permission) | 110 | form.reviewToken(action.permission) |
211 | 105 | for permission in OAuthPermission.items: | 111 | for permission in OAuthPermission.items: |
212 | @@ -108,13 +114,15 @@ | |||
213 | 108 | condition=token_exists_and_is_not_reviewed) | 114 | condition=token_exists_and_is_not_reviewed) |
214 | 109 | action.permission = permission | 115 | action.permission = permission |
215 | 110 | actions.append(action) | 116 | actions.append(action) |
218 | 111 | return actions | 117 | if permission != OAuthPermission.GRANT_PERMISSIONS: |
219 | 112 | 118 | actions_excluding_grant_permissions.append(action) | |
220 | 119 | return actions, actions_excluding_grant_permissions | ||
221 | 113 | 120 | ||
222 | 114 | class OAuthAuthorizeTokenView(LaunchpadFormView, JSONTokenMixin): | 121 | class OAuthAuthorizeTokenView(LaunchpadFormView, JSONTokenMixin): |
223 | 115 | """Where users authorize consumers to access Launchpad on their behalf.""" | 122 | """Where users authorize consumers to access Launchpad on their behalf.""" |
224 | 116 | 123 | ||
226 | 117 | actions = create_oauth_permission_actions() | 124 | actions, actions_excluding_grant_permissions = ( |
227 | 125 | create_oauth_permission_actions()) | ||
228 | 118 | label = "Authorize application to access Launchpad on your behalf" | 126 | label = "Authorize application to access Launchpad on your behalf" |
229 | 119 | schema = IOAuthRequestToken | 127 | schema = IOAuthRequestToken |
230 | 120 | field_names = [] | 128 | field_names = [] |
231 | @@ -132,28 +140,47 @@ | |||
232 | 132 | acceptable subset of OAuthPermission. | 140 | acceptable subset of OAuthPermission. |
233 | 133 | 141 | ||
234 | 134 | The user always has the option to deny the client access | 142 | The user always has the option to deny the client access |
237 | 135 | altogether, so it makes sense for the client to specify the | 143 | altogether, so it makes sense for the client to ask for the |
238 | 136 | least restrictions possible. | 144 | least access possible. |
239 | 137 | 145 | ||
240 | 138 | If the client sends nonsensical values for allow_permissions, | 146 | If the client sends nonsensical values for allow_permissions, |
242 | 139 | the end-user will be given an unrestricted choice. | 147 | the end-user will be given a choice among all the permissions |
243 | 148 | used by normal applications. | ||
244 | 140 | """ | 149 | """ |
245 | 150 | |||
246 | 141 | allowed_permissions = self.request.form_ng.getAll('allow_permission') | 151 | allowed_permissions = self.request.form_ng.getAll('allow_permission') |
247 | 142 | if len(allowed_permissions) == 0: | 152 | if len(allowed_permissions) == 0: |
249 | 143 | return self.actions | 153 | return self.actions_excluding_grant_permissions |
250 | 144 | actions = Actions() | 154 | actions = Actions() |
251 | 155 | |||
252 | 156 | # UNAUTHORIZED is always one of the options. If the client | ||
253 | 157 | # explicitly requested UNAUTHORIZED, remove it from the list | ||
254 | 158 | # to simplify the algorithm: we'll add it back later. | ||
255 | 159 | if OAuthPermission.UNAUTHORIZED.name in allowed_permissions: | ||
256 | 160 | allowed_permissions.remove(OAuthPermission.UNAUTHORIZED.name) | ||
257 | 161 | |||
258 | 162 | # GRANT_PERMISSIONS cannot be requested as one of several | ||
259 | 163 | # options--it must be the only option (other than | ||
260 | 164 | # UNAUTHORIZED). If GRANT_PERMISSIONS is one of several | ||
261 | 165 | # options, remove it from the list. | ||
262 | 166 | if (OAuthPermission.GRANT_PERMISSIONS.name in allowed_permissions | ||
263 | 167 | and len(allowed_permissions) > 1): | ||
264 | 168 | allowed_permissions.remove(OAuthPermission.GRANT_PERMISSIONS.name) | ||
265 | 169 | |||
266 | 145 | for action in self.actions: | 170 | for action in self.actions: |
267 | 146 | if (action.permission.name in allowed_permissions | 171 | if (action.permission.name in allowed_permissions |
268 | 147 | or action.permission is OAuthPermission.UNAUTHORIZED): | 172 | or action.permission is OAuthPermission.UNAUTHORIZED): |
269 | 148 | actions.append(action) | 173 | actions.append(action) |
270 | 174 | |||
271 | 149 | if len(list(actions)) == 1: | 175 | if len(list(actions)) == 1: |
272 | 150 | # The only visible action is UNAUTHORIZED. That means the | 176 | # The only visible action is UNAUTHORIZED. That means the |
279 | 151 | # client tried to restrict the actions but didn't name any | 177 | # client tried to restrict the permissions but didn't name |
280 | 152 | # actual actions (except possibly UNAUTHORIZED). Rather | 178 | # any actual permissions (except possibly |
281 | 153 | # than present the end-user with an impossible situation | 179 | # UNAUTHORIZED). Rather than present the end-user with an |
282 | 154 | # where their only option is to deny access, we'll present | 180 | # impossible situation where their only option is to deny |
283 | 155 | # the full range of actions. | 181 | # access, we'll present the full range of actions (except |
284 | 156 | return self.actions | 182 | # for GRANT_PERMISSIONS). |
285 | 183 | return self.actions_excluding_grant_permissions | ||
286 | 157 | return actions | 184 | return actions |
287 | 158 | 185 | ||
288 | 159 | def initialize(self): | 186 | def initialize(self): |
289 | 160 | 187 | ||
290 | === modified file 'lib/canonical/launchpad/components/decoratedresultset.py' | |||
291 | --- lib/canonical/launchpad/components/decoratedresultset.py 2010-08-20 20:31:18 +0000 | |||
292 | +++ lib/canonical/launchpad/components/decoratedresultset.py 2010-08-24 16:45:57 +0000 | |||
293 | @@ -9,7 +9,6 @@ | |||
294 | 9 | ] | 9 | ] |
295 | 10 | 10 | ||
296 | 11 | from lazr.delegates import delegates | 11 | from lazr.delegates import delegates |
297 | 12 | from storm.expr import Column | ||
298 | 13 | from storm.zope.interfaces import IResultSet | 12 | from storm.zope.interfaces import IResultSet |
299 | 14 | from zope.security.proxy import removeSecurityProxy | 13 | from zope.security.proxy import removeSecurityProxy |
300 | 15 | 14 | ||
301 | @@ -31,9 +30,6 @@ | |||
302 | 31 | 30 | ||
303 | 32 | This behaviour is required for other classes as well (Distribution, | 31 | This behaviour is required for other classes as well (Distribution, |
304 | 33 | DistroArchSeries), hence a generalised solution. | 32 | DistroArchSeries), hence a generalised solution. |
305 | 34 | |||
306 | 35 | This class also fixes a bug currently in Storm's ResultSet.count | ||
307 | 36 | method (see below) | ||
308 | 37 | """ | 33 | """ |
309 | 38 | delegates(IResultSet, context='result_set') | 34 | delegates(IResultSet, context='result_set') |
310 | 39 | 35 | ||
311 | 40 | 36 | ||
312 | === modified file 'lib/canonical/launchpad/doc/webapp-authorization.txt' | |||
313 | --- lib/canonical/launchpad/doc/webapp-authorization.txt 2010-04-16 15:06:55 +0000 | |||
314 | +++ lib/canonical/launchpad/doc/webapp-authorization.txt 2010-08-24 16:45:57 +0000 | |||
315 | @@ -79,8 +79,27 @@ | |||
316 | 79 | >>> check_permission('launchpad.View', bug_1) | 79 | >>> check_permission('launchpad.View', bug_1) |
317 | 80 | False | 80 | False |
318 | 81 | 81 | ||
321 | 82 | Users logged in through the web application, though, have full access, | 82 | Now consider a principal authorized to create OAuth tokens. Whenever |
322 | 83 | which means they can read/change any object they have access to. | 83 | it's not creating OAuth tokens, it has a level of permission |
323 | 84 | equivalent to READ_PUBLIC. | ||
324 | 85 | |||
325 | 86 | >>> principal.access_level = AccessLevel.GRANT_PERMISSIONS | ||
326 | 87 | >>> setupInteraction(principal) | ||
327 | 88 | >>> check_permission('launchpad.View', bug_1) | ||
328 | 89 | False | ||
329 | 90 | |||
330 | 91 | >>> check_permission('launchpad.Edit', sample_person) | ||
331 | 92 | False | ||
332 | 93 | |||
333 | 94 | This may seem useless from a security standpoint, since once a | ||
334 | 95 | malicious client is authorized to create OAuth tokens, it can escalate | ||
335 | 96 | its privileges at any time by creating a new token for itself. The | ||
336 | 97 | security benefit is more subtle: by discouraging feature creep in | ||
337 | 98 | clients that have this super-access level, we reduce the risk that a | ||
338 | 99 | bug in a _trusted_ client will enable privilege escalation attacks. | ||
339 | 100 | |||
340 | 101 | Users logged in through the web application have full access, which | ||
341 | 102 | means they can read/change any object they have access to. | ||
342 | 84 | 103 | ||
343 | 85 | >>> mock_participation = Participation() | 104 | >>> mock_participation = Participation() |
344 | 86 | >>> login('test@canonical.com', mock_participation) | 105 | >>> login('test@canonical.com', mock_participation) |
345 | 87 | 106 | ||
346 | === modified file 'lib/canonical/launchpad/pagetests/oauth/authorize-token.txt' | |||
347 | --- lib/canonical/launchpad/pagetests/oauth/authorize-token.txt 2010-02-05 13:25:46 +0000 | |||
348 | +++ lib/canonical/launchpad/pagetests/oauth/authorize-token.txt 2010-08-24 16:45:57 +0000 | |||
349 | @@ -44,7 +44,8 @@ | |||
350 | 44 | ... | 44 | ... |
351 | 45 | See all applications authorized to access Launchpad on your behalf. | 45 | See all applications authorized to access Launchpad on your behalf. |
352 | 46 | 46 | ||
354 | 47 | This page contains one submit button for each item of OAuthPermission. | 47 | This page contains one submit button for each item of OAuthPermission, |
355 | 48 | except for 'Grant Permissions', which must be specifically requested. | ||
356 | 48 | 49 | ||
357 | 49 | >>> browser.getControl('No Access') | 50 | >>> browser.getControl('No Access') |
358 | 50 | <SubmitControl... | 51 | <SubmitControl... |
359 | @@ -57,9 +58,14 @@ | |||
360 | 57 | >>> browser.getControl('Change Anything') | 58 | >>> browser.getControl('Change Anything') |
361 | 58 | <SubmitControl... | 59 | <SubmitControl... |
362 | 59 | 60 | ||
363 | 61 | >>> browser.getControl('Grant Permissions') | ||
364 | 62 | Traceback (most recent call last): | ||
365 | 63 | ... | ||
366 | 64 | LookupError: label 'Grant Permissions' | ||
367 | 65 | |||
368 | 60 | >>> actions = main_content.findAll('input', attrs={'type': 'submit'}) | 66 | >>> actions = main_content.findAll('input', attrs={'type': 'submit'}) |
369 | 61 | >>> from canonical.launchpad.webapp.interfaces import OAuthPermission | 67 | >>> from canonical.launchpad.webapp.interfaces import OAuthPermission |
371 | 62 | >>> len(actions) == len(OAuthPermission.items) | 68 | >>> len(actions) == len(OAuthPermission.items) - 1 |
372 | 63 | True | 69 | True |
373 | 64 | 70 | ||
374 | 65 | An application, when asking to access Launchpad on a user's behalf, | 71 | An application, when asking to access Launchpad on a user's behalf, |
375 | @@ -83,9 +89,28 @@ | |||
376 | 83 | Change Non-Private Data | 89 | Change Non-Private Data |
377 | 84 | Change Anything | 90 | Change Anything |
378 | 85 | 91 | ||
379 | 92 | The only time the 'Grant Permissions' permission shows up in this list | ||
380 | 93 | is if the client specifically requests it, and no other | ||
381 | 94 | permission. (Also requesting UNAUTHORIZED is okay--it will show up | ||
382 | 95 | anyway.) | ||
383 | 96 | |||
384 | 97 | >>> print_access_levels('allow_permission=GRANT_PERMISSIONS') | ||
385 | 98 | No Access | ||
386 | 99 | Grant Permissions | ||
387 | 100 | |||
388 | 101 | >>> print_access_levels( | ||
389 | 102 | ... 'allow_permission=GRANT_PERMISSIONS&allow_permission=UNAUTHORIZED') | ||
390 | 103 | No Access | ||
391 | 104 | Grant Permissions | ||
392 | 105 | |||
393 | 106 | >>> print_access_levels( | ||
394 | 107 | ... 'allow_permission=WRITE_PUBLIC&allow_permission=GRANT_PERMISSIONS') | ||
395 | 108 | No Access | ||
396 | 109 | Change Non-Private Data | ||
397 | 110 | |||
398 | 86 | If an application doesn't specify any valid access levels, or only | 111 | If an application doesn't specify any valid access levels, or only |
399 | 87 | specifies the UNAUTHORIZED access level, Launchpad will show all the | 112 | specifies the UNAUTHORIZED access level, Launchpad will show all the |
401 | 88 | access levels. | 113 | access levels, except for GRANT_PERMISSIONS. |
402 | 89 | 114 | ||
403 | 90 | >>> print_access_levels('') | 115 | >>> print_access_levels('') |
404 | 91 | No Access | 116 | No Access |
405 | 92 | 117 | ||
406 | === modified file 'lib/canonical/launchpad/webapp/interfaces.py' | |||
407 | --- lib/canonical/launchpad/webapp/interfaces.py 2010-08-20 20:31:18 +0000 | |||
408 | +++ lib/canonical/launchpad/webapp/interfaces.py 2010-08-24 16:45:57 +0000 | |||
409 | @@ -527,6 +527,15 @@ | |||
410 | 527 | for reading and changing anything, including private data. | 527 | for reading and changing anything, including private data. |
411 | 528 | """) | 528 | """) |
412 | 529 | 529 | ||
413 | 530 | GRANT_PERMISSIONS = DBItem(60, """ | ||
414 | 531 | Grant Permissions | ||
415 | 532 | |||
416 | 533 | The application will be able to grant access to your Launchpad | ||
417 | 534 | account to any other application. This is a very powerful | ||
418 | 535 | level of access. You should not grant this level of access to | ||
419 | 536 | any application except the official Launchpad credential | ||
420 | 537 | manager. | ||
421 | 538 | """) | ||
422 | 530 | 539 | ||
423 | 531 | class AccessLevel(DBEnumeratedType): | 540 | class AccessLevel(DBEnumeratedType): |
424 | 532 | """The level of access any given principal has.""" | 541 | """The level of access any given principal has.""" |
425 | 533 | 542 | ||
426 | === modified file 'lib/lp/archivepublisher/utils.py' | |||
427 | --- lib/lp/archivepublisher/utils.py 2010-08-20 20:31:18 +0000 | |||
428 | +++ lib/lp/archivepublisher/utils.py 2010-08-24 16:45:57 +0000 | |||
429 | @@ -109,7 +109,8 @@ | |||
430 | 109 | end = start + chunk_size | 109 | end = start + chunk_size |
431 | 110 | 110 | ||
432 | 111 | # The reason why we listify the sliced ResultSet is because we | 111 | # The reason why we listify the sliced ResultSet is because we |
434 | 112 | # cannot very it's size using 'count' (see bug #217644). However, | 112 | # cannot very it's size using 'count' (see bug #217644 and note |
435 | 113 | # that it was fixed in storm but not SQLObjectResultSet). However, | ||
436 | 113 | # It's not exactly a problem considering non-empty set will be | 114 | # It's not exactly a problem considering non-empty set will be |
437 | 114 | # iterated anyway. | 115 | # iterated anyway. |
438 | 115 | batch = list(self.input[start:end]) | 116 | batch = list(self.input[start:end]) |
439 | 116 | 117 | ||
440 | === modified file 'lib/lp/archiveuploader/dscfile.py' | |||
441 | --- lib/lp/archiveuploader/dscfile.py 2010-08-21 13:54:20 +0000 | |||
442 | +++ lib/lp/archiveuploader/dscfile.py 2010-08-24 16:45:57 +0000 | |||
443 | @@ -13,10 +13,11 @@ | |||
444 | 13 | 'SignableTagFile', | 13 | 'SignableTagFile', |
445 | 14 | 'DSCFile', | 14 | 'DSCFile', |
446 | 15 | 'DSCUploadedFile', | 15 | 'DSCUploadedFile', |
449 | 16 | 'findChangelog', | 16 | 'find_changelog', |
450 | 17 | 'findCopyright', | 17 | 'find_copyright', |
451 | 18 | ] | 18 | ] |
452 | 19 | 19 | ||
453 | 20 | from cStringIO import StringIO | ||
454 | 20 | import errno | 21 | import errno |
455 | 21 | import glob | 22 | import glob |
456 | 22 | import os | 23 | import os |
457 | @@ -73,6 +74,70 @@ | |||
458 | 73 | from lp.soyuz.interfaces.sourcepackageformat import SourcePackageFormat | 74 | from lp.soyuz.interfaces.sourcepackageformat import SourcePackageFormat |
459 | 74 | 75 | ||
460 | 75 | 76 | ||
461 | 77 | class DpkgSourceError(Exception): | ||
462 | 78 | |||
463 | 79 | _fmt = "Unable to unpack source package (%(result)s): %(output)s" | ||
464 | 80 | |||
465 | 81 | def __init__(self, output, result): | ||
466 | 82 | Exception.__init__( | ||
467 | 83 | self, self._fmt % {"output": output, "result": result}) | ||
468 | 84 | self.output = output | ||
469 | 85 | self.result = result | ||
470 | 86 | |||
471 | 87 | |||
472 | 88 | def unpack_source(dsc_filepath): | ||
473 | 89 | """Unpack a source package into a temporary directory | ||
474 | 90 | |||
475 | 91 | :param dsc_filepath: Path to the dsc file | ||
476 | 92 | :return: Path to the temporary directory with the unpacked sources | ||
477 | 93 | """ | ||
478 | 94 | # Get a temporary dir together. | ||
479 | 95 | unpacked_dir = tempfile.mkdtemp() | ||
480 | 96 | try: | ||
481 | 97 | # chdir into it | ||
482 | 98 | cwd = os.getcwd() | ||
483 | 99 | os.chdir(unpacked_dir) | ||
484 | 100 | try: | ||
485 | 101 | args = ["dpkg-source", "-sn", "-x", dsc_filepath] | ||
486 | 102 | dpkg_source = subprocess.Popen(args, stdout=subprocess.PIPE, | ||
487 | 103 | stderr=subprocess.PIPE) | ||
488 | 104 | output, unused = dpkg_source.communicate() | ||
489 | 105 | result = dpkg_source.wait() | ||
490 | 106 | finally: | ||
491 | 107 | # When all is said and done, chdir out again so that we can | ||
492 | 108 | # clean up the tree with shutil.rmtree without leaving the | ||
493 | 109 | # process in a directory we're trying to remove. | ||
494 | 110 | os.chdir(cwd) | ||
495 | 111 | |||
496 | 112 | if result != 0: | ||
497 | 113 | dpkg_output = prefix_multi_line_string(output, " ") | ||
498 | 114 | raise DpkgSourceError(result=result, output=dpkg_output) | ||
499 | 115 | except: | ||
500 | 116 | shutil.rmtree(unpacked_dir) | ||
501 | 117 | raise | ||
502 | 118 | |||
503 | 119 | return unpacked_dir | ||
504 | 120 | |||
505 | 121 | |||
506 | 122 | def cleanup_unpacked_dir(unpacked_dir): | ||
507 | 123 | """Remove the directory with an unpacked source package. | ||
508 | 124 | |||
509 | 125 | :param unpacked_dir: Path to the directory. | ||
510 | 126 | """ | ||
511 | 127 | try: | ||
512 | 128 | shutil.rmtree(unpacked_dir) | ||
513 | 129 | except OSError, error: | ||
514 | 130 | if errno.errorcode[error.errno] != 'EACCES': | ||
515 | 131 | raise UploadError( | ||
516 | 132 | "couldn't remove tmp dir %s: code %s" % ( | ||
517 | 133 | unpacked_dir, error.errno)) | ||
518 | 134 | else: | ||
519 | 135 | result = os.system("chmod -R u+rwx " + unpacked_dir) | ||
520 | 136 | if result != 0: | ||
521 | 137 | raise UploadError("chmod failed with %s" % result) | ||
522 | 138 | shutil.rmtree(unpacked_dir) | ||
523 | 139 | |||
524 | 140 | |||
525 | 76 | class SignableTagFile: | 141 | class SignableTagFile: |
526 | 77 | """Base class for signed file verification.""" | 142 | """Base class for signed file verification.""" |
527 | 78 | 143 | ||
528 | @@ -160,7 +225,7 @@ | |||
529 | 160 | "rfc2047": rfc2047, | 225 | "rfc2047": rfc2047, |
530 | 161 | "name": name, | 226 | "name": name, |
531 | 162 | "email": email, | 227 | "email": email, |
533 | 163 | "person": person | 228 | "person": person, |
534 | 164 | } | 229 | } |
535 | 165 | 230 | ||
536 | 166 | 231 | ||
537 | @@ -187,9 +252,9 @@ | |||
538 | 187 | 252 | ||
539 | 188 | # Note that files is actually only set inside verify(). | 253 | # Note that files is actually only set inside verify(). |
540 | 189 | files = None | 254 | files = None |
542 | 190 | # Copyright and changelog_path are only set inside unpackAndCheckSource(). | 255 | # Copyright and changelog are only set inside unpackAndCheckSource(). |
543 | 191 | copyright = None | 256 | copyright = None |
545 | 192 | changelog_path = None | 257 | changelog = None |
546 | 193 | 258 | ||
547 | 194 | def __init__(self, filepath, digest, size, component_and_section, | 259 | def __init__(self, filepath, digest, size, component_and_section, |
548 | 195 | priority, package, version, changes, policy, logger): | 260 | priority, package, version, changes, policy, logger): |
549 | @@ -238,12 +303,9 @@ | |||
550 | 238 | else: | 303 | else: |
551 | 239 | self.processSignature() | 304 | self.processSignature() |
552 | 240 | 305 | ||
553 | 241 | self.unpacked_dir = None | ||
554 | 242 | |||
555 | 243 | # | 306 | # |
556 | 244 | # Useful properties. | 307 | # Useful properties. |
557 | 245 | # | 308 | # |
558 | 246 | |||
559 | 247 | @property | 309 | @property |
560 | 248 | def source(self): | 310 | def source(self): |
561 | 249 | """Return the DSC source name.""" | 311 | """Return the DSC source name.""" |
562 | @@ -277,12 +339,11 @@ | |||
563 | 277 | # | 339 | # |
564 | 278 | # DSC file checks. | 340 | # DSC file checks. |
565 | 279 | # | 341 | # |
566 | 280 | |||
567 | 281 | def verify(self): | 342 | def verify(self): |
568 | 282 | """Verify the uploaded .dsc file. | 343 | """Verify the uploaded .dsc file. |
569 | 283 | 344 | ||
572 | 284 | This method is an error generator, i.e, it returns an iterator over all | 345 | This method is an error generator, i.e, it returns an iterator over |
573 | 285 | exceptions that are generated while processing DSC file checks. | 346 | all exceptions that are generated while processing DSC file checks. |
574 | 286 | """ | 347 | """ |
575 | 287 | 348 | ||
576 | 288 | for error in SourceUploadFile.verify(self): | 349 | for error in SourceUploadFile.verify(self): |
577 | @@ -518,82 +579,53 @@ | |||
578 | 518 | self.logger.debug( | 579 | self.logger.debug( |
579 | 519 | "Verifying uploaded source package by unpacking it.") | 580 | "Verifying uploaded source package by unpacking it.") |
580 | 520 | 581 | ||
581 | 521 | # Get a temporary dir together. | ||
582 | 522 | self.unpacked_dir = tempfile.mkdtemp() | ||
583 | 523 | |||
584 | 524 | # chdir into it | ||
585 | 525 | cwd = os.getcwd() | ||
586 | 526 | os.chdir(self.unpacked_dir) | ||
587 | 527 | dsc_in_tmpdir = os.path.join(self.unpacked_dir, self.filename) | ||
588 | 528 | |||
589 | 529 | package_files = self.files + [self] | ||
590 | 530 | try: | 582 | try: |
608 | 531 | for source_file in package_files: | 583 | unpacked_dir = unpack_source(self.filepath) |
609 | 532 | os.symlink( | 584 | except DpkgSourceError, e: |
593 | 533 | source_file.filepath, | ||
594 | 534 | os.path.join(self.unpacked_dir, source_file.filename)) | ||
595 | 535 | args = ["dpkg-source", "-sn", "-x", dsc_in_tmpdir] | ||
596 | 536 | dpkg_source = subprocess.Popen(args, stdout=subprocess.PIPE, | ||
597 | 537 | stderr=subprocess.PIPE) | ||
598 | 538 | output, unused = dpkg_source.communicate() | ||
599 | 539 | result = dpkg_source.wait() | ||
600 | 540 | finally: | ||
601 | 541 | # When all is said and done, chdir out again so that we can | ||
602 | 542 | # clean up the tree with shutil.rmtree without leaving the | ||
603 | 543 | # process in a directory we're trying to remove. | ||
604 | 544 | os.chdir(cwd) | ||
605 | 545 | |||
606 | 546 | if result != 0: | ||
607 | 547 | dpkg_output = prefix_multi_line_string(output, " ") | ||
610 | 548 | yield UploadError( | 585 | yield UploadError( |
611 | 549 | "dpkg-source failed for %s [return: %s]\n" | 586 | "dpkg-source failed for %s [return: %s]\n" |
612 | 550 | "[dpkg-source output: %s]" | 587 | "[dpkg-source output: %s]" |
638 | 551 | % (self.filename, result, dpkg_output)) | 588 | % (self.filename, e.result, e.output)) |
639 | 552 | 589 | return | |
640 | 553 | # Copy debian/copyright file content. It will be stored in the | 590 | |
641 | 554 | # SourcePackageRelease records. | 591 | try: |
642 | 555 | 592 | # Copy debian/copyright file content. It will be stored in the | |
643 | 556 | # Check if 'dpkg-source' created only one directory. | 593 | # SourcePackageRelease records. |
644 | 557 | temp_directories = [ | 594 | |
645 | 558 | dirname for dirname in os.listdir(self.unpacked_dir) | 595 | # Check if 'dpkg-source' created only one directory. |
646 | 559 | if os.path.isdir(dirname)] | 596 | temp_directories = [ |
647 | 560 | if len(temp_directories) > 1: | 597 | dirname for dirname in os.listdir(unpacked_dir) |
648 | 561 | yield UploadError( | 598 | if os.path.isdir(dirname)] |
649 | 562 | 'Unpacked source contains more than one directory: %r' | 599 | if len(temp_directories) > 1: |
650 | 563 | % temp_directories) | 600 | yield UploadError( |
651 | 564 | 601 | 'Unpacked source contains more than one directory: %r' | |
652 | 565 | # XXX cprov 20070713: We should access only the expected directory | 602 | % temp_directories) |
653 | 566 | # name (<sourcename>-<no_epoch(no_revision(version))>). | 603 | |
654 | 567 | 604 | # XXX cprov 20070713: We should access only the expected directory | |
655 | 568 | # Locate both the copyright and changelog files for later processing. | 605 | # name (<sourcename>-<no_epoch(no_revision(version))>). |
656 | 569 | for error in findCopyright(self, self.unpacked_dir, self.logger): | 606 | |
657 | 570 | yield error | 607 | # Locate both the copyright and changelog files for later |
658 | 571 | 608 | # processing. | |
659 | 572 | for error in findChangelog(self, self.unpacked_dir, self.logger): | 609 | try: |
660 | 573 | yield error | 610 | self.copyright = find_copyright(unpacked_dir, self.logger) |
661 | 574 | 611 | except UploadError, error: | |
662 | 575 | self.logger.debug("Cleaning up source tree.") | 612 | yield error |
663 | 613 | return | ||
664 | 614 | except UploadWarning, warning: | ||
665 | 615 | yield warning | ||
666 | 616 | |||
667 | 617 | try: | ||
668 | 618 | self.changelog = find_changelog(unpacked_dir, self.logger) | ||
669 | 619 | except UploadError, error: | ||
670 | 620 | yield error | ||
671 | 621 | return | ||
672 | 622 | except UploadWarning, warning: | ||
673 | 623 | yield warning | ||
674 | 624 | finally: | ||
675 | 625 | self.logger.debug("Cleaning up source tree.") | ||
676 | 626 | cleanup_unpacked_dir(unpacked_dir) | ||
677 | 576 | self.logger.debug("Done") | 627 | self.logger.debug("Done") |
678 | 577 | 628 | ||
679 | 578 | def cleanUp(self): | ||
680 | 579 | if self.unpacked_dir is None: | ||
681 | 580 | return | ||
682 | 581 | try: | ||
683 | 582 | shutil.rmtree(self.unpacked_dir) | ||
684 | 583 | except OSError, error: | ||
685 | 584 | # XXX: dsilvers 2006-03-15: We currently lack a test for this. | ||
686 | 585 | if errno.errorcode[error.errno] != 'EACCES': | ||
687 | 586 | raise UploadError( | ||
688 | 587 | "%s: couldn't remove tmp dir %s: code %s" % ( | ||
689 | 588 | self.filename, self.unpacked_dir, error.errno)) | ||
690 | 589 | else: | ||
691 | 590 | result = os.system("chmod -R u+rwx " + self.unpacked_dir) | ||
692 | 591 | if result != 0: | ||
693 | 592 | raise UploadError("chmod failed with %s" % result) | ||
694 | 593 | shutil.rmtree(self.unpacked_dir) | ||
695 | 594 | self.unpacked_dir = None | ||
696 | 595 | |||
697 | 596 | |||
698 | 597 | def findBuild(self): | 629 | def findBuild(self): |
699 | 598 | """Find and return the SourcePackageRecipeBuild, if one is specified. | 630 | """Find and return the SourcePackageRecipeBuild, if one is specified. |
700 | 599 | 631 | ||
701 | @@ -651,8 +683,8 @@ | |||
702 | 651 | 683 | ||
703 | 652 | changelog_lfa = self.librarian.create( | 684 | changelog_lfa = self.librarian.create( |
704 | 653 | "changelog", | 685 | "changelog", |
707 | 654 | os.stat(self.changelog_path).st_size, | 686 | len(self.changelog), |
708 | 655 | open(self.changelog_path, "r"), | 687 | StringIO(self.changelog), |
709 | 656 | "text/x-debian-source-changelog", | 688 | "text/x-debian-source-changelog", |
710 | 657 | restricted=self.policy.archive.private) | 689 | restricted=self.policy.archive.private) |
711 | 658 | 690 | ||
712 | @@ -716,6 +748,7 @@ | |||
713 | 716 | validation inside DSCFile.verify(); there is no | 748 | validation inside DSCFile.verify(); there is no |
714 | 717 | store_in_database() method. | 749 | store_in_database() method. |
715 | 718 | """ | 750 | """ |
716 | 751 | |||
717 | 719 | def __init__(self, filepath, digest, size, policy, logger): | 752 | def __init__(self, filepath, digest, size, policy, logger): |
718 | 720 | component_and_section = priority = "--no-value--" | 753 | component_and_section = priority = "--no-value--" |
719 | 721 | NascentUploadFile.__init__( | 754 | NascentUploadFile.__init__( |
720 | @@ -735,7 +768,7 @@ | |||
721 | 735 | 768 | ||
722 | 736 | :param source_file: The directory where the source was extracted | 769 | :param source_file: The directory where the source was extracted |
723 | 737 | :param source_dir: The directory where the source was extracted. | 770 | :param source_dir: The directory where the source was extracted. |
725 | 738 | :return fullpath: The full path of the file, else return None if the | 771 | :return fullpath: The full path of the file, else return None if the |
726 | 739 | file is not found. | 772 | file is not found. |
727 | 740 | """ | 773 | """ |
728 | 741 | # Instead of trying to predict the unpacked source directory name, | 774 | # Instead of trying to predict the unpacked source directory name, |
729 | @@ -758,50 +791,42 @@ | |||
730 | 758 | return fullpath | 791 | return fullpath |
731 | 759 | return None | 792 | return None |
732 | 760 | 793 | ||
734 | 761 | def findCopyright(dsc_file, source_dir, logger): | 794 | |
735 | 795 | def find_copyright(source_dir, logger): | ||
736 | 762 | """Find and store any debian/copyright. | 796 | """Find and store any debian/copyright. |
737 | 763 | 797 | ||
738 | 764 | :param dsc_file: A DSCFile object where the copyright will be stored. | ||
739 | 765 | :param source_dir: The directory where the source was extracted. | 798 | :param source_dir: The directory where the source was extracted. |
740 | 766 | :param logger: A logger object for debug output. | 799 | :param logger: A logger object for debug output. |
741 | 800 | :return: Contents of copyright file | ||
742 | 767 | """ | 801 | """ |
748 | 768 | try: | 802 | copyright_file = findFile(source_dir, 'debian/copyright') |
744 | 769 | copyright_file = findFile(source_dir, 'debian/copyright') | ||
745 | 770 | except UploadError, error: | ||
746 | 771 | yield error | ||
747 | 772 | return | ||
749 | 773 | if copyright_file is None: | 803 | if copyright_file is None: |
752 | 774 | yield UploadWarning("No copyright file found.") | 804 | raise UploadWarning("No copyright file found.") |
751 | 775 | return | ||
753 | 776 | 805 | ||
754 | 777 | logger.debug("Copying copyright contents.") | 806 | logger.debug("Copying copyright contents.") |
759 | 778 | dsc_file.copyright = open(copyright_file).read().strip() | 807 | return open(copyright_file).read().strip() |
760 | 779 | 808 | ||
761 | 780 | 809 | ||
762 | 781 | def findChangelog(dsc_file, source_dir, logger): | 810 | def find_changelog(source_dir, logger): |
763 | 782 | """Find and move any debian/changelog. | 811 | """Find and move any debian/changelog. |
764 | 783 | 812 | ||
765 | 784 | This function finds the changelog file within the source package. The | 813 | This function finds the changelog file within the source package. The |
766 | 785 | changelog file is later uploaded to the librarian by | 814 | changelog file is later uploaded to the librarian by |
767 | 786 | DSCFile.storeInDatabase(). | 815 | DSCFile.storeInDatabase(). |
768 | 787 | 816 | ||
769 | 788 | :param dsc_file: A DSCFile object where the copyright will be stored. | ||
770 | 789 | :param source_dir: The directory where the source was extracted. | 817 | :param source_dir: The directory where the source was extracted. |
771 | 790 | :param logger: A logger object for debug output. | 818 | :param logger: A logger object for debug output. |
772 | 819 | :return: Changelog contents | ||
773 | 791 | """ | 820 | """ |
779 | 792 | try: | 821 | changelog_file = findFile(source_dir, 'debian/changelog') |
775 | 793 | changelog_file = findFile(source_dir, 'debian/changelog') | ||
776 | 794 | except UploadError, error: | ||
777 | 795 | yield error | ||
778 | 796 | return | ||
780 | 797 | if changelog_file is None: | 822 | if changelog_file is None: |
781 | 798 | # Policy requires debian/changelog to always exist. | 823 | # Policy requires debian/changelog to always exist. |
784 | 799 | yield UploadError("No changelog file found.") | 824 | raise UploadError("No changelog file found.") |
783 | 800 | return | ||
785 | 801 | 825 | ||
786 | 802 | # Move the changelog file out of the package direcotry | 826 | # Move the changelog file out of the package direcotry |
787 | 803 | logger.debug("Found changelog") | 827 | logger.debug("Found changelog") |
789 | 804 | dsc_file.changelog_path = changelog_file | 828 | return open(changelog_file, 'r').read() |
790 | 829 | |||
791 | 805 | 830 | ||
792 | 806 | 831 | ||
793 | 807 | def check_format_1_0_files(filename, file_type_counts, component_counts, | 832 | def check_format_1_0_files(filename, file_type_counts, component_counts, |
794 | 808 | 833 | ||
795 | === modified file 'lib/lp/archiveuploader/nascentupload.py' | |||
796 | --- lib/lp/archiveuploader/nascentupload.py 2010-08-20 20:31:18 +0000 | |||
797 | +++ lib/lp/archiveuploader/nascentupload.py 2010-08-24 16:45:57 +0000 | |||
798 | @@ -893,12 +893,6 @@ | |||
799 | 893 | 'Exception while accepting:\n %s' % e, exc_info=True) | 893 | 'Exception while accepting:\n %s' % e, exc_info=True) |
800 | 894 | self.do_reject(notify) | 894 | self.do_reject(notify) |
801 | 895 | return False | 895 | return False |
802 | 896 | else: | ||
803 | 897 | self.cleanUp() | ||
804 | 898 | |||
805 | 899 | def cleanUp(self): | ||
806 | 900 | if self.changes.dsc is not None: | ||
807 | 901 | self.changes.dsc.cleanUp() | ||
808 | 902 | 896 | ||
809 | 903 | def do_reject(self, notify=True): | 897 | def do_reject(self, notify=True): |
810 | 904 | """Reject the current upload given the reason provided.""" | 898 | """Reject the current upload given the reason provided.""" |
811 | @@ -929,7 +923,6 @@ | |||
812 | 929 | self.queue_root.notify(summary_text=self.rejection_message, | 923 | self.queue_root.notify(summary_text=self.rejection_message, |
813 | 930 | changes_file_object=changes_file_object, logger=self.logger) | 924 | changes_file_object=changes_file_object, logger=self.logger) |
814 | 931 | changes_file_object.close() | 925 | changes_file_object.close() |
815 | 932 | self.cleanUp() | ||
816 | 933 | 926 | ||
817 | 934 | def _createQueueEntry(self): | 927 | def _createQueueEntry(self): |
818 | 935 | """Return a PackageUpload object.""" | 928 | """Return a PackageUpload object.""" |
819 | 936 | 929 | ||
820 | === modified file 'lib/lp/archiveuploader/tests/test_dscfile.py' | |||
821 | --- lib/lp/archiveuploader/tests/test_dscfile.py 2010-08-20 20:31:18 +0000 | |||
822 | +++ lib/lp/archiveuploader/tests/test_dscfile.py 2010-08-24 16:45:57 +0000 | |||
823 | @@ -10,10 +10,12 @@ | |||
824 | 10 | from canonical.launchpad.scripts.logger import QuietFakeLogger | 10 | from canonical.launchpad.scripts.logger import QuietFakeLogger |
825 | 11 | from canonical.testing.layers import LaunchpadZopelessLayer | 11 | from canonical.testing.layers import LaunchpadZopelessLayer |
826 | 12 | from lp.archiveuploader.dscfile import ( | 12 | from lp.archiveuploader.dscfile import ( |
827 | 13 | cleanup_unpacked_dir, | ||
828 | 13 | DSCFile, | 14 | DSCFile, |
831 | 14 | findChangelog, | 15 | find_changelog, |
832 | 15 | findCopyright, | 16 | find_copyright, |
833 | 16 | format_to_file_checker_map, | 17 | format_to_file_checker_map, |
834 | 18 | unpack_source, | ||
835 | 17 | ) | 19 | ) |
836 | 18 | from lp.archiveuploader.nascentuploadfile import UploadError | 20 | from lp.archiveuploader.nascentuploadfile import UploadError |
837 | 19 | from lp.archiveuploader.tests import ( | 21 | from lp.archiveuploader.tests import ( |
838 | @@ -37,9 +39,6 @@ | |||
839 | 37 | 39 | ||
840 | 38 | class TestDscFile(TestCase): | 40 | class TestDscFile(TestCase): |
841 | 39 | 41 | ||
842 | 40 | class MockDSCFile: | ||
843 | 41 | copyright = None | ||
844 | 42 | |||
845 | 43 | def setUp(self): | 42 | def setUp(self): |
846 | 44 | super(TestDscFile, self).setUp() | 43 | super(TestDscFile, self).setUp() |
847 | 45 | self.tmpdir = self.makeTemporaryDirectory() | 44 | self.tmpdir = self.makeTemporaryDirectory() |
848 | @@ -47,7 +46,6 @@ | |||
849 | 47 | os.makedirs(self.dir_path) | 46 | os.makedirs(self.dir_path) |
850 | 48 | self.copyright_path = os.path.join(self.dir_path, "copyright") | 47 | self.copyright_path = os.path.join(self.dir_path, "copyright") |
851 | 49 | self.changelog_path = os.path.join(self.dir_path, "changelog") | 48 | self.changelog_path = os.path.join(self.dir_path, "changelog") |
852 | 50 | self.dsc_file = self.MockDSCFile() | ||
853 | 51 | 49 | ||
854 | 52 | def testBadDebianCopyright(self): | 50 | def testBadDebianCopyright(self): |
855 | 53 | """Test that a symlink as debian/copyright will fail. | 51 | """Test that a symlink as debian/copyright will fail. |
856 | @@ -56,14 +54,10 @@ | |||
857 | 56 | dangling symlink in an attempt to try and access files on the system | 54 | dangling symlink in an attempt to try and access files on the system |
858 | 57 | processing the source packages.""" | 55 | processing the source packages.""" |
859 | 58 | os.symlink("/etc/passwd", self.copyright_path) | 56 | os.symlink("/etc/passwd", self.copyright_path) |
865 | 59 | errors = list(findCopyright( | 57 | error = self.assertRaises( |
866 | 60 | self.dsc_file, self.tmpdir, mock_logger_quiet)) | 58 | UploadError, find_copyright, self.tmpdir, mock_logger_quiet) |
862 | 61 | |||
863 | 62 | self.assertEqual(len(errors), 1) | ||
864 | 63 | self.assertIsInstance(errors[0], UploadError) | ||
867 | 64 | self.assertEqual( | 59 | self.assertEqual( |
870 | 65 | errors[0].args[0], | 60 | error.args[0], "Symbolic link for debian/copyright not allowed") |
869 | 66 | "Symbolic link for debian/copyright not allowed") | ||
871 | 67 | 61 | ||
872 | 68 | def testGoodDebianCopyright(self): | 62 | def testGoodDebianCopyright(self): |
873 | 69 | """Test that a proper copyright file will be accepted""" | 63 | """Test that a proper copyright file will be accepted""" |
874 | @@ -72,11 +66,8 @@ | |||
875 | 72 | file.write(copyright) | 66 | file.write(copyright) |
876 | 73 | file.close() | 67 | file.close() |
877 | 74 | 68 | ||
883 | 75 | errors = list(findCopyright( | 69 | self.assertEquals( |
884 | 76 | self.dsc_file, self.tmpdir, mock_logger_quiet)) | 70 | copyright, find_copyright(self.tmpdir, mock_logger_quiet)) |
880 | 77 | |||
881 | 78 | self.assertEqual(len(errors), 0) | ||
882 | 79 | self.assertEqual(self.dsc_file.copyright, copyright) | ||
885 | 80 | 71 | ||
886 | 81 | def testBadDebianChangelog(self): | 72 | def testBadDebianChangelog(self): |
887 | 82 | """Test that a symlink as debian/changelog will fail. | 73 | """Test that a symlink as debian/changelog will fail. |
888 | @@ -85,14 +76,10 @@ | |||
889 | 85 | dangling symlink in an attempt to try and access files on the system | 76 | dangling symlink in an attempt to try and access files on the system |
890 | 86 | processing the source packages.""" | 77 | processing the source packages.""" |
891 | 87 | os.symlink("/etc/passwd", self.changelog_path) | 78 | os.symlink("/etc/passwd", self.changelog_path) |
897 | 88 | errors = list(findChangelog( | 79 | error = self.assertRaises( |
898 | 89 | self.dsc_file, self.tmpdir, mock_logger_quiet)) | 80 | UploadError, find_changelog, self.tmpdir, mock_logger_quiet) |
894 | 90 | |||
895 | 91 | self.assertEqual(len(errors), 1) | ||
896 | 92 | self.assertIsInstance(errors[0], UploadError) | ||
899 | 93 | self.assertEqual( | 81 | self.assertEqual( |
902 | 94 | errors[0].args[0], | 82 | error.args[0], "Symbolic link for debian/changelog not allowed") |
901 | 95 | "Symbolic link for debian/changelog not allowed") | ||
903 | 96 | 83 | ||
904 | 97 | def testGoodDebianChangelog(self): | 84 | def testGoodDebianChangelog(self): |
905 | 98 | """Test that a proper changelog file will be accepted""" | 85 | """Test that a proper changelog file will be accepted""" |
906 | @@ -101,12 +88,8 @@ | |||
907 | 101 | file.write(changelog) | 88 | file.write(changelog) |
908 | 102 | file.close() | 89 | file.close() |
909 | 103 | 90 | ||
916 | 104 | errors = list(findChangelog( | 91 | self.assertEquals( |
917 | 105 | self.dsc_file, self.tmpdir, mock_logger_quiet)) | 92 | changelog, find_changelog(self.tmpdir, mock_logger_quiet)) |
912 | 106 | |||
913 | 107 | self.assertEqual(len(errors), 0) | ||
914 | 108 | self.assertEqual(self.dsc_file.changelog_path, | ||
915 | 109 | self.changelog_path) | ||
918 | 110 | 93 | ||
919 | 111 | def testOversizedFile(self): | 94 | def testOversizedFile(self): |
920 | 112 | """Test that a file larger than 10MiB will fail. | 95 | """Test that a file larger than 10MiB will fail. |
921 | @@ -125,13 +108,10 @@ | |||
922 | 125 | file.write(empty_file) | 108 | file.write(empty_file) |
923 | 126 | file.close() | 109 | file.close() |
924 | 127 | 110 | ||
929 | 128 | errors = list(findChangelog( | 111 | error = self.assertRaises( |
930 | 129 | self.dsc_file, self.tmpdir, mock_logger_quiet)) | 112 | UploadError, find_changelog, self.tmpdir, mock_logger_quiet) |
927 | 130 | |||
928 | 131 | self.assertIsInstance(errors[0], UploadError) | ||
931 | 132 | self.assertEqual( | 113 | self.assertEqual( |
934 | 133 | errors[0].args[0], | 114 | error.args[0], "debian/changelog file too large, 10MiB max") |
933 | 134 | "debian/changelog file too large, 10MiB max") | ||
935 | 135 | 115 | ||
936 | 136 | 116 | ||
937 | 137 | class TestDscFileLibrarian(TestCaseWithFactory): | 117 | class TestDscFileLibrarian(TestCaseWithFactory): |
938 | @@ -141,6 +121,7 @@ | |||
939 | 141 | 121 | ||
940 | 142 | def getDscFile(self, name): | 122 | def getDscFile(self, name): |
941 | 143 | dsc_path = datadir(os.path.join('suite', name, name + '.dsc')) | 123 | dsc_path = datadir(os.path.join('suite', name, name + '.dsc')) |
942 | 124 | |||
943 | 144 | class Changes: | 125 | class Changes: |
944 | 145 | architectures = ['source'] | 126 | architectures = ['source'] |
945 | 146 | logger = QuietFakeLogger() | 127 | logger = QuietFakeLogger() |
946 | @@ -157,10 +138,7 @@ | |||
947 | 157 | os.chmod(tempdir, 0555) | 138 | os.chmod(tempdir, 0555) |
948 | 158 | try: | 139 | try: |
949 | 159 | dsc_file = self.getDscFile('bar_1.0-1') | 140 | dsc_file = self.getDscFile('bar_1.0-1') |
954 | 160 | try: | 141 | list(dsc_file.verify()) |
951 | 161 | list(dsc_file.verify()) | ||
952 | 162 | finally: | ||
953 | 163 | dsc_file.cleanUp() | ||
955 | 164 | finally: | 142 | finally: |
956 | 165 | os.chmod(tempdir, 0755) | 143 | os.chmod(tempdir, 0755) |
957 | 166 | 144 | ||
958 | @@ -292,3 +270,42 @@ | |||
959 | 292 | # A 3.0 (native) source with component tarballs is invalid. | 270 | # A 3.0 (native) source with component tarballs is invalid. |
960 | 293 | self.assertErrorsForFiles( | 271 | self.assertErrorsForFiles( |
961 | 294 | [self.wrong_files_error], {NATIVE_TARBALL: 1}, {'foo': 1}) | 272 | [self.wrong_files_error], {NATIVE_TARBALL: 1}, {'foo': 1}) |
962 | 273 | |||
963 | 274 | |||
964 | 275 | class UnpackedDirTests(TestCase): | ||
965 | 276 | """Tests for unpack_source and cleanup_unpacked_dir.""" | ||
966 | 277 | |||
967 | 278 | def test_unpack_source(self): | ||
968 | 279 | # unpack_source unpacks in a temporary directory and returns the | ||
969 | 280 | # path. | ||
970 | 281 | unpacked_dir = unpack_source( | ||
971 | 282 | datadir(os.path.join('suite', 'bar_1.0-1', 'bar_1.0-1.dsc'))) | ||
972 | 283 | try: | ||
973 | 284 | self.assertEquals(["bar-1.0"], os.listdir(unpacked_dir)) | ||
974 | 285 | self.assertContentEqual( | ||
975 | 286 | ["THIS_IS_BAR", "debian"], | ||
976 | 287 | os.listdir(os.path.join(unpacked_dir, "bar-1.0"))) | ||
977 | 288 | finally: | ||
978 | 289 | cleanup_unpacked_dir(unpacked_dir) | ||
979 | 290 | |||
980 | 291 | def test_cleanup(self): | ||
981 | 292 | # cleanup_dir removes the temporary directory and all files under it. | ||
982 | 293 | temp_dir = self.makeTemporaryDirectory() | ||
983 | 294 | unpacked_dir = os.path.join(temp_dir, "unpacked") | ||
984 | 295 | os.mkdir(unpacked_dir) | ||
985 | 296 | os.mkdir(os.path.join(unpacked_dir, "bar_1.0")) | ||
986 | 297 | cleanup_unpacked_dir(unpacked_dir) | ||
987 | 298 | self.assertFalse(os.path.exists(unpacked_dir)) | ||
988 | 299 | |||
989 | 300 | def test_cleanup_invalid_mode(self): | ||
990 | 301 | # cleanup_dir can remove a directory even if the mode does | ||
991 | 302 | # not allow it. | ||
992 | 303 | temp_dir = self.makeTemporaryDirectory() | ||
993 | 304 | unpacked_dir = os.path.join(temp_dir, "unpacked") | ||
994 | 305 | os.mkdir(unpacked_dir) | ||
995 | 306 | bar_path = os.path.join(unpacked_dir, "bar_1.0") | ||
996 | 307 | os.mkdir(bar_path) | ||
997 | 308 | os.chmod(bar_path, 0600) | ||
998 | 309 | os.chmod(unpacked_dir, 0600) | ||
999 | 310 | cleanup_unpacked_dir(unpacked_dir) | ||
1000 | 311 | self.assertFalse(os.path.exists(unpacked_dir)) | ||
1001 | 295 | 312 | ||
1002 | === modified file 'lib/lp/archiveuploader/tests/test_nascentuploadfile.py' | |||
1003 | --- lib/lp/archiveuploader/tests/test_nascentuploadfile.py 2010-08-21 13:54:20 +0000 | |||
1004 | +++ lib/lp/archiveuploader/tests/test_nascentuploadfile.py 2010-08-24 16:45:57 +0000 | |||
1005 | @@ -169,8 +169,7 @@ | |||
1006 | 169 | uploadfile = self.createDSCFile( | 169 | uploadfile = self.createDSCFile( |
1007 | 170 | "foo.dsc", dsc, "main/net", "extra", "dulwich", "0.42", | 170 | "foo.dsc", dsc, "main/net", "extra", "dulwich", "0.42", |
1008 | 171 | self.createChangesFile("foo.changes", changes)) | 171 | self.createChangesFile("foo.changes", changes)) |
1011 | 172 | (uploadfile.changelog_path, changelog_digest, changelog_size) = ( | 172 | uploadfile.changelog = "DUMMY" |
1010 | 173 | self.writeUploadFile("changelog", "DUMMY")) | ||
1012 | 174 | uploadfile.files = [] | 173 | uploadfile.files = [] |
1013 | 175 | release = uploadfile.storeInDatabase(None) | 174 | release = uploadfile.storeInDatabase(None) |
1014 | 176 | self.assertEquals("0.42", release.version) | 175 | self.assertEquals("0.42", release.version) |
1015 | 177 | 176 | ||
1016 | === modified file 'lib/lp/hardwaredb/model/hwdb.py' | |||
1017 | --- lib/lp/hardwaredb/model/hwdb.py 2010-08-20 20:31:18 +0000 | |||
1018 | +++ lib/lp/hardwaredb/model/hwdb.py 2010-08-24 16:45:57 +0000 | |||
1019 | @@ -64,9 +64,6 @@ | |||
1020 | 64 | SQLBase, | 64 | SQLBase, |
1021 | 65 | sqlvalues, | 65 | sqlvalues, |
1022 | 66 | ) | 66 | ) |
1023 | 67 | from canonical.launchpad.components.decoratedresultset import ( | ||
1024 | 68 | DecoratedResultSet, | ||
1025 | 69 | ) | ||
1026 | 70 | from canonical.launchpad.interfaces.launchpad import ILaunchpadCelebrities | 67 | from canonical.launchpad.interfaces.launchpad import ILaunchpadCelebrities |
1027 | 71 | from canonical.launchpad.interfaces.librarian import ILibraryFileAliasSet | 68 | from canonical.launchpad.interfaces.librarian import ILibraryFileAliasSet |
1028 | 72 | from canonical.launchpad.validators.name import valid_name | 69 | from canonical.launchpad.validators.name import valid_name |
1029 | @@ -347,13 +344,7 @@ | |||
1030 | 347 | # DISTINCT clause. | 344 | # DISTINCT clause. |
1031 | 348 | result_set.config(distinct=True) | 345 | result_set.config(distinct=True) |
1032 | 349 | result_set.order_by(HWSubmission.id) | 346 | result_set.order_by(HWSubmission.id) |
1040 | 350 | # The Storm implementation of ResultSet.count() is incorrect if | 347 | return result_set |
1034 | 351 | # the select query uses the distinct directive (see bug #217644). | ||
1035 | 352 | # DecoratedResultSet solves this problem by modifying the query | ||
1036 | 353 | # to count only the records appearing in a subquery. | ||
1037 | 354 | # We don't actually need to transform the results, which is why | ||
1038 | 355 | # the second argument is a no-op. | ||
1039 | 356 | return DecoratedResultSet(result_set, lambda result: result) | ||
1041 | 357 | 348 | ||
1042 | 358 | def _submissionsSubmitterSelects( | 349 | def _submissionsSubmitterSelects( |
1043 | 359 | self, target_column, bus, vendor_id, product_id, driver_name, | 350 | self, target_column, bus, vendor_id, product_id, driver_name, |
1044 | 360 | 351 | ||
1045 | === modified file 'lib/lp/registry/browser/distribution.py' | |||
1046 | --- lib/lp/registry/browser/distribution.py 2010-08-20 20:31:18 +0000 | |||
1047 | +++ lib/lp/registry/browser/distribution.py 2010-08-24 16:45:57 +0000 | |||
1048 | @@ -474,18 +474,7 @@ | |||
1049 | 474 | """See `AbstractPackageSearchView`.""" | 474 | """See `AbstractPackageSearchView`.""" |
1050 | 475 | 475 | ||
1051 | 476 | if self.search_by_binary_name: | 476 | if self.search_by_binary_name: |
1064 | 477 | non_exact_matches = self.context.searchBinaryPackages(self.text) | 477 | return self.context.searchBinaryPackages(self.text) |
1053 | 478 | |||
1054 | 479 | # XXX Michael Nelson 20090605 bug=217644 | ||
1055 | 480 | # We are only using a decorated resultset here to conveniently | ||
1056 | 481 | # get around the storm bug whereby count returns the count | ||
1057 | 482 | # of non-distinct results, even though this result set | ||
1058 | 483 | # is configured for distinct results. | ||
1059 | 484 | def dummy_func(result): | ||
1060 | 485 | return result | ||
1061 | 486 | non_exact_matches = DecoratedResultSet( | ||
1062 | 487 | non_exact_matches, dummy_func) | ||
1063 | 488 | |||
1065 | 489 | else: | 478 | else: |
1066 | 490 | non_exact_matches = self.context.searchSourcePackageCaches( | 479 | non_exact_matches = self.context.searchSourcePackageCaches( |
1067 | 491 | self.text) | 480 | self.text) |
1068 | 492 | 481 | ||
1069 | === modified file 'lib/lp/registry/browser/sourcepackage.py' | |||
1070 | --- lib/lp/registry/browser/sourcepackage.py 2010-08-20 20:31:18 +0000 | |||
1071 | +++ lib/lp/registry/browser/sourcepackage.py 2010-08-24 16:45:57 +0000 | |||
1072 | @@ -542,7 +542,7 @@ | |||
1073 | 542 | self.form_fields = Fields( | 542 | self.form_fields = Fields( |
1074 | 543 | Choice(__name__='upstream', | 543 | Choice(__name__='upstream', |
1075 | 544 | title=_('Registered upstream project'), | 544 | title=_('Registered upstream project'), |
1077 | 545 | default=None, | 545 | default=self.other_upstream, |
1078 | 546 | vocabulary=upstream_vocabulary, | 546 | vocabulary=upstream_vocabulary, |
1079 | 547 | required=True)) | 547 | required=True)) |
1080 | 548 | 548 | ||
1081 | 549 | 549 | ||
1082 | === modified file 'lib/lp/registry/browser/team.py' | |||
1083 | --- lib/lp/registry/browser/team.py 2010-08-20 20:31:18 +0000 | |||
1084 | +++ lib/lp/registry/browser/team.py 2010-08-24 16:45:57 +0000 | |||
1085 | @@ -150,7 +150,7 @@ | |||
1086 | 150 | "name", "visibility", "displayname", "contactemail", | 150 | "name", "visibility", "displayname", "contactemail", |
1087 | 151 | "teamdescription", "subscriptionpolicy", | 151 | "teamdescription", "subscriptionpolicy", |
1088 | 152 | "defaultmembershipperiod", "renewal_policy", | 152 | "defaultmembershipperiod", "renewal_policy", |
1090 | 153 | "defaultrenewalperiod", "teamowner", | 153 | "defaultrenewalperiod", "teamowner", |
1091 | 154 | ] | 154 | ] |
1092 | 155 | private_prefix = PRIVATE_TEAM_PREFIX | 155 | private_prefix = PRIVATE_TEAM_PREFIX |
1093 | 156 | 156 | ||
1094 | @@ -767,7 +767,7 @@ | |||
1095 | 767 | 767 | ||
1096 | 768 | def renderTable(self): | 768 | def renderTable(self): |
1097 | 769 | html = ['<table style="max-width: 80em">'] | 769 | html = ['<table style="max-width: 80em">'] |
1099 | 770 | items = self.subscribers.currentBatch() | 770 | items = list(self.subscribers.currentBatch()) |
1100 | 771 | assert len(items) > 0, ( | 771 | assert len(items) > 0, ( |
1101 | 772 | "Don't call this method if there are no subscribers to show.") | 772 | "Don't call this method if there are no subscribers to show.") |
1102 | 773 | # When there are more than 10 items, we use multiple columns, but | 773 | # When there are more than 10 items, we use multiple columns, but |
1103 | 774 | 774 | ||
1104 | === modified file 'lib/lp/registry/model/distroseries.py' | |||
1105 | --- lib/lp/registry/model/distroseries.py 2010-08-23 08:12:39 +0000 | |||
1106 | +++ lib/lp/registry/model/distroseries.py 2010-08-24 16:45:57 +0000 | |||
1107 | @@ -343,7 +343,7 @@ | |||
1108 | 343 | @cachedproperty | 343 | @cachedproperty |
1109 | 344 | def _all_packagings(self): | 344 | def _all_packagings(self): |
1110 | 345 | """Get an unordered list of all packagings. | 345 | """Get an unordered list of all packagings. |
1112 | 346 | 346 | ||
1113 | 347 | :return: A ResultSet which can be decorated or tuned further. Use | 347 | :return: A ResultSet which can be decorated or tuned further. Use |
1114 | 348 | DistroSeries._packaging_row_to_packaging to extract the | 348 | DistroSeries._packaging_row_to_packaging to extract the |
1115 | 349 | packaging objects out. | 349 | packaging objects out. |
1116 | @@ -353,7 +353,7 @@ | |||
1117 | 353 | # Packaging object. | 353 | # Packaging object. |
1118 | 354 | # NB: precaching objects like this method tries to do has a very poor | 354 | # NB: precaching objects like this method tries to do has a very poor |
1119 | 355 | # hit rate with storm - many queries will still be executed; consider | 355 | # hit rate with storm - many queries will still be executed; consider |
1121 | 356 | # ripping this out and instead allowing explicit inclusion of things | 356 | # ripping this out and instead allowing explicit inclusion of things |
1122 | 357 | # like Person._all_members does - returning a cached object graph. | 357 | # like Person._all_members does - returning a cached object graph. |
1123 | 358 | # -- RBC 20100810 | 358 | # -- RBC 20100810 |
1124 | 359 | # Avoid circular import failures. | 359 | # Avoid circular import failures. |
1125 | @@ -1810,11 +1810,7 @@ | |||
1126 | 1810 | DistroSeries.hide_all_translations == False, | 1810 | DistroSeries.hide_all_translations == False, |
1127 | 1811 | DistroSeries.id == POTemplate.distroseriesID) | 1811 | DistroSeries.id == POTemplate.distroseriesID) |
1128 | 1812 | result_set = result_set.config(distinct=True) | 1812 | result_set = result_set.config(distinct=True) |
1134 | 1813 | # XXX: henninge 2009-02-11 bug=217644: Convert to sequence right here | 1813 | return result_set |
1130 | 1814 | # because ResultSet reports a wrong count() when using DISTINCT. Also | ||
1131 | 1815 | # ResultSet does not implement __len__(), which would make it more | ||
1132 | 1816 | # like a sequence. | ||
1133 | 1817 | return list(result_set) | ||
1135 | 1818 | 1814 | ||
1136 | 1819 | def findByName(self, name): | 1815 | def findByName(self, name): |
1137 | 1820 | """See `IDistroSeriesSet`.""" | 1816 | """See `IDistroSeriesSet`.""" |
1138 | 1821 | 1817 | ||
1139 | === modified file 'lib/lp/registry/model/mailinglist.py' | |||
1140 | --- lib/lp/registry/model/mailinglist.py 2010-08-20 20:31:18 +0000 | |||
1141 | +++ lib/lp/registry/model/mailinglist.py 2010-08-24 16:45:57 +0000 | |||
1142 | @@ -389,7 +389,7 @@ | |||
1143 | 389 | TeamParticipation.team == self.team, | 389 | TeamParticipation.team == self.team, |
1144 | 390 | MailingListSubscription.person == Person.id, | 390 | MailingListSubscription.person == Person.id, |
1145 | 391 | MailingListSubscription.mailing_list == self) | 391 | MailingListSubscription.mailing_list == self) |
1147 | 392 | return results.order_by(Person.displayname) | 392 | return results.order_by(Person.displayname, Person.name) |
1148 | 393 | 393 | ||
1149 | 394 | def subscribe(self, person, address=None): | 394 | def subscribe(self, person, address=None): |
1150 | 395 | """See `IMailingList`.""" | 395 | """See `IMailingList`.""" |
1151 | @@ -451,8 +451,9 @@ | |||
1152 | 451 | MailingListSubscription.personID | 451 | MailingListSubscription.personID |
1153 | 452 | == EmailAddress.personID), | 452 | == EmailAddress.personID), |
1154 | 453 | # pylint: disable-msg=C0301 | 453 | # pylint: disable-msg=C0301 |
1157 | 454 | LeftJoin(MailingList, | 454 | LeftJoin( |
1158 | 455 | MailingList.id == MailingListSubscription.mailing_listID), | 455 | MailingList, |
1159 | 456 | MailingList.id == MailingListSubscription.mailing_listID), | ||
1160 | 456 | LeftJoin(TeamParticipation, | 457 | LeftJoin(TeamParticipation, |
1161 | 457 | TeamParticipation.personID | 458 | TeamParticipation.personID |
1162 | 458 | == MailingListSubscription.personID), | 459 | == MailingListSubscription.personID), |
1163 | @@ -472,8 +473,9 @@ | |||
1164 | 472 | MailingListSubscription.email_addressID | 473 | MailingListSubscription.email_addressID |
1165 | 473 | == EmailAddress.id), | 474 | == EmailAddress.id), |
1166 | 474 | # pylint: disable-msg=C0301 | 475 | # pylint: disable-msg=C0301 |
1169 | 475 | LeftJoin(MailingList, | 476 | LeftJoin( |
1170 | 476 | MailingList.id == MailingListSubscription.mailing_listID), | 477 | MailingList, |
1171 | 478 | MailingList.id == MailingListSubscription.mailing_listID), | ||
1172 | 477 | LeftJoin(TeamParticipation, | 479 | LeftJoin(TeamParticipation, |
1173 | 478 | TeamParticipation.personID | 480 | TeamParticipation.personID |
1174 | 479 | == MailingListSubscription.personID), | 481 | == MailingListSubscription.personID), |
1175 | @@ -664,8 +666,9 @@ | |||
1176 | 664 | MailingListSubscription.personID | 666 | MailingListSubscription.personID |
1177 | 665 | == EmailAddress.personID), | 667 | == EmailAddress.personID), |
1178 | 666 | # pylint: disable-msg=C0301 | 668 | # pylint: disable-msg=C0301 |
1181 | 667 | LeftJoin(MailingList, | 669 | LeftJoin( |
1182 | 668 | MailingList.id == MailingListSubscription.mailing_listID), | 670 | MailingList, |
1183 | 671 | MailingList.id == MailingListSubscription.mailing_listID), | ||
1184 | 669 | LeftJoin(TeamParticipation, | 672 | LeftJoin(TeamParticipation, |
1185 | 670 | TeamParticipation.personID | 673 | TeamParticipation.personID |
1186 | 671 | == MailingListSubscription.personID), | 674 | == MailingListSubscription.personID), |
1187 | @@ -678,8 +681,7 @@ | |||
1188 | 678 | team.id for team in store.find( | 681 | team.id for team in store.find( |
1189 | 679 | Person, | 682 | Person, |
1190 | 680 | And(Person.name.is_in(team_names), | 683 | And(Person.name.is_in(team_names), |
1193 | 681 | Person.teamowner != None)) | 684 | Person.teamowner != None))) |
1192 | 682 | ) | ||
1194 | 683 | list_ids = set( | 685 | list_ids = set( |
1195 | 684 | mailing_list.id for mailing_list in store.find( | 686 | mailing_list.id for mailing_list in store.find( |
1196 | 685 | MailingList, | 687 | MailingList, |
1197 | @@ -709,8 +711,9 @@ | |||
1198 | 709 | MailingListSubscription.email_addressID | 711 | MailingListSubscription.email_addressID |
1199 | 710 | == EmailAddress.id), | 712 | == EmailAddress.id), |
1200 | 711 | # pylint: disable-msg=C0301 | 713 | # pylint: disable-msg=C0301 |
1203 | 712 | LeftJoin(MailingList, | 714 | LeftJoin( |
1204 | 713 | MailingList.id == MailingListSubscription.mailing_listID), | 715 | MailingList, |
1205 | 716 | MailingList.id == MailingListSubscription.mailing_listID), | ||
1206 | 714 | LeftJoin(TeamParticipation, | 717 | LeftJoin(TeamParticipation, |
1207 | 715 | TeamParticipation.personID | 718 | TeamParticipation.personID |
1208 | 716 | == MailingListSubscription.personID), | 719 | == MailingListSubscription.personID), |
1209 | @@ -756,8 +759,7 @@ | |||
1210 | 756 | team.id for team in store.find( | 759 | team.id for team in store.find( |
1211 | 757 | Person, | 760 | Person, |
1212 | 758 | And(Person.name.is_in(team_names), | 761 | And(Person.name.is_in(team_names), |
1215 | 759 | Person.teamowner != None)) | 762 | Person.teamowner != None))) |
1214 | 760 | ) | ||
1216 | 761 | team_members = store.using(*tables).find( | 763 | team_members = store.using(*tables).find( |
1217 | 762 | (Team.name, Person.displayname, EmailAddress.email), | 764 | (Team.name, Person.displayname, EmailAddress.email), |
1218 | 763 | And(TeamParticipation.teamID.is_in(team_ids), | 765 | And(TeamParticipation.teamID.is_in(team_ids), |
1219 | 764 | 766 | ||
1220 | === modified file 'lib/lp/registry/tests/test_mailinglist.py' | |||
1221 | --- lib/lp/registry/tests/test_mailinglist.py 2010-08-20 20:31:18 +0000 | |||
1222 | +++ lib/lp/registry/tests/test_mailinglist.py 2010-08-24 16:45:57 +0000 | |||
1223 | @@ -1,64 +1,64 @@ | |||
1224 | 1 | # Copyright 2009 Canonical Ltd. This software is licensed under the | 1 | # Copyright 2009 Canonical Ltd. This software is licensed under the |
1225 | 2 | # GNU Affero General Public License version 3 (see the file LICENSE). | 2 | # GNU Affero General Public License version 3 (see the file LICENSE). |
1226 | 3 | 3 | ||
1227 | 4 | from __future__ import with_statement | ||
1228 | 5 | |||
1229 | 4 | __metaclass__ = type | 6 | __metaclass__ = type |
1230 | 5 | __all__ = [] | 7 | __all__ = [] |
1231 | 6 | 8 | ||
1237 | 7 | 9 | from canonical.testing import DatabaseFunctionalLayer | |
1233 | 8 | import unittest | ||
1234 | 9 | |||
1235 | 10 | from canonical.launchpad.ftests import login | ||
1236 | 11 | from canonical.testing import LaunchpadFunctionalLayer | ||
1238 | 12 | from lp.registry.interfaces.mailinglistsubscription import ( | 10 | from lp.registry.interfaces.mailinglistsubscription import ( |
1239 | 13 | MailingListAutoSubscribePolicy, | 11 | MailingListAutoSubscribePolicy, |
1240 | 14 | ) | 12 | ) |
1241 | 15 | from lp.registry.interfaces.person import TeamSubscriptionPolicy | 13 | from lp.registry.interfaces.person import TeamSubscriptionPolicy |
1243 | 16 | from lp.testing import TestCaseWithFactory | 14 | from lp.testing import login_celebrity, person_logged_in, TestCaseWithFactory |
1244 | 17 | 15 | ||
1245 | 18 | 16 | ||
1246 | 19 | class MailingList_getSubscribers_TestCase(TestCaseWithFactory): | 17 | class MailingList_getSubscribers_TestCase(TestCaseWithFactory): |
1247 | 20 | """Tests for `IMailingList`.getSubscribers().""" | 18 | """Tests for `IMailingList`.getSubscribers().""" |
1248 | 21 | 19 | ||
1250 | 22 | layer = LaunchpadFunctionalLayer | 20 | layer = DatabaseFunctionalLayer |
1251 | 23 | 21 | ||
1252 | 24 | def setUp(self): | 22 | def setUp(self): |
1253 | 25 | # Create a team (tied to a mailing list) with one former member, one | ||
1254 | 26 | # pending member and one active member. | ||
1255 | 27 | TestCaseWithFactory.setUp(self) | 23 | TestCaseWithFactory.setUp(self) |
1257 | 28 | login('foo.bar@canonical.com') | 24 | self.team, self.mailing_list = self.factory.makeTeamAndMailingList( |
1258 | 25 | 'test-mailinglist', 'team-owner') | ||
1259 | 26 | |||
1260 | 27 | def test_only_active_members_can_be_subscribers(self): | ||
1261 | 29 | former_member = self.factory.makePerson() | 28 | former_member = self.factory.makePerson() |
1262 | 30 | pending_member = self.factory.makePerson() | 29 | pending_member = self.factory.makePerson() |
1263 | 31 | active_member = self.active_member = self.factory.makePerson() | 30 | active_member = self.active_member = self.factory.makePerson() |
1264 | 32 | self.team, self.mailing_list = self.factory.makeTeamAndMailingList( | ||
1265 | 33 | 'test-mailinglist', 'team-owner') | ||
1266 | 34 | self.team.subscriptionpolicy = TeamSubscriptionPolicy.MODERATED | ||
1267 | 35 | |||
1268 | 36 | # Each of our members want to be subscribed to a team's mailing list | 31 | # Each of our members want to be subscribed to a team's mailing list |
1269 | 37 | # whenever they join the team. | 32 | # whenever they join the team. |
1270 | 33 | login_celebrity('admin') | ||
1271 | 38 | former_member.mailing_list_auto_subscribe_policy = ( | 34 | former_member.mailing_list_auto_subscribe_policy = ( |
1272 | 39 | MailingListAutoSubscribePolicy.ALWAYS) | 35 | MailingListAutoSubscribePolicy.ALWAYS) |
1273 | 40 | active_member.mailing_list_auto_subscribe_policy = ( | 36 | active_member.mailing_list_auto_subscribe_policy = ( |
1274 | 41 | MailingListAutoSubscribePolicy.ALWAYS) | 37 | MailingListAutoSubscribePolicy.ALWAYS) |
1275 | 42 | pending_member.mailing_list_auto_subscribe_policy = ( | 38 | pending_member.mailing_list_auto_subscribe_policy = ( |
1276 | 43 | MailingListAutoSubscribePolicy.ALWAYS) | 39 | MailingListAutoSubscribePolicy.ALWAYS) |
1278 | 44 | 40 | self.team.subscriptionpolicy = TeamSubscriptionPolicy.MODERATED | |
1279 | 45 | pending_member.join(self.team) | 41 | pending_member.join(self.team) |
1280 | 46 | self.assertEqual(False, pending_member.inTeam(self.team)) | ||
1281 | 47 | |||
1282 | 48 | self.team.addMember(former_member, reviewer=self.team.teamowner) | 42 | self.team.addMember(former_member, reviewer=self.team.teamowner) |
1283 | 49 | former_member.leave(self.team) | 43 | former_member.leave(self.team) |
1284 | 50 | self.assertEqual(False, former_member.inTeam(self.team)) | ||
1285 | 51 | |||
1286 | 52 | self.team.addMember(active_member, reviewer=self.team.teamowner) | 44 | self.team.addMember(active_member, reviewer=self.team.teamowner) |
1287 | 53 | self.assertEqual(True, active_member.inTeam(self.team)) | ||
1288 | 54 | |||
1289 | 55 | def test_only_active_members_can_be_subscribers(self): | ||
1290 | 56 | # Even though our 3 members want to subscribe to the team's mailing | 45 | # Even though our 3 members want to subscribe to the team's mailing |
1291 | 57 | # list, only the active member is considered a subscriber. | 46 | # list, only the active member is considered a subscriber. |
1299 | 58 | subscribers = [self.active_member] | 47 | self.assertEqual( |
1300 | 59 | self.assertEqual( | 48 | [active_member], list(self.mailing_list.getSubscribers())) |
1301 | 60 | subscribers, list(self.mailing_list.getSubscribers())) | 49 | |
1302 | 61 | 50 | def test_getSubscribers_order(self): | |
1303 | 62 | 51 | person_1 = self.factory.makePerson(name="pb1", displayname="Me") | |
1304 | 63 | def test_suite(): | 52 | with person_logged_in(person_1): |
1305 | 64 | return unittest.TestLoader().loadTestsFromName(__name__) | 53 | person_1.mailing_list_auto_subscribe_policy = ( |
1306 | 54 | MailingListAutoSubscribePolicy.ALWAYS) | ||
1307 | 55 | person_1.join(self.team) | ||
1308 | 56 | person_2 = self.factory.makePerson(name="pa2", displayname="Me") | ||
1309 | 57 | with person_logged_in(person_2): | ||
1310 | 58 | person_2.mailing_list_auto_subscribe_policy = ( | ||
1311 | 59 | MailingListAutoSubscribePolicy.ALWAYS) | ||
1312 | 60 | person_2.join(self.team) | ||
1313 | 61 | subscribers = self.mailing_list.getSubscribers() | ||
1314 | 62 | self.assertEqual(2, subscribers.count()) | ||
1315 | 63 | self.assertEqual( | ||
1316 | 64 | ['pa2', 'pb1'], [person.name for person in subscribers]) | ||
1317 | 65 | 65 | ||
1318 | === modified file 'lib/lp/registry/vocabularies.py' | |||
1319 | --- lib/lp/registry/vocabularies.py 2010-08-22 03:09:51 +0000 | |||
1320 | +++ lib/lp/registry/vocabularies.py 2010-08-24 16:45:57 +0000 | |||
1321 | @@ -95,9 +95,6 @@ | |||
1322 | 95 | SQLBase, | 95 | SQLBase, |
1323 | 96 | sqlvalues, | 96 | sqlvalues, |
1324 | 97 | ) | 97 | ) |
1325 | 98 | from canonical.launchpad.components.decoratedresultset import ( | ||
1326 | 99 | DecoratedResultSet, | ||
1327 | 100 | ) | ||
1328 | 101 | from canonical.launchpad.database.account import Account | 98 | from canonical.launchpad.database.account import Account |
1329 | 102 | from canonical.launchpad.database.emailaddress import EmailAddress | 99 | from canonical.launchpad.database.emailaddress import EmailAddress |
1330 | 103 | from canonical.launchpad.database.stormsugar import StartsWith | 100 | from canonical.launchpad.database.stormsugar import StartsWith |
1331 | @@ -648,10 +645,7 @@ | |||
1332 | 648 | else: | 645 | else: |
1333 | 649 | result.order_by(Person.displayname, Person.name) | 646 | result.order_by(Person.displayname, Person.name) |
1334 | 650 | result.config(limit=self.LIMIT) | 647 | result.config(limit=self.LIMIT) |
1339 | 651 | # XXX: BradCrittenden 2009-04-24 bug=217644: Wrap the results to | 648 | return result |
1336 | 652 | # ensure the .count() method works until the Storm bug is fixed and | ||
1337 | 653 | # integrated. | ||
1338 | 654 | return DecoratedResultSet(result) | ||
1340 | 655 | 649 | ||
1341 | 656 | def search(self, text): | 650 | def search(self, text): |
1342 | 657 | """Return people/teams whose fti or email address match :text:.""" | 651 | """Return people/teams whose fti or email address match :text:.""" |
1343 | @@ -727,10 +721,7 @@ | |||
1344 | 727 | result.config(distinct=True) | 721 | result.config(distinct=True) |
1345 | 728 | result.order_by(Person.displayname, Person.name) | 722 | result.order_by(Person.displayname, Person.name) |
1346 | 729 | result.config(limit=self.LIMIT) | 723 | result.config(limit=self.LIMIT) |
1351 | 730 | # XXX: BradCrittenden 2009-04-24 bug=217644: Wrap the results to | 724 | return result |
1348 | 731 | # ensure the .count() method works until the Storm bug is fixed and | ||
1349 | 732 | # integrated. | ||
1350 | 733 | return DecoratedResultSet(result) | ||
1352 | 734 | 725 | ||
1353 | 735 | 726 | ||
1354 | 736 | class ValidPersonVocabulary(ValidPersonOrTeamVocabulary): | 727 | class ValidPersonVocabulary(ValidPersonOrTeamVocabulary): |
1355 | 737 | 728 | ||
1356 | === modified file 'lib/lp/soyuz/doc/package-diff.txt' | |||
1357 | --- lib/lp/soyuz/doc/package-diff.txt 2010-05-13 12:04:56 +0000 | |||
1358 | +++ lib/lp/soyuz/doc/package-diff.txt 2010-08-24 16:45:57 +0000 | |||
1359 | @@ -451,12 +451,7 @@ | |||
1360 | 451 | >>> packagediff_set.getPendingDiffs().count() | 451 | >>> packagediff_set.getPendingDiffs().count() |
1361 | 452 | 7 | 452 | 7 |
1362 | 453 | 453 | ||
1369 | 454 | XXX cprov 20070530: storm doesn't go well with limited count()s | 454 | >>> packagediff_set.getPendingDiffs(limit=2).count() |
1364 | 455 | See bug #217644. For now we have to listify the results and used | ||
1365 | 456 | the list length. | ||
1366 | 457 | |||
1367 | 458 | >>> r = packagediff_set.getPendingDiffs(limit=2) | ||
1368 | 459 | >>> len(list(r)) | ||
1370 | 460 | 2 | 455 | 2 |
1371 | 461 | 456 | ||
1372 | 462 | All package diffs targeting a set of source package releases can also | 457 | All package diffs targeting a set of source package releases can also |
1373 | 463 | 458 | ||
1374 | === modified file 'lib/lp/soyuz/scripts/initialise_distroseries.py' | |||
1375 | --- lib/lp/soyuz/scripts/initialise_distroseries.py 2010-08-20 20:31:18 +0000 | |||
1376 | +++ lib/lp/soyuz/scripts/initialise_distroseries.py 2010-08-24 16:45:57 +0000 | |||
1377 | @@ -25,8 +25,10 @@ | |||
1378 | 25 | ArchivePurpose, | 25 | ArchivePurpose, |
1379 | 26 | IArchiveSet, | 26 | IArchiveSet, |
1380 | 27 | ) | 27 | ) |
1381 | 28 | from lp.soyuz.interfaces.packageset import IPackagesetSet | ||
1382 | 28 | from lp.soyuz.interfaces.queue import PackageUploadStatus | 29 | from lp.soyuz.interfaces.queue import PackageUploadStatus |
1383 | 29 | from lp.soyuz.model.packagecloner import clone_packages | 30 | from lp.soyuz.model.packagecloner import clone_packages |
1384 | 31 | from lp.soyuz.model.packageset import Packageset | ||
1385 | 30 | 32 | ||
1386 | 31 | 33 | ||
1387 | 32 | class InitialisationError(Exception): | 34 | class InitialisationError(Exception): |
1388 | @@ -270,10 +272,28 @@ | |||
1389 | 270 | 272 | ||
1390 | 271 | def _copy_packagesets(self): | 273 | def _copy_packagesets(self): |
1391 | 272 | """Copy packagesets from the parent distroseries.""" | 274 | """Copy packagesets from the parent distroseries.""" |
1399 | 273 | self._store.execute(""" | 275 | packagesets = self._store.find(Packageset, distroseries=self.parent) |
1400 | 274 | INSERT INTO Packageset | 276 | parent_to_child = {} |
1401 | 275 | (distroseries, owner, name, description, packagesetgroup) | 277 | # Create the packagesets, and any archivepermissions |
1402 | 276 | SELECT %s, %s, name, description, packagesetgroup | 278 | for parent_ps in packagesets: |
1403 | 277 | FROM Packageset WHERE distroseries = %s | 279 | child_ps = getUtility(IPackagesetSet).new( |
1404 | 278 | """ % sqlvalues( | 280 | parent_ps.name, parent_ps.description, |
1405 | 279 | self.distroseries, self.distroseries.owner, self.parent)) | 281 | self.distroseries.owner, distroseries=self.distroseries, |
1406 | 282 | related_set=parent_ps) | ||
1407 | 283 | self._store.execute(""" | ||
1408 | 284 | INSERT INTO Archivepermission | ||
1409 | 285 | (person, permission, archive, packageset, explicit) | ||
1410 | 286 | SELECT person, permission, %s, %s, explicit | ||
1411 | 287 | FROM Archivepermission WHERE packageset = %s | ||
1412 | 288 | """ % sqlvalues( | ||
1413 | 289 | self.distroseries.main_archive, child_ps.id, | ||
1414 | 290 | parent_ps.id)) | ||
1415 | 291 | parent_to_child[parent_ps] = child_ps | ||
1416 | 292 | # Copy the relations between sets, and the contents | ||
1417 | 293 | for old_series_ps, new_series_ps in parent_to_child.items(): | ||
1418 | 294 | old_series_sets = old_series_ps.setsIncluded( | ||
1419 | 295 | direct_inclusion=True) | ||
1420 | 296 | for old_series_child in old_series_sets: | ||
1421 | 297 | new_series_ps.add(parent_to_child[old_series_child]) | ||
1422 | 298 | new_series_ps.add(old_series_ps.sourcesIncluded( | ||
1423 | 299 | direct_inclusion=True)) | ||
1424 | 280 | 300 | ||
1425 | === modified file 'lib/lp/soyuz/scripts/tests/test_initialise_distroseries.py' | |||
1426 | --- lib/lp/soyuz/scripts/tests/test_initialise_distroseries.py 2010-08-20 20:31:18 +0000 | |||
1427 | +++ lib/lp/soyuz/scripts/tests/test_initialise_distroseries.py 2010-08-24 16:45:57 +0000 | |||
1428 | @@ -23,6 +23,7 @@ | |||
1429 | 23 | from canonical.testing.layers import LaunchpadZopelessLayer | 23 | from canonical.testing.layers import LaunchpadZopelessLayer |
1430 | 24 | from lp.buildmaster.interfaces.buildbase import BuildStatus | 24 | from lp.buildmaster.interfaces.buildbase import BuildStatus |
1431 | 25 | from lp.registry.interfaces.pocket import PackagePublishingPocket | 25 | from lp.registry.interfaces.pocket import PackagePublishingPocket |
1432 | 26 | from lp.soyuz.interfaces.archivepermission import IArchivePermissionSet | ||
1433 | 26 | from lp.soyuz.interfaces.packageset import IPackagesetSet | 27 | from lp.soyuz.interfaces.packageset import IPackagesetSet |
1434 | 27 | from lp.soyuz.interfaces.sourcepackageformat import SourcePackageFormat | 28 | from lp.soyuz.interfaces.sourcepackageformat import SourcePackageFormat |
1435 | 28 | from lp.soyuz.model.distroarchseries import DistroArchSeries | 29 | from lp.soyuz.model.distroarchseries import DistroArchSeries |
1436 | @@ -87,7 +88,7 @@ | |||
1437 | 87 | self.ubuntu['breezy-autotest']) | 88 | self.ubuntu['breezy-autotest']) |
1438 | 88 | ids = InitialiseDistroSeries(foobuntu) | 89 | ids = InitialiseDistroSeries(foobuntu) |
1439 | 89 | self.assertRaisesWithContent( | 90 | self.assertRaisesWithContent( |
1441 | 90 | InitialisationError,"Parent series queues are not empty.", | 91 | InitialisationError, "Parent series queues are not empty.", |
1442 | 91 | ids.check) | 92 | ids.check) |
1443 | 92 | 93 | ||
1444 | 93 | def assertDistroSeriesInitialisedCorrectly(self, foobuntu): | 94 | def assertDistroSeriesInitialisedCorrectly(self, foobuntu): |
1445 | @@ -191,6 +192,7 @@ | |||
1446 | 191 | 192 | ||
1447 | 192 | def test_copying_packagesets(self): | 193 | def test_copying_packagesets(self): |
1448 | 193 | # If a parent series has packagesets, we should copy them | 194 | # If a parent series has packagesets, we should copy them |
1449 | 195 | uploader = self.factory.makePerson() | ||
1450 | 194 | test1 = getUtility(IPackagesetSet).new( | 196 | test1 = getUtility(IPackagesetSet).new( |
1451 | 195 | u'test1', u'test 1 packageset', self.hoary.owner, | 197 | u'test1', u'test 1 packageset', self.hoary.owner, |
1452 | 196 | distroseries=self.hoary) | 198 | distroseries=self.hoary) |
1453 | @@ -199,13 +201,11 @@ | |||
1454 | 199 | distroseries=self.hoary) | 201 | distroseries=self.hoary) |
1455 | 200 | test3 = getUtility(IPackagesetSet).new( | 202 | test3 = getUtility(IPackagesetSet).new( |
1456 | 201 | u'test3', u'test 3 packageset', self.hoary.owner, | 203 | u'test3', u'test 3 packageset', self.hoary.owner, |
1464 | 202 | distroseries=self.hoary) | 204 | distroseries=self.hoary, related_set=test2) |
1465 | 203 | foobuntu = self._create_distroseries(self.hoary) | 205 | test1.addSources('pmount') |
1466 | 204 | self._set_pending_to_failed(self.hoary) | 206 | getUtility(IArchivePermissionSet).newPackagesetUploader( |
1467 | 205 | transaction.commit() | 207 | self.hoary.main_archive, uploader, test1) |
1468 | 206 | ids = InitialiseDistroSeries(foobuntu) | 208 | foobuntu = self._full_initialise() |
1462 | 207 | ids.check() | ||
1463 | 208 | ids.initialise() | ||
1469 | 209 | # We can fetch the copied sets from foobuntu | 209 | # We can fetch the copied sets from foobuntu |
1470 | 210 | foobuntu_test1 = getUtility(IPackagesetSet).getByName( | 210 | foobuntu_test1 = getUtility(IPackagesetSet).getByName( |
1471 | 211 | u'test1', distroseries=foobuntu) | 211 | u'test1', distroseries=foobuntu) |
1472 | @@ -219,8 +219,26 @@ | |||
1473 | 219 | self.assertEqual(test2.description, foobuntu_test2.description) | 219 | self.assertEqual(test2.description, foobuntu_test2.description) |
1474 | 220 | self.assertEqual(test3.description, foobuntu_test3.description) | 220 | self.assertEqual(test3.description, foobuntu_test3.description) |
1475 | 221 | self.assertEqual(foobuntu_test1.relatedSets().one(), test1) | 221 | self.assertEqual(foobuntu_test1.relatedSets().one(), test1) |
1478 | 222 | self.assertEqual(foobuntu_test2.relatedSets().one(), test2) | 222 | self.assertEqual( |
1479 | 223 | self.assertEqual(foobuntu_test3.relatedSets().one(), test3) | 223 | list(foobuntu_test2.relatedSets()), |
1480 | 224 | [test2, test3, foobuntu_test3]) | ||
1481 | 225 | self.assertEqual( | ||
1482 | 226 | list(foobuntu_test3.relatedSets()), | ||
1483 | 227 | [test2, foobuntu_test2, test3]) | ||
1484 | 228 | # The contents of the packagesets will have been copied. | ||
1485 | 229 | foobuntu_srcs = foobuntu_test1.getSourcesIncluded( | ||
1486 | 230 | direct_inclusion=True) | ||
1487 | 231 | hoary_srcs = test1.getSourcesIncluded(direct_inclusion=True) | ||
1488 | 232 | self.assertEqual(foobuntu_srcs, hoary_srcs) | ||
1489 | 233 | # The uploader can also upload to the new distroseries. | ||
1490 | 234 | self.assertTrue( | ||
1491 | 235 | getUtility(IArchivePermissionSet).isSourceUploadAllowed( | ||
1492 | 236 | self.hoary.main_archive, 'pmount', uploader, | ||
1493 | 237 | distroseries=self.hoary)) | ||
1494 | 238 | self.assertTrue( | ||
1495 | 239 | getUtility(IArchivePermissionSet).isSourceUploadAllowed( | ||
1496 | 240 | foobuntu.main_archive, 'pmount', uploader, | ||
1497 | 241 | distroseries=foobuntu)) | ||
1498 | 224 | 242 | ||
1499 | 225 | def test_script(self): | 243 | def test_script(self): |
1500 | 226 | # Do an end-to-end test using the command-line tool | 244 | # Do an end-to-end test using the command-line tool |
1501 | 227 | 245 | ||
1502 | === modified file 'lib/lp/testing/fakelibrarian.py' | |||
1503 | --- lib/lp/testing/fakelibrarian.py 2010-08-20 20:31:18 +0000 | |||
1504 | +++ lib/lp/testing/fakelibrarian.py 2010-08-24 16:45:57 +0000 | |||
1505 | @@ -151,6 +151,19 @@ | |||
1506 | 151 | alias.checkCommitted() | 151 | alias.checkCommitted() |
1507 | 152 | return StringIO(alias.content_string) | 152 | return StringIO(alias.content_string) |
1508 | 153 | 153 | ||
1509 | 154 | def pretendCommit(self): | ||
1510 | 155 | """Pretend that there's been a commit. | ||
1511 | 156 | |||
1512 | 157 | When you add a file to the librarian (real or fake), it is not | ||
1513 | 158 | fully available until the transaction that added the file has | ||
1514 | 159 | been committed. Call this method to make the FakeLibrarian act | ||
1515 | 160 | as if there's been a commit, without actually committing a | ||
1516 | 161 | database transaction. | ||
1517 | 162 | """ | ||
1518 | 163 | # Note that all files have been committed to storage. | ||
1519 | 164 | for alias in self.aliases.itervalues(): | ||
1520 | 165 | alias.file_committed = True | ||
1521 | 166 | |||
1522 | 154 | def _makeAlias(self, file_id, name, content, content_type): | 167 | def _makeAlias(self, file_id, name, content, content_type): |
1523 | 155 | """Create a `LibraryFileAlias`.""" | 168 | """Create a `LibraryFileAlias`.""" |
1524 | 156 | alias = InstrumentedLibraryFileAlias( | 169 | alias = InstrumentedLibraryFileAlias( |
1525 | @@ -195,9 +208,7 @@ | |||
1526 | 195 | 208 | ||
1527 | 196 | def afterCompletion(self, txn): | 209 | def afterCompletion(self, txn): |
1528 | 197 | """See `ISynchronizer`.""" | 210 | """See `ISynchronizer`.""" |
1532 | 198 | # Note that all files have been committed to storage. | 211 | self.pretendCommit() |
1530 | 199 | for alias in self.aliases.itervalues(): | ||
1531 | 200 | alias.file_committed = True | ||
1533 | 201 | 212 | ||
1534 | 202 | def newTransaction(self, txn): | 213 | def newTransaction(self, txn): |
1535 | 203 | """See `ISynchronizer`.""" | 214 | """See `ISynchronizer`.""" |
1536 | 204 | 215 | ||
1537 | === modified file 'lib/lp/testing/tests/test_fakelibrarian.py' | |||
1538 | --- lib/lp/testing/tests/test_fakelibrarian.py 2010-08-20 20:31:18 +0000 | |||
1539 | +++ lib/lp/testing/tests/test_fakelibrarian.py 2010-08-24 16:45:57 +0000 | |||
1540 | @@ -109,6 +109,15 @@ | |||
1541 | 109 | self.assertTrue(verifyObject(ISynchronizer, self.fake_librarian)) | 109 | self.assertTrue(verifyObject(ISynchronizer, self.fake_librarian)) |
1542 | 110 | self.assertIsInstance(self.fake_librarian, FakeLibrarian) | 110 | self.assertIsInstance(self.fake_librarian, FakeLibrarian) |
1543 | 111 | 111 | ||
1544 | 112 | def test_pretend_commit(self): | ||
1545 | 113 | name, text, alias_id = self._storeFile() | ||
1546 | 114 | |||
1547 | 115 | self.fake_librarian.pretendCommit() | ||
1548 | 116 | |||
1549 | 117 | retrieved_alias = getUtility(ILibraryFileAliasSet)[alias_id] | ||
1550 | 118 | retrieved_alias.open() | ||
1551 | 119 | self.assertEqual(text, retrieved_alias.read()) | ||
1552 | 120 | |||
1553 | 112 | 121 | ||
1554 | 113 | class TestRealLibrarian(LibraryAccessScenarioMixin, TestCaseWithFactory): | 122 | class TestRealLibrarian(LibraryAccessScenarioMixin, TestCaseWithFactory): |
1555 | 114 | """Test the supported interface subset on the real librarian.""" | 123 | """Test the supported interface subset on the real librarian.""" |
1556 | 115 | 124 | ||
1557 | === modified file 'versions.cfg' | |||
1558 | --- versions.cfg 2010-08-18 19:41:20 +0000 | |||
1559 | +++ versions.cfg 2010-08-24 16:45:57 +0000 | |||
1560 | @@ -101,7 +101,7 @@ | |||
1561 | 101 | z3c.ptcompat = 0.5.3 | 101 | z3c.ptcompat = 0.5.3 |
1562 | 102 | z3c.recipe.filetemplate = 2.1.0 | 102 | z3c.recipe.filetemplate = 2.1.0 |
1563 | 103 | z3c.recipe.i18n = 0.5.3 | 103 | z3c.recipe.i18n = 0.5.3 |
1565 | 104 | z3c.recipe.scripts = 1.0.0dev-gary-r110068 | 104 | z3c.recipe.scripts = 1.0.0 |
1566 | 105 | z3c.recipe.tag = 0.2.0 | 105 | z3c.recipe.tag = 0.2.0 |
1567 | 106 | z3c.rml = 0.7.3 | 106 | z3c.rml = 0.7.3 |
1568 | 107 | z3c.skin.pagelet = 1.0.2 | 107 | z3c.skin.pagelet = 1.0.2 |
1569 | @@ -111,12 +111,12 @@ | |||
1570 | 111 | z3c.viewlet = 1.0.0 | 111 | z3c.viewlet = 1.0.0 |
1571 | 112 | z3c.viewtemplate = 0.3.2 | 112 | z3c.viewtemplate = 0.3.2 |
1572 | 113 | z3c.zrtresource = 1.0.1 | 113 | z3c.zrtresource = 1.0.1 |
1574 | 114 | zc.buildout = 1.5.0dev-gary-r111190 | 114 | zc.buildout = 1.5.0 |
1575 | 115 | zc.catalog = 1.2.0 | 115 | zc.catalog = 1.2.0 |
1576 | 116 | zc.datetimewidget = 0.5.2 | 116 | zc.datetimewidget = 0.5.2 |
1577 | 117 | zc.i18n = 0.5.2 | 117 | zc.i18n = 0.5.2 |
1578 | 118 | zc.lockfile = 1.0.0 | 118 | zc.lockfile = 1.0.0 |
1580 | 119 | zc.recipe.egg = 1.2.3dev-gary-r110068 | 119 | zc.recipe.egg = 1.3.0 |
1581 | 120 | zc.zservertracelog = 1.1.5 | 120 | zc.zservertracelog = 1.1.5 |
1582 | 121 | ZConfig = 2.7.1 | 121 | ZConfig = 2.7.1 |
1583 | 122 | zdaemon = 2.0.4 | 122 | zdaemon = 2.0.4 |
Hi Leonard,
This looks good. I just have a couple of wording suggestions below.
-Edwin
>=== modified file 'lib/canonical/ launchpad/ webapp/ interfaces. py' launchpad/ webapp/ interfaces. py 2010-05-02 23:43:35 + 0000 launchpad/ webapp/ interfaces. py 2010-07-07 20:53:55 + 0000
>--- lib/canonical/
>+++ lib/canonical/
>@@ -547,6 +547,16 @@
> for reading and changing anything, including private data.
> """)
>
>+ GRANT_PERMISSIONS = DBItem(60, """
>+ Grant Permissions
>+
>+ Not only will the application will be able to access Launchpad
s/will the application will be/will the application be/
>+ on your behalf, it will be able to grant access to your
>+ Launchpad account to any other application. This is a very
>+ powerful level of access. You should not grant this level of
>+ access to any application except the official desktop
>+ Launchpad credential manager.
"official Launchpad desktop..." makes more sense than "official desktop
Launchpad ...".
>+ """) DBEnumeratedTyp e):
>
> class AccessLevel(
> """The level of access any given principal has."""
>