Merge lp:~bjornt/launchpad/acl-adapter into lp:launchpad
- acl-adapter
- Merge into devel
Proposed by
Björn Tillenius
Status: | Rejected |
---|---|
Rejected by: | Björn Tillenius |
Proposed branch: | lp:~bjornt/launchpad/acl-adapter |
Merge into: | lp:launchpad |
Diff against target: |
2399 lines (+820/-641) (has conflicts) 54 files modified
Makefile (+1/-1) cronscripts/calculate-bug-heat.py (+0/-33) cronscripts/publishing/maintenance-check.py (+1/-1) database/replication/helpers.py (+10/-4) database/replication/new-slave.py (+5/-0) database/schema/comments.sql (+3/-0) database/schema/fti.py (+10/-13) database/schema/patch-2207-60-1.sql (+10/-0) database/schema/patch-2207-60-2.sql (+7/-0) database/schema/patch-2207-61-0.sql (+13/-0) database/schema/patch-2207-62-0.sql (+14/-0) database/schema/patch-2207-63-0.sql (+7/-0) database/schema/security.cfg (+1/-0) database/schema/trusted.sql (+8/-1) lib/canonical/config/schema-lazr.conf (+11/-0) lib/canonical/configure.zcml (+1/-0) lib/canonical/launchpad/interfaces/_schema_circular_imports.py (+2/-0) lib/canonical/launchpad/scripts/garbo.py (+0/-1) lib/lp/bugs/browser/bugtask.py (+1/-1) lib/lp/bugs/browser/tests/test_bugtask.py (+9/-8) lib/lp/bugs/configure.zcml (+0/-12) lib/lp/bugs/doc/bugtask-status-workflow.txt (+11/-1) lib/lp/bugs/interfaces/bugjob.py (+1/-11) lib/lp/bugs/interfaces/bugtask.py (+11/-2) lib/lp/bugs/model/bug.py (+0/-1) lib/lp/bugs/model/bugheat.py (+0/-54) lib/lp/bugs/scripts/bugheat.py (+0/-108) lib/lp/bugs/scripts/tests/test_bugheat.py (+0/-256) lib/lp/bugs/tests/bugs-emailinterface.txt (+1/-1) lib/lp/bugs/tests/bugtarget-bugcount.txt (+2/-0) lib/lp/bugs/tests/test_bugheat.py (+1/-102) lib/lp/code/browser/sourcepackagerecipe.py (+13/-2) lib/lp/code/browser/tests/test_sourcepackagerecipe.py (+71/-6) lib/lp/code/configure.zcml (+1/-0) lib/lp/code/interfaces/sourcepackagerecipe.py (+4/-4) lib/lp/code/model/tests/test_sourcepackagerecipe.py (+6/-1) lib/lp/code/templates/sourcepackagerecipe-index.pt (+13/-0) lib/lp/registry/interfaces/person.py (+7/-1) lib/lp/registry/model/person.py (+3/-2) lib/lp/services/acl/configure.zcml (+14/-0) lib/lp/services/acl/interfaces.py (+59/-0) lib/lp/services/acl/model.py (+106/-0) lib/lp/services/doc/acl.txt (+205/-0) lib/lp/soyuz/browser/archive.py (+6/-1) lib/lp/soyuz/configure.zcml (+5/-0) lib/lp/soyuz/interfaces/archive.py (+9/-0) lib/lp/soyuz/model/archive.py (+39/-0) lib/lp/soyuz/stories/ppa/xx-ppa-workflow.txt (+10/-3) lib/lp/soyuz/stories/webservice/xx-archive.txt (+4/-0) lib/lp/soyuz/tests/test_archive.py (+32/-1) lib/lp/translations/doc/translations-export-to-branch.txt (+9/-1) lib/lp/translations/scripts/tests/test_translations_to_branch.py (+29/-0) lib/lp/translations/scripts/translations_to_branch.py (+12/-0) utilities/report-database-stats.py (+22/-8) Text conflict in lib/lp/code/browser/tests/test_sourcepackagerecipe.py Text conflict in lib/lp/soyuz/configure.zcml Text conflict in lib/lp/soyuz/model/archive.py |
To merge this branch: | bzr merge lp:~bjornt/launchpad/acl-adapter |
Related bugs: |
Reviewer | Review Type | Date Requested | Status |
---|---|---|---|
Canonical Launchpad Engineering | Pending | ||
Review via email: mp+28891@code.launchpad.net |
Commit message
Description of the change
To post a comment you must log in.
Revision history for this message
Björn Tillenius (bjornt) wrote : | # |
Unmerged revisions
- 9494. By Björn Tillenius
-
Remove parent columns, since they aren't used yet.
- 9493. By Björn Tillenius
-
Add IACL to __all__.
- 9492. By Björn Tillenius
-
Add note about creating an index.
- 9491. By Björn Tillenius
-
Update copyright.
- 9490. By Björn Tillenius
-
No need for _object_id attribute.
- 9489. By Björn Tillenius
-
Add/improve docstrings.
- 9488. By Björn Tillenius
-
Granting a permission to EVERYONE removes all other ACL rows for that permission.
- 9487. By Björn Tillenius
-
Ensure all ACL rows can't be deleted.
- 9486. By Björn Tillenius
-
No need to filter on the permission in the list expansion.
- 9485. By Björn Tillenius
-
Require a permission to getACLItems(), since that seems to be the common case.
Preview Diff
[H/L] Next/Prev Comment, [J/K] Next/Prev File, [N/P] Next/Prev Hunk
1 | === modified file 'Makefile' |
2 | --- Makefile 2010-06-24 16:28:02 +0000 |
3 | +++ Makefile 2010-06-30 14:22:34 +0000 |
4 | @@ -249,7 +249,7 @@ |
5 | bin/run -r librarian,sftp,codebrowse -i $(LPCONFIG) |
6 | |
7 | |
8 | -start_librarian: build |
9 | +start_librarian: compile |
10 | bin/start_librarian |
11 | |
12 | stop_librarian: |
13 | |
14 | === removed file 'cronscripts/calculate-bug-heat.py' |
15 | --- cronscripts/calculate-bug-heat.py 2010-04-27 19:48:39 +0000 |
16 | +++ cronscripts/calculate-bug-heat.py 1970-01-01 00:00:00 +0000 |
17 | @@ -1,33 +0,0 @@ |
18 | -#!/usr/bin/python -S |
19 | -# |
20 | -# Copyright 2010 Canonical Ltd. This software is licensed under the |
21 | -# GNU Affero General Public License version 3 (see the file LICENSE). |
22 | - |
23 | -# pylint: disable-msg=W0403 |
24 | - |
25 | -"""Calculate bug heat.""" |
26 | - |
27 | -__metaclass__ = type |
28 | - |
29 | -import _pythonpath |
30 | - |
31 | -from canonical.launchpad.webapp import errorlog |
32 | - |
33 | -from lp.services.job.runner import JobCronScript |
34 | -from lp.bugs.interfaces.bugjob import ICalculateBugHeatJobSource |
35 | - |
36 | - |
37 | -class RunCalculateBugHeat(JobCronScript): |
38 | - """Run BranchScanJob jobs.""" |
39 | - |
40 | - config_name = 'calculate_bug_heat' |
41 | - source_interface = ICalculateBugHeatJobSource |
42 | - |
43 | - def main(self): |
44 | - errorlog.globalErrorUtility.configure(self.config_name) |
45 | - return super(RunCalculateBugHeat, self).main() |
46 | - |
47 | - |
48 | -if __name__ == '__main__': |
49 | - script = RunCalculateBugHeat() |
50 | - script.lock_and_run() |
51 | |
52 | === modified file 'cronscripts/publishing/maintenance-check.py' |
53 | --- cronscripts/publishing/maintenance-check.py 2010-04-23 13:43:19 +0000 |
54 | +++ cronscripts/publishing/maintenance-check.py 2010-06-30 14:22:34 +0000 |
55 | @@ -350,7 +350,7 @@ |
56 | except: |
57 | logging.exception("can not parse line '%s'" % line) |
58 | except urllib2.HTTPError, e: |
59 | - if e.getcode() != 404: |
60 | + if e.code != 404: |
61 | raise |
62 | sys.stderr.write("hints-file: %s gave 404 error\n" % hints_file) |
63 | |
64 | |
65 | === modified file 'database/replication/helpers.py' |
66 | --- database/replication/helpers.py 2010-04-29 12:38:05 +0000 |
67 | +++ database/replication/helpers.py 2010-06-30 14:22:34 +0000 |
68 | @@ -44,7 +44,6 @@ |
69 | ('public', 'nameblacklist'), |
70 | ('public', 'openidconsumerassociation'), |
71 | ('public', 'openidconsumernonce'), |
72 | - ('public', 'oauthnonce'), |
73 | ('public', 'codeimportmachine'), |
74 | ('public', 'scriptactivity'), |
75 | ('public', 'standardshipitrequest'), |
76 | @@ -71,6 +70,8 @@ |
77 | # Database statistics |
78 | 'public.databasetablestats', |
79 | 'public.databasecpustats', |
80 | + # Don't replicate OAuthNonce - too busy and no real gain. |
81 | + 'public.oauthnonce', |
82 | # Ubuntu SSO database. These tables where created manually by ISD |
83 | # and the Launchpad scripts should not mess with them. Eventually |
84 | # these tables will be in a totally separate database. |
85 | @@ -353,6 +354,9 @@ |
86 | |
87 | A replication set must contain all tables linked by foreign key |
88 | reference to the given table, and sequences used to generate keys. |
89 | + Tables and sequences can be added to the IGNORED_TABLES and |
90 | + IGNORED_SEQUENCES lists for cases where we known can safely ignore |
91 | + this restriction. |
92 | |
93 | :param seeds: [(namespace, tablename), ...] |
94 | |
95 | @@ -420,7 +424,8 @@ |
96 | """ % sqlvalues(namespace, tablename)) |
97 | for namespace, tablename in cur.fetchall(): |
98 | key = (namespace, tablename) |
99 | - if key not in tables and key not in pending_tables: |
100 | + if (key not in tables and key not in pending_tables |
101 | + and '%s.%s' % (namespace, tablename) not in IGNORED_TABLES): |
102 | pending_tables.add(key) |
103 | |
104 | # Generate the set of sequences that are linked to any of our set of |
105 | @@ -441,8 +446,9 @@ |
106 | ) AS whatever |
107 | WHERE seq IS NOT NULL; |
108 | """ % sqlvalues(fqn(namespace, tablename), namespace, tablename)) |
109 | - for row in cur.fetchall(): |
110 | - sequences.add(row[0]) |
111 | + for sequence, in cur.fetchall(): |
112 | + if sequence not in IGNORED_SEQUENCES: |
113 | + sequences.add(sequence) |
114 | |
115 | # We can't easily convert the sequence name to (namespace, name) tuples, |
116 | # so we might as well convert the tables to dot notation for consistancy. |
117 | |
118 | === modified file 'database/replication/new-slave.py' |
119 | --- database/replication/new-slave.py 2010-05-19 18:07:56 +0000 |
120 | +++ database/replication/new-slave.py 2010-06-30 14:22:34 +0000 |
121 | @@ -188,6 +188,9 @@ |
122 | |
123 | script += dedent("""\ |
124 | } on error { echo 'Failed.'; exit 1; } |
125 | + |
126 | + echo 'You may need to restart the Slony daemons now. If the first'; |
127 | + echo 'of the following syncs passes then there is no need.'; |
128 | """) |
129 | |
130 | full_sync = [] |
131 | @@ -200,6 +203,7 @@ |
132 | wait for event ( |
133 | origin = @%(nickname)s, confirmed=ALL, |
134 | wait on = @%(nickname)s, timeout=0); |
135 | + echo 'Ok. Replication syncing fine with new node.'; |
136 | """ % {'nickname': nickname})) |
137 | full_sync = '\n'.join(full_sync) |
138 | script += full_sync |
139 | @@ -210,6 +214,7 @@ |
140 | subscribe set ( |
141 | id=%d, provider=@master_node, receiver=@new_node, forward=yes); |
142 | echo 'Waiting for subscribe to start processing.'; |
143 | + echo 'This will block on long running transactions.'; |
144 | sync (id = @master_node); |
145 | wait for event ( |
146 | origin = @master_node, confirmed = ALL, |
147 | |
148 | === modified file 'database/schema/comments.sql' |
149 | --- database/schema/comments.sql 2010-05-27 22:18:16 +0000 |
150 | +++ database/schema/comments.sql 2010-06-30 14:22:34 +0000 |
151 | @@ -1350,6 +1350,7 @@ |
152 | COMMENT ON COLUMN SourcePackageRecipe.owner IS 'The person or team who can edit this recipe.'; |
153 | COMMENT ON COLUMN SourcePackageRecipe.name IS 'The name of the recipe in the web/URL.'; |
154 | COMMENT ON COLUMN SourcePackageRecipe.build_daily IS 'If true, this recipe should be built daily.'; |
155 | +COMMENT ON COLUMN SourcePackageRecipe.is_stale IS 'True if this recipe has not been built since a branch was updated.'; |
156 | |
157 | COMMENT ON COLUMN SourcePackageREcipe.daily_build_archive IS 'The archive to build into for daily builds.'; |
158 | |
159 | @@ -1371,6 +1372,7 @@ |
160 | COMMENT ON COLUMN SourcePackageRecipeBuild.date_first_dispatched IS 'The instant the build was dispatched the first time. This value will not get overridden if the build is retried.'; |
161 | COMMENT ON COLUMN SourcePackageRecipeBuild.requester IS 'Who requested the build.'; |
162 | COMMENT ON COLUMN SourcePackageRecipeBuild.recipe IS 'The recipe being processed.'; |
163 | +COMMENT ON COLUMN SourcePackageRecipeBuild.manifest IS 'The evaluated recipe that was built.'; |
164 | COMMENT ON COLUMN SourcePackageRecipeBuild.archive IS 'The archive the source package will be built in and uploaded to.'; |
165 | COMMENT ON COLUMN SourcePackageRecipeBuild.pocket IS 'The pocket the source package will be built in and uploaded to.'; |
166 | COMMENT ON COLUMN SourcePackageRecipeBuild.dependencies IS 'The missing build dependencies, if any.'; |
167 | @@ -1965,6 +1967,7 @@ |
168 | COMMENT ON COLUMN Archive.num_old_versions_published IS 'The number of versions of a package to keep published before older versions are superseded.'; |
169 | COMMENT ON COLUMN Archive.relative_build_score IS 'A delta to the build score that is applied to all builds in this archive.'; |
170 | COMMENT ON COLUMN Archive.external_dependencies IS 'Newline-separated list of repositories to be used to retrieve any external build dependencies when building packages in this archive, in the format: deb http[s]://[user:pass@]<host>[/path] %(series)s[-pocket] [components] The series variable is replaced with the series name of the context build. This column is specifically and only intended for OEM migration to Launchpad and should be re-examined in October 2010 to see if it is still relevant.'; |
171 | +COMMENT ON COLUMN Archive.commercial IS 'Whether this archive is a commercial Archive and should appear in the Software Center.'; |
172 | |
173 | -- ArchiveAuthToken |
174 | |
175 | |
176 | === modified file 'database/schema/fti.py' |
177 | --- database/schema/fti.py 2010-05-19 18:07:56 +0000 |
178 | +++ database/schema/fti.py 2010-06-30 14:22:34 +0000 |
179 | @@ -14,10 +14,10 @@ |
180 | import _pythonpath |
181 | |
182 | from distutils.version import LooseVersion |
183 | -import sys |
184 | import os.path |
185 | from optparse import OptionParser |
186 | -import popen2 |
187 | +import subprocess |
188 | +import sys |
189 | from tempfile import NamedTemporaryFile |
190 | from textwrap import dedent |
191 | import time |
192 | @@ -319,18 +319,15 @@ |
193 | cmd += ' -h %s' % lp.dbhost |
194 | if options.dbuser: |
195 | cmd += ' -U %s' % options.dbuser |
196 | - p = popen2.Popen4(cmd) |
197 | - c = p.tochild |
198 | - print >> c, "SET client_min_messages=ERROR;" |
199 | - print >> c, "CREATE SCHEMA ts2;" |
200 | - print >> c, open(tsearch2_sql_path).read().replace( |
201 | - 'public;','ts2, public;' |
202 | - ) |
203 | - p.tochild.close() |
204 | - rv = p.wait() |
205 | - if rv != 0: |
206 | + p = subprocess.Popen( |
207 | + cmd.split(' '), stdin=subprocess.PIPE, |
208 | + stdout=subprocess.PIPE, stderr=subprocess.STDOUT) |
209 | + out, err = p.communicate( |
210 | + "SET client_min_messages=ERROR; CREATE SCHEMA ts2;" |
211 | + + open(tsearch2_sql_path).read().replace('public;','ts2, public;')) |
212 | + if p.returncode != 0: |
213 | log.fatal('Error executing %s:', cmd) |
214 | - log.debug(p.fromchild.read()) |
215 | + log.debug(out) |
216 | sys.exit(rv) |
217 | |
218 | # Create ftq helper and its sibling _ftq. |
219 | |
220 | === added file 'database/schema/patch-2207-60-1.sql' |
221 | --- database/schema/patch-2207-60-1.sql 1970-01-01 00:00:00 +0000 |
222 | +++ database/schema/patch-2207-60-1.sql 2010-06-30 14:22:34 +0000 |
223 | @@ -0,0 +1,10 @@ |
224 | +SET client_min_messages=ERROR; |
225 | + |
226 | +CREATE INDEX archive__require_virtualized__idx |
227 | +ON Archive(require_virtualized); |
228 | + |
229 | +CREATE INDEX buildfarmjob__status__idx |
230 | +ON BuildFarmJob(status); |
231 | + |
232 | +INSERT INTO LaunchpadDatabaseRevision VALUES (2207, 60, 1); |
233 | + |
234 | |
235 | === added file 'database/schema/patch-2207-60-2.sql' |
236 | --- database/schema/patch-2207-60-2.sql 1970-01-01 00:00:00 +0000 |
237 | +++ database/schema/patch-2207-60-2.sql 2010-06-30 14:22:34 +0000 |
238 | @@ -0,0 +1,7 @@ |
239 | +SET client_min_messages=ERROR; |
240 | + |
241 | +CREATE INDEX job__date_finished__idx ON Job(date_finished) |
242 | +WHERE date_finished IS NOT NULL; |
243 | + |
244 | +INSERT INTO LaunchpadDatabaseRevision VALUES (2207, 60, 2); |
245 | + |
246 | |
247 | === added file 'database/schema/patch-2207-61-0.sql' |
248 | --- database/schema/patch-2207-61-0.sql 1970-01-01 00:00:00 +0000 |
249 | +++ database/schema/patch-2207-61-0.sql 2010-06-30 14:22:34 +0000 |
250 | @@ -0,0 +1,13 @@ |
251 | +-- Copyright 2010 Canonical Ltd. This software is licensed under the |
252 | +-- GNU Affero General Public License version 3 (see the file LICENSE). |
253 | + |
254 | +SET client_min_messages=ERROR; |
255 | +ALTER TABLE SourcePackageRecipe ADD COLUMN is_stale BOOLEAN NOT NULL DEFAULT TRUE; |
256 | +ALTER TABLE SourcePackageRecipeBuild ADD COLUMN manifest INTEGER REFERENCES SourcePackageRecipeData; |
257 | + |
258 | +CREATE INDEX sourcepackagerecipe__is_stale__build_daily__idx |
259 | +ON SourcepackageRecipe(is_stale, build_daily); |
260 | + |
261 | +CREATE INDEX sourcepackagerecipebuild__manifest__idx ON SourcepackageRecipeBuild(manifest); |
262 | + |
263 | +INSERT INTO LaunchpadDatabaseRevision VALUES (2207, 61, 0); |
264 | |
265 | === added file 'database/schema/patch-2207-62-0.sql' |
266 | --- database/schema/patch-2207-62-0.sql 1970-01-01 00:00:00 +0000 |
267 | +++ database/schema/patch-2207-62-0.sql 2010-06-30 14:22:34 +0000 |
268 | @@ -0,0 +1,14 @@ |
269 | +SET client_min_messages=ERROR; |
270 | + |
271 | +-- Bug #49717 |
272 | +ALTER TABLE SourcePackageRelease ALTER component SET NOT NULL; |
273 | + |
274 | +-- We are taking OAuthNonce out of replication, so we make the foreign |
275 | +-- key reference ON DELETE CASCADE so things don't explode when we |
276 | +-- shuffle the lpmain master around. |
277 | +ALTER TABLE OAuthNonce DROP CONSTRAINT oauthnonce__access_token__fk; |
278 | +ALTER TABLE OAuthNonce ADD CONSTRAINT oauthnonce__access_token__fk |
279 | + FOREIGN KEY (access_token) REFERENCES OAuthAccessToken |
280 | + ON DELETE CASCADE; |
281 | + |
282 | +INSERT INTO LaunchpadDatabaseRevision VALUES (2207, 62, 0); |
283 | |
284 | === added file 'database/schema/patch-2207-63-0.sql' |
285 | --- database/schema/patch-2207-63-0.sql 1970-01-01 00:00:00 +0000 |
286 | +++ database/schema/patch-2207-63-0.sql 2010-06-30 14:22:34 +0000 |
287 | @@ -0,0 +1,7 @@ |
288 | +SET client_min_messages=ERROR; |
289 | + |
290 | +ALTER TABLE Archive |
291 | + ADD COLUMN commercial BOOLEAN NOT NULL DEFAULT FALSE; |
292 | +CREATE INDEX archive__commercial__idx ON Archive(commercial); |
293 | + |
294 | +INSERT INTO LaunchpadDatabaseRevision VALUES (2207, 63, 0); |
295 | |
296 | === modified file 'database/schema/security.cfg' |
297 | --- database/schema/security.cfg 2010-06-18 09:49:31 +0000 |
298 | +++ database/schema/security.cfg 2010-06-30 14:22:34 +0000 |
299 | @@ -1848,6 +1848,7 @@ |
300 | type=user |
301 | public.archive = SELECT |
302 | public.buildfarmjob = SELECT |
303 | +public.databasereplicationlag = SELECT |
304 | public.packagebuild = SELECT |
305 | public.binarypackagebuild = SELECT |
306 | public.buildqueue = SELECT |
307 | |
308 | === modified file 'database/schema/trusted.sql' |
309 | --- database/schema/trusted.sql 2010-05-28 10:36:08 +0000 |
310 | +++ database/schema/trusted.sql 2010-06-30 14:22:34 +0000 |
311 | @@ -144,6 +144,12 @@ |
312 | LIMIT 1 |
313 | """, 1).nrows() > 0 |
314 | if stats_reset: |
315 | + # The database stats have been reset. We cannot calculate |
316 | + # deltas because we do not know when this happened. So we trash |
317 | + # our records as they are now useless to us. We could be more |
318 | + # sophisticated about this, but this should only happen |
319 | + # when an admin explicitly resets the statistics or if the |
320 | + # database is rebuilt. |
321 | plpy.notice("Stats wraparound. Purging DatabaseTableStats") |
322 | plpy.execute("DELETE FROM DatabaseTableStats") |
323 | else: |
324 | @@ -158,7 +164,8 @@ |
325 | SELECT |
326 | CURRENT_TIMESTAMP AT TIME ZONE 'UTC', |
327 | schemaname, relname, seq_scan, seq_tup_read, |
328 | - idx_scan, idx_tup_fetch, n_tup_ins, n_tup_upd, n_tup_del, |
329 | + coalesce(idx_scan, 0), coalesce(idx_tup_fetch, 0), |
330 | + n_tup_ins, n_tup_upd, n_tup_del, |
331 | n_tup_hot_upd, n_live_tup, n_dead_tup, last_vacuum, |
332 | last_autovacuum, last_analyze, last_autoanalyze |
333 | FROM pg_catalog.pg_stat_user_tables; |
334 | |
335 | === modified file 'lib/canonical/config/schema-lazr.conf' |
336 | --- lib/canonical/config/schema-lazr.conf 2010-06-18 10:58:11 +0000 |
337 | +++ lib/canonical/config/schema-lazr.conf 2010-06-30 14:22:34 +0000 |
338 | @@ -1680,6 +1680,17 @@ |
339 | # datatype: boolean |
340 | global_suggestions_enabled: True |
341 | |
342 | +# Send out notifications for branches that have been created in the |
343 | +# database but never pushed to bzr? |
344 | +# Be careful: setting this to True on staging will trigger lots of |
345 | +# notifications because the branches it copies from production all |
346 | +# appear as unpushed. And the notifications happen on codehosting, |
347 | +# which bypasses the staging mailbox and sends out email for real. |
348 | +# XXX JeroenVermeulen 2010-06-14 bug=593522: This is needed |
349 | +# because the staging codehosting server's email isn't being |
350 | +# captured like it should. |
351 | +notify_unpushed_branches: False |
352 | + |
353 | # A different batch size for POFile:+translate pages to keep them from |
354 | # timing out. |
355 | # datatype: integer |
356 | |
357 | === modified file 'lib/canonical/configure.zcml' |
358 | --- lib/canonical/configure.zcml 2010-03-24 15:22:26 +0000 |
359 | +++ lib/canonical/configure.zcml 2010-06-30 14:22:34 +0000 |
360 | @@ -15,6 +15,7 @@ |
361 | <include package="canonical.launchpad" file="permissions.zcml" /> |
362 | <include package="canonical.launchpad.webapp" file="meta.zcml" /> |
363 | <include package="lazr.restful" file="meta.zcml" /> |
364 | + <include package="lp.services.acl" /> |
365 | <include package="lp.services.database" /> |
366 | <include package="lp.services.inlinehelp" file="meta.zcml" /> |
367 | <include package="lp.services.openid" /> |
368 | |
369 | === modified file 'lib/canonical/launchpad/interfaces/_schema_circular_imports.py' |
370 | --- lib/canonical/launchpad/interfaces/_schema_circular_imports.py 2010-06-15 07:40:57 +0000 |
371 | +++ lib/canonical/launchpad/interfaces/_schema_circular_imports.py 2010-06-30 14:22:34 +0000 |
372 | @@ -198,6 +198,8 @@ |
373 | patch_entry_return_type(IPerson, 'createRecipe', ISourcePackageRecipe) |
374 | patch_list_parameter_type(IPerson, 'createRecipe', 'distroseries', |
375 | Reference(schema=IDistroSeries)) |
376 | +patch_plain_parameter_type(IPerson, 'createRecipe', 'daily_build_archive', |
377 | + IArchive) |
378 | |
379 | patch_entry_return_type(IPerson, 'getRecipe', ISourcePackageRecipe) |
380 | |
381 | |
382 | === modified file 'lib/canonical/launchpad/scripts/garbo.py' |
383 | --- lib/canonical/launchpad/scripts/garbo.py 2010-06-11 07:26:03 +0000 |
384 | +++ lib/canonical/launchpad/scripts/garbo.py 2010-06-30 14:22:34 +0000 |
385 | @@ -33,7 +33,6 @@ |
386 | from lp.bugs.interfaces.bug import IBugSet |
387 | from lp.bugs.model.bug import Bug |
388 | from lp.bugs.model.bugattachment import BugAttachment |
389 | -from lp.bugs.interfaces.bugjob import ICalculateBugHeatJobSource |
390 | from lp.bugs.model.bugnotification import BugNotification |
391 | from lp.bugs.model.bugwatch import BugWatch |
392 | from lp.bugs.scripts.checkwatches.scheduler import ( |
393 | |
394 | === modified file 'lib/lp/bugs/browser/bugtask.py' |
395 | --- lib/lp/bugs/browser/bugtask.py 2010-05-25 16:45:26 +0000 |
396 | +++ lib/lp/bugs/browser/bugtask.py 2010-06-30 14:22:34 +0000 |
397 | @@ -2646,7 +2646,7 @@ |
398 | dict( |
399 | value=term.token, title=term.title or term.token, |
400 | checked=term.value in default_values)) |
401 | - return helpers.shortlist(widget_values, longest_expected=11) |
402 | + return helpers.shortlist(widget_values, longest_expected=12) |
403 | |
404 | def getStatusWidgetValues(self): |
405 | """Return data used to render the status checkboxes.""" |
406 | |
407 | === modified file 'lib/lp/bugs/browser/tests/test_bugtask.py' |
408 | --- lib/lp/bugs/browser/tests/test_bugtask.py 2010-05-25 14:50:42 +0000 |
409 | +++ lib/lp/bugs/browser/tests/test_bugtask.py 2010-06-30 14:22:34 +0000 |
410 | @@ -245,8 +245,8 @@ |
411 | self.bug.default_bugtask, LaunchpadTestRequest()) |
412 | view.initialize() |
413 | self.assertEqual( |
414 | - ['New', 'Incomplete', 'Invalid', 'Confirmed', 'In Progress', |
415 | - 'Fix Committed', 'Fix Released'], |
416 | + ['New', 'Incomplete', 'Opinion', 'Invalid', 'Confirmed', |
417 | + 'In Progress', 'Fix Committed', 'Fix Released'], |
418 | self.getWidgetOptionTitles(view.form_fields['status'])) |
419 | |
420 | def test_status_field_privileged_persons(self): |
421 | @@ -260,8 +260,9 @@ |
422 | self.bug.default_bugtask, LaunchpadTestRequest()) |
423 | view.initialize() |
424 | self.assertEqual( |
425 | - ['New', 'Incomplete', 'Invalid', "Won't Fix", 'Confirmed', |
426 | - 'Triaged', 'In Progress', 'Fix Committed', 'Fix Released'], |
427 | + ['New', 'Incomplete', 'Opinion', 'Invalid', "Won't Fix", |
428 | + 'Confirmed', 'Triaged', 'In Progress', 'Fix Committed', |
429 | + 'Fix Released'], |
430 | self.getWidgetOptionTitles(view.form_fields['status']), |
431 | 'Unexpected set of settable status options for %s' |
432 | % user.name) |
433 | @@ -278,8 +279,8 @@ |
434 | self.bug.default_bugtask, LaunchpadTestRequest()) |
435 | view.initialize() |
436 | self.assertEqual( |
437 | - ['New', 'Incomplete', 'Invalid', 'Confirmed', 'In Progress', |
438 | - 'Fix Committed', 'Fix Released', 'Unknown'], |
439 | + ['New', 'Incomplete', 'Opinion', 'Invalid', 'Confirmed', |
440 | + 'In Progress', 'Fix Committed', 'Fix Released', 'Unknown'], |
441 | self.getWidgetOptionTitles(view.form_fields['status'])) |
442 | |
443 | def test_status_field_bug_task_in_status_expired(self): |
444 | @@ -292,8 +293,8 @@ |
445 | self.bug.default_bugtask, LaunchpadTestRequest()) |
446 | view.initialize() |
447 | self.assertEqual( |
448 | - ['New', 'Incomplete', 'Invalid', 'Expired', 'Confirmed', |
449 | - 'In Progress', 'Fix Committed', 'Fix Released'], |
450 | + ['New', 'Incomplete', 'Opinion', 'Invalid', 'Expired', |
451 | + 'Confirmed', 'In Progress', 'Fix Committed', 'Fix Released'], |
452 | self.getWidgetOptionTitles(view.form_fields['status'])) |
453 | |
454 | |
455 | |
456 | === modified file 'lib/lp/bugs/configure.zcml' |
457 | --- lib/lp/bugs/configure.zcml 2010-06-16 08:27:19 +0000 |
458 | +++ lib/lp/bugs/configure.zcml 2010-06-30 14:22:34 +0000 |
459 | @@ -969,18 +969,6 @@ |
460 | factory="lp.bugs.browser.bugtarget.BugsVHostBreadcrumb" |
461 | permission="zope.Public"/> |
462 | |
463 | - <!-- CalculateBugHeatJobs --> |
464 | - <class class="lp.bugs.model.bugheat.CalculateBugHeatJob"> |
465 | - <allow interface="lp.bugs.interfaces.bugjob.IBugJob" /> |
466 | - <allow interface="lp.bugs.interfaces.bugjob.ICalculateBugHeatJob"/> |
467 | - </class> |
468 | - <securedutility |
469 | - component="lp.bugs.model.bugheat.CalculateBugHeatJob" |
470 | - provides="lp.bugs.interfaces.bugjob.ICalculateBugHeatJobSource"> |
471 | - <allow |
472 | - interface="lp.bugs.interfaces.bugjob.ICalculateBugHeatJobSource"/> |
473 | - </securedutility> |
474 | - |
475 | <!-- ProcessApportBlobJobs --> |
476 | <class class="lp.bugs.model.apportjob.ProcessApportBlobJob"> |
477 | <allow interface="lp.bugs.interfaces.apportjob.IApportJob" /> |
478 | |
479 | === modified file 'lib/lp/bugs/doc/bugtask-status-workflow.txt' |
480 | --- lib/lp/bugs/doc/bugtask-status-workflow.txt 2010-04-15 15:28:22 +0000 |
481 | +++ lib/lp/bugs/doc/bugtask-status-workflow.txt 2010-06-30 14:22:34 +0000 |
482 | @@ -145,7 +145,7 @@ |
483 | >>> ubuntu_firefox_task.date_inprogress is None |
484 | True |
485 | |
486 | -Marking the bug Triaged sets `date_triged`. |
487 | +Marking the bug Triaged sets `date_triaged`. |
488 | |
489 | >>> print ubuntu_firefox_task.date_triaged |
490 | None |
491 | @@ -188,6 +188,16 @@ |
492 | |
493 | >>> ubuntu_firefox_task.transitionToStatus( |
494 | ... BugTaskStatus.CONFIRMED, getUtility(ILaunchBag).user) |
495 | + >>> ubuntu_firefox_task.date_closed is None |
496 | + True |
497 | + |
498 | + >>> ubuntu_firefox_task.transitionToStatus( |
499 | + ... BugTaskStatus.OPINION, getUtility(ILaunchBag).user) |
500 | + >>> ubuntu_firefox_task.date_closed |
501 | + datetime.datetime... |
502 | + |
503 | + >>> ubuntu_firefox_task.transitionToStatus( |
504 | + ... BugTaskStatus.CONFIRMED, getUtility(ILaunchBag).user) |
505 | >>> ubuntu_firefox_task.date_inprogress is None |
506 | True |
507 | >>> ubuntu_firefox_task.transitionToStatus( |
508 | |
509 | === modified file 'lib/lp/bugs/interfaces/bugjob.py' |
510 | --- lib/lp/bugs/interfaces/bugjob.py 2010-01-22 21:44:19 +0000 |
511 | +++ lib/lp/bugs/interfaces/bugjob.py 2010-06-30 14:22:34 +0000 |
512 | @@ -8,8 +8,6 @@ |
513 | 'BugJobType', |
514 | 'IBugJob', |
515 | 'IBugJobSource', |
516 | - 'ICalculateBugHeatJob', |
517 | - 'ICalculateBugHeatJobSource', |
518 | ] |
519 | |
520 | from zope.interface import Attribute, Interface |
521 | @@ -19,7 +17,7 @@ |
522 | |
523 | from lazr.enum import DBEnumeratedType, DBItem |
524 | from lp.bugs.interfaces.bug import IBug |
525 | -from lp.services.job.interfaces.job import IJob, IJobSource, IRunnableJob |
526 | +from lp.services.job.interfaces.job import IJob, IJobSource |
527 | |
528 | |
529 | class BugJobType(DBEnumeratedType): |
530 | @@ -57,11 +55,3 @@ |
531 | |
532 | def create(bug): |
533 | """Create a new IBugJob for a bug.""" |
534 | - |
535 | - |
536 | -class ICalculateBugHeatJob(IRunnableJob): |
537 | - """A Job to calculate bug heat.""" |
538 | - |
539 | - |
540 | -class ICalculateBugHeatJobSource(IBugJobSource): |
541 | - """Interface for acquiring CalculateBugHeatJobs.""" |
542 | |
543 | === modified file 'lib/lp/bugs/interfaces/bugtask.py' |
544 | --- lib/lp/bugs/interfaces/bugtask.py 2010-06-24 17:09:14 +0000 |
545 | +++ lib/lp/bugs/interfaces/bugtask.py 2010-06-30 14:22:34 +0000 |
546 | @@ -159,6 +159,14 @@ |
547 | the user was visiting when the bug occurred, etc. |
548 | """) |
549 | |
550 | + OPINION = DBItem(16, """ |
551 | + Opinion |
552 | + |
553 | + The bug remains open for discussion only. This status is usually |
554 | + used where there is disagreement over whether the bug is relevant |
555 | + to the current target and whether it should be fixed. |
556 | + """) |
557 | + |
558 | INVALID = DBItem(17, """ |
559 | Invalid |
560 | |
561 | @@ -235,8 +243,8 @@ |
562 | |
563 | sort_order = ( |
564 | 'NEW', 'INCOMPLETE_WITH_RESPONSE', 'INCOMPLETE_WITHOUT_RESPONSE', |
565 | - 'INCOMPLETE', 'INVALID', 'WONTFIX', 'EXPIRED', 'CONFIRMED', 'TRIAGED', |
566 | - 'INPROGRESS', 'FIXCOMMITTED', 'FIXRELEASED') |
567 | + 'INCOMPLETE', 'OPINION', 'INVALID', 'WONTFIX', 'EXPIRED', |
568 | + 'CONFIRMED', 'TRIAGED', 'INPROGRESS', 'FIXCOMMITTED', 'FIXRELEASED') |
569 | |
570 | INCOMPLETE_WITH_RESPONSE = DBItem(35, """ |
571 | Incomplete (with response) |
572 | @@ -312,6 +320,7 @@ |
573 | |
574 | RESOLVED_BUGTASK_STATUSES = ( |
575 | BugTaskStatus.FIXRELEASED, |
576 | + BugTaskStatus.OPINION, |
577 | BugTaskStatus.INVALID, |
578 | BugTaskStatus.WONTFIX, |
579 | BugTaskStatus.EXPIRED) |
580 | |
581 | === modified file 'lib/lp/bugs/model/bug.py' |
582 | --- lib/lp/bugs/model/bug.py 2010-06-16 08:20:23 +0000 |
583 | +++ lib/lp/bugs/model/bug.py 2010-06-30 14:22:34 +0000 |
584 | @@ -83,7 +83,6 @@ |
585 | from lp.bugs.interfaces.bugtracker import BugTrackerType |
586 | from lp.bugs.interfaces.bugwatch import IBugWatchSet |
587 | from lp.bugs.interfaces.cve import ICveSet |
588 | -from lp.bugs.scripts.bugheat import BugHeatConstants |
589 | from lp.bugs.model.bugattachment import BugAttachment |
590 | from lp.bugs.model.bugbranch import BugBranch |
591 | from lp.bugs.model.bugcve import BugCve |
592 | |
593 | === removed file 'lib/lp/bugs/model/bugheat.py' |
594 | --- lib/lp/bugs/model/bugheat.py 2010-01-21 20:46:03 +0000 |
595 | +++ lib/lp/bugs/model/bugheat.py 1970-01-01 00:00:00 +0000 |
596 | @@ -1,54 +0,0 @@ |
597 | -# Copyright 2010 Canonical Ltd. This software is licensed under the |
598 | -# GNU Affero General Public License version 3 (see the file LICENSE). |
599 | - |
600 | -"""Job classes related to BugJobs are in here.""" |
601 | - |
602 | -__metaclass__ = type |
603 | -__all__ = [ |
604 | - 'CalculateBugHeatJob', |
605 | - ] |
606 | - |
607 | -from zope.component import getUtility |
608 | -from zope.interface import classProvides, implements |
609 | - |
610 | -from canonical.launchpad.webapp.interfaces import ( |
611 | - DEFAULT_FLAVOR, IStoreSelector, MAIN_STORE) |
612 | - |
613 | -from lp.bugs.interfaces.bugjob import ( |
614 | - BugJobType, ICalculateBugHeatJob, ICalculateBugHeatJobSource) |
615 | -from lp.bugs.model.bugjob import BugJob, BugJobDerived |
616 | -from lp.bugs.scripts.bugheat import BugHeatCalculator |
617 | -from lp.services.job.model.job import Job |
618 | - |
619 | - |
620 | -class CalculateBugHeatJob(BugJobDerived): |
621 | - """A Job to calculate bug heat.""" |
622 | - implements(ICalculateBugHeatJob) |
623 | - |
624 | - class_job_type = BugJobType.UPDATE_HEAT |
625 | - classProvides(ICalculateBugHeatJobSource) |
626 | - |
627 | - def run(self): |
628 | - """See `IRunnableJob`.""" |
629 | - calculator = BugHeatCalculator(self.bug) |
630 | - calculated_heat = calculator.getBugHeat() |
631 | - self.bug.setHeat(calculated_heat) |
632 | - |
633 | - @classmethod |
634 | - def create(cls, bug): |
635 | - """See `ICalculateBugHeatJobSource`.""" |
636 | - # If there's already a job for the bug, don't create a new one. |
637 | - store = getUtility(IStoreSelector).get(MAIN_STORE, DEFAULT_FLAVOR) |
638 | - job_for_bug = store.find( |
639 | - BugJob, |
640 | - BugJob.bug == bug, |
641 | - BugJob.job_type == cls.class_job_type, |
642 | - BugJob.job == Job.id, |
643 | - Job.id.is_in(Job.ready_jobs) |
644 | - ).any() |
645 | - |
646 | - if job_for_bug is not None: |
647 | - return cls(job_for_bug) |
648 | - else: |
649 | - return super(CalculateBugHeatJob, cls).create(bug) |
650 | - |
651 | |
652 | === removed file 'lib/lp/bugs/scripts/bugheat.py' |
653 | --- lib/lp/bugs/scripts/bugheat.py 2010-04-29 11:31:49 +0000 |
654 | +++ lib/lp/bugs/scripts/bugheat.py 1970-01-01 00:00:00 +0000 |
655 | @@ -1,108 +0,0 @@ |
656 | -# Copyright 2010 Canonical Ltd. This software is licensed under the |
657 | -# GNU Affero General Public License version 3 (see the file LICENSE). |
658 | - |
659 | -"""The innards of the Bug Heat cronscript.""" |
660 | - |
661 | -__metaclass__ = type |
662 | -__all__ = [ |
663 | - 'BugHeatCalculator', |
664 | - 'BugHeatConstants', |
665 | - ] |
666 | - |
667 | -from datetime import datetime |
668 | - |
669 | -from lp.bugs.interfaces.bugtask import RESOLVED_BUGTASK_STATUSES |
670 | - |
671 | -class BugHeatConstants: |
672 | - |
673 | - PRIVACY = 150 |
674 | - SECURITY = 250 |
675 | - DUPLICATE = 6 |
676 | - AFFECTED_USER = 4 |
677 | - SUBSCRIBER = 2 |
678 | - |
679 | - |
680 | -class BugHeatCalculator: |
681 | - """A class to calculate the heat for a bug.""" |
682 | - # If you change the way that bug heat is calculated, remember to update |
683 | - # the description of how it is calculated at |
684 | - # /lib/lp/bugs/help/bug-heat.html and |
685 | - # https://help.launchpad.net/Bugs/BugHeat |
686 | - |
687 | - def __init__(self, bug): |
688 | - self.bug = bug |
689 | - |
690 | - def _getHeatFromPrivacy(self): |
691 | - """Return the heat generated by the bug's `private` attribute.""" |
692 | - if self.bug.private: |
693 | - return BugHeatConstants.PRIVACY |
694 | - else: |
695 | - return 0 |
696 | - |
697 | - def _getHeatFromSecurity(self): |
698 | - """Return the heat generated if the bug is security related.""" |
699 | - if self.bug.security_related: |
700 | - return BugHeatConstants.SECURITY |
701 | - else: |
702 | - return 0 |
703 | - |
704 | - def _getHeatFromDuplicates(self): |
705 | - """Return the heat generated by the bug's duplicates.""" |
706 | - return self.bug.duplicates.count() * BugHeatConstants.DUPLICATE |
707 | - |
708 | - def _getHeatFromAffectedUsers(self): |
709 | - """Return the heat generated by the bug's affected users.""" |
710 | - return ( |
711 | - self.bug.users_affected_count_with_dupes * |
712 | - BugHeatConstants.AFFECTED_USER) |
713 | - |
714 | - def _getHeatFromSubscribers(self): |
715 | - """Return the heat generated by the bug's subscribers.""" |
716 | - direct_subscribers = self.bug.getDirectSubscribers() |
717 | - subscribers_from_dupes = self.bug.getSubscribersFromDuplicates() |
718 | - |
719 | - subscriber_count = ( |
720 | - len(direct_subscribers) + len(subscribers_from_dupes)) |
721 | - return subscriber_count * BugHeatConstants.SUBSCRIBER |
722 | - |
723 | - def _bugIsComplete(self): |
724 | - """Are all the tasks for this bug resolved?""" |
725 | - return all([(task.status in RESOLVED_BUGTASK_STATUSES) |
726 | - for task in self.bug.bugtasks]) |
727 | - |
728 | - def getBugHeat(self): |
729 | - """Return the total heat for the current bug.""" |
730 | - if self._bugIsComplete(): |
731 | - return 0 |
732 | - |
733 | - total_heat = sum([ |
734 | - self._getHeatFromAffectedUsers(), |
735 | - self._getHeatFromDuplicates(), |
736 | - self._getHeatFromPrivacy(), |
737 | - self._getHeatFromSecurity(), |
738 | - self._getHeatFromSubscribers(), |
739 | - ]) |
740 | - |
741 | - # Bugs decay over time. Every day the bug isn't touched its heat |
742 | - # decreases by 1%. |
743 | - days = ( |
744 | - datetime.utcnow() - |
745 | - self.bug.date_last_updated.replace(tzinfo=None)).days |
746 | - total_heat = int(total_heat * (0.99 ** days)) |
747 | - |
748 | - if days > 0: |
749 | - # Bug heat increases by a quarter of the maximum bug heat divided |
750 | - # by the number of days since the bug's creation date. |
751 | - days_since_last_activity = ( |
752 | - datetime.utcnow() - |
753 | - max(self.bug.date_last_updated.replace(tzinfo=None), |
754 | - self.bug.date_last_message.replace(tzinfo=None))).days |
755 | - days_since_created = ( |
756 | - datetime.utcnow() - self.bug.datecreated.replace(tzinfo=None)).days |
757 | - max_heat = max( |
758 | - task.target.max_bug_heat for task in self.bug.bugtasks) |
759 | - if max_heat is not None and days_since_created > 0: |
760 | - total_heat = total_heat + (max_heat * 0.25 / days_since_created) |
761 | - |
762 | - return int(total_heat) |
763 | - |
764 | |
765 | === removed file 'lib/lp/bugs/scripts/tests/test_bugheat.py' |
766 | --- lib/lp/bugs/scripts/tests/test_bugheat.py 2010-04-29 11:31:49 +0000 |
767 | +++ lib/lp/bugs/scripts/tests/test_bugheat.py 1970-01-01 00:00:00 +0000 |
768 | @@ -1,256 +0,0 @@ |
769 | -# Copyright 2010 Canonical Ltd. This software is licensed under the |
770 | -# GNU Affero General Public License version 3 (see the file LICENSE). |
771 | - |
772 | -"""Module docstring goes here.""" |
773 | - |
774 | -__metaclass__ = type |
775 | - |
776 | -import unittest |
777 | - |
778 | -from datetime import datetime, timedelta |
779 | - |
780 | -from canonical.testing import LaunchpadZopelessLayer |
781 | - |
782 | -from lp.bugs.interfaces.bugtask import BugTaskStatus |
783 | -from lp.bugs.scripts.bugheat import BugHeatCalculator, BugHeatConstants |
784 | -from lp.testing import TestCaseWithFactory |
785 | - |
786 | -from zope.security.proxy import removeSecurityProxy |
787 | - |
788 | - |
789 | -class TestBugHeatCalculator(TestCaseWithFactory): |
790 | - """Tests for the BugHeatCalculator class.""" |
791 | - # If you change the way that bug heat is calculated, remember to update |
792 | - # the description of how it is calculated at |
793 | - # /lib/lp/bugs/help/bug-heat.html and |
794 | - # https://help.launchpad.net/Bugs/BugHeat |
795 | - |
796 | - layer = LaunchpadZopelessLayer |
797 | - |
798 | - def setUp(self): |
799 | - super(TestBugHeatCalculator, self).setUp() |
800 | - self.bug = self.factory.makeBug() |
801 | - self.calculator = BugHeatCalculator(self.bug) |
802 | - |
803 | - def test__getHeatFromDuplicates(self): |
804 | - # BugHeatCalculator._getHeatFromDuplicates() returns the bug |
805 | - # heat generated by duplicates of a bug. |
806 | - # By default, the bug has no heat from dupes |
807 | - self.assertEqual(0, self.calculator._getHeatFromDuplicates()) |
808 | - |
809 | - # If adding duplicates, the heat generated by them will be n * |
810 | - # BugHeatConstants.DUPLICATE, where n is the number of |
811 | - # duplicates. |
812 | - for i in range(5): |
813 | - dupe = self.factory.makeBug() |
814 | - dupe.duplicateof = self.bug |
815 | - |
816 | - expected_heat = BugHeatConstants.DUPLICATE * 5 |
817 | - actual_heat = self.calculator._getHeatFromDuplicates() |
818 | - self.assertEqual( |
819 | - expected_heat, actual_heat, |
820 | - "Heat from duplicates does not match expected heat. " |
821 | - "Expected %s, got %s" % (expected_heat, actual_heat)) |
822 | - |
823 | - def test__getHeatFromAffectedUsers(self): |
824 | - # BugHeatCalculator._getHeatFromAffectedUsers() returns the bug |
825 | - # heat generated by users affected by the bug and by duplicate bugs. |
826 | - # By default, the heat will be BugHeatConstants.AFFECTED_USER, since |
827 | - # there will be one affected user (the user who filed the bug). |
828 | - self.assertEqual( |
829 | - BugHeatConstants.AFFECTED_USER, |
830 | - self.calculator._getHeatFromAffectedUsers()) |
831 | - |
832 | - # As the number of affected users increases, the heat generated |
833 | - # will be n * BugHeatConstants.AFFECTED_USER, where n is the number |
834 | - # of affected users. |
835 | - for i in range(5): |
836 | - person = self.factory.makePerson() |
837 | - self.bug.markUserAffected(person) |
838 | - |
839 | - expected_heat = BugHeatConstants.AFFECTED_USER * 6 |
840 | - actual_heat = self.calculator._getHeatFromAffectedUsers() |
841 | - self.assertEqual( |
842 | - expected_heat, actual_heat, |
843 | - "Heat from affected users does not match expected heat. " |
844 | - "Expected %s, got %s" % (expected_heat, actual_heat)) |
845 | - |
846 | - # When our bug has duplicates, users affected by these duplicates |
847 | - # are included in _getHeatFromAffectedUsers() of the main bug. |
848 | - for i in range(3): |
849 | - dupe = self.factory.makeBug() |
850 | - dupe.duplicateof = self.bug |
851 | - # Each bug reporter is by default also marked as being affected |
852 | - # by the bug, so we have three additional affected users. |
853 | - expected_heat += BugHeatConstants.AFFECTED_USER * 3 |
854 | - |
855 | - person = self.factory.makePerson() |
856 | - dupe.markUserAffected(person) |
857 | - expected_heat += BugHeatConstants.AFFECTED_USER |
858 | - actual_heat = self.calculator._getHeatFromAffectedUsers() |
859 | - self.assertEqual( |
860 | - expected_heat, actual_heat, |
861 | - "Heat from users affected by duplicate bugs does not match " |
862 | - "expected heat. Expected %s, got %s" |
863 | - % (expected_heat, actual_heat)) |
864 | - |
865 | - def test__getHeatFromSubscribers(self): |
866 | - # BugHeatCalculator._getHeatFromSubscribers() returns the bug |
867 | - # heat generated by users subscribed tothe bug. |
868 | - # By default, the heat will be BugHeatConstants.SUBSCRIBER, |
869 | - # since there will be one direct subscriber (the user who filed |
870 | - # the bug). |
871 | - self.assertEqual( |
872 | - BugHeatConstants.SUBSCRIBER, |
873 | - self.calculator._getHeatFromSubscribers()) |
874 | - |
875 | - # As the number of subscribers increases, the heat generated |
876 | - # will be n * BugHeatConstants.SUBSCRIBER, where n is the number |
877 | - # of subscribers. |
878 | - for i in range(5): |
879 | - person = self.factory.makePerson() |
880 | - self.bug.subscribe(person, person) |
881 | - |
882 | - expected_heat = BugHeatConstants.SUBSCRIBER * 6 |
883 | - actual_heat = self.calculator._getHeatFromSubscribers() |
884 | - self.assertEqual( |
885 | - expected_heat, actual_heat, |
886 | - "Heat from subscribers does not match expected heat. " |
887 | - "Expected %s, got %s" % (expected_heat, actual_heat)) |
888 | - |
889 | - # Subscribers from duplicates are included in the heat returned |
890 | - # by _getHeatFromSubscribers() |
891 | - dupe = self.factory.makeBug() |
892 | - dupe.duplicateof = self.bug |
893 | - expected_heat = BugHeatConstants.SUBSCRIBER * 7 |
894 | - actual_heat = self.calculator._getHeatFromSubscribers() |
895 | - self.assertEqual( |
896 | - expected_heat, actual_heat, |
897 | - "Heat from subscribers (including duplicate-subscribers) " |
898 | - "does not match expected heat. Expected %s, got %s" % |
899 | - (expected_heat, actual_heat)) |
900 | - |
901 | - # Seting the bug to private will increase its heat from |
902 | - # subscribers by 1 * BugHeatConstants.SUBSCRIBER, as the project |
903 | - # owner will now be directly subscribed to it. |
904 | - self.bug.setPrivate(True, self.bug.owner) |
905 | - expected_heat = BugHeatConstants.SUBSCRIBER * 8 |
906 | - actual_heat = self.calculator._getHeatFromSubscribers() |
907 | - self.assertEqual( |
908 | - expected_heat, actual_heat, |
909 | - "Heat from subscribers to private bug does not match expected " |
910 | - "heat. Expected %s, got %s" % (expected_heat, actual_heat)) |
911 | - |
912 | - def test__getHeatFromPrivacy(self): |
913 | - # BugHeatCalculator._getHeatFromPrivacy() returns the heat |
914 | - # generated by the bug's private attribute. If the bug is |
915 | - # public, this will be 0. |
916 | - self.assertEqual(0, self.calculator._getHeatFromPrivacy()) |
917 | - |
918 | - # However, if the bug is private, _getHeatFromPrivacy() will |
919 | - # return BugHeatConstants.PRIVACY. |
920 | - self.bug.setPrivate(True, self.bug.owner) |
921 | - self.assertEqual( |
922 | - BugHeatConstants.PRIVACY, self.calculator._getHeatFromPrivacy()) |
923 | - |
924 | - def test__getHeatFromSecurity(self): |
925 | - # BugHeatCalculator._getHeatFromSecurity() returns the heat |
926 | - # generated by the bug's security_related attribute. If the bug |
927 | - # is not security related, _getHeatFromSecurity() will return 0. |
928 | - self.assertEqual(0, self.calculator._getHeatFromPrivacy()) |
929 | - |
930 | - |
931 | - # If, on the other hand, the bug is security_related, |
932 | - # _getHeatFromSecurity() will return BugHeatConstants.SECURITY |
933 | - self.bug.setSecurityRelated(True) |
934 | - self.assertEqual( |
935 | - BugHeatConstants.SECURITY, self.calculator._getHeatFromSecurity()) |
936 | - |
937 | - def test_getBugHeat(self): |
938 | - # BugHeatCalculator.getBugHeat() returns the total heat for a |
939 | - # given bug as the sum of the results of all _getHeatFrom*() |
940 | - # methods. |
941 | - # By default this will be (BugHeatConstants.AFFECTED_USER + |
942 | - # BugHeatConstants.SUBSCRIBER) since there will be one |
943 | - # subscriber and one affected user only. |
944 | - expected_heat = ( |
945 | - BugHeatConstants.AFFECTED_USER + BugHeatConstants.SUBSCRIBER) |
946 | - actual_heat = self.calculator.getBugHeat() |
947 | - self.assertEqual( |
948 | - expected_heat, actual_heat, |
949 | - "Expected bug heat did not match actual bug heat. " |
950 | - "Expected %s, got %s" % (expected_heat, actual_heat)) |
951 | - |
952 | - # Adding a duplicate and making the bug private and security |
953 | - # related will increase its heat. |
954 | - dupe = self.factory.makeBug() |
955 | - dupe.duplicateof = self.bug |
956 | - self.bug.setPrivate(True, self.bug.owner) |
957 | - self.bug.setSecurityRelated(True) |
958 | - |
959 | - expected_heat += ( |
960 | - BugHeatConstants.DUPLICATE + |
961 | - BugHeatConstants.PRIVACY + |
962 | - BugHeatConstants.SECURITY + |
963 | - BugHeatConstants.AFFECTED_USER |
964 | - ) |
965 | - |
966 | - # Adding the duplicate and making the bug private means it gets |
967 | - # two new subscribers, the project owner and the duplicate's |
968 | - # direct subscriber. |
969 | - expected_heat += BugHeatConstants.SUBSCRIBER * 2 |
970 | - actual_heat = self.calculator.getBugHeat() |
971 | - self.assertEqual( |
972 | - expected_heat, actual_heat, |
973 | - "Expected bug heat did not match actual bug heat. " |
974 | - "Expected %s, got %s" % (expected_heat, actual_heat)) |
975 | - |
976 | - def test_getBugHeat_complete_bugs(self): |
977 | - # Bug which are in a resolved status don't have heat at all. |
978 | - complete_bug = self.factory.makeBug() |
979 | - heat = BugHeatCalculator(complete_bug).getBugHeat() |
980 | - self.assertNotEqual( |
981 | - 0, heat, |
982 | - "Expected bug heat did not match actual bug heat. " |
983 | - "Expected a positive value, got 0") |
984 | - complete_bug.bugtasks[0].transitionToStatus( |
985 | - BugTaskStatus.INVALID, complete_bug.owner) |
986 | - heat = BugHeatCalculator(complete_bug).getBugHeat() |
987 | - self.assertEqual( |
988 | - 0, heat, |
989 | - "Expected bug heat did not match actual bug heat. " |
990 | - "Expected %s, got %s" % (0, heat)) |
991 | - |
992 | - def test_getBugHeat_decay(self): |
993 | - # Every day, a bug that wasn't touched has its heat reduced by 1%. |
994 | - aging_bug = self.factory.makeBug() |
995 | - fresh_heat = BugHeatCalculator(aging_bug).getBugHeat() |
996 | - aging_bug.date_last_updated = ( |
997 | - aging_bug.date_last_updated - timedelta(days=1)) |
998 | - expected = int(fresh_heat * 0.99) |
999 | - heat = BugHeatCalculator(aging_bug).getBugHeat() |
1000 | - self.assertEqual( |
1001 | - expected, heat, |
1002 | - "Expected bug heat did not match actual bug heat. " |
1003 | - "Expected %s, got %s" % (expected, heat)) |
1004 | - |
1005 | - def test_getBugHeat_activity(self): |
1006 | - # Bug heat increases by a quarter of the maximum bug heat divided by |
1007 | - # the number of days between the bug's creating and its last activity. |
1008 | - active_bug = removeSecurityProxy(self.factory.makeBug()) |
1009 | - fresh_heat = BugHeatCalculator(active_bug).getBugHeat() |
1010 | - active_bug.date_last_updated = ( |
1011 | - active_bug.date_last_updated - timedelta(days=10)) |
1012 | - active_bug.datecreated = (active_bug.datecreated - timedelta(days=20)) |
1013 | - active_bug.default_bugtask.target.setMaxBugHeat(100) |
1014 | - expected = int((fresh_heat * (0.99 ** 20)) + (100 * 0.25 / 20)) |
1015 | - heat = BugHeatCalculator(active_bug).getBugHeat() |
1016 | - self.assertEqual( |
1017 | - expected, heat, |
1018 | - "Expected bug heat did not match actual bug heat. " |
1019 | - "Expected %s, got %s" % (expected, heat)) |
1020 | - |
1021 | - |
1022 | - |
1023 | -def test_suite(): |
1024 | - return unittest.TestLoader().loadTestsFromName(__name__) |
1025 | |
1026 | === modified file 'lib/lp/bugs/tests/bugs-emailinterface.txt' |
1027 | --- lib/lp/bugs/tests/bugs-emailinterface.txt 2010-05-27 13:51:06 +0000 |
1028 | +++ lib/lp/bugs/tests/bugs-emailinterface.txt 2010-06-30 14:22:34 +0000 |
1029 | @@ -1435,7 +1435,7 @@ |
1030 | status foo |
1031 | ... |
1032 | The 'status' command expects any of the following arguments: |
1033 | - new, incomplete, invalid, wontfix, expired, confirmed, triaged, inprogress, fixcommitted, fixreleased |
1034 | + new, incomplete, opinion, invalid, wontfix, expired, confirmed, triaged, inprogress, fixcommitted, fixreleased |
1035 | <BLANKLINE> |
1036 | For example: |
1037 | <BLANKLINE> |
1038 | |
1039 | === modified file 'lib/lp/bugs/tests/bugtarget-bugcount.txt' |
1040 | --- lib/lp/bugs/tests/bugtarget-bugcount.txt 2010-04-15 13:26:33 +0000 |
1041 | +++ lib/lp/bugs/tests/bugtarget-bugcount.txt 2010-06-30 14:22:34 +0000 |
1042 | @@ -11,6 +11,7 @@ |
1043 | ... print status.name |
1044 | NEW |
1045 | INCOMPLETE |
1046 | + OPINION |
1047 | INVALID |
1048 | WONTFIX |
1049 | EXPIRED |
1050 | @@ -54,6 +55,7 @@ |
1051 | ... print_count_difference(new_bug_counts, old_counts, status) |
1052 | NEW: 5 bug(s) more |
1053 | INCOMPLETE: 5 bug(s) more |
1054 | + OPINION: 5 bug(s) more |
1055 | INVALID: 5 bug(s) more |
1056 | WONTFIX: 5 bug(s) more |
1057 | EXPIRED: 5 bug(s) more |
1058 | |
1059 | === modified file 'lib/lp/bugs/tests/test_bugheat.py' |
1060 | --- lib/lp/bugs/tests/test_bugheat.py 2010-05-27 13:56:03 +0000 |
1061 | +++ lib/lp/bugs/tests/test_bugheat.py 2010-06-30 14:22:34 +0000 |
1062 | @@ -5,114 +5,13 @@ |
1063 | |
1064 | __metaclass__ = type |
1065 | |
1066 | -import pytz |
1067 | -import transaction |
1068 | import unittest |
1069 | -from datetime import datetime |
1070 | - |
1071 | -from zope.component import getUtility |
1072 | - |
1073 | -from canonical.launchpad.scripts.tests import run_script |
1074 | + |
1075 | from canonical.testing import LaunchpadZopelessLayer |
1076 | |
1077 | -from lp.bugs.adapters.bugchange import BugDescriptionChange |
1078 | -from lp.bugs.interfaces.bugjob import ICalculateBugHeatJobSource |
1079 | -from lp.bugs.model.bugheat import CalculateBugHeatJob |
1080 | -from lp.bugs.scripts.bugheat import BugHeatCalculator |
1081 | -from lp.testing import TestCaseWithFactory |
1082 | from lp.testing.factory import LaunchpadObjectFactory |
1083 | |
1084 | |
1085 | -class CalculateBugHeatJobTestCase(TestCaseWithFactory): |
1086 | - """Test case for CalculateBugHeatJob.""" |
1087 | - |
1088 | - layer = LaunchpadZopelessLayer |
1089 | - |
1090 | - def setUp(self): |
1091 | - super(CalculateBugHeatJobTestCase, self).setUp() |
1092 | - self.bug = self.factory.makeBug() |
1093 | - |
1094 | - # NB: This looks like it should go in the teardown, however |
1095 | - # creating the bug causes a job to be added for it. We clear |
1096 | - # this out so that our tests are consistent. |
1097 | - self._completeJobsAndAssertQueueEmpty() |
1098 | - |
1099 | - def _completeJobsAndAssertQueueEmpty(self): |
1100 | - """Make sure that all the CalculateBugHeatJobs are completed.""" |
1101 | - for bug_job in getUtility(ICalculateBugHeatJobSource).iterReady(): |
1102 | - bug_job.job.start() |
1103 | - bug_job.job.complete() |
1104 | - self.assertEqual(0, self._getJobCount()) |
1105 | - |
1106 | - def _getJobCount(self): |
1107 | - """Return the number of CalculateBugHeatJobs in the queue.""" |
1108 | - return len(self._getJobs()) |
1109 | - |
1110 | - def _getJobs(self): |
1111 | - """Return the pending CalculateBugHeatJobs as a list.""" |
1112 | - return list(CalculateBugHeatJob.iterReady()) |
1113 | - |
1114 | - def test_run(self): |
1115 | - # CalculateBugHeatJob.run() sets calculates and sets the heat |
1116 | - # for a bug. |
1117 | - job = CalculateBugHeatJob.create(self.bug) |
1118 | - bug_heat_calculator = BugHeatCalculator(self.bug) |
1119 | - |
1120 | - job.run() |
1121 | - self.assertEqual( |
1122 | - bug_heat_calculator.getBugHeat(), self.bug.heat) |
1123 | - |
1124 | - def test_utility(self): |
1125 | - # CalculateBugHeatJobSource is a utility for acquiring |
1126 | - # CalculateBugHeatJobs. |
1127 | - utility = getUtility(ICalculateBugHeatJobSource) |
1128 | - self.assertTrue( |
1129 | - ICalculateBugHeatJobSource.providedBy(utility)) |
1130 | - |
1131 | - def test_create_only_creates_one(self): |
1132 | - # If there's already a CalculateBugHeatJob for a bug, |
1133 | - # CalculateBugHeatJob.create() won't create a new one. |
1134 | - job = CalculateBugHeatJob.create(self.bug) |
1135 | - |
1136 | - # There will now be one job in the queue. |
1137 | - self.assertEqual(1, self._getJobCount()) |
1138 | - |
1139 | - new_job = CalculateBugHeatJob.create(self.bug) |
1140 | - |
1141 | - # The two jobs will in fact be the same job. |
1142 | - self.assertEqual(job, new_job) |
1143 | - |
1144 | - # And the queue will still have a length of 1. |
1145 | - self.assertEqual(1, self._getJobCount()) |
1146 | - |
1147 | - def test_cronscript_succeeds(self): |
1148 | - # The calculate-bug-heat cronscript will run all pending |
1149 | - # CalculateBugHeatJobs. |
1150 | - CalculateBugHeatJob.create(self.bug) |
1151 | - transaction.commit() |
1152 | - |
1153 | - retcode, stdout, stderr = run_script( |
1154 | - 'cronscripts/calculate-bug-heat.py', [], |
1155 | - expect_returncode=0) |
1156 | - self.assertEqual('', stdout) |
1157 | - self.assertIn( |
1158 | - 'INFO Ran 1 CalculateBugHeatJob jobs.\n', stderr) |
1159 | - |
1160 | - def test_getOopsVars(self): |
1161 | - # BugJobDerived.getOopsVars() returns the variables to be used |
1162 | - # when logging an OOPS for a bug job. We test this using |
1163 | - # CalculateBugHeatJob because BugJobDerived doesn't let us |
1164 | - # create() jobs. |
1165 | - job = CalculateBugHeatJob.create(self.bug) |
1166 | - vars = job.getOopsVars() |
1167 | - |
1168 | - # The Bug ID, BugJob ID and BugJob type will be returned by |
1169 | - # getOopsVars(). |
1170 | - self.assertIn(('bug_id', self.bug.id), vars) |
1171 | - self.assertIn(('bug_job_id', job.context.id), vars) |
1172 | - self.assertIn(('bug_job_type', job.context.job_type.title), vars) |
1173 | - |
1174 | - |
1175 | class MaxHeatByTargetBase: |
1176 | """Base class for testing a bug target's max_bug_heat attribute.""" |
1177 | |
1178 | |
1179 | === modified file 'lib/lp/code/browser/sourcepackagerecipe.py' |
1180 | --- lib/lp/code/browser/sourcepackagerecipe.py 2010-06-16 13:04:12 +0000 |
1181 | +++ lib/lp/code/browser/sourcepackagerecipe.py 2010-06-30 14:22:34 +0000 |
1182 | @@ -314,7 +314,10 @@ |
1183 | 'name', |
1184 | 'description', |
1185 | 'owner', |
1186 | + 'build_daily' |
1187 | ]) |
1188 | + daily_build_archive = Choice(vocabulary='TargetPPAs', |
1189 | + title=u'Daily build archive') |
1190 | distros = List( |
1191 | Choice(vocabulary='BuildableDistroSeries'), |
1192 | title=u'Default Distribution series') |
1193 | @@ -323,10 +326,16 @@ |
1194 | description=u'The text of the recipe.') |
1195 | |
1196 | |
1197 | + |
1198 | class RecipeTextValidatorMixin: |
1199 | """Class to validate that the Source Package Recipe text is valid.""" |
1200 | |
1201 | def validate(self, data): |
1202 | + if data['build_daily']: |
1203 | + if len(data['distros']) == 0: |
1204 | + self.setFieldError( |
1205 | + 'distros', |
1206 | + 'You must specify at least one series for daily builds.') |
1207 | try: |
1208 | parser = RecipeParser(data['recipe_text']) |
1209 | parser.parse() |
1210 | @@ -348,7 +357,8 @@ |
1211 | def initial_values(self): |
1212 | return { |
1213 | 'recipe_text': MINIMAL_RECIPE_TEXT % self.context.bzr_identity, |
1214 | - 'owner': self.user} |
1215 | + 'owner': self.user, |
1216 | + 'build_daily': False} |
1217 | |
1218 | @property |
1219 | def cancel_url(self): |
1220 | @@ -362,7 +372,8 @@ |
1221 | source_package_recipe = getUtility( |
1222 | ISourcePackageRecipeSource).new( |
1223 | self.user, self.user, data['name'], recipe, |
1224 | - data['description'], data['distros']) |
1225 | + data['description'], data['distros'], |
1226 | + data['daily_build_archive'], data['build_daily']) |
1227 | except ForbiddenInstruction: |
1228 | # XXX: bug=592513 We shouldn't be hardcoding "run" here. |
1229 | self.setFieldError( |
1230 | |
1231 | === modified file 'lib/lp/code/browser/tests/test_sourcepackagerecipe.py' |
1232 | --- lib/lp/code/browser/tests/test_sourcepackagerecipe.py 2010-06-17 14:12:33 +0000 |
1233 | +++ lib/lp/code/browser/tests/test_sourcepackagerecipe.py 2010-06-30 14:22:34 +0000 |
1234 | @@ -70,6 +70,14 @@ |
1235 | |
1236 | layer = DatabaseFunctionalLayer |
1237 | |
1238 | + def makeBranch(self): |
1239 | + product = self.factory.makeProduct( |
1240 | + name='ratatouille', displayname='Ratatouille') |
1241 | + branch = self.factory.makeBranch( |
1242 | + owner=self.chef, product=product, name='veggies') |
1243 | + self.factory.makeSourcePackage(sourcepackagename='ratatouille') |
1244 | + return branch |
1245 | + |
1246 | def test_create_new_recipe_not_logged_in(self): |
1247 | from canonical.launchpad.testing.pages import setupBrowser |
1248 | product = self.factory.makeProduct( |
1249 | @@ -86,11 +94,7 @@ |
1250 | Unauthorized, browser.getLink('Create packaging recipe').click) |
1251 | |
1252 | def test_create_new_recipe(self): |
1253 | - product = self.factory.makeProduct( |
1254 | - name='ratatouille', displayname='Ratatouille') |
1255 | - branch = self.factory.makeBranch( |
1256 | - owner=self.chef, product=product, name='veggies') |
1257 | - |
1258 | + branch = self.makeBranch() |
1259 | # A new recipe can be created from the branch page. |
1260 | browser = self.getUserBrowser(canonical_url(branch), user=self.chef) |
1261 | browser.getLink('Create packaging recipe').click() |
1262 | @@ -98,6 +102,7 @@ |
1263 | browser.getControl(name='field.name').value = 'daily' |
1264 | browser.getControl('Description').value = 'Make some food!' |
1265 | browser.getControl('Secret Squirrel').click() |
1266 | + browser.getControl('Build daily').click() |
1267 | browser.getControl('Create Recipe').click() |
1268 | |
1269 | pattern = """\ |
1270 | @@ -108,9 +113,15 @@ |
1271 | Make some food! |
1272 | |
1273 | Recipe information |
1274 | + Build daily: True |
1275 | Owner: Master Chef |
1276 | Base branch: lp://dev/~chef/ratatouille/veggies |
1277 | +<<<<<<< TREE |
1278 | Debian version: 0\+\{revno\} |
1279 | +======= |
1280 | + Debian version: 1.0 |
1281 | + Daily build archive: Secret PPA |
1282 | +>>>>>>> MERGE-SOURCE |
1283 | Distribution series: Secret Squirrel |
1284 | .* |
1285 | |
1286 | @@ -185,11 +196,26 @@ |
1287 | def test_create_recipe_bad_text(self): |
1288 | # If a user tries to create source package recipe with bad text, they |
1289 | # should get an error. |
1290 | +<<<<<<< TREE |
1291 | browser = self.createRecipe('Foo bar baz') |
1292 | +======= |
1293 | + branch = self.makeBranch() |
1294 | + |
1295 | + # A new recipe can be created from the branch page. |
1296 | + browser = self.getUserBrowser(canonical_url(branch), user=self.chef) |
1297 | + browser.getLink('Create packaging recipe').click() |
1298 | + |
1299 | + browser.getControl(name='field.name').value = 'daily' |
1300 | + browser.getControl('Description').value = 'Make some food!' |
1301 | + browser.getControl('Recipe text').value = 'Foo bar baz' |
1302 | + browser.getControl('Create Recipe').click() |
1303 | + |
1304 | +>>>>>>> MERGE-SOURCE |
1305 | self.assertEqual( |
1306 | get_message_text(browser, 1), |
1307 | 'The recipe text is not a valid bzr-builder recipe.') |
1308 | |
1309 | +<<<<<<< TREE |
1310 | def test_create_recipe_bad_base_branch(self): |
1311 | # If a user tries to create source package recipe with a bad base |
1312 | # branch location, they should get an error. |
1313 | @@ -210,6 +236,19 @@ |
1314 | self.assertEqual( |
1315 | get_message_text(browser, 1), 'foo is not a branch on Launchpad.') |
1316 | |
1317 | +======= |
1318 | + def test_create_recipe_no_distroseries(self): |
1319 | + browser = self.getViewBrowser(self.makeBranch(), '+new-recipe') |
1320 | + browser.getControl(name='field.name').value = 'daily' |
1321 | + browser.getControl('Description').value = 'Make some food!' |
1322 | + |
1323 | + browser.getControl('Build daily').click() |
1324 | + browser.getControl('Create Recipe').click() |
1325 | + self.assertEqual( |
1326 | + extract_text(find_tags_by_class(browser.contents, 'message')[1]), |
1327 | + 'You must specify at least one series for daily builds.') |
1328 | + |
1329 | +>>>>>>> MERGE-SOURCE |
1330 | def test_create_dupe_recipe(self): |
1331 | # You shouldn't be able to create a duplicate recipe owned by the same |
1332 | # person with the same name. |
1333 | @@ -252,7 +291,11 @@ |
1334 | recipe = self.factory.makeSourcePackageRecipe( |
1335 | owner=self.chef, registrant=self.chef, |
1336 | name=u'things', description=u'This is a recipe', |
1337 | - distroseries=self.squirrel, branches=[veggie_branch]) |
1338 | + distroseries=self.squirrel, branches=[veggie_branch], |
1339 | + daily_build_archive=self.ppa) |
1340 | + self.factory.makeArchive( |
1341 | + distribution=self.ppa.distribution, name='ppa2', |
1342 | + displayname="PPA 2", owner=self.chef) |
1343 | |
1344 | meat_path = meat_branch.bzr_identity |
1345 | |
1346 | @@ -264,6 +307,7 @@ |
1347 | MINIMAL_RECIPE_TEXT % meat_path) |
1348 | browser.getControl('Secret Squirrel').click() |
1349 | browser.getControl('Mumbly Midget').click() |
1350 | + browser.getControl('PPA 2').click() |
1351 | browser.getControl('Update Recipe').click() |
1352 | |
1353 | pattern = """\ |
1354 | @@ -274,9 +318,16 @@ |
1355 | This is stuff |
1356 | |
1357 | Recipe information |
1358 | + Build daily: False |
1359 | Owner: Master Chef |
1360 | Base branch: lp://dev/~chef/ratatouille/meat |
1361 | +<<<<<<< TREE |
1362 | Debian version: 0\+\{revno\} |
1363 | +======= |
1364 | + Debian version: 1.0 |
1365 | + Daily build archive: |
1366 | + PPA 2 |
1367 | +>>>>>>> MERGE-SOURCE |
1368 | Distribution series: Mumbly Midget |
1369 | .* |
1370 | |
1371 | @@ -381,9 +432,17 @@ |
1372 | This is stuff |
1373 | |
1374 | Recipe information |
1375 | + Build daily: |
1376 | + False |
1377 | Owner: Master Chef |
1378 | Base branch: lp://dev/~chef/ratatouille/meat |
1379 | +<<<<<<< TREE |
1380 | Debian version: 0\+\{revno\} |
1381 | +======= |
1382 | + Debian version: 1.0 |
1383 | + Daily build archive: |
1384 | + Secret PPA |
1385 | +>>>>>>> MERGE-SOURCE |
1386 | Distribution series: Mumbly Midget |
1387 | .* |
1388 | |
1389 | @@ -413,9 +472,15 @@ |
1390 | This recipe .*changes. |
1391 | |
1392 | Recipe information |
1393 | + Build daily: False |
1394 | Owner: Master Chef |
1395 | Base branch: lp://dev/~chef/chocolate/cake |
1396 | +<<<<<<< TREE |
1397 | Debian version: 0\+\{revno\} |
1398 | +======= |
1399 | + Debian version: 1.0 |
1400 | + Daily build archive: Secret PPA |
1401 | +>>>>>>> MERGE-SOURCE |
1402 | Distribution series: Secret Squirrel |
1403 | |
1404 | Build records |
1405 | |
1406 | === modified file 'lib/lp/code/configure.zcml' |
1407 | --- lib/lp/code/configure.zcml 2010-06-10 07:55:54 +0000 |
1408 | +++ lib/lp/code/configure.zcml 2010-06-30 14:22:34 +0000 |
1409 | @@ -1052,6 +1052,7 @@ |
1410 | set_attributes=" |
1411 | build_daily |
1412 | builder_recipe |
1413 | + daily_build_archive |
1414 | date_last_modified |
1415 | description |
1416 | distroseries |
1417 | |
1418 | === modified file 'lib/lp/code/interfaces/sourcepackagerecipe.py' |
1419 | --- lib/lp/code/interfaces/sourcepackagerecipe.py 2010-06-15 16:09:04 +0000 |
1420 | +++ lib/lp/code/interfaces/sourcepackagerecipe.py 2010-06-30 14:22:34 +0000 |
1421 | @@ -71,8 +71,8 @@ |
1422 | |
1423 | id = Int() |
1424 | |
1425 | - daily_build_archive = Reference( |
1426 | - IArchive, title=_("The archive to use for daily builds.")) |
1427 | + daily_build_archive = exported(Reference( |
1428 | + IArchive, title=_("The archive to use for daily builds."))) |
1429 | |
1430 | date_created = Datetime(required=True, readonly=True) |
1431 | date_last_modified = Datetime(required=True, readonly=True) |
1432 | @@ -94,8 +94,8 @@ |
1433 | Reference(IDistroSeries), title=_("The distroseries this recipe will" |
1434 | " build a source package for"), |
1435 | readonly=False) |
1436 | - build_daily = Bool( |
1437 | - title=_("If true, the recipe should be built daily.")) |
1438 | + build_daily = exported(Bool( |
1439 | + title=_("Build daily"))) |
1440 | |
1441 | name = exported(TextLine( |
1442 | title=_("Name"), required=True, |
1443 | |
1444 | === modified file 'lib/lp/code/model/tests/test_sourcepackagerecipe.py' |
1445 | --- lib/lp/code/model/tests/test_sourcepackagerecipe.py 2010-06-12 13:34:11 +0000 |
1446 | +++ lib/lp/code/model/tests/test_sourcepackagerecipe.py 2010-06-30 14:22:34 +0000 |
1447 | @@ -629,14 +629,17 @@ |
1448 | db_distroseries = self.factory.makeDistroSeries() |
1449 | if recipe_text is None: |
1450 | recipe_text = self.makeRecipeText() |
1451 | + db_archive = self.factory.makeArchive(owner=owner, name="recipe-ppa") |
1452 | launchpad = launchpadlib_for('test', user, |
1453 | service_root="http://api.launchpad.dev:8085") |
1454 | login(ANONYMOUS) |
1455 | distroseries = ws_object(launchpad, db_distroseries) |
1456 | ws_owner = ws_object(launchpad, owner) |
1457 | + ws_archive = ws_object(launchpad, db_archive) |
1458 | recipe = ws_owner.createRecipe( |
1459 | name='toaster-1', description='a recipe', recipe_text=recipe_text, |
1460 | - distroseries=[distroseries.self_link]) |
1461 | + distroseries=[distroseries.self_link], build_daily=True, |
1462 | + daily_build_archive=ws_archive) |
1463 | # at the moment, distroseries is not exposed in the API. |
1464 | transaction.commit() |
1465 | db_recipe = owner.getRecipe(name=u'toaster-1') |
1466 | @@ -653,6 +656,8 @@ |
1467 | self.assertEqual(team.teamowner.name, recipe.registrant.name) |
1468 | self.assertEqual('toaster-1', recipe.name) |
1469 | self.assertEqual(recipe_text, recipe.recipe_text) |
1470 | + self.assertTrue(recipe.build_daily) |
1471 | + self.assertEqual('recipe-ppa', recipe.daily_build_archive.name) |
1472 | |
1473 | def test_recipe_text(self): |
1474 | recipe_text2 = self.makeRecipeText() |
1475 | |
1476 | === modified file 'lib/lp/code/templates/sourcepackagerecipe-index.pt' |
1477 | --- lib/lp/code/templates/sourcepackagerecipe-index.pt 2010-04-28 21:18:13 +0000 |
1478 | +++ lib/lp/code/templates/sourcepackagerecipe-index.pt 2010-06-30 14:22:34 +0000 |
1479 | @@ -36,6 +36,11 @@ |
1480 | <div class="portlet"> |
1481 | <h2>Recipe information</h2> |
1482 | <div class="two-column-list"> |
1483 | + <dl id="build_daily"> |
1484 | + <dt>Build daily:</dt> |
1485 | + <dd tal:content="context/build_daily" /> |
1486 | + </dl> |
1487 | + |
1488 | <dl id="owner"> |
1489 | <dt>Owner:</dt> |
1490 | <dd tal:content="structure context/owner/fmt:link" /> |
1491 | @@ -48,6 +53,14 @@ |
1492 | <dt>Debian version:</dt> |
1493 | <dd tal:content="context/deb_version_template" /> |
1494 | </dl> |
1495 | + <dl id="daily_build_archive"> |
1496 | + <dt>Daily build archive:</dt> |
1497 | + <dd tal:content="structure context/daily_build_archive/fmt:link" |
1498 | + tal:condition="context/daily_build_archive"> |
1499 | + </dd> |
1500 | + <dd tal:condition="not: context/daily_build_archive">None</dd> |
1501 | + </dl> |
1502 | + |
1503 | <dl id="distros"> |
1504 | <dt>Distribution series:</dt> |
1505 | <dd> |
1506 | |
1507 | === modified file 'lib/lp/registry/interfaces/person.py' |
1508 | --- lib/lp/registry/interfaces/person.py 2010-05-22 01:42:59 +0000 |
1509 | +++ lib/lp/registry/interfaces/person.py 2010-06-30 14:22:34 +0000 |
1510 | @@ -850,15 +850,21 @@ |
1511 | distroseries=List(value_type=Reference(schema=Interface)), |
1512 | name=TextLine(), |
1513 | recipe_text=Text(), |
1514 | + daily_build_archive=Reference(schema=Interface), |
1515 | + build_daily=Bool(), |
1516 | ) |
1517 | @export_factory_operation(Interface, []) |
1518 | - def createRecipe(name, description, recipe_text, distroseries, registrant): |
1519 | + def createRecipe(name, description, recipe_text, distroseries, |
1520 | + registrant, daily_build_archive=None, build_daily=False): |
1521 | """Create a SourcePackageRecipe owned by this person. |
1522 | |
1523 | :param name: the name to use for referring to the recipe. |
1524 | :param description: A description of the recipe. |
1525 | :param recipe_text: The text of the recipe. |
1526 | :param distroseries: The distroseries to use. |
1527 | + :param registrant: The person who created this recipe. |
1528 | + :param daily_build_archive: The archive to use for daily builds. |
1529 | + :param build_daily: If True, build this recipe daily (if changed). |
1530 | :return: a SourcePackageRecipe. |
1531 | """ |
1532 | |
1533 | |
1534 | === modified file 'lib/lp/registry/model/person.py' |
1535 | --- lib/lp/registry/model/person.py 2010-06-21 21:01:01 +0000 |
1536 | +++ lib/lp/registry/model/person.py 2010-06-30 14:22:34 +0000 |
1537 | @@ -2264,13 +2264,14 @@ |
1538 | return rset |
1539 | |
1540 | def createRecipe(self, name, description, recipe_text, distroseries, |
1541 | - registrant): |
1542 | + registrant, daily_build_archive=None, build_daily=False): |
1543 | """See `IPerson`.""" |
1544 | from lp.code.model.sourcepackagerecipe import SourcePackageRecipe |
1545 | builder_recipe = RecipeParser(recipe_text).parse() |
1546 | spnset = getUtility(ISourcePackageNameSet) |
1547 | return SourcePackageRecipe.new( |
1548 | - registrant, self, name, builder_recipe, description, distroseries) |
1549 | + registrant, self, name, builder_recipe, description, distroseries, |
1550 | + daily_build_archive, build_daily) |
1551 | |
1552 | def getRecipe(self, name): |
1553 | from lp.code.model.sourcepackagerecipe import SourcePackageRecipe |
1554 | |
1555 | === added directory 'lib/lp/services/acl' |
1556 | === added file 'lib/lp/services/acl/__init__.py' |
1557 | === added file 'lib/lp/services/acl/configure.zcml' |
1558 | --- lib/lp/services/acl/configure.zcml 1970-01-01 00:00:00 +0000 |
1559 | +++ lib/lp/services/acl/configure.zcml 2010-06-30 14:22:34 +0000 |
1560 | @@ -0,0 +1,14 @@ |
1561 | + |
1562 | +<!-- Copyright 2009 Canonical Ltd. This software is licensed under the |
1563 | + GNU Affero General Public License version 3 (see the file LICENSE). |
1564 | +--> |
1565 | + |
1566 | +<configure |
1567 | + xmlns="http://namespaces.zope.org/zope" |
1568 | + xmlns:browser="http://namespaces.zope.org/browser" |
1569 | + xmlns:i18n="http://namespaces.zope.org/i18n" |
1570 | + xmlns:xmlrpc="http://namespaces.zope.org/xmlrpc" |
1571 | + xmlns:lp="http://namespaces.canonical.com/lp" |
1572 | + i18n_domain="launchpad"> |
1573 | + <adapter factory="lp.services.acl.model.adapt_to_acl" /> |
1574 | +</configure> |
1575 | |
1576 | === added file 'lib/lp/services/acl/interfaces.py' |
1577 | --- lib/lp/services/acl/interfaces.py 1970-01-01 00:00:00 +0000 |
1578 | +++ lib/lp/services/acl/interfaces.py 2010-06-30 14:22:34 +0000 |
1579 | @@ -0,0 +1,59 @@ |
1580 | +# Copyright 2010 Canonical Ltd. This software is licensed under the |
1581 | +# GNU Affero General Public License version 3 (see the file LICENSE). |
1582 | + |
1583 | +"""Interfaces and enums for ACLs.""" |
1584 | + |
1585 | +__metaclass__ = type |
1586 | +__all__ = [ |
1587 | + 'ACLObjectType', |
1588 | + 'ACLPermission', |
1589 | + 'EVERYONE', |
1590 | + ] |
1591 | + |
1592 | +from zope.interface import Interface |
1593 | + |
1594 | +from lazr.enum import DBEnumeratedType, DBItem |
1595 | + |
1596 | + |
1597 | +EVERYONE = None |
1598 | + |
1599 | + |
1600 | +class ACLPermission(DBEnumeratedType): |
1601 | + """Permissions for ACLs.""" |
1602 | + |
1603 | + VIEW = DBItem(0, """ |
1604 | + View |
1605 | + |
1606 | + Allows you to view an object. |
1607 | + """) |
1608 | + |
1609 | + MODIFY_ACL = DBItem(1, """ |
1610 | + Modify ACL |
1611 | + |
1612 | + Allows you to grant and revoke permissions for an object. |
1613 | + """) |
1614 | + |
1615 | + |
1616 | +class ACLObjectType(DBEnumeratedType): |
1617 | + |
1618 | + BUGTASK = DBItem(0, """ |
1619 | + BugTask |
1620 | + |
1621 | + ACL for a BugTask. |
1622 | + """) |
1623 | + |
1624 | + |
1625 | +class IACL(Interface): |
1626 | + """Manage ACL for an object.""" |
1627 | + |
1628 | + def grant(permission, user): |
1629 | + """Grant the given permission to the user.""" |
1630 | + |
1631 | + def revoke(permission, user): |
1632 | + """Revoke the given permission for the user.""" |
1633 | + |
1634 | + def has(permission, user): |
1635 | + """Check whether a user has the given permission.""" |
1636 | + |
1637 | + def getACLItems(permission): |
1638 | + """Return all the ACL items for the object.""" |
1639 | |
1640 | === added file 'lib/lp/services/acl/model.py' |
1641 | --- lib/lp/services/acl/model.py 1970-01-01 00:00:00 +0000 |
1642 | +++ lib/lp/services/acl/model.py 2010-06-30 14:22:34 +0000 |
1643 | @@ -0,0 +1,106 @@ |
1644 | +# Copyright 2010 Canonical Ltd. This software is licensed under the |
1645 | +# GNU Affero General Public License version 3 (see the file LICENSE). |
1646 | + |
1647 | +"""ACL model classes.""" |
1648 | + |
1649 | +__metaclass__ = type |
1650 | +__all__ = [ |
1651 | + 'ACLAdapter', |
1652 | + ] |
1653 | + |
1654 | + |
1655 | +from zope.component import adapter |
1656 | +from zope.interface import implementer, implements, Interface |
1657 | +from zope.security.proxy import removeSecurityProxy |
1658 | + |
1659 | +from canonical.launchpad.interfaces.lpstorm import IStore |
1660 | +from lp.registry.model.teammembership import TeamParticipation |
1661 | +from lp.services.acl.interfaces import EVERYONE, IACL |
1662 | + |
1663 | + |
1664 | +class ACLAdapter: |
1665 | + """Generic ACL adapter. |
1666 | + |
1667 | + This one should be used to manage the ACL for any object. It |
1668 | + shouldn't be necessary to create more specific ones. |
1669 | + """ |
1670 | + |
1671 | + implements(IACL) |
1672 | + |
1673 | + def __init__(self, ob, ACL_class): |
1674 | + self._object = ob |
1675 | + self._ACL_class = ACL_class |
1676 | + |
1677 | + def grant(self, permission, user): |
1678 | + """See `IACL`.""" |
1679 | + store = IStore(self._ACL_class) |
1680 | + existing_acls = [ |
1681 | + acl for acl in self.getACLItems(permission)] |
1682 | + if len(existing_acls) == 1 and existing_acls[0].person is EVERYONE: |
1683 | + public_acl = existing_acls.pop(0) |
1684 | + store.remove(public_acl) |
1685 | + if user == EVERYONE: |
1686 | + for existing_acl in existing_acls: |
1687 | + store.remove(existing_acl) |
1688 | + object_acl = self._ACL_class( |
1689 | + ob=self._object, person=user, |
1690 | + permission=permission) |
1691 | + store.add(object_acl) |
1692 | + |
1693 | + def revoke(self, permission, user): |
1694 | + """See `IACL`.""" |
1695 | + assert self.getACLItems(permission).count() > 1, ( |
1696 | + "Can't remove last ACL row for %s." % permission.name) |
1697 | + store = IStore(self._ACL_class) |
1698 | + existing_acl = store.find( |
1699 | + self._ACL_class, |
1700 | + self._ACL_class.object_id == self._object.id, |
1701 | + self._ACL_class.permission == permission, |
1702 | + self._ACL_class.person == user).one() |
1703 | + store.remove(existing_acl) |
1704 | + |
1705 | + def has(self, permission, user): |
1706 | + """See `IACL`.""" |
1707 | + store = IStore(self._ACL_class) |
1708 | + public_acl = store.find( |
1709 | + self._ACL_class, |
1710 | + self._ACL_class.object_id == self._object.id, |
1711 | + self._ACL_class.person == EVERYONE, |
1712 | + self._ACL_class.permission == permission).one() |
1713 | + if public_acl is not None: |
1714 | + return True |
1715 | + acls = store.find( |
1716 | + self._ACL_class, |
1717 | + self._ACL_class.object_id == self._object.id, |
1718 | + self._ACL_class.person == TeamParticipation.teamID, |
1719 | + self._ACL_class.permission == permission, |
1720 | + TeamParticipation.person == user) |
1721 | + return acls.any() is not None |
1722 | + |
1723 | + |
1724 | + def getACLItems(self, permission): |
1725 | + """See `IACL`.""" |
1726 | + store = IStore(self._ACL_class) |
1727 | + return store.find( |
1728 | + self._ACL_class, |
1729 | + self._ACL_class.permission == permission, |
1730 | + self._ACL_class.object_id == self._object.id) |
1731 | + |
1732 | + |
1733 | +@adapter(Interface) |
1734 | +@implementer(IACL) |
1735 | +def adapt_to_acl(context): |
1736 | + """Adapt any object that has __acl_class__ set. |
1737 | + |
1738 | + It's registered against Interface, so that it's easier to hook up |
1739 | + the ACL adpter, so that a separate ZCML registration isn't needed. |
1740 | + |
1741 | + If the object doesn't have an __acl__class__ attribute, None is |
1742 | + returned, which means that the object couldn't be adapted. |
1743 | + """ |
1744 | + missing = object() |
1745 | + acl_class = getattr( |
1746 | + removeSecurityProxy(context.__class__), '__acl_class__', missing) |
1747 | + if acl_class is missing: |
1748 | + return None |
1749 | + return ACLAdapter(context, acl_class) |
1750 | |
1751 | === added file 'lib/lp/services/doc/acl.txt' |
1752 | --- lib/lp/services/doc/acl.txt 1970-01-01 00:00:00 +0000 |
1753 | +++ lib/lp/services/doc/acl.txt 2010-06-30 14:22:34 +0000 |
1754 | @@ -0,0 +1,205 @@ |
1755 | +==== |
1756 | +ACLs |
1757 | +==== |
1758 | + |
1759 | +This document aims to explain how the ACL system works, and how a |
1760 | +programmer can integrate it into his code. It starts with an overview, |
1761 | +which explains what the goal of the ACL system is. For the more |
1762 | +interested parties, it then continues explaining the ACL API; how a |
1763 | +programmer would use it. |
1764 | + |
1765 | + |
1766 | +Overview |
1767 | +======== |
1768 | + |
1769 | +The ACL system is used to check whether some user has certain |
1770 | +permissions to do various things with an object. To have something more |
1771 | +concrete to talk about, lets say that we have a project, which has bugs. |
1772 | +By default, the project and its bugs are accessible to anyone. A |
1773 | +Launchpad Commercial Admin can grant the MODIFY_ACL permission to |
1774 | +someone in the project team. That person (or team) can now limit who can |
1775 | +have access to their project and its bugs. The can give people and teams |
1776 | +permission project-wide, to let someone see the whole project and its |
1777 | +bugs, or let someone to see only part of the project, for example only a |
1778 | +single bug. (Giving permission to a single bugs means that the user will |
1779 | +automatically get permission to see the name of the project, though). |
1780 | + |
1781 | +Teams can be used to manage permissions across multiple projects. By |
1782 | +giving permission to a team, you can give others the same permission by |
1783 | +adding them to the team. This way you can easily give someone access to |
1784 | +multiple projects. |
1785 | + |
1786 | +In addition to controlling permission project-wide, or for a single |
1787 | +bugs, it's also possible to control it for a larger part of the project. |
1788 | +For example, to have specific permission for the Bugs part of the |
1789 | +project, e.g. giving someone permission to change the importance of |
1790 | +all bugs, but not editing the project details. |
1791 | + |
1792 | + |
1793 | +Defining an ACL for an object |
1794 | +============================= |
1795 | + |
1796 | +Each object type needs its own ACL table. The first thing you need to do |
1797 | +is to create the DB patch. |
1798 | + |
1799 | + >>> from canonical.launchpad.webapp.interfaces import ( |
1800 | + ... IStoreSelector, MAIN_STORE, MASTER_FLAVOR) |
1801 | + >>> store = getUtility(IStoreSelector).get(MAIN_STORE, MASTER_FLAVOR) |
1802 | + >>> store.execute(""" |
1803 | + ... CREATE TABLE MyObjectACL( |
1804 | + ... id serial NOT NULL PRIMARY KEY, |
1805 | + ... object_id integer NOT NULL, --REFERENCES Product(id), |
1806 | + ... person integer NULL, -- REFERENCES Person(id), |
1807 | + ... permission INTEGER NOT NULL, |
1808 | + ... parent_id INTEGER, |
1809 | + ... parent_type INTEGER |
1810 | + ... ); |
1811 | + ... """, noresult=True) |
1812 | + |
1813 | + |
1814 | +The next things is of course to add the model class. |
1815 | + |
1816 | + >>> from storm.base import Storm |
1817 | + >>> from storm.locals import Bool, Int, Reference |
1818 | + >>> from canonical.database.enumcol import EnumCol |
1819 | + >>> from lp.services.acl.interfaces import ACLObjectType, ACLPermission |
1820 | + >>> class MyObjectACL(Storm): |
1821 | + ... __storm_table__ = "MyObjectACL" |
1822 | + ... def __init__(self, ob, person, permission): |
1823 | + ... self.object_id = ob.id |
1824 | + ... self.person = person |
1825 | + ... self.permission = permission |
1826 | + ... |
1827 | + ... id = Int(primary=True) |
1828 | + ... object_id = Int(name="object_id") |
1829 | + ... parent_id = Int(name='parent_id') |
1830 | + ... parent_type = EnumCol(enum=ACLObjectType) |
1831 | + ... person_id = Int(name='person') |
1832 | + ... person = Reference(person_id, "Person.id") |
1833 | + ... permission = EnumCol(enum=ACLPermission) |
1834 | + |
1835 | + |
1836 | +By defining a __acl_class__ class attribute, we can connect the ACL |
1837 | +class with the object type what we want to create. This should of course |
1838 | +be a DB class, but we'll use a non-DB class as an example. |
1839 | + |
1840 | + >>> class MyObject: |
1841 | + ... __acl_class__ = MyObjectACL |
1842 | + ... def __init__(self, id): |
1843 | + ... self.id = id |
1844 | + |
1845 | +That's everything that is needed to hook things up. We can now adapt an |
1846 | +instance of our object to IACL to manage the permissions for the object. |
1847 | + |
1848 | + >>> from lp.services.acl.interfaces import IACL |
1849 | + >>> acl = IACL(MyObject(1)) |
1850 | + |
1851 | + |
1852 | +Managing ACLs |
1853 | +============= |
1854 | + |
1855 | +The ACL of an object tells you who has certain permissions for an |
1856 | +object. You have to hook into the places where the object is created and |
1857 | +change, and make sure that the ACL is correct. |
1858 | + |
1859 | +When the object is created, you have to add the appropriate permissions. |
1860 | +Usually this means granting VIEW to EVERYONE, if the object is public. |
1861 | +EVERYONE gets mapped into NULL in the database. |
1862 | + |
1863 | +Grant |
1864 | +----- |
1865 | + |
1866 | + >>> from lp.services.acl.interfaces import EVERYONE |
1867 | + >>> my_object = MyObject(2) |
1868 | + >>> IACL(my_object).grant(ACLPermission.VIEW, EVERYONE) |
1869 | + >>> for acl_item in IACL(my_object).getACLItems(ACLPermission.VIEW): |
1870 | + ... print "User: %s, Permission: %s" % ( |
1871 | + ... acl_item.person, acl_item.permission.name) |
1872 | + User: None, Permission: VIEW |
1873 | + |
1874 | +The EVERYONE ACL is special. There has to be at least one item in the |
1875 | +ACL, so the EVERYONE ACL is basically the same as an empty ACL. Meaning |
1876 | +that if you want to limit the access to an object, all you have to do is |
1877 | +to grant the permission to the people you want to have access. The |
1878 | +permission for EVERYONE will automatically be revoked by the ACL |
1879 | +adapter. |
1880 | + |
1881 | + >>> bob = factory.makePerson(name='bob') |
1882 | + >>> IACL(my_object).grant(ACLPermission.VIEW, bob) |
1883 | + >>> for acl_item in IACL(my_object).getACLItems(ACLPermission.VIEW): |
1884 | + ... print "User: %s, Permission: %s" % ( |
1885 | + ... acl_item.person, acl_item.permission.name) |
1886 | + User: <Person at ... bob (Bob)>, Permission: VIEW |
1887 | + |
1888 | +Has |
1889 | +--- |
1890 | + |
1891 | +To check whether someone has permissions on an object, the has() method |
1892 | +is used. |
1893 | + |
1894 | + >>> alice = factory.makePerson(name='alice') |
1895 | + >>> my_object = MyObject(3) |
1896 | + >>> IACL(my_object).grant(ACLPermission.VIEW, EVERYONE) |
1897 | + >>> IACL(my_object).has(ACLPermission.VIEW, bob) |
1898 | + True |
1899 | + >>> IACL(my_object).has(ACLPermission.VIEW, alice) |
1900 | + True |
1901 | + |
1902 | + >>> IACL(my_object).grant(ACLPermission.VIEW, bob) |
1903 | + >>> IACL(my_object).has(ACLPermission.VIEW, bob) |
1904 | + True |
1905 | + >>> IACL(my_object).has(ACLPermission.VIEW, alice) |
1906 | + False |
1907 | + |
1908 | +The user doesn't have to have the permission directly. He can also |
1909 | +inherit it from a team. |
1910 | + |
1911 | + >>> from zope.security.proxy import removeSecurityProxy |
1912 | + >>> team = factory.makeTeam() |
1913 | + >>> IACL(my_object).grant(ACLPermission.VIEW, team) |
1914 | + >>> IACL(my_object).has(ACLPermission.VIEW, alice) |
1915 | + False |
1916 | + >>> removeSecurityProxy(alice).join(team) |
1917 | + >>> IACL(my_object).has(ACLPermission.VIEW, alice) |
1918 | + True |
1919 | + |
1920 | + |
1921 | +Revoke |
1922 | +------ |
1923 | + |
1924 | +To revoke a permissions, the revoke() method is used. |
1925 | + |
1926 | + >>> my_object = MyObject(4) |
1927 | + >>> IACL(my_object).grant(ACLPermission.VIEW, bob) |
1928 | + >>> IACL(my_object).grant(ACLPermission.VIEW, alice) |
1929 | + |
1930 | + >>> IACL(my_object).revoke(ACLPermission.VIEW, alice) |
1931 | + >>> IACL(my_object).has(ACLPermission.VIEW, bob) |
1932 | + True |
1933 | + >>> IACL(my_object).has(ACLPermission.VIEW, alice) |
1934 | + False |
1935 | + |
1936 | +As with grant(), the EVERYONE permission is special. There has to exist |
1937 | +at least one ACL row in the DB. To avoid making something public by |
1938 | +accident, EVERYONE isn't given the permission automatically if the last |
1939 | +row is removed. The last row should never be removed. If it is, it's a |
1940 | +programming error, so an assertion is there to prevent the db to get |
1941 | +into a bad state. |
1942 | + |
1943 | + >>> for acl_item in IACL(my_object).getACLItems(ACLPermission.VIEW): |
1944 | + ... print "User: %s, Permission: %s" % ( |
1945 | + ... acl_item.person, acl_item.permission.name) |
1946 | + User: <Person at ... bob (Bob)>, Permission: VIEW |
1947 | + >>> IACL(my_object).revoke(ACLPermission.VIEW, bob) |
1948 | + Traceback (most recent call last): |
1949 | + ... |
1950 | + AssertionError: Can't remove last ACL row for VIEW. |
1951 | + |
1952 | +However, if EVERYONE is granted a permission, all other ACL rows will be |
1953 | +deleted, since everyone will have access anyway. |
1954 | + |
1955 | + >>> IACL(my_object).grant(ACLPermission.VIEW, EVERYONE) |
1956 | + >>> for acl_item in IACL(my_object).getACLItems(ACLPermission.VIEW): |
1957 | + ... print "User: %s, Permission: %s" % ( |
1958 | + ... acl_item.person, acl_item.permission.name) |
1959 | + User: None, Permission: VIEW |
1960 | |
1961 | === modified file 'lib/lp/soyuz/browser/archive.py' |
1962 | --- lib/lp/soyuz/browser/archive.py 2010-06-29 15:11:44 +0000 |
1963 | +++ lib/lp/soyuz/browser/archive.py 2010-06-30 14:22:34 +0000 |
1964 | @@ -1873,7 +1873,7 @@ |
1965 | |
1966 | class ArchiveAdminView(BaseArchiveEditView): |
1967 | |
1968 | - field_names = ['enabled', 'private', 'require_virtualized', |
1969 | + field_names = ['enabled', 'private', 'commercial', 'require_virtualized', |
1970 | 'buildd_secret', 'authorized_size', 'relative_build_score', |
1971 | 'external_dependencies'] |
1972 | |
1973 | @@ -1919,6 +1919,11 @@ |
1974 | error_text = "\n".join(errors) |
1975 | self.setFieldError('external_dependencies', error_text) |
1976 | |
1977 | + if data.get('commercial') is True and not data['private']: |
1978 | + self.setFieldError( |
1979 | + 'commercial', |
1980 | + 'Can only set commericial for private archives.') |
1981 | + |
1982 | def validate_external_dependencies(self, ext_deps): |
1983 | """Validate the external_dependencies field. |
1984 | |
1985 | |
1986 | === modified file 'lib/lp/soyuz/configure.zcml' |
1987 | --- lib/lp/soyuz/configure.zcml 2010-06-29 13:20:28 +0000 |
1988 | +++ lib/lp/soyuz/configure.zcml 2010-06-30 14:22:34 +0000 |
1989 | @@ -405,9 +405,14 @@ |
1990 | set_attributes="description displayname publish status"/> |
1991 | <require |
1992 | permission="launchpad.Commercial" |
1993 | +<<<<<<< TREE |
1994 | set_attributes="authorized_size buildd_secret |
1995 | enabled_restricted_families |
1996 | external_dependencies private |
1997 | +======= |
1998 | + set_attributes="authorized_size buildd_secret arm_builds_allowed |
1999 | + commercial external_dependencies private |
2000 | +>>>>>>> MERGE-SOURCE |
2001 | require_virtualized relative_build_score "/> |
2002 | <require |
2003 | permission="launchpad.Admin" |
2004 | |
2005 | === modified file 'lib/lp/soyuz/interfaces/archive.py' |
2006 | --- lib/lp/soyuz/interfaces/archive.py 2010-06-29 15:00:35 +0000 |
2007 | +++ lib/lp/soyuz/interfaces/archive.py 2010-06-30 14:22:34 +0000 |
2008 | @@ -375,6 +375,15 @@ |
2009 | value_type=Reference(schema=IProcessorFamily), |
2010 | readonly=False) |
2011 | |
2012 | + commercial = exported( |
2013 | + Bool( |
2014 | + title=_("Commercial Archive"), |
2015 | + required=True, |
2016 | + description=_( |
2017 | + "Set if this archive is used for commercial purposes and " |
2018 | + "should appear in the Software Center listings. The archive " |
2019 | + "must also be private if this is set."))) |
2020 | + |
2021 | def getSourcesForDeletion(name=None, status=None, distroseries=None): |
2022 | """All `ISourcePackagePublishingHistory` available for deletion. |
2023 | |
2024 | |
2025 | === modified file 'lib/lp/soyuz/model/archive.py' |
2026 | --- lib/lp/soyuz/model/archive.py 2010-06-29 13:49:39 +0000 |
2027 | +++ lib/lp/soyuz/model/archive.py 2010-06-30 14:22:34 +0000 |
2028 | @@ -211,6 +211,45 @@ |
2029 | external_dependencies = StringCol( |
2030 | dbName='external_dependencies', notNull=False, default=None) |
2031 | |
2032 | +<<<<<<< TREE |
2033 | +======= |
2034 | + commercial = BoolCol( |
2035 | + dbName='commercial', notNull=True, default=False) |
2036 | + |
2037 | + def _get_arm_builds_enabled(self): |
2038 | + """Check whether ARM builds are allowed for this archive.""" |
2039 | + archive_arch_set = getUtility(IArchiveArchSet) |
2040 | + restricted_families = archive_arch_set.getRestrictedfamilies(self) |
2041 | + arm = getUtility(IProcessorFamilySet).getByName('arm') |
2042 | + for (family, archive_arch) in restricted_families: |
2043 | + if family == arm: |
2044 | + return (archive_arch is not None) |
2045 | + # ARM doesn't exist or isn't restricted. Either way, there is no |
2046 | + # need for an explicit association. |
2047 | + return False |
2048 | + |
2049 | + def _set_arm_builds_enabled(self, value): |
2050 | + """Set whether ARM builds are enabled for this archive.""" |
2051 | + archive_arch_set = getUtility(IArchiveArchSet) |
2052 | + restricted_families = archive_arch_set.getRestrictedfamilies(self) |
2053 | + arm = getUtility(IProcessorFamilySet).getByName('arm') |
2054 | + for (family, archive_arch) in restricted_families: |
2055 | + if family == arm: |
2056 | + if value: |
2057 | + if archive_arch is not None: |
2058 | + # ARM builds are already enabled |
2059 | + return |
2060 | + else: |
2061 | + archive_arch_set.new(self, family) |
2062 | + else: |
2063 | + if archive_arch is not None: |
2064 | + Store.of(self).remove(archive_arch) |
2065 | + else: |
2066 | + pass # ARM builds are already disabled |
2067 | + arm_builds_allowed = property(_get_arm_builds_enabled, |
2068 | + _set_arm_builds_enabled) |
2069 | + |
2070 | +>>>>>>> MERGE-SOURCE |
2071 | def _init(self, *args, **kw): |
2072 | """Provide the right interface for URL traversal.""" |
2073 | SQLBase._init(self, *args, **kw) |
2074 | |
2075 | === modified file 'lib/lp/soyuz/stories/ppa/xx-ppa-workflow.txt' |
2076 | --- lib/lp/soyuz/stories/ppa/xx-ppa-workflow.txt 2010-06-12 05:05:11 +0000 |
2077 | +++ lib/lp/soyuz/stories/ppa/xx-ppa-workflow.txt 2010-06-30 14:22:34 +0000 |
2078 | @@ -347,6 +347,8 @@ |
2079 | True |
2080 | >>> admin_browser.getControl(name="field.private").value |
2081 | False |
2082 | + >>> admin_browser.getControl(name="field.commercial").value |
2083 | + False |
2084 | >>> admin_browser.getControl(name="field.require_virtualized").value |
2085 | True |
2086 | >>> admin_browser.getControl(name="field.relative_build_score").value |
2087 | @@ -356,6 +358,7 @@ |
2088 | |
2089 | >>> admin_browser.getControl(name="field.enabled").value = False |
2090 | >>> admin_browser.getControl(name="field.private").value = True |
2091 | + >>> admin_browser.getControl(name="field.commercial").value = True |
2092 | >>> admin_browser.getControl(name="field.buildd_secret").value = "secret" |
2093 | >>> admin_browser.getControl( |
2094 | ... name="field.require_virtualized").value = False |
2095 | @@ -413,8 +416,10 @@ |
2096 | There is 1 error. |
2097 | Required for private archives. |
2098 | |
2099 | -Conversely, setting the buildd secret for non-private archives also |
2100 | -generates an error: |
2101 | +Conversely, setting the buildd secret for non-private archives also generates |
2102 | +an error. Because the "commercial" flag is also currently set, removing |
2103 | +privacy will also trigger a validation error because the commercial flag can |
2104 | +only be set on private archives: |
2105 | |
2106 | >>> admin_browser.getControl(name="field.private").value = False |
2107 | >>> admin_browser.getControl(name="field.buildd_secret").value = "secret" |
2108 | @@ -422,9 +427,11 @@ |
2109 | |
2110 | >>> for error in get_feedback_messages(admin_browser.contents): |
2111 | ... print error |
2112 | - There is 1 error. |
2113 | + There are 2 errors. |
2114 | + Can only set commericial for private archives. |
2115 | Do not specify for non-private archives |
2116 | |
2117 | + |
2118 | There is a maximum value allowed for `IArchive.authorized_size`, it is |
2119 | currently 2147483647 and the unit used in code is MiB, so in practice |
2120 | the size limit is 2 PiB. |
2121 | |
2122 | === modified file 'lib/lp/soyuz/stories/webservice/xx-archive.txt' |
2123 | --- lib/lp/soyuz/stories/webservice/xx-archive.txt 2010-06-29 21:57:15 +0000 |
2124 | +++ lib/lp/soyuz/stories/webservice/xx-archive.txt 2010-06-30 14:22:34 +0000 |
2125 | @@ -16,6 +16,7 @@ |
2126 | |
2127 | >>> from lazr.restful.testing.webservice import pprint_entry |
2128 | >>> pprint_entry(cprov_archive) |
2129 | + commercial: False |
2130 | dependencies_collection_link: u'http://.../~cprov/+archive/ppa/dependencies' |
2131 | description: u'packages to help my friends.' |
2132 | displayname: u'PPA for Celso Providelo' |
2133 | @@ -80,6 +81,7 @@ |
2134 | >>> ubuntu_main_archive = webservice.get( |
2135 | ... ubuntutest['main_archive_link']).jsonBody() |
2136 | >>> pprint_entry(ubuntu_main_archive) |
2137 | + commercial: False |
2138 | dependencies_collection_link: u'http://.../ubuntutest/+archive/primary/dependencies' |
2139 | description: None |
2140 | displayname: u'Primary Archive for Ubuntu Test' |
2141 | @@ -819,6 +821,7 @@ |
2142 | the IArchive context, in this case only Celso has it. |
2143 | |
2144 | >>> pprint_entry(user_webservice.get("/~cprov/+archive/p3a").jsonBody()) |
2145 | + commercial: False |
2146 | dependencies_collection_link: u'http://.../~cprov/+archive/p3a/dependencies' |
2147 | description: u'tag:launchpad.net:2008:redacted' |
2148 | displayname: u'PPA named p3a for Celso Providelo' |
2149 | @@ -832,6 +835,7 @@ |
2150 | signing_key_fingerprint: u'tag:launchpad.net:2008:redacted' |
2151 | |
2152 | >>> pprint_entry(cprov_webservice.get("/~cprov/+archive/p3a").jsonBody()) |
2153 | + commercial: False |
2154 | dependencies_collection_link: u'http://.../~cprov/+archive/p3a/dependencies' |
2155 | description: u'packages to help my friends.' |
2156 | displayname: u'PPA named p3a for Celso Providelo' |
2157 | |
2158 | === modified file 'lib/lp/soyuz/tests/test_archive.py' |
2159 | --- lib/lp/soyuz/tests/test_archive.py 2010-06-29 13:20:28 +0000 |
2160 | +++ lib/lp/soyuz/tests/test_archive.py 2010-06-30 14:22:34 +0000 |
2161 | @@ -8,6 +8,7 @@ |
2162 | import unittest |
2163 | |
2164 | from zope.component import getUtility |
2165 | +from zope.security.interfaces import Unauthorized |
2166 | from zope.security.proxy import removeSecurityProxy |
2167 | |
2168 | from canonical.database.sqlbase import sqlvalues |
2169 | @@ -34,7 +35,7 @@ |
2170 | from lp.soyuz.model.binarypackagerelease import ( |
2171 | BinaryPackageReleaseDownloadCount) |
2172 | from lp.soyuz.tests.test_publishing import SoyuzTestPublisher |
2173 | -from lp.testing import login_person, TestCaseWithFactory |
2174 | +from lp.testing import login, login_person, TestCaseWithFactory |
2175 | |
2176 | |
2177 | class TestGetPublicationsInArchive(TestCaseWithFactory): |
2178 | @@ -1055,5 +1056,35 @@ |
2179 | self.failUnlessEqual(ArchiveStatus.DELETING, self.archive.status) |
2180 | |
2181 | |
2182 | +class TestCommercialArchive(TestCaseWithFactory): |
2183 | + """Tests relating to commercial archives.""" |
2184 | + |
2185 | + layer = DatabaseFunctionalLayer |
2186 | + |
2187 | + def setUp(self): |
2188 | + super(TestCommercialArchive, self).setUp() |
2189 | + self.archive = self.factory.makeArchive() |
2190 | + |
2191 | + def setCommercial(self, archive, commercial): |
2192 | + """Helper function.""" |
2193 | + archive.commercial = commercial |
2194 | + |
2195 | + def test_set_and_get_commercial(self): |
2196 | + # Basic set and get of the commercial property. Anyone can read |
2197 | + # it and it defaults to False. |
2198 | + login_person(self.archive.owner) |
2199 | + self.assertFalse(self.archive.commercial) |
2200 | + |
2201 | + # The archive owner can't change the value. |
2202 | + self.assertRaises( |
2203 | + Unauthorized, self.setCommercial, self.archive, True) |
2204 | + |
2205 | + # Commercial admins can change it. |
2206 | + login("commercial-member@canonical.com") |
2207 | + self.setCommercial(self.archive, True) |
2208 | + self.assertTrue(self.archive.commercial) |
2209 | + |
2210 | + |
2211 | + |
2212 | def test_suite(): |
2213 | return unittest.TestLoader().loadTestsFromName(__name__) |
2214 | |
2215 | === modified file 'lib/lp/translations/doc/translations-export-to-branch.txt' |
2216 | --- lib/lp/translations/doc/translations-export-to-branch.txt 2010-05-27 14:20:42 +0000 |
2217 | +++ lib/lp/translations/doc/translations-export-to-branch.txt 2010-06-30 14:22:34 +0000 |
2218 | @@ -12,7 +12,8 @@ |
2219 | >>> stdout |
2220 | '' |
2221 | >>> print stderr |
2222 | - INFO Creating lockfile: /var/lock/launchpad-translations-export-to-branch.lock |
2223 | + INFO Creating lockfile: |
2224 | + /var/lock/launchpad-translations-export-to-branch.lock |
2225 | INFO Exporting to translations branches. |
2226 | INFO Processed 0 item(s); 0 failure(s), 0 unpushed branch(es). |
2227 | |
2228 | @@ -218,8 +219,13 @@ |
2229 | won't work, so we email a notification to the branch owner. |
2230 | |
2231 | >>> from email import message_from_string |
2232 | + >>> from canonical.config import config |
2233 | >>> from lp.services.mail import stub |
2234 | >>> from lp.codehosting.vfs import get_rw_server |
2235 | + >>> config.push('enable_emails', """ |
2236 | + ... [rosetta] |
2237 | + ... notify_unpushed_branches: True |
2238 | + ... """) |
2239 | >>> productseries = factory.makeProductSeries() |
2240 | >>> productseries.translations_branch = factory.makeBranch() |
2241 | >>> template = factory.makePOTemplate(productseries=productseries) |
2242 | @@ -256,6 +262,8 @@ |
2243 | Branch synchronization for this release series has been set up to |
2244 | commit translations snapshots to the bzr branch at lp://... |
2245 | |
2246 | + >>> extra_config = config.pop('enable_emails') |
2247 | + |
2248 | For the full message text, see emailtemplates/unpushed-branch.txt. |
2249 | |
2250 | |
2251 | |
2252 | === modified file 'lib/lp/translations/scripts/tests/test_translations_to_branch.py' |
2253 | --- lib/lp/translations/scripts/tests/test_translations_to_branch.py 2010-06-11 18:18:03 +0000 |
2254 | +++ lib/lp/translations/scripts/tests/test_translations_to_branch.py 2010-06-30 14:22:34 +0000 |
2255 | @@ -190,6 +190,8 @@ |
2256 | self.assertEqual(0, exporter._handleUnpushedBranch.call_count) |
2257 | |
2258 | def test_handleUnpushedBranch_mails_branch_owner(self): |
2259 | + # If configured to do so, _handleUnpushedBranch sends out |
2260 | + # notification emails. |
2261 | exporter = ExportTranslationsToBranch(test_args=[]) |
2262 | exporter.logger = QuietFakeLogger() |
2263 | productseries = self.factory.makeProductSeries() |
2264 | @@ -202,6 +204,10 @@ |
2265 | |
2266 | self.becomeDbUser('translationstobranch') |
2267 | |
2268 | + # XXX JeroenVermeulen 2010-06-14 bug=593522: This is needed |
2269 | + # because the staging codehosting server's email isn't being |
2270 | + # captured like it should. |
2271 | + self.pushConfig('rosetta', notify_unpushed_branches=True) |
2272 | exporter._exportToBranches([productseries]) |
2273 | |
2274 | self.assertEqual(1, exporter._sendMail.call_count) |
2275 | @@ -218,6 +224,29 @@ |
2276 | self.assertIn(productseries.translations_branch.bzr_identity, text) |
2277 | self.assertIn('bzr push lp://', text) |
2278 | |
2279 | + def test_handleUnpushedBranch_email_suppressed_by_default(self): |
2280 | + # The default configuration suppresses notification emails, so |
2281 | + # we don't accidentally send out emails from staging |
2282 | + # codehosting. |
2283 | + # XXX JeroenVermeulen 2010-06-14 bug=593522: This is needed |
2284 | + # because the staging codehosting server's email isn't being |
2285 | + # captured like it should. |
2286 | + exporter = ExportTranslationsToBranch(test_args=[]) |
2287 | + exporter.logger = QuietFakeLogger() |
2288 | + productseries = self.factory.makeProductSeries() |
2289 | + email = self.factory.getUniqueEmailAddress() |
2290 | + branch_owner = self.factory.makePerson(email=email) |
2291 | + productseries.translations_branch = self.factory.makeBranch( |
2292 | + owner=branch_owner) |
2293 | + exporter._exportToBranch = FakeMethod(failure=NotBranchError("Ow")) |
2294 | + exporter._sendMail = FakeMethod() |
2295 | + |
2296 | + self.becomeDbUser('translationstobranch') |
2297 | + |
2298 | + exporter._exportToBranches([productseries]) |
2299 | + |
2300 | + self.assertEqual(0, exporter._sendMail.call_count) |
2301 | + |
2302 | def test_handleUnpushedBranch_has_required_privileges(self): |
2303 | # Dealing with an unpushed branch is a special code path that |
2304 | # was not exercised by the full-script test. Ensure that it has |
2305 | |
2306 | === modified file 'lib/lp/translations/scripts/translations_to_branch.py' |
2307 | --- lib/lp/translations/scripts/translations_to_branch.py 2010-05-27 14:32:32 +0000 |
2308 | +++ lib/lp/translations/scripts/translations_to_branch.py 2010-06-30 14:22:34 +0000 |
2309 | @@ -271,6 +271,18 @@ |
2310 | This means that as far as the Launchpad database knows, there is |
2311 | no actual bzr branch behind this `IBranch` yet. |
2312 | """ |
2313 | + if not config.rosetta.notify_unpushed_branches: |
2314 | + # Configurable since it's perfectly reasonable for staging |
2315 | + # codehosting to contain unpushed branches. If the emails |
2316 | + # were sent out from the staging appserver this wouldn't |
2317 | + # matter; outgoing email there is walled off. But this |
2318 | + # script runs on the codehosting server. The staging |
2319 | + # codehosting server sends out real email. |
2320 | + # XXX JeroenVermeulen 2010-06-14 bug=593522: This is needed |
2321 | + # because the staging codehosting server's email isn't being |
2322 | + # captured like it should. |
2323 | + return |
2324 | + |
2325 | branch = productseries.translations_branch |
2326 | self.logger.info("Notifying %s of unpushed branch %s." % ( |
2327 | branch.owner.name, branch.bzr_identity)) |
2328 | |
2329 | === modified file 'utilities/report-database-stats.py' |
2330 | --- utilities/report-database-stats.py 2010-04-29 12:38:05 +0000 |
2331 | +++ utilities/report-database-stats.py 2010-06-30 14:22:34 +0000 |
2332 | @@ -72,12 +72,22 @@ |
2333 | |
2334 | |
2335 | def get_cpu_stats(cur, options): |
2336 | + # This query calculates the averate cpu utilization from the |
2337 | + # samples. It assumes samples are taken at regular intervals over |
2338 | + # the period. |
2339 | query = """ |
2340 | - SELECT avg(cpu), username FROM DatabaseCpuStats |
2341 | + SELECT ( |
2342 | + CAST(SUM(cpu) AS float) / ( |
2343 | + SELECT COUNT(DISTINCT date_created) FROM DatabaseCpuStats |
2344 | + WHERE |
2345 | + date_created >= (CURRENT_TIMESTAMP AT TIME ZONE 'UTC') |
2346 | + - CAST (%s AS interval)) |
2347 | + ) AS avg_cpu, username |
2348 | + FROM DatabaseCpuStats |
2349 | WHERE date_created >= (CURRENT_TIMESTAMP AT TIME ZONE 'UTC' |
2350 | - CAST(%s AS interval)) |
2351 | GROUP BY username |
2352 | - """ % sqlvalues(options.since_interval) |
2353 | + """ % sqlvalues(options.since_interval, options.since_interval) |
2354 | |
2355 | cur.execute(query) |
2356 | |
2357 | @@ -107,16 +117,20 @@ |
2358 | tables = get_table_stats(cur, options) |
2359 | arbitrary_table = list(tables)[0] |
2360 | interval = arbitrary_table.date_end - arbitrary_table.date_start |
2361 | - per_minute = interval.days * 24 * 60 + interval.seconds / 60.0 |
2362 | + per_second = float(interval.days * 24 * 60 * 60 + interval.seconds) |
2363 | |
2364 | print "== Most Read Tables ==" |
2365 | |
2366 | + # These match the pg_user_table_stats view. schemaname is the |
2367 | + # namespace (normally 'public'), relname is the table (relation) |
2368 | + # name. total_tup_red is the total number of rows read. |
2369 | + # idx_tup_fetch is the number of rows looked up using an index. |
2370 | tables_sort = ['total_tup_read', 'idx_tup_fetch', 'schemaname', 'relname'] |
2371 | most_read_tables = sorted( |
2372 | tables, key=attrgetter(*tables_sort), reverse=True) |
2373 | for table in most_read_tables[:options.limit]: |
2374 | - print "%40s || %10.2f tuples/min" % ( |
2375 | - table.relname, table.total_tup_read / per_minute) |
2376 | + print "%40s || %10.2f tuples/sec" % ( |
2377 | + table.relname, table.total_tup_read / per_second) |
2378 | |
2379 | |
2380 | print "== Most Written Tables ==" |
2381 | @@ -126,15 +140,15 @@ |
2382 | most_written_tables = sorted( |
2383 | tables, key=attrgetter(*tables_sort), reverse=True) |
2384 | for table in most_written_tables[:options.limit]: |
2385 | - print "%40s || %10.2f tuples/min" % ( |
2386 | - table.relname, table.total_tup_written / per_minute) |
2387 | + print "%40s || %10.2f tuples/sec" % ( |
2388 | + table.relname, table.total_tup_written / per_second) |
2389 | |
2390 | |
2391 | user_cpu = get_cpu_stats(cur, options) |
2392 | print "== Most Active Users ==" |
2393 | |
2394 | for cpu, username in sorted(user_cpu, reverse=True)[:options.limit]: |
2395 | - print "%40s || %6.2f%% CPU" % (username, float(cpu) / 100) |
2396 | + print "%40s || %10.2f%% CPU" % (username, float(cpu) / 10) |
2397 | |
2398 | |
2399 | if __name__ == '__main__': |
Wrong target branch.