Merge lp:~intellectronica/launchpad/bug-heat-days-active into lp:launchpad/db-devel
- bug-heat-days-active
- Merge into db-devel
Proposed by
Eleanor Berger
Status: | Merged |
---|---|
Approved by: | Eleanor Berger |
Approved revision: | no longer in the source branch. |
Merged at revision: | not available |
Proposed branch: | lp:~intellectronica/launchpad/bug-heat-days-active |
Merge into: | lp:launchpad/db-devel |
Prerequisite: | lp:~intellectronica/launchpad/bug-heat-degrade |
Diff against target: |
1026 lines (+959/-2) 8 files modified
lib/launchpad_loggerhead/__init__.py (+1/-0) lib/launchpad_loggerhead/app.py (+215/-0) lib/launchpad_loggerhead/debug.py (+120/-0) lib/launchpad_loggerhead/session.py (+73/-0) lib/launchpad_loggerhead/static/robots.txt (+2/-0) lib/lp/bugs/scripts/bugheat.py (+16/-1) lib/lp/bugs/scripts/tests/test_bugheat.py (+21/-1) lib/lp/code/model/tests/test_sourcepackagerecipe.py (+511/-0) |
To merge this branch: | bzr merge lp:~intellectronica/launchpad/bug-heat-days-active |
Related bugs: |
Reviewer | Review Type | Date Requested | Status |
---|---|---|---|
Michael Nelson (community) | code | Approve | |
Review via email: mp+23921@code.launchpad.net |
Commit message
Add a proportion of the maximum bug heat to a bug's heat for every day since the bug was created.
Description of the change
This branch is the last in a series of branches to make bug heat more sensitive to bug activity. In this branch we change the formula so that a proportion of the maximum heat for a bug is added for every day since it was created (this is offset by the decrease in bug heat for time since last activity, already in place).
To post a comment you must log in.
Preview Diff
[H/L] Next/Prev Comment, [J/K] Next/Prev File, [N/P] Next/Prev Hunk
1 | === removed directory 'lib/canonical/launchpad/apidoc' |
2 | === added directory 'lib/launchpad_loggerhead' |
3 | === added file 'lib/launchpad_loggerhead/__init__.py' |
4 | --- lib/launchpad_loggerhead/__init__.py 1970-01-01 00:00:00 +0000 |
5 | +++ lib/launchpad_loggerhead/__init__.py 2010-04-27 01:35:56 +0000 |
6 | @@ -0,0 +1,1 @@ |
7 | + |
8 | |
9 | === added file 'lib/launchpad_loggerhead/app.py' |
10 | --- lib/launchpad_loggerhead/app.py 1970-01-01 00:00:00 +0000 |
11 | +++ lib/launchpad_loggerhead/app.py 2010-04-27 01:39:55 +0000 |
12 | @@ -0,0 +1,215 @@ |
13 | +# Copyright 2009 Canonical Ltd. This software is licensed under the |
14 | +# GNU Affero General Public License version 3 (see the file LICENSE). |
15 | + |
16 | +import logging |
17 | +import os |
18 | +import threading |
19 | +import urllib |
20 | +import urlparse |
21 | +import xmlrpclib |
22 | + |
23 | +from bzrlib import errors, lru_cache, urlutils |
24 | + |
25 | +from loggerhead.apps import favicon_app, static_app |
26 | +from loggerhead.apps.branch import BranchWSGIApp |
27 | + |
28 | +from openid.extensions.sreg import SRegRequest, SRegResponse |
29 | +from openid.consumer.consumer import CANCEL, Consumer, FAILURE, SUCCESS |
30 | +from openid.store.memstore import MemoryStore |
31 | + |
32 | +from paste.fileapp import DataApp |
33 | +from paste.request import construct_url, parse_querystring, path_info_pop |
34 | +from paste.httpexceptions import ( |
35 | + HTTPMovedPermanently, HTTPNotFound, HTTPUnauthorized) |
36 | + |
37 | +from canonical.config import config |
38 | +from canonical.launchpad.xmlrpc import faults |
39 | +from canonical.launchpad.webapp.vhosts import allvhosts |
40 | +from lp.code.interfaces.codehosting import ( |
41 | + BRANCH_TRANSPORT, LAUNCHPAD_ANONYMOUS) |
42 | +from lp.codehosting.vfs import get_lp_server |
43 | +from lp.codehosting.bzrutils import safe_open |
44 | + |
45 | +robots_txt = '''\ |
46 | +User-agent: * |
47 | +Disallow: / |
48 | +''' |
49 | + |
50 | +robots_app = DataApp(robots_txt, content_type='text/plain') |
51 | + |
52 | + |
53 | +thread_transports = threading.local() |
54 | + |
55 | + |
56 | +def check_fault(fault, *fault_classes): |
57 | + """Check if 'fault's faultCode matches any of 'fault_classes'. |
58 | + |
59 | + :param fault: An instance of `xmlrpclib.Fault`. |
60 | + :param fault_classes: Any number of `LaunchpadFault` subclasses. |
61 | + """ |
62 | + for cls in fault_classes: |
63 | + if fault.faultCode == cls.error_code: |
64 | + return True |
65 | + return False |
66 | + |
67 | + |
68 | +class RootApp: |
69 | + |
70 | + def __init__(self, session_var): |
71 | + self.graph_cache = lru_cache.LRUCache(10) |
72 | + self.branchfs = xmlrpclib.ServerProxy( |
73 | + config.codehosting.codehosting_endpoint) |
74 | + self.session_var = session_var |
75 | + self.store = MemoryStore() |
76 | + self.log = logging.getLogger('lp-loggerhead') |
77 | + |
78 | + def get_transports(self): |
79 | + t = getattr(thread_transports, 'transports', None) |
80 | + if t is None: |
81 | + thread_transports.transports = [] |
82 | + return thread_transports.transports |
83 | + |
84 | + def _make_consumer(self, environ): |
85 | + """Build an OpenID `Consumer` object with standard arguments.""" |
86 | + return Consumer(environ[self.session_var], self.store) |
87 | + |
88 | + def _begin_login(self, environ, start_response): |
89 | + """Start the process of authenticating with OpenID. |
90 | + |
91 | + We redirect the user to Launchpad to identify themselves, asking to be |
92 | + sent their nickname. Launchpad will then redirect them to our +login |
93 | + page with enough information that we can then redirect them again to |
94 | + the page they were looking at, with a cookie that gives us the |
95 | + username. |
96 | + """ |
97 | + openid_vhost = config.launchpad.openid_provider_vhost |
98 | + openid_request = self._make_consumer(environ).begin( |
99 | + allvhosts.configs[openid_vhost].rooturl) |
100 | + openid_request.addExtension( |
101 | + SRegRequest(required=['nickname'])) |
102 | + back_to = construct_url(environ) |
103 | + raise HTTPMovedPermanently(openid_request.redirectURL( |
104 | + config.codehosting.secure_codebrowse_root, |
105 | + config.codehosting.secure_codebrowse_root + '+login/?' |
106 | + + urllib.urlencode({'back_to':back_to}))) |
107 | + |
108 | + def _complete_login(self, environ, start_response): |
109 | + """Complete the OpenID authentication process. |
110 | + |
111 | + Here we handle the result of the OpenID process. If the process |
112 | + succeeded, we record the username in the session and redirect the user |
113 | + to the page they were trying to view that triggered the login attempt. |
114 | + In the various failures cases we return a 401 Unauthorized response |
115 | + with a brief explanation of what went wrong. |
116 | + """ |
117 | + query = dict(parse_querystring(environ)) |
118 | + # Passing query['openid.return_to'] here is massive cheating, but |
119 | + # given we control the endpoint who cares. |
120 | + response = self._make_consumer(environ).complete( |
121 | + query, query['openid.return_to']) |
122 | + if response.status == SUCCESS: |
123 | + self.log.error('open id response: SUCCESS') |
124 | + sreg_info = SRegResponse.fromSuccessResponse(response) |
125 | + print sreg_info |
126 | + environ[self.session_var]['user'] = sreg_info['nickname'] |
127 | + raise HTTPMovedPermanently(query['back_to']) |
128 | + elif response.status == FAILURE: |
129 | + self.log.error('open id response: FAILURE: %s', response.message) |
130 | + exc = HTTPUnauthorized() |
131 | + exc.explanation = response.message |
132 | + raise exc |
133 | + elif response.status == CANCEL: |
134 | + self.log.error('open id response: CANCEL') |
135 | + exc = HTTPUnauthorized() |
136 | + exc.explanation = "Authetication cancelled." |
137 | + raise exc |
138 | + else: |
139 | + self.log.error('open id response: UNKNOWN') |
140 | + exc = HTTPUnauthorized() |
141 | + exc.explanation = "Unknown OpenID response." |
142 | + raise exc |
143 | + |
144 | + def __call__(self, environ, start_response): |
145 | + environ['loggerhead.static.url'] = environ['SCRIPT_NAME'] |
146 | + if environ['PATH_INFO'].startswith('/static/'): |
147 | + path_info_pop(environ) |
148 | + return static_app(environ, start_response) |
149 | + elif environ['PATH_INFO'] == '/favicon.ico': |
150 | + return favicon_app(environ, start_response) |
151 | + elif environ['PATH_INFO'] == '/robots.txt': |
152 | + return robots_app(environ, start_response) |
153 | + elif environ['PATH_INFO'].startswith('/+login'): |
154 | + return self._complete_login(environ, start_response) |
155 | + path = environ['PATH_INFO'] |
156 | + trailingSlashCount = len(path) - len(path.rstrip('/')) |
157 | + user = environ[self.session_var].get('user', LAUNCHPAD_ANONYMOUS) |
158 | + lp_server = get_lp_server( |
159 | + user, branch_url=config.codehosting.internal_branch_by_id_root) |
160 | + lp_server.start_server() |
161 | + try: |
162 | + try: |
163 | + transport_type, info, trail = self.branchfs.translatePath( |
164 | + user, urlutils.escape(path)) |
165 | + except xmlrpclib.Fault, f: |
166 | + if check_fault(f, faults.PathTranslationError): |
167 | + raise HTTPNotFound() |
168 | + elif check_fault(f, faults.PermissionDenied): |
169 | + # If we're not allowed to see the branch... |
170 | + if environ['wsgi.url_scheme'] != 'https': |
171 | + # ... the request shouldn't have come in over http, as |
172 | + # requests for private branches over http should be |
173 | + # redirected to https by the dynamic rewrite script we |
174 | + # use (which runs before this code is reached), but |
175 | + # just in case... |
176 | + env_copy = environ.copy() |
177 | + env_copy['wsgi.url_scheme'] = 'https' |
178 | + raise HTTPMovedPermanently(construct_url(env_copy)) |
179 | + elif user != LAUNCHPAD_ANONYMOUS: |
180 | + # ... if the user is already logged in and still can't |
181 | + # see the branch, they lose. |
182 | + exc = HTTPUnauthorized() |
183 | + exc.explanation = "You are logged in as %s." % user |
184 | + raise exc |
185 | + else: |
186 | + # ... otherwise, lets give them a chance to log in |
187 | + # with OpenID. |
188 | + return self._begin_login(environ, start_response) |
189 | + else: |
190 | + raise |
191 | + if transport_type != BRANCH_TRANSPORT: |
192 | + raise HTTPNotFound() |
193 | + trail = urlutils.unescape(trail).encode('utf-8') |
194 | + trail += trailingSlashCount * '/' |
195 | + amount_consumed = len(path) - len(trail) |
196 | + consumed = path[:amount_consumed] |
197 | + branch_name = consumed.strip('/') |
198 | + self.log.info('Using branch: %s', branch_name) |
199 | + if trail and not trail.startswith('/'): |
200 | + trail = '/' + trail |
201 | + environ['PATH_INFO'] = trail |
202 | + environ['SCRIPT_NAME'] += consumed.rstrip('/') |
203 | + branch_url = lp_server.get_url() + branch_name |
204 | + branch_link = urlparse.urljoin( |
205 | + config.codebrowse.launchpad_root, branch_name) |
206 | + cachepath = os.path.join( |
207 | + config.codebrowse.cachepath, branch_name[1:]) |
208 | + if not os.path.isdir(cachepath): |
209 | + os.makedirs(cachepath) |
210 | + self.log.info('branch_url: %s', branch_url) |
211 | + try: |
212 | + bzr_branch = safe_open( |
213 | + lp_server.get_url().strip(':/'), branch_url, |
214 | + possible_transports=self.get_transports()) |
215 | + except errors.NotBranchError, err: |
216 | + self.log.warning('Not a branch: %s', err) |
217 | + raise HTTPNotFound() |
218 | + bzr_branch.lock_read() |
219 | + try: |
220 | + view = BranchWSGIApp( |
221 | + bzr_branch, branch_name, {'cachepath': cachepath}, |
222 | + self.graph_cache, branch_link=branch_link, served_url=None) |
223 | + return view.app(environ, start_response) |
224 | + finally: |
225 | + bzr_branch.unlock() |
226 | + finally: |
227 | + lp_server.stop_server() |
228 | |
229 | === added file 'lib/launchpad_loggerhead/debug.py' |
230 | --- lib/launchpad_loggerhead/debug.py 1970-01-01 00:00:00 +0000 |
231 | +++ lib/launchpad_loggerhead/debug.py 2010-04-27 01:35:56 +0000 |
232 | @@ -0,0 +1,120 @@ |
233 | +# Copyright 2009 Canonical Ltd. This software is licensed under the |
234 | +# GNU Affero General Public License version 3 (see the file LICENSE). |
235 | + |
236 | +import thread |
237 | +import time |
238 | + |
239 | +from paste.request import construct_url |
240 | + |
241 | + |
242 | +def tabulate(cells): |
243 | + """Format a list of lists of strings in a table. |
244 | + |
245 | + The 'cells' are centered. |
246 | + |
247 | + >>> print ''.join(tabulate( |
248 | + ... [['title 1', 'title 2'], |
249 | + ... ['short', 'rather longer']])) |
250 | + title 1 title 2 |
251 | + short rather longer |
252 | + """ |
253 | + widths = {} |
254 | + for row in cells: |
255 | + for col_index, cell in enumerate(row): |
256 | + widths[col_index] = max(len(cell), widths.get(col_index, 0)) |
257 | + result = [] |
258 | + for row in cells: |
259 | + result_row = '' |
260 | + for col_index, cell in enumerate(row): |
261 | + result_row += cell.center(widths[col_index] + 2) |
262 | + result.append(result_row.rstrip() + '\n') |
263 | + return result |
264 | + |
265 | + |
266 | +def threadpool_debug(app): |
267 | + """Wrap `app` to provide debugging information about the threadpool state. |
268 | + |
269 | + The returned application will serve debugging information about the state |
270 | + of the threadpool at '/thread-debug' -- but only when accessed directly, |
271 | + not when accessed through Apache. |
272 | + """ |
273 | + def wrapped(environ, start_response): |
274 | + if ('HTTP_X_FORWARDED_SERVER' in environ |
275 | + or environ['PATH_INFO'] != '/thread-debug'): |
276 | + environ['lp.timestarted'] = time.time() |
277 | + return app(environ, start_response) |
278 | + threadpool = environ['paste.httpserver.thread_pool'] |
279 | + start_response("200 Ok", []) |
280 | + output = [("url", "time running", "time since last activity")] |
281 | + now = time.time() |
282 | + # Because we're accessing mutable structures without locks here, |
283 | + # we're a bit cautious about things looking like we expect -- if a |
284 | + # worker doesn't seem fully set up, we just ignore it. |
285 | + for worker in threadpool.workers: |
286 | + if not hasattr(worker, 'thread_id'): |
287 | + continue |
288 | + time_started, info = threadpool.worker_tracker.get( |
289 | + worker.thread_id, (None, None)) |
290 | + if time_started is not None and info is not None: |
291 | + real_time_started = info.get( |
292 | + 'lp.timestarted', time_started) |
293 | + output.append( |
294 | + map(str, |
295 | + (construct_url(info), |
296 | + now - real_time_started, |
297 | + now - time_started,))) |
298 | + return tabulate(output) |
299 | + return wrapped |
300 | + |
301 | + |
302 | +def change_kill_thread_criteria(application): |
303 | + """Interfere with threadpool so that threads are killed for inactivity. |
304 | + |
305 | + The usual rules with paste's threadpool is that a thread that takes longer |
306 | + than 'hung_thread_limit' seconds to process a request is considered hung |
307 | + and more than 'kill_thread_limit' seconds is killed. |
308 | + |
309 | + Because loggerhead streams its output, how long the entire request takes |
310 | + to process depends on things like how fast the users internet connection |
311 | + is. What we'd like to do is kill threads that don't _start_ to produce |
312 | + output for 'kill_thread_limit' seconds. |
313 | + |
314 | + What this class actually does is arrange things so that threads that |
315 | + produce no output for 'kill_thread_limit' are killed, because that's the |
316 | + rule Apache uses when interpreting ProxyTimeout. |
317 | + """ |
318 | + def wrapped_application(environ, start_response): |
319 | + threadpool = environ['paste.httpserver.thread_pool'] |
320 | + def reset_timer(): |
321 | + """Make this thread safe for another 'kill_thread_limit' seconds. |
322 | + |
323 | + We do this by hacking the threadpool's record of when this thread |
324 | + started to pretend that it started right now. Hacky, but it's |
325 | + enough to fool paste.httpserver.ThreadPool.kill_hung_threads and |
326 | + that's what matters. |
327 | + """ |
328 | + threadpool.worker_tracker[thread.get_ident()][0] = time.time() |
329 | + def response_hook(status, response_headers, exc_info=None): |
330 | + # We reset the timer when the HTTP headers are sent... |
331 | + reset_timer() |
332 | + writer = start_response(status, response_headers, exc_info) |
333 | + def wrapped_writer(arg): |
334 | + # ... and whenever more output has been generated. |
335 | + reset_timer() |
336 | + return writer(arg) |
337 | + return wrapped_writer |
338 | + result = application(environ, response_hook) |
339 | + # WSGI allows the application to return an iterable, which could be a |
340 | + # generator that does significant processing between successive items, |
341 | + # so we should reset the timer between each item. |
342 | + # |
343 | + # This isn't really necessary as loggerhead doesn't return any |
344 | + # non-trivial iterables to the WSGI server. But it's probably better |
345 | + # to cope with this case to avoid nasty suprises if loggerhead |
346 | + # changes. |
347 | + def reset_timer_between_items(iterable): |
348 | + for item in iterable: |
349 | + reset_timer() |
350 | + yield item |
351 | + return reset_timer_between_items(result) |
352 | + return wrapped_application |
353 | |
354 | === added file 'lib/launchpad_loggerhead/session.py' |
355 | --- lib/launchpad_loggerhead/session.py 1970-01-01 00:00:00 +0000 |
356 | +++ lib/launchpad_loggerhead/session.py 2010-04-27 01:35:56 +0000 |
357 | @@ -0,0 +1,73 @@ |
358 | +# Copyright 2009 Canonical Ltd. This software is licensed under the |
359 | +# GNU Affero General Public License version 3 (see the file LICENSE). |
360 | + |
361 | +"""Simple paste-y session manager tuned for the needs of launchpad-loggerhead. |
362 | +""" |
363 | + |
364 | +import pickle |
365 | + |
366 | +from paste.auth.cookie import AuthCookieHandler, AuthCookieSigner |
367 | + |
368 | + |
369 | +class MyAuthCookieSigner(AuthCookieSigner): |
370 | + """Fix a bug in AuthCookieSigner.""" |
371 | + |
372 | + def sign(self, content): |
373 | + # XXX 2008-01-13 Michael Hudson: paste.auth.cookie generates bogus |
374 | + # cookies when the value is long: |
375 | + # http://trac.pythonpaste.org/pythonpaste/ticket/257. This is fixed |
376 | + # now, so when a new version is released and packaged we can remove |
377 | + # this class. |
378 | + r = AuthCookieSigner.sign(self, content) |
379 | + return r.replace('\n', '') |
380 | + |
381 | + |
382 | +class SessionHandler(object): |
383 | + """Middleware that provides a cookie-based session. |
384 | + |
385 | + The session dict is stored, pickled (and HMACed), in a cookie, so don't |
386 | + store very much in the session! |
387 | + """ |
388 | + |
389 | + def __init__(self, application, session_var, secret=None): |
390 | + """Initialize a SessionHandler instance. |
391 | + |
392 | + :param application: This is the wrapped application which will have |
393 | + access to the ``environ[session_var]`` dictionary managed by this |
394 | + middleware. |
395 | + :param session_var: The key under which to store the session |
396 | + dictionary in the environment. |
397 | + :param secret: A secret value used for signing the cookie. If not |
398 | + supplied, a new secret will be used for each instantiation of the |
399 | + SessionHandler. |
400 | + """ |
401 | + self.application = application |
402 | + self.cookie_handler = AuthCookieHandler( |
403 | + self._process, scanlist=[session_var], |
404 | + signer=MyAuthCookieSigner(secret)) |
405 | + self.session_var = session_var |
406 | + |
407 | + def __call__(self, environ, start_response): |
408 | + # We need to put the request through the cookie handler first, so we |
409 | + # can access the validated string in the environ in `_process` below. |
410 | + return self.cookie_handler(environ, start_response) |
411 | + |
412 | + def _process(self, environ, start_response): |
413 | + """Process a request. |
414 | + |
415 | + AuthCookieHandler takes care of getting the text value of the session |
416 | + in and out of the cookie (and validating the text using HMAC) so we |
417 | + just need to convert that string to and from a real dictionary using |
418 | + pickle. |
419 | + """ |
420 | + if self.session_var in environ: |
421 | + session = pickle.loads(environ[self.session_var]) |
422 | + else: |
423 | + session = {} |
424 | + environ[self.session_var] = session |
425 | + def response_hook(status, response_headers, exc_info=None): |
426 | + session = environ.pop(self.session_var) |
427 | + if session: |
428 | + environ[self.session_var] = pickle.dumps(session) |
429 | + return start_response(status, response_headers, exc_info) |
430 | + return self.application(environ, response_hook) |
431 | |
432 | === added directory 'lib/launchpad_loggerhead/static' |
433 | === added file 'lib/launchpad_loggerhead/static/robots.txt' |
434 | --- lib/launchpad_loggerhead/static/robots.txt 1970-01-01 00:00:00 +0000 |
435 | +++ lib/launchpad_loggerhead/static/robots.txt 2010-04-27 01:35:56 +0000 |
436 | @@ -0,0 +1,2 @@ |
437 | +User-agent: * |
438 | +Disallow: / |
439 | |
440 | === modified file 'lib/lp/bugs/scripts/bugheat.py' |
441 | --- lib/lp/bugs/scripts/bugheat.py 2010-04-28 21:40:12 +0000 |
442 | +++ lib/lp/bugs/scripts/bugheat.py 2010-04-29 14:53:44 +0000 |
443 | @@ -90,4 +90,19 @@ |
444 | self.bug.date_last_updated.replace(tzinfo=None)).days |
445 | total_heat = int(total_heat * (0.99 ** days)) |
446 | |
447 | - return total_heat |
448 | + if days > 0: |
449 | + # Bug heat increases by a quarter of the maximum bug heat divided |
450 | + # by the number of days since the bug's creation date. |
451 | + days_since_last_activity = ( |
452 | + datetime.utcnow() - |
453 | + max(self.bug.date_last_updated.replace(tzinfo=None), |
454 | + self.bug.date_last_message.replace(tzinfo=None))).days |
455 | + days_since_created = ( |
456 | + datetime.utcnow() - self.bug.datecreated.replace(tzinfo=None)).days |
457 | + max_heat = max( |
458 | + task.target.max_bug_heat for task in self.bug.bugtasks) |
459 | + if max_heat is not None and days_since_created > 0: |
460 | + total_heat = total_heat + (max_heat * 0.25 / days_since_created) |
461 | + |
462 | + return int(total_heat) |
463 | + |
464 | |
465 | === modified file 'lib/lp/bugs/scripts/tests/test_bugheat.py' |
466 | --- lib/lp/bugs/scripts/tests/test_bugheat.py 2010-04-28 21:40:12 +0000 |
467 | +++ lib/lp/bugs/scripts/tests/test_bugheat.py 2010-04-29 14:53:44 +0000 |
468 | @@ -7,7 +7,7 @@ |
469 | |
470 | import unittest |
471 | |
472 | -from datetime import timedelta |
473 | +from datetime import datetime, timedelta |
474 | |
475 | from canonical.testing import LaunchpadZopelessLayer |
476 | |
477 | @@ -15,6 +15,9 @@ |
478 | from lp.bugs.scripts.bugheat import BugHeatCalculator, BugHeatConstants |
479 | from lp.testing import TestCaseWithFactory |
480 | |
481 | +from zope.security.proxy import removeSecurityProxy |
482 | + |
483 | + |
484 | class TestBugHeatCalculator(TestCaseWithFactory): |
485 | """Tests for the BugHeatCalculator class.""" |
486 | # If you change the way that bug heat is calculated, remember to update |
487 | @@ -231,6 +234,23 @@ |
488 | "Expected bug heat did not match actual bug heat. " |
489 | "Expected %s, got %s" % (expected, heat)) |
490 | |
491 | + def test_getBugHeat_activity(self): |
492 | + # Bug heat increases by a quarter of the maximum bug heat divided by |
493 | + # the number of days between the bug's creating and its last activity. |
494 | + active_bug = removeSecurityProxy(self.factory.makeBug()) |
495 | + fresh_heat = BugHeatCalculator(active_bug).getBugHeat() |
496 | + active_bug.date_last_updated = ( |
497 | + active_bug.date_last_updated - timedelta(days=10)) |
498 | + active_bug.datecreated = (active_bug.datecreated - timedelta(days=20)) |
499 | + active_bug.default_bugtask.target.setMaxBugHeat(100) |
500 | + expected = int((fresh_heat * (0.99 ** 20)) + (100 * 0.25 / 20)) |
501 | + heat = BugHeatCalculator(active_bug).getBugHeat() |
502 | + self.assertEqual( |
503 | + expected, heat, |
504 | + "Expected bug heat did not match actual bug heat. " |
505 | + "Expected %s, got %s" % (expected, heat)) |
506 | + |
507 | + |
508 | |
509 | def test_suite(): |
510 | return unittest.TestLoader().loadTestsFromName(__name__) |
511 | |
512 | === added file 'lib/lp/code/model/tests/test_sourcepackagerecipe.py' |
513 | --- lib/lp/code/model/tests/test_sourcepackagerecipe.py 1970-01-01 00:00:00 +0000 |
514 | +++ lib/lp/code/model/tests/test_sourcepackagerecipe.py 2010-04-20 03:27:10 +0000 |
515 | @@ -0,0 +1,511 @@ |
516 | +# Copyright 2009, 2010 Canonical Ltd. This software is licensed under the |
517 | +# GNU Affero General Public License version 3 (see the file LICENSE). |
518 | + |
519 | +"""Tests for the SourcePackageRecipe content type.""" |
520 | + |
521 | +from __future__ import with_statement |
522 | + |
523 | +__metaclass__ = type |
524 | + |
525 | +from datetime import datetime |
526 | +import textwrap |
527 | +import unittest |
528 | + |
529 | +from bzrlib.plugins.builder.recipe import RecipeParser |
530 | + |
531 | +from pytz import UTC |
532 | +from storm.locals import Store |
533 | + |
534 | +from zope.component import getUtility |
535 | +from zope.security.interfaces import Unauthorized |
536 | +from zope.security.proxy import removeSecurityProxy |
537 | + |
538 | +from canonical.testing.layers import DatabaseFunctionalLayer |
539 | + |
540 | +from canonical.launchpad.webapp.authorization import check_permission |
541 | +from lp.archiveuploader.permission import ( |
542 | + ArchiveDisabled, CannotUploadToArchive, InvalidPocketForPPA) |
543 | +from lp.buildmaster.interfaces.buildqueue import IBuildQueue |
544 | +from lp.buildmaster.model.buildqueue import BuildQueue |
545 | +from lp.code.interfaces.sourcepackagerecipe import ( |
546 | + ForbiddenInstruction, ISourcePackageRecipe, ISourcePackageRecipeSource, |
547 | + TooNewRecipeFormat) |
548 | +from lp.code.interfaces.sourcepackagerecipebuild import ( |
549 | + ISourcePackageRecipeBuild, ISourcePackageRecipeBuildJob) |
550 | +from lp.code.model.sourcepackagerecipebuild import ( |
551 | + SourcePackageRecipeBuildJob) |
552 | +from lp.code.model.sourcepackagerecipe import ( |
553 | + NonPPABuildRequest) |
554 | +from lp.registry.interfaces.pocket import PackagePublishingPocket |
555 | +from lp.services.job.interfaces.job import ( |
556 | + IJob, JobStatus) |
557 | +from lp.soyuz.interfaces.archive import ArchivePurpose |
558 | +from lp.testing import ( |
559 | + login_person, person_logged_in, TestCaseWithFactory) |
560 | + |
561 | +class TestSourcePackageRecipe(TestCaseWithFactory): |
562 | + """Tests for `SourcePackageRecipe` objects.""" |
563 | + |
564 | + layer = DatabaseFunctionalLayer |
565 | + |
566 | + def makeSourcePackageRecipeFromBuilderRecipe(self, builder_recipe): |
567 | + """Make a SourcePackageRecipe from a recipe with arbitrary other data. |
568 | + """ |
569 | + registrant = self.factory.makePerson() |
570 | + owner = self.factory.makeTeam(owner=registrant) |
571 | + distroseries = self.factory.makeDistroSeries() |
572 | + sourcepackagename = self.factory.makeSourcePackageName() |
573 | + name = self.factory.getUniqueString(u'recipe-name') |
574 | + description = self.factory.getUniqueString(u'recipe-description') |
575 | + return getUtility(ISourcePackageRecipeSource).new( |
576 | + registrant=registrant, owner=owner, distroseries=[distroseries], |
577 | + sourcepackagename=sourcepackagename, name=name, |
578 | + description=description, builder_recipe=builder_recipe) |
579 | + |
580 | + def test_creation(self): |
581 | + # The metadata supplied when a SourcePackageRecipe is created is |
582 | + # present on the new object. |
583 | + registrant = self.factory.makePerson() |
584 | + owner = self.factory.makeTeam(owner=registrant) |
585 | + distroseries = self.factory.makeDistroSeries() |
586 | + sourcepackagename = self.factory.makeSourcePackageName() |
587 | + name = self.factory.getUniqueString(u'recipe-name') |
588 | + description = self.factory.getUniqueString(u'recipe-description') |
589 | + builder_recipe = self.factory.makeRecipe() |
590 | + recipe = getUtility(ISourcePackageRecipeSource).new( |
591 | + registrant=registrant, owner=owner, distroseries=[distroseries], |
592 | + sourcepackagename=sourcepackagename, name=name, |
593 | + description=description, builder_recipe=builder_recipe) |
594 | + self.assertEquals( |
595 | + (registrant, owner, set([distroseries]), sourcepackagename, name), |
596 | + (recipe.registrant, recipe.owner, set(recipe.distroseries), |
597 | + recipe.sourcepackagename, recipe.name)) |
598 | + |
599 | + def test_source_implements_interface(self): |
600 | + # The SourcePackageRecipe class implements ISourcePackageRecipeSource. |
601 | + self.assertProvides( |
602 | + getUtility(ISourcePackageRecipeSource), |
603 | + ISourcePackageRecipeSource) |
604 | + |
605 | + def test_recipe_implements_interface(self): |
606 | + # SourcePackageRecipe objects implement ISourcePackageRecipe. |
607 | + recipe = self.makeSourcePackageRecipeFromBuilderRecipe( |
608 | + self.factory.makeRecipe()) |
609 | + self.assertProvides(recipe, ISourcePackageRecipe) |
610 | + |
611 | + def test_base_branch(self): |
612 | + # When a recipe is created, we can access its base branch. |
613 | + branch = self.factory.makeAnyBranch() |
614 | + builder_recipe = self.factory.makeRecipe(branch) |
615 | + sp_recipe = self.makeSourcePackageRecipeFromBuilderRecipe( |
616 | + builder_recipe) |
617 | + self.assertEquals(branch, sp_recipe.base_branch) |
618 | + |
619 | + def test_branch_links_created(self): |
620 | + # When a recipe is created, we can query it for links to the branch |
621 | + # it references. |
622 | + branch = self.factory.makeAnyBranch() |
623 | + builder_recipe = self.factory.makeRecipe(branch) |
624 | + sp_recipe = self.makeSourcePackageRecipeFromBuilderRecipe( |
625 | + builder_recipe) |
626 | + self.assertEquals([branch], list(sp_recipe.getReferencedBranches())) |
627 | + |
628 | + def test_multiple_branch_links_created(self): |
629 | + # If a recipe links to more than one branch, getReferencedBranches() |
630 | + # returns all of them. |
631 | + branch1 = self.factory.makeAnyBranch() |
632 | + branch2 = self.factory.makeAnyBranch() |
633 | + builder_recipe = self.factory.makeRecipe(branch1, branch2) |
634 | + sp_recipe = self.makeSourcePackageRecipeFromBuilderRecipe( |
635 | + builder_recipe) |
636 | + self.assertEquals( |
637 | + sorted([branch1, branch2]), |
638 | + sorted(sp_recipe.getReferencedBranches())) |
639 | + |
640 | + def test_random_user_cant_edit(self): |
641 | + # An arbitrary user can't set attributes. |
642 | + branch1 = self.factory.makeAnyBranch() |
643 | + builder_recipe1 = self.factory.makeRecipe(branch1) |
644 | + sp_recipe = self.makeSourcePackageRecipeFromBuilderRecipe( |
645 | + builder_recipe1) |
646 | + branch2 = self.factory.makeAnyBranch() |
647 | + builder_recipe2 = self.factory.makeRecipe(branch2) |
648 | + login_person(self.factory.makePerson()) |
649 | + self.assertRaises( |
650 | + Unauthorized, setattr, sp_recipe, 'builder_recipe', |
651 | + builder_recipe2) |
652 | + |
653 | + def test_set_recipe_text_resets_branch_references(self): |
654 | + # When the recipe_text is replaced, getReferencedBranches returns |
655 | + # (only) the branches referenced by the new recipe. |
656 | + branch1 = self.factory.makeAnyBranch() |
657 | + builder_recipe1 = self.factory.makeRecipe(branch1) |
658 | + sp_recipe = self.makeSourcePackageRecipeFromBuilderRecipe( |
659 | + builder_recipe1) |
660 | + branch2 = self.factory.makeAnyBranch() |
661 | + builder_recipe2 = self.factory.makeRecipe(branch2) |
662 | + login_person(sp_recipe.owner.teamowner) |
663 | + #import pdb; pdb.set_trace() |
664 | + sp_recipe.builder_recipe = builder_recipe2 |
665 | + self.assertEquals([branch2], list(sp_recipe.getReferencedBranches())) |
666 | + |
667 | + def test_rejects_run_command(self): |
668 | + recipe_text = '''\ |
669 | + # bzr-builder format 0.2 deb-version 0.1-{revno} |
670 | + %(base)s |
671 | + run touch test |
672 | + ''' % dict(base=self.factory.makeAnyBranch().bzr_identity) |
673 | + parser = RecipeParser(textwrap.dedent(recipe_text)) |
674 | + builder_recipe = parser.parse() |
675 | + self.assertRaises( |
676 | + ForbiddenInstruction, |
677 | + self.makeSourcePackageRecipeFromBuilderRecipe, builder_recipe) |
678 | + |
679 | + def test_run_rejected_without_mangling_recipe(self): |
680 | + branch1 = self.factory.makeAnyBranch() |
681 | + builder_recipe1 = self.factory.makeRecipe(branch1) |
682 | + sp_recipe = self.makeSourcePackageRecipeFromBuilderRecipe( |
683 | + builder_recipe1) |
684 | + recipe_text = '''\ |
685 | + # bzr-builder format 0.2 deb-version 0.1-{revno} |
686 | + %(base)s |
687 | + run touch test |
688 | + ''' % dict(base=self.factory.makeAnyBranch().bzr_identity) |
689 | + parser = RecipeParser(textwrap.dedent(recipe_text)) |
690 | + builder_recipe2 = parser.parse() |
691 | + login_person(sp_recipe.owner.teamowner) |
692 | + self.assertRaises( |
693 | + ForbiddenInstruction, setattr, sp_recipe, 'builder_recipe', |
694 | + builder_recipe2) |
695 | + self.assertEquals([branch1], list(sp_recipe.getReferencedBranches())) |
696 | + |
697 | + def test_reject_newer_formats(self): |
698 | + builder_recipe = self.factory.makeRecipe() |
699 | + builder_recipe.format = 0.3 |
700 | + self.assertRaises( |
701 | + TooNewRecipeFormat, |
702 | + self.makeSourcePackageRecipeFromBuilderRecipe, builder_recipe) |
703 | + |
704 | + def test_requestBuild(self): |
705 | + recipe = self.factory.makeSourcePackageRecipe() |
706 | + (distroseries,) = list(recipe.distroseries) |
707 | + ppa = self.factory.makeArchive() |
708 | + build = recipe.requestBuild(ppa, ppa.owner, distroseries, |
709 | + PackagePublishingPocket.RELEASE) |
710 | + self.assertProvides(build, ISourcePackageRecipeBuild) |
711 | + self.assertEqual(build.archive, ppa) |
712 | + self.assertEqual(build.distroseries, distroseries) |
713 | + self.assertEqual(build.requester, ppa.owner) |
714 | + store = Store.of(build) |
715 | + store.flush() |
716 | + build_job = store.find(SourcePackageRecipeBuildJob, |
717 | + SourcePackageRecipeBuildJob.build_id==build.id).one() |
718 | + self.assertProvides(build_job, ISourcePackageRecipeBuildJob) |
719 | + self.assertTrue(build_job.virtualized) |
720 | + job = build_job.job |
721 | + self.assertProvides(job, IJob) |
722 | + self.assertEquals(job.status, JobStatus.WAITING) |
723 | + build_queue = store.find(BuildQueue, BuildQueue.job==job.id).one() |
724 | + self.assertProvides(build_queue, IBuildQueue) |
725 | + self.assertTrue(build_queue.virtualized) |
726 | + |
727 | + def test_requestBuildRejectsNotPPA(self): |
728 | + recipe = self.factory.makeSourcePackageRecipe() |
729 | + not_ppa = self.factory.makeArchive(purpose=ArchivePurpose.PRIMARY) |
730 | + (distroseries,) = list(recipe.distroseries) |
731 | + self.assertRaises(NonPPABuildRequest, recipe.requestBuild, not_ppa, |
732 | + not_ppa.owner, distroseries, PackagePublishingPocket.RELEASE) |
733 | + |
734 | + def test_requestBuildRejectsNoPermission(self): |
735 | + recipe = self.factory.makeSourcePackageRecipe() |
736 | + ppa = self.factory.makeArchive() |
737 | + requester = self.factory.makePerson() |
738 | + (distroseries,) = list(recipe.distroseries) |
739 | + self.assertRaises(CannotUploadToArchive, recipe.requestBuild, ppa, |
740 | + requester, distroseries, PackagePublishingPocket.RELEASE) |
741 | + |
742 | + def test_requestBuildRejectsInvalidPocket(self): |
743 | + recipe = self.factory.makeSourcePackageRecipe() |
744 | + ppa = self.factory.makeArchive() |
745 | + (distroseries,) = list(recipe.distroseries) |
746 | + self.assertRaises(InvalidPocketForPPA, recipe.requestBuild, ppa, |
747 | + ppa.owner, distroseries, PackagePublishingPocket.BACKPORTS) |
748 | + |
749 | + def test_requestBuildRejectsDisabledArchive(self): |
750 | + recipe = self.factory.makeSourcePackageRecipe() |
751 | + ppa = self.factory.makeArchive() |
752 | + removeSecurityProxy(ppa).disable() |
753 | + (distroseries,) = list(recipe.distroseries) |
754 | + self.assertRaises(ArchiveDisabled, recipe.requestBuild, ppa, |
755 | + ppa.owner, distroseries, PackagePublishingPocket.RELEASE) |
756 | + |
757 | + def test_sourcepackagerecipe_description(self): |
758 | + """Ensure that the SourcePackageRecipe has a proper description.""" |
759 | + description = u'The whoozits and whatzits.' |
760 | + source_package_recipe = self.factory.makeSourcePackageRecipe( |
761 | + description=description) |
762 | + self.assertEqual(description, source_package_recipe.description) |
763 | + |
764 | + def test_distroseries(self): |
765 | + """Test that the distroseries behaves as a set.""" |
766 | + recipe = self.factory.makeSourcePackageRecipe() |
767 | + distroseries = self.factory.makeDistroSeries() |
768 | + (old_distroseries,) = recipe.distroseries |
769 | + recipe.distroseries.add(distroseries) |
770 | + self.assertEqual( |
771 | + set([distroseries, old_distroseries]), set(recipe.distroseries)) |
772 | + recipe.distroseries.remove(distroseries) |
773 | + self.assertEqual([old_distroseries], list(recipe.distroseries)) |
774 | + recipe.distroseries.clear() |
775 | + self.assertEqual([], list(recipe.distroseries)) |
776 | + |
777 | + def test_build_daily(self): |
778 | + """Test that build_daily behaves as a bool.""" |
779 | + recipe = self.factory.makeSourcePackageRecipe() |
780 | + self.assertFalse(recipe.build_daily) |
781 | + login_person(recipe.owner) |
782 | + recipe.build_daily = True |
783 | + self.assertTrue(recipe.build_daily) |
784 | + |
785 | + def test_view_public(self): |
786 | + """Anyone can view a recipe with public branches.""" |
787 | + owner = self.factory.makePerson() |
788 | + branch = self.factory.makeAnyBranch(owner=owner) |
789 | + with person_logged_in(owner): |
790 | + recipe = self.factory.makeSourcePackageRecipe(branches=[branch]) |
791 | + self.assertTrue(check_permission('launchpad.View', recipe)) |
792 | + with person_logged_in(self.factory.makePerson()): |
793 | + self.assertTrue(check_permission('launchpad.View', recipe)) |
794 | + self.assertTrue(check_permission('launchpad.View', recipe)) |
795 | + |
796 | + def test_view_private(self): |
797 | + """Recipes with private branches are restricted.""" |
798 | + owner = self.factory.makePerson() |
799 | + branch = self.factory.makeAnyBranch(owner=owner, private=True) |
800 | + with person_logged_in(owner): |
801 | + recipe = self.factory.makeSourcePackageRecipe(branches=[branch]) |
802 | + self.assertTrue(check_permission('launchpad.View', recipe)) |
803 | + with person_logged_in(self.factory.makePerson()): |
804 | + self.assertFalse(check_permission('launchpad.View', recipe)) |
805 | + self.assertFalse(check_permission('launchpad.View', recipe)) |
806 | + |
807 | + def test_edit(self): |
808 | + """Only the owner can edit a sourcepackagerecipe.""" |
809 | + recipe = self.factory.makeSourcePackageRecipe() |
810 | + self.assertFalse(check_permission('launchpad.Edit', recipe)) |
811 | + with person_logged_in(self.factory.makePerson()): |
812 | + self.assertFalse(check_permission('launchpad.Edit', recipe)) |
813 | + with person_logged_in(recipe.owner): |
814 | + self.assertTrue(check_permission('launchpad.Edit', recipe)) |
815 | + |
816 | + def test_destroySelf(self): |
817 | + """Should destroy associated builds, distroseries, etc.""" |
818 | + # Recipe should have at least one datainstruction. |
819 | + branches = [self.factory.makeBranch() for count in range(2)] |
820 | + recipe = self.factory.makeSourcePackageRecipe(branches=branches) |
821 | + pending_build = self.factory.makeSourcePackageRecipeBuild( |
822 | + recipe=recipe) |
823 | + self.factory.makeSourcePackageRecipeBuildJob( |
824 | + recipe_build=pending_build) |
825 | + past_build = self.factory.makeSourcePackageRecipeBuild( |
826 | + recipe=recipe) |
827 | + self.factory.makeSourcePackageRecipeBuildJob( |
828 | + recipe_build=past_build) |
829 | + removeSecurityProxy(past_build).datebuilt = datetime.now(UTC) |
830 | + recipe.destroySelf() |
831 | + # Show no database constraints were violated |
832 | + Store.of(recipe).flush() |
833 | + |
834 | + |
835 | +class TestRecipeBranchRoundTripping(TestCaseWithFactory): |
836 | + |
837 | + layer = DatabaseFunctionalLayer |
838 | + |
839 | + def setUp(self): |
840 | + super(TestRecipeBranchRoundTripping, self).setUp() |
841 | + self.base_branch = self.factory.makeAnyBranch() |
842 | + self.nested_branch = self.factory.makeAnyBranch() |
843 | + self.merged_branch = self.factory.makeAnyBranch() |
844 | + self.branch_identities = { |
845 | + 'base': self.base_branch.bzr_identity, |
846 | + 'nested': self.nested_branch.bzr_identity, |
847 | + 'merged': self.merged_branch.bzr_identity, |
848 | + } |
849 | + |
850 | + def get_recipe(self, recipe_text): |
851 | + builder_recipe = RecipeParser(textwrap.dedent(recipe_text)).parse() |
852 | + registrant = self.factory.makePerson() |
853 | + owner = self.factory.makeTeam(owner=registrant) |
854 | + distroseries = self.factory.makeDistroSeries() |
855 | + sourcepackagename = self.factory.makeSourcePackageName() |
856 | + name = self.factory.getUniqueString(u'recipe-name') |
857 | + description = self.factory.getUniqueString(u'recipe-description') |
858 | + recipe = getUtility(ISourcePackageRecipeSource).new( |
859 | + registrant=registrant, owner=owner, distroseries=[distroseries], |
860 | + sourcepackagename=sourcepackagename, name=name, |
861 | + description=description, builder_recipe=builder_recipe) |
862 | + return recipe.builder_recipe |
863 | + |
864 | + def check_base_recipe_branch(self, branch, url, revspec=None, |
865 | + num_child_branches=0, revid=None, deb_version=None): |
866 | + self.check_recipe_branch(branch, None, url, revspec=revspec, |
867 | + num_child_branches=num_child_branches, revid=revid) |
868 | + self.assertEqual(deb_version, branch.deb_version) |
869 | + |
870 | + def check_recipe_branch(self, branch, name, url, revspec=None, |
871 | + num_child_branches=0, revid=None): |
872 | + self.assertEqual(name, branch.name) |
873 | + self.assertEqual(url, branch.url) |
874 | + self.assertEqual(revspec, branch.revspec) |
875 | + self.assertEqual(revid, branch.revid) |
876 | + self.assertEqual(num_child_branches, len(branch.child_branches)) |
877 | + |
878 | + def test_builds_simplest_recipe(self): |
879 | + recipe_text = '''\ |
880 | + # bzr-builder format 0.2 deb-version 0.1-{revno} |
881 | + %(base)s |
882 | + ''' % self.branch_identities |
883 | + base_branch = self.get_recipe(recipe_text) |
884 | + self.check_base_recipe_branch( |
885 | + base_branch, self.base_branch.bzr_identity, |
886 | + deb_version='0.1-{revno}') |
887 | + |
888 | + def test_builds_recipe_with_merge(self): |
889 | + recipe_text = '''\ |
890 | + # bzr-builder format 0.2 deb-version 0.1-{revno} |
891 | + %(base)s |
892 | + merge bar %(merged)s |
893 | + ''' % self.branch_identities |
894 | + base_branch = self.get_recipe(recipe_text) |
895 | + self.check_base_recipe_branch( |
896 | + base_branch, self.base_branch.bzr_identity, num_child_branches=1, |
897 | + deb_version='0.1-{revno}') |
898 | + child_branch, location = base_branch.child_branches[0].as_tuple() |
899 | + self.assertEqual(None, location) |
900 | + self.check_recipe_branch( |
901 | + child_branch, "bar", self.merged_branch.bzr_identity) |
902 | + |
903 | + def test_builds_recipe_with_nest(self): |
904 | + recipe_text = '''\ |
905 | + # bzr-builder format 0.2 deb-version 0.1-{revno} |
906 | + %(base)s |
907 | + nest bar %(nested)s baz |
908 | + ''' % self.branch_identities |
909 | + base_branch = self.get_recipe(recipe_text) |
910 | + self.check_base_recipe_branch( |
911 | + base_branch, self.base_branch.bzr_identity, num_child_branches=1, |
912 | + deb_version='0.1-{revno}') |
913 | + child_branch, location = base_branch.child_branches[0].as_tuple() |
914 | + self.assertEqual("baz", location) |
915 | + self.check_recipe_branch( |
916 | + child_branch, "bar", self.nested_branch.bzr_identity) |
917 | + |
918 | + def test_builds_recipe_with_nest_then_merge(self): |
919 | + recipe_text = '''\ |
920 | + # bzr-builder format 0.2 deb-version 0.1-{revno} |
921 | + %(base)s |
922 | + nest bar %(nested)s baz |
923 | + merge zam %(merged)s |
924 | + ''' % self.branch_identities |
925 | + base_branch = self.get_recipe(recipe_text) |
926 | + self.check_base_recipe_branch( |
927 | + base_branch, self.base_branch.bzr_identity, num_child_branches=2, |
928 | + deb_version='0.1-{revno}') |
929 | + child_branch, location = base_branch.child_branches[0].as_tuple() |
930 | + self.assertEqual("baz", location) |
931 | + self.check_recipe_branch( |
932 | + child_branch, "bar", self.nested_branch.bzr_identity) |
933 | + child_branch, location = base_branch.child_branches[1].as_tuple() |
934 | + self.assertEqual(None, location) |
935 | + self.check_recipe_branch( |
936 | + child_branch, "zam", self.merged_branch.bzr_identity) |
937 | + |
938 | + def test_builds_recipe_with_merge_then_nest(self): |
939 | + recipe_text = '''\ |
940 | + # bzr-builder format 0.2 deb-version 0.1-{revno} |
941 | + %(base)s |
942 | + merge zam %(merged)s |
943 | + nest bar %(nested)s baz |
944 | + ''' % self.branch_identities |
945 | + base_branch = self.get_recipe(recipe_text) |
946 | + self.check_base_recipe_branch( |
947 | + base_branch, self.base_branch.bzr_identity, num_child_branches=2, |
948 | + deb_version='0.1-{revno}') |
949 | + child_branch, location = base_branch.child_branches[0].as_tuple() |
950 | + self.assertEqual(None, location) |
951 | + self.check_recipe_branch( |
952 | + child_branch, "zam", self.merged_branch.bzr_identity) |
953 | + child_branch, location = base_branch.child_branches[1].as_tuple() |
954 | + self.assertEqual("baz", location) |
955 | + self.check_recipe_branch( |
956 | + child_branch, "bar", self.nested_branch.bzr_identity) |
957 | + |
958 | + def test_builds_a_merge_in_to_a_nest(self): |
959 | + recipe_text = '''\ |
960 | + # bzr-builder format 0.2 deb-version 0.1-{revno} |
961 | + %(base)s |
962 | + nest bar %(nested)s baz |
963 | + merge zam %(merged)s |
964 | + ''' % self.branch_identities |
965 | + base_branch = self.get_recipe(recipe_text) |
966 | + self.check_base_recipe_branch( |
967 | + base_branch, self.base_branch.bzr_identity, num_child_branches=1, |
968 | + deb_version='0.1-{revno}') |
969 | + child_branch, location = base_branch.child_branches[0].as_tuple() |
970 | + self.assertEqual("baz", location) |
971 | + self.check_recipe_branch( |
972 | + child_branch, "bar", self.nested_branch.bzr_identity, |
973 | + num_child_branches=1) |
974 | + child_branch, location = child_branch.child_branches[0].as_tuple() |
975 | + self.assertEqual(None, location) |
976 | + self.check_recipe_branch( |
977 | + child_branch, "zam", self.merged_branch.bzr_identity) |
978 | + |
979 | + def tests_builds_nest_into_a_nest(self): |
980 | + nested2 = self.factory.makeAnyBranch() |
981 | + self.branch_identities['nested2'] = nested2.bzr_identity |
982 | + recipe_text = '''\ |
983 | + # bzr-builder format 0.2 deb-version 0.1-{revno} |
984 | + %(base)s |
985 | + nest bar %(nested)s baz |
986 | + nest zam %(nested2)s zoo |
987 | + ''' % self.branch_identities |
988 | + base_branch = self.get_recipe(recipe_text) |
989 | + self.check_base_recipe_branch( |
990 | + base_branch, self.base_branch.bzr_identity, num_child_branches=1, |
991 | + deb_version='0.1-{revno}') |
992 | + child_branch, location = base_branch.child_branches[0].as_tuple() |
993 | + self.assertEqual("baz", location) |
994 | + self.check_recipe_branch( |
995 | + child_branch, "bar", self.nested_branch.bzr_identity, |
996 | + num_child_branches=1) |
997 | + child_branch, location = child_branch.child_branches[0].as_tuple() |
998 | + self.assertEqual("zoo", location) |
999 | + self.check_recipe_branch(child_branch, "zam", nested2.bzr_identity) |
1000 | + |
1001 | + def tests_builds_recipe_with_revspecs(self): |
1002 | + recipe_text = '''\ |
1003 | + # bzr-builder format 0.2 deb-version 0.1-{revno} |
1004 | + %(base)s revid:a |
1005 | + nest bar %(nested)s baz tag:b |
1006 | + merge zam %(merged)s 2 |
1007 | + ''' % self.branch_identities |
1008 | + base_branch = self.get_recipe(recipe_text) |
1009 | + self.check_base_recipe_branch( |
1010 | + base_branch, self.base_branch.bzr_identity, num_child_branches=2, |
1011 | + revspec="revid:a", deb_version='0.1-{revno}') |
1012 | + instruction = base_branch.child_branches[0] |
1013 | + child_branch = instruction.recipe_branch |
1014 | + location = instruction.nest_path |
1015 | + self.assertEqual("baz", location) |
1016 | + self.check_recipe_branch( |
1017 | + child_branch, "bar", self.nested_branch.bzr_identity, |
1018 | + revspec="tag:b") |
1019 | + child_branch, location = base_branch.child_branches[1].as_tuple() |
1020 | + self.assertEqual(None, location) |
1021 | + self.check_recipe_branch( |
1022 | + child_branch, "zam", self.merged_branch.bzr_identity, revspec="2") |
1023 | + |
1024 | + |
1025 | +def test_suite(): |
1026 | + return unittest.TestLoader().loadTestsFromName(__name__) |
There seems to be an inconsistency between the comment about the formula and the formula itself (regarding days_since_ last_activity) . From our IRC chat (below) and the related bug title, the comment should be:
> === modified file 'lib/lp/ bugs/scripts/ bugheat. py' bugs/scripts/ bugheat. py 2010-04-22 12:14:18 +0000 bugs/scripts/ bugheat. py 2010-04-22 12:14:19 +0000 date_last_ updated. replace( tzinfo= None)). days
> --- lib/lp/
> +++ lib/lp/
> @@ -86,5 +86,19 @@
> self.bug.
> total_heat = int(total_heat * (0.99 ** days))
>
> - return total_heat
> + # Bug heat increases by a quarter of the maximum bug heat divided by
> + # the number of days between the bug's creating and its last activity.
# the number of days since the bugs last activity.
> + days_since_ last_activity = ( bug.date_ last_updated. replace( tzinfo= None), date_last_ message. replace( tzinfo= None))) .days datecreated. replace( tzinfo= None)). days max_bug_ heat for task in self.bug.bugtasks)
> + datetime.utcnow() -
> + max(self.
> + self.bug.
> + days_since_created = (
> + datetime.utcnow() - self.bug.
> + if days_since_created > 0:
> + max_heat = max(
> + task.target.
> + if max_heat is not None:
> + total_heat = total_heat + (max_heat * 0.25 / days_since_created)
s/days_ since_created/ days_since_ last_activity
> +
> + return int(total_heat)
Similarly in the test:
> + expected = int((fresh_heat * (0.99 ** 10)) + (100 * 0.25 / 20))
s/20/10
{{{ last_activity is being used for anything? last_activity) ) or something similar, looking at the comment? since_created/ days_since_ last_activity on line 21 on the MP diff? /launchpad. net/bugs/ 567439>
14:17 < noodles775> intellectronica: I can't see that days_since_
14:18 < noodles775> Did you mean to do (max_heat * 0.25 / (days_since_created - days_since_
14:19 < noodles775> Ah, or looking at the related bug title, I'm guessing it should be s/dasy_
14:21 < intellectronica> noodles775: no, i think that's a cut-n-paste error.
14:27 < noodles775> intellectronica: so should it be divided by the number of days *since* the bugs last activity (as stated in the bug title), or divided by the difference between the days since the bug was created and it's last activity (as the comment seems to suggest?)
14:28 < intellectronica> noodles775: it should be divided by the days since the bug's creation
14:28 < noodles775> intellectronica: so the bug 567439 title is wrong then, ok.
14:28 < mup> Bug #567439: Add MAX_HEAT / 4 / days since last activity to bug heat <story-bug-heat> <Launchpad Bugs:In Progress by intellectronica> <https:/
14:29 < intellectronica> noodles775: oh, right, it is. there's another bug for a calculation based on time since last activity, i must have confused them.
}}}