Merge lp:~james-w/launchpad/copy-archive-job into lp:launchpad/db-devel
- copy-archive-job
- Merge into db-devel
Status: | Merged |
---|---|
Approved by: | Edwin Grubbs |
Approved revision: | no longer in the source branch. |
Merged at revision: | 9598 |
Proposed branch: | lp:~james-w/launchpad/copy-archive-job |
Merge into: | lp:launchpad/db-devel |
Prerequisite: | lp:~james-w/launchpad/archive-job-db |
Diff against target: |
800 lines (+765/-0) 6 files modified
database/schema/security.cfg (+1/-0) lib/lp/soyuz/interfaces/archivejob.py (+55/-0) lib/lp/soyuz/model/archivejob.py (+130/-0) lib/lp/soyuz/model/copyarchivejob.py (+140/-0) lib/lp/soyuz/tests/test_archivejob.py (+48/-0) lib/lp/soyuz/tests/test_copyarchivejob.py (+391/-0) |
To merge this branch: | bzr merge lp:~james-w/launchpad/copy-archive-job |
Related bugs: |
Reviewer | Review Type | Date Requested | Status |
---|---|---|---|
Edwin Grubbs (community) | code | Approve | |
Review via email: mp+28439@code.launchpad.net |
Commit message
Add CopyArchiveJob, a job to, um... copy archives.
Description of the change
Summary
This adds the model classes and code for CopyArchiveJob, with no users.
Currently copying archives is done by a script, and it has a table for
storing requests while it is doing it so that they could be deferred if
desired, though that was never fully implemented.
This lays the groundwork so that we can use the job system for copying
archives.
Proposed fix
This is mainly boilerplate code for a new job type, IArchiveJob, and
the implementation for a single job type. We can have the script
use this later, and write the cronscript to process them.
Pre-implementation notes
Julian confirmed that using the job system is the right thing to do.
Implementation details
We don't do what the other job types do and return the current request
if the same target archive is chosen, we error instead. I chose this as
it only makes sense for a single copy to be targeting an archive at one
time, but currently they will only be done in response to a user request,
so we should tell the user what is going on.
We make a lot more use of the metadata than the other job types do. While
it is inconvenient and leaves room for errors it is a lot easier than
creating a very specific job type that has references for all these things.
We rely on tests of PackageCloner for testing most of the behaviour of run(),
we just test enough to be fairly confident that we are passing the right
arguments to PackageCloner.
Tests
./bin/test -s lp.soyuz.tests -m test_copyarchivejob
./bin/test -s lp.soyuz.tests -m test_archivejob
Demo and Q/A
This will need QA when we convert the populate_archive script to use it.
lint
== Pyflakes notices ==
lib/lp/
112: local variable 'ignore_this' is assigned to but never used
279: local variable 'ignore_result' is assigned to but never used
== Pylint notices ==
lib/lp/
39: [E0211, IArchiveJob.
46: [E0213, IArchiveJobSour
lib/lp/
29: [E1002, CopyArchiveJob.
Thanks,
James
James Westby (james-w) wrote : | # |
Guilherme Salgado (salgado) wrote : | # |
I haven't done a through review as I'm not familiar with the job system, so wouldn't be comfortable doing it. I have a few comments, though.
I assume the new tables created to describe a new job are standardised, but we need to have the new one reviewed by either stub or jml.
It's weird to see new code raising SQLObjectNotFound. Does the job system expect that?
I think your tests can do with just the DatabaseFunctio
I'll be happy to do a thorough review tomorrow, if you think it's worth it, but I'd like someone more familiar with the job system to review it as well.
James Westby (james-w) wrote : | # |
Hi,
I adjusted the layer, and the db tables have been approved.
The SQLObjectNotFound is what the code jobs use, and no-one was
able to suggest anything better when I asked on IRC just now.
Thanks,
James
Edwin Grubbs (edwin-grubbs) wrote : | # |
Hi James,
This branch looks very good. I have minior comments below.
merge-conditional
-Edwin
>=== added file 'lib/lp/
>--- lib/lp/
>+++ lib/lp/
>@@ -0,0 +1,55 @@
>+from zope.interface import Attribute, Interface
>+from zope.schema import Int, Object
>+
>+from canonical.launchpad import _
>+
>+from lazr.enum import DBEnumeratedType, DBItem
>+
>+from lp.services.
>+from lp.soyuz.
>+
>+
>+class ArchiveJobType(
>+ """Values that IArchiveJob.
>+
>+ COPY_ARCHIVE = DBItem(0, """
>+ Create a copy archive.
>+
>+ This job creates a copy archive from the current state of
>+ the archive.
>+ """)
>+
>+
>+class IArchiveJob(
>+ """A Job related to an Archive."""
>+
>+ id = Int(
>+ title=_('DB ID'), required=True, readonly=True,
>+ description=_("The tracking number for this job."))
>+
>+ archive = Object(
>+ title=_('The Archive this job is about.'), schema=IArchive,
It looks peculiar to occasionally capitalize the noun. For example,
"job" isn't capitalized here, but it is capitalized in the following title.
I think capitalizing to match the class names is better left to doc
strings, since field titles are often exposed to the user, even if that
would never happen for this interface.
>+ required=True)
>+
>+ job = Object(
>+ title=_('The common Job attributes'), schema=IJob, required=True)
>+
>+ metadata = Attribute('A dict of data about the job.')
>+
>+ def destroySelf():
>+ """Destroy this object."""
>+
>+
>+class IArchiveJobSour
>+ """An interface for acquiring IArchiveJobs."""
>+
>+ def create(archive):
>+ """Create a new IArchiveJobs for an archive."""
>+
>+
>+class ICopyArchiveJob
>+ """A Job to copy archives."""
>+
>+
>+class ICopyArchiveJob
>+ """Interface for acquiring CopyArchiveJobs."""
>
>=== added file 'lib/lp/
>--- lib/lp/
>+++ lib/lp/
>@@ -0,0 +1,130 @@
>+
>+__metaclass__ = object
>+
>+import simplejson
>+
>+from sqlobject import SQLObjectNotFound
>+from storm.base import Storm
>+from storm.expr import And
>+from storm.locals import Int, Reference, Unicode
>+
>+from zope.component import getUtility
>+from zope.interface import classProvides, implements
>+
>+from canonical.
>+from canonical.
>+ DEFAULT_FLAVOR, IStoreSelector, MAIN_STORE, MASTER_FLAVOR)
>+
>+from lazr.delegates import delegates
>+
>+from lp.services.
>+from lp.services.
>+from lp.soyuz.
>+ ArchiveJobType, IArchiveJob, IArchiveJobSource)
>+from lp.soyuz.
>+
>+
>+class ArchiveJob(Storm):
>+ """Base class for jobs r...
James Westby (james-w) wrote : | # |
> Here you are creating a SPPH record for the source_archive, and
> makeSourceAndTa
> test the merge, shouldn't the to packages be in different archives?
It's not the best test, as the setup runs a job.
What we do is create a source archive, and then clone it to a target archive,
we then modify the source archive and request a "merge" operation, which should
copy the new publication we create in the source.
I'm not sure of a better way to write test to make that clear, so I added
comments.
All other changes made as suggested.
Thanks,
James
Preview Diff
1 | === modified file 'database/schema/security.cfg' |
2 | --- database/schema/security.cfg 2010-07-30 15:59:06 +0000 |
3 | +++ database/schema/security.cfg 2010-07-30 18:33:56 +0000 |
4 | @@ -893,6 +893,7 @@ |
5 | public.account = SELECT, INSERT, UPDATE |
6 | public.accountpassword = SELECT, INSERT |
7 | public.archive = SELECT, INSERT, UPDATE |
8 | +public.archivejob = SELECT, INSERT |
9 | public.archivearch = SELECT, INSERT, UPDATE, DELETE |
10 | public.binarypackagerelease = SELECT, INSERT, UPDATE |
11 | public.binarypackagefile = SELECT, INSERT, UPDATE |
12 | |
13 | === added file 'lib/lp/soyuz/interfaces/archivejob.py' |
14 | --- lib/lp/soyuz/interfaces/archivejob.py 1970-01-01 00:00:00 +0000 |
15 | +++ lib/lp/soyuz/interfaces/archivejob.py 2010-07-30 18:33:56 +0000 |
16 | @@ -0,0 +1,55 @@ |
17 | +from zope.interface import Attribute, Interface |
18 | +from zope.schema import Int, Object |
19 | + |
20 | +from canonical.launchpad import _ |
21 | + |
22 | +from lazr.enum import DBEnumeratedType, DBItem |
23 | + |
24 | +from lp.services.job.interfaces.job import IJob, IJobSource, IRunnableJob |
25 | +from lp.soyuz.interfaces.archive import IArchive |
26 | + |
27 | + |
28 | +class ArchiveJobType(DBEnumeratedType): |
29 | + """Values that IArchiveJob.job_type can take.""" |
30 | + |
31 | + COPY_ARCHIVE = DBItem(0, """ |
32 | + Create a copy archive. |
33 | + |
34 | + This job creates a copy archive from the current state of |
35 | + the archive. |
36 | + """) |
37 | + |
38 | + |
39 | +class IArchiveJob(Interface): |
40 | + """A Job related to an Archive.""" |
41 | + |
42 | + id = Int( |
43 | + title=_('DB ID'), required=True, readonly=True, |
44 | + description=_("The tracking number for this job.")) |
45 | + |
46 | + archive = Object( |
47 | + title=_('The archive this job is about.'), schema=IArchive, |
48 | + required=True) |
49 | + |
50 | + job = Object( |
51 | + title=_('The common Job attributes'), schema=IJob, required=True) |
52 | + |
53 | + metadata = Attribute('A dict of data about the job.') |
54 | + |
55 | + def destroySelf(): |
56 | + """Destroy this object.""" |
57 | + |
58 | + |
59 | +class IArchiveJobSource(IJobSource): |
60 | + """An interface for acquiring IArchiveJobs.""" |
61 | + |
62 | + def create(archive): |
63 | + """Create a new IArchiveJobs for an archive.""" |
64 | + |
65 | + |
66 | +class ICopyArchiveJob(IRunnableJob): |
67 | + """A Job to copy archives.""" |
68 | + |
69 | + |
70 | +class ICopyArchiveJobSource(IArchiveJobSource): |
71 | + """Interface for acquiring CopyArchiveJobs.""" |
72 | |
73 | === added file 'lib/lp/soyuz/model/archivejob.py' |
74 | --- lib/lp/soyuz/model/archivejob.py 1970-01-01 00:00:00 +0000 |
75 | +++ lib/lp/soyuz/model/archivejob.py 2010-07-30 18:33:56 +0000 |
76 | @@ -0,0 +1,130 @@ |
77 | + |
78 | +__metaclass__ = object |
79 | + |
80 | +import simplejson |
81 | + |
82 | +from sqlobject import SQLObjectNotFound |
83 | +from storm.base import Storm |
84 | +from storm.expr import And |
85 | +from storm.locals import Int, Reference, Unicode |
86 | + |
87 | +from zope.component import getUtility |
88 | +from zope.interface import classProvides, implements |
89 | + |
90 | +from canonical.database.enumcol import EnumCol |
91 | +from canonical.launchpad.webapp.interfaces import ( |
92 | + DEFAULT_FLAVOR, IStoreSelector, MAIN_STORE, MASTER_FLAVOR) |
93 | + |
94 | +from lazr.delegates import delegates |
95 | + |
96 | +from lp.services.job.model.job import Job |
97 | +from lp.services.job.runner import BaseRunnableJob |
98 | +from lp.soyuz.interfaces.archivejob import ( |
99 | + ArchiveJobType, IArchiveJob, IArchiveJobSource) |
100 | +from lp.soyuz.model.archive import Archive |
101 | + |
102 | + |
103 | +class ArchiveJob(Storm): |
104 | + """Base class for jobs related to Archives.""" |
105 | + |
106 | + implements(IArchiveJob) |
107 | + |
108 | + __storm_table__ = 'archivejob' |
109 | + |
110 | + id = Int(primary=True) |
111 | + |
112 | + job_id = Int(name='job') |
113 | + job = Reference(job_id, Job.id) |
114 | + |
115 | + archive_id = Int(name='archive') |
116 | + archive = Reference(archive_id, Archive.id) |
117 | + |
118 | + job_type = EnumCol(enum=ArchiveJobType, notNull=True) |
119 | + |
120 | + _json_data = Unicode('json_data') |
121 | + |
122 | + @property |
123 | + def metadata(self): |
124 | + return simplejson.loads(self._json_data) |
125 | + |
126 | + def __init__(self, archive, job_type, metadata): |
127 | + """Create an ArchiveJob. |
128 | + |
129 | + :param archive: the archive this job relates to. |
130 | + :param job_type: the bugjobtype of this job. |
131 | + :param metadata: the type-specific variables, as a json-compatible |
132 | + dict. |
133 | + """ |
134 | + super(ArchiveJob, self).__init__() |
135 | + json_data = simplejson.dumps(metadata) |
136 | + self.job = Job() |
137 | + self.archive = archive |
138 | + self.job_type = job_type |
139 | + # XXX AaronBentley 2009-01-29 bug=322819: This should be a bytestring, |
140 | + # but the db representation is unicode. |
141 | + self._json_data = json_data.decode('utf-8') |
142 | + |
143 | + @classmethod |
144 | + def get(cls, key): |
145 | + """Return the instance of this class whose key is supplied.""" |
146 | + store = getUtility(IStoreSelector).get(MAIN_STORE, DEFAULT_FLAVOR) |
147 | + instance = store.get(cls, key) |
148 | + if instance is None: |
149 | + raise SQLObjectNotFound( |
150 | + 'No occurence of %s has key %s' % (cls.__name__, key)) |
151 | + return instance |
152 | + |
153 | + |
154 | +class ArchiveJobDerived(BaseRunnableJob): |
155 | + """Intermediate class for deriving from ArchiveJob.""" |
156 | + delegates(IArchiveJob) |
157 | + classProvides(IArchiveJobSource) |
158 | + |
159 | + def __init__(self, job): |
160 | + self.context = job |
161 | + |
162 | + @classmethod |
163 | + def create(cls, archive, metadata=None): |
164 | + """See `IArchiveJob`.""" |
165 | + if metadata is None: |
166 | + metadata = {} |
167 | + job = ArchiveJob(archive, cls.class_job_type, metadata) |
168 | + return cls(job) |
169 | + |
170 | + @classmethod |
171 | + def get(cls, job_id): |
172 | + """Get a job by id. |
173 | + |
174 | + :return: the ArchiveJob with the specified id, as the current |
175 | + BugJobDerived subclass. |
176 | + :raises: SQLObjectNotFound if there is no job with the specified id, |
177 | + or its job_type does not match the desired subclass. |
178 | + """ |
179 | + job = ArchiveJob.get(job_id) |
180 | + if job.job_type != cls.class_job_type: |
181 | + raise SQLObjectNotFound( |
182 | + 'No object found with id %d and type %s' % (job_id, |
183 | + cls.class_job_type.title)) |
184 | + return cls(job) |
185 | + |
186 | + @classmethod |
187 | + def iterReady(cls): |
188 | + """Iterate through all ready BugJobs.""" |
189 | + store = getUtility(IStoreSelector).get(MAIN_STORE, MASTER_FLAVOR) |
190 | + jobs = store.find( |
191 | + ArchiveJob, |
192 | + And(ArchiveJob.job_type == cls.class_job_type, |
193 | + ArchiveJob.job == Job.id, |
194 | + Job.id.is_in(Job.ready_jobs), |
195 | + ArchiveJob.archive == Archive.id)) |
196 | + return (cls(job) for job in jobs) |
197 | + |
198 | + def getOopsVars(self): |
199 | + """See `IRunnableJob`.""" |
200 | + vars = BaseRunnableJob.getOopsVars(self) |
201 | + vars.extend([ |
202 | + ('archive_id', self.context.archive.id), |
203 | + ('archive_job_id', self.context.id), |
204 | + ('archive_job_type', self.context.job_type.title), |
205 | + ]) |
206 | + return vars |
207 | |
208 | === added file 'lib/lp/soyuz/model/copyarchivejob.py' |
209 | --- lib/lp/soyuz/model/copyarchivejob.py 1970-01-01 00:00:00 +0000 |
210 | +++ lib/lp/soyuz/model/copyarchivejob.py 2010-07-30 18:33:56 +0000 |
211 | @@ -0,0 +1,140 @@ |
212 | + |
213 | +__metaclass__ = object |
214 | + |
215 | +from zope.component import getUtility |
216 | +from zope.interface import classProvides, implements |
217 | + |
218 | +from canonical.launchpad.webapp.interfaces import ( |
219 | + DEFAULT_FLAVOR, IStoreSelector, MAIN_STORE) |
220 | + |
221 | +from lp.registry.interfaces.distroseries import IDistroSeriesSet |
222 | +from lp.registry.interfaces.pocket import PackagePublishingPocket |
223 | +from lp.services.job.model.job import Job |
224 | +from lp.soyuz.adapters.packagelocation import PackageLocation |
225 | +from lp.soyuz.interfaces.archive import IArchiveSet |
226 | +from lp.soyuz.interfaces.archivejob import ( |
227 | + ArchiveJobType, ICopyArchiveJob, ICopyArchiveJobSource) |
228 | +from lp.soyuz.interfaces.packagecloner import IPackageCloner |
229 | +from lp.soyuz.interfaces.packageset import IPackagesetSet |
230 | +from lp.soyuz.interfaces.processor import IProcessorFamilySet |
231 | +from lp.soyuz.interfaces.component import IComponentSet |
232 | +from lp.soyuz.model.archivejob import ArchiveJob, ArchiveJobDerived |
233 | + |
234 | + |
235 | +class CopyArchiveJob(ArchiveJobDerived): |
236 | + |
237 | + implements(ICopyArchiveJob) |
238 | + |
239 | + class_job_type = ArchiveJobType.COPY_ARCHIVE |
240 | + classProvides(ICopyArchiveJobSource) |
241 | + |
242 | + @classmethod |
243 | + def create(cls, target_archive, source_archive, |
244 | + source_series, source_pocket, target_series, target_pocket, |
245 | + target_component=None, proc_families=None, packagesets=None, |
246 | + merge=False): |
247 | + """See `ICopyArchiveJobSource`.""" |
248 | + store = getUtility(IStoreSelector).get(MAIN_STORE, DEFAULT_FLAVOR) |
249 | + job_for_archive = store.find( |
250 | + ArchiveJob, |
251 | + ArchiveJob.archive == target_archive, |
252 | + ArchiveJob.job_type == cls.class_job_type, |
253 | + ArchiveJob.job == Job.id, |
254 | + Job.id.is_in(Job.ready_jobs) |
255 | + ).any() |
256 | + |
257 | + if job_for_archive is not None: |
258 | + raise ValueError( |
259 | + "CopyArchiveJob already in progress for %s" % target_archive) |
260 | + else: |
261 | + if proc_families is None: |
262 | + proc_families = [] |
263 | + if len(proc_families) > 0 and merge: |
264 | + raise ValueError("Can't specify the architectures for merge.") |
265 | + proc_family_names = [p.name for p in proc_families] |
266 | + if packagesets is None: |
267 | + packagesets = [] |
268 | + packageset_names = [p.name for p in packagesets] |
269 | + target_component_id = None |
270 | + if target_component is not None: |
271 | + target_component_id = target_component.id |
272 | + metadata = { |
273 | + 'source_archive_id': source_archive.id, |
274 | + 'source_distroseries_id': source_series.id, |
275 | + 'source_pocket_value': source_pocket.value, |
276 | + 'target_distroseries_id': target_series.id, |
277 | + 'target_pocket_value': target_pocket.value, |
278 | + 'target_component_id': target_component_id, |
279 | + 'proc_family_names': proc_family_names, |
280 | + 'packageset_names': packageset_names, |
281 | + 'merge': merge, |
282 | + } |
283 | + return super(CopyArchiveJob, cls).create(target_archive, metadata) |
284 | + |
285 | + def getOopsVars(self): |
286 | + """See `ArchiveJobDerived`.""" |
287 | + vars = ArchiveJobDerived.getOopsVars(self) |
288 | + vars.extend([ |
289 | + ('source_archive_id', self.metadata['source_archive_id']), |
290 | + ('source_distroseries_id', |
291 | + self.metadata['source_distroseries_id']), |
292 | + ('target_distroseries_id', |
293 | + self.metadata['target_distroseries_id']), |
294 | + ('source_pocket_value', self.metadata['source_pocket_value']), |
295 | + ('target_pocket_value', self.metadata['target_pocket_value']), |
296 | + ('target_component_id', self.metadata['target_component_id']), |
297 | + ('merge', self.metadata['merge']), |
298 | + ]) |
299 | + return vars |
300 | + |
301 | + def getSourceLocation(self): |
302 | + """Get the PackageLocation for the source.""" |
303 | + # TODO: handle things going bye-bye before we get here. |
304 | + source_archive_id = self.metadata['source_archive_id'] |
305 | + source_archive = getUtility(IArchiveSet).get(source_archive_id) |
306 | + source_distroseries_id = self.metadata['source_distroseries_id'] |
307 | + source_distroseries = getUtility(IDistroSeriesSet).get( |
308 | + source_distroseries_id) |
309 | + source_distribution = source_distroseries.distribution |
310 | + source_pocket_value = self.metadata['source_pocket_value'] |
311 | + source_pocket = PackagePublishingPocket.items[source_pocket_value] |
312 | + packageset_names = self.metadata['packageset_names'] |
313 | + packagesets = [getUtility(IPackagesetSet).getByName(name) |
314 | + for name in packageset_names] |
315 | + source_location = PackageLocation( |
316 | + source_archive, source_distribution, source_distroseries, |
317 | + source_pocket, packagesets=packagesets) |
318 | + return source_location |
319 | + |
320 | + def getTargetLocation(self): |
321 | + """Get the PackageLocation for the target.""" |
322 | + # TODO: handle things going bye-bye before we get here. |
323 | + target_distroseries_id = self.metadata['target_distroseries_id'] |
324 | + target_distroseries = getUtility(IDistroSeriesSet).get( |
325 | + target_distroseries_id) |
326 | + target_distribution = target_distroseries.distribution |
327 | + target_pocket_value = self.metadata['target_pocket_value'] |
328 | + target_pocket = PackagePublishingPocket.items[target_pocket_value] |
329 | + target_location = PackageLocation( |
330 | + self.archive, target_distribution, target_distroseries, |
331 | + target_pocket) |
332 | + target_component_id = self.metadata['target_component_id'] |
333 | + if target_component_id is not None: |
334 | + target_location.component = getUtility(IComponentSet).get( |
335 | + target_component_id) |
336 | + return target_location |
337 | + |
338 | + def run(self): |
339 | + """See `IRunnableJob`.""" |
340 | + source_location = self.getSourceLocation() |
341 | + target_location = self.getTargetLocation() |
342 | + proc_family_names = self.metadata['proc_family_names'] |
343 | + proc_family_set = getUtility(IProcessorFamilySet) |
344 | + proc_families = [proc_family_set.getByName(p) |
345 | + for p in proc_family_names] |
346 | + package_cloner = getUtility(IPackageCloner) |
347 | + if self.metadata['merge']: |
348 | + package_cloner.mergeCopy(source_location, target_location) |
349 | + else: |
350 | + package_cloner.clonePackages( |
351 | + source_location, target_location, proc_families=proc_families) |
352 | |
353 | === added file 'lib/lp/soyuz/tests/test_archivejob.py' |
354 | --- lib/lp/soyuz/tests/test_archivejob.py 1970-01-01 00:00:00 +0000 |
355 | +++ lib/lp/soyuz/tests/test_archivejob.py 2010-07-30 18:33:56 +0000 |
356 | @@ -0,0 +1,48 @@ |
357 | +import unittest |
358 | + |
359 | +from canonical.testing import DatabaseFunctionalLayer |
360 | + |
361 | +from lp.soyuz.interfaces.archivejob import ArchiveJobType |
362 | +from lp.soyuz.model.archivejob import ArchiveJob, ArchiveJobDerived |
363 | +from lp.testing import TestCaseWithFactory |
364 | + |
365 | + |
366 | +class ArchiveJobTestCase(TestCaseWithFactory): |
367 | + """Test case for basic ArchiveJob gubbins.""" |
368 | + |
369 | + layer = DatabaseFunctionalLayer |
370 | + |
371 | + def test_instantiate(self): |
372 | + # ArchiveJob.__init__() instantiates a ArchiveJob instance. |
373 | + archive = self.factory.makeArchive() |
374 | + |
375 | + metadata = ('some', 'arbitrary', 'metadata') |
376 | + archive_job = ArchiveJob( |
377 | + archive, ArchiveJobType.COPY_ARCHIVE, metadata) |
378 | + |
379 | + self.assertEqual(archive, archive_job.archive) |
380 | + self.assertEqual(ArchiveJobType.COPY_ARCHIVE, archive_job.job_type) |
381 | + |
382 | + # When we actually access the ArchiveJob's metadata it gets |
383 | + # deserialized from JSON, so the representation returned by |
384 | + # archive_job.metadata will be different from what we originally |
385 | + # passed in. |
386 | + metadata_expected = [u'some', u'arbitrary', u'metadata'] |
387 | + self.assertEqual(metadata_expected, archive_job.metadata) |
388 | + |
389 | + |
390 | +class ArchiveJobDerivedTestCase(TestCaseWithFactory): |
391 | + """Test case for the ArchiveJobDerived class.""" |
392 | + |
393 | + layer = DatabaseFunctionalLayer |
394 | + |
395 | + def test_create_explodes(self): |
396 | + # ArchiveJobDerived.create() will blow up because it needs to be |
397 | + # subclassed to work properly. |
398 | + archive = self.factory.makeArchive() |
399 | + self.assertRaises( |
400 | + AttributeError, ArchiveJobDerived.create, archive) |
401 | + |
402 | + |
403 | +def test_suite(): |
404 | + return unittest.TestLoader().loadTestsFromName(__name__) |
405 | |
406 | === added file 'lib/lp/soyuz/tests/test_copyarchivejob.py' |
407 | --- lib/lp/soyuz/tests/test_copyarchivejob.py 1970-01-01 00:00:00 +0000 |
408 | +++ lib/lp/soyuz/tests/test_copyarchivejob.py 2010-07-30 18:33:56 +0000 |
409 | @@ -0,0 +1,391 @@ |
410 | +# Copyright 2009 Canonical Ltd. This software is licensed under the |
411 | +# GNU Affero General Public License version 3 (see the file LICENSE). |
412 | + |
413 | +from __future__ import with_statement |
414 | + |
415 | +__metaclass__ = type |
416 | + |
417 | +from zope.component import getUtility |
418 | +from zope.security.proxy import removeSecurityProxy |
419 | + |
420 | +from canonical.testing import DatabaseFunctionalLayer |
421 | + |
422 | +from lp.buildmaster.interfaces.buildbase import BuildStatus |
423 | +from lp.registry.interfaces.pocket import PackagePublishingPocket |
424 | +from lp.soyuz.adapters.packagelocation import PackageLocation |
425 | +from lp.soyuz.interfaces.archive import ArchivePurpose |
426 | +from lp.soyuz.interfaces.binarypackagebuild import IBinaryPackageBuildSet |
427 | +from lp.soyuz.interfaces.publishing import PackagePublishingStatus |
428 | +from lp.soyuz.model.copyarchivejob import CopyArchiveJob |
429 | +from lp.soyuz.model.processor import ProcessorFamilySet |
430 | +from lp.testing import celebrity_logged_in, TestCaseWithFactory |
431 | + |
432 | + |
433 | +class CopyArchiveJobTests(TestCaseWithFactory): |
434 | + """Tests for CopyArchiveJob.""" |
435 | + |
436 | + layer = DatabaseFunctionalLayer |
437 | + |
438 | + def test_getOopsVars(self): |
439 | + archive = self.factory.makeArchive() |
440 | + args = self.makeDummyArgs() |
441 | + target_distroseries = self.factory.makeDistroSeries() |
442 | + source_pocket = PackagePublishingPocket.RELEASE |
443 | + target_pocket = PackagePublishingPocket.BACKPORTS |
444 | + target_component = self.factory.makeComponent() |
445 | + job = CopyArchiveJob.create( |
446 | + archive, args['source_archive'], args['distroseries'], |
447 | + source_pocket, target_distroseries, target_pocket, |
448 | + target_component=target_component) |
449 | + vars = job.getOopsVars() |
450 | + self.assertIn(('archive_id', archive.id), vars) |
451 | + self.assertIn(('archive_job_id', job.context.id), vars) |
452 | + self.assertIn(('archive_job_type', job.context.job_type.title), vars) |
453 | + self.assertIn(('source_archive_id', args['source_archive'].id), vars) |
454 | + self.assertIn( |
455 | + ('source_distroseries_id', args['distroseries'].id), vars) |
456 | + self.assertIn( |
457 | + ('target_distroseries_id', target_distroseries.id), vars) |
458 | + self.assertIn(('source_pocket_value', source_pocket.value), vars) |
459 | + self.assertIn(('target_pocket_value', target_pocket.value), vars) |
460 | + self.assertIn( |
461 | + ('target_component_id', target_component.id), vars) |
462 | + self.assertIn(('merge', False), vars) |
463 | + |
464 | + def makeDummyArgs(self): |
465 | + args = {} |
466 | + distro = self.factory.makeDistribution() |
467 | + args['distroseries'] = self.factory.makeDistroSeries( |
468 | + distribution=distro) |
469 | + args['pocket'] = self.factory.getAnyPocket() |
470 | + args['source_archive'] = self.factory.makeArchive( |
471 | + distribution=distro) |
472 | + return args |
473 | + |
474 | + def test_error_if_already_exists(self): |
475 | + target_archive = self.factory.makeArchive() |
476 | + args = self.makeDummyArgs() |
477 | + CopyArchiveJob.create( |
478 | + target_archive, args['source_archive'], args['distroseries'], |
479 | + args['pocket'], args['distroseries'], args['pocket']) |
480 | + self.assertEqual(1, self._getJobCount()) |
481 | + args = self.makeDummyArgs() |
482 | + self.assertRaises( |
483 | + ValueError, CopyArchiveJob.create, target_archive, |
484 | + args['source_archive'], args['distroseries'], args['pocket'], |
485 | + args['distroseries'], args['pocket']) |
486 | + |
487 | + def test_create_sets_source_archive_id(self): |
488 | + target_archive = self.factory.makeArchive() |
489 | + args = self.makeDummyArgs() |
490 | + source_archive = self.factory.makeArchive() |
491 | + job = CopyArchiveJob.create( |
492 | + target_archive, source_archive, args['distroseries'], |
493 | + args['pocket'], args['distroseries'], args['pocket']) |
494 | + self.assertEqual( |
495 | + source_archive.id, job.metadata['source_archive_id']) |
496 | + |
497 | + def test_create_sets_source_series_id(self): |
498 | + target_archive = self.factory.makeArchive() |
499 | + args = self.makeDummyArgs() |
500 | + source_distroseries = self.factory.makeDistroSeries() |
501 | + job = CopyArchiveJob.create( |
502 | + target_archive, args['source_archive'], source_distroseries, |
503 | + args['pocket'], args['distroseries'], args['pocket']) |
504 | + self.assertEqual( |
505 | + source_distroseries.id, job.metadata['source_distroseries_id']) |
506 | + |
507 | + def test_create_sets_source_pocket_value(self): |
508 | + target_archive = self.factory.makeArchive() |
509 | + args = self.makeDummyArgs() |
510 | + source_pocket = PackagePublishingPocket.RELEASE |
511 | + target_pocket = PackagePublishingPocket.BACKPORTS |
512 | + job = CopyArchiveJob.create( |
513 | + target_archive, args['source_archive'], args['distroseries'], |
514 | + source_pocket, args['distroseries'], target_pocket) |
515 | + self.assertEqual( |
516 | + source_pocket.value, job.metadata['source_pocket_value']) |
517 | + |
518 | + def test_create_sets_target_pocket_value(self): |
519 | + target_archive = self.factory.makeArchive() |
520 | + args = self.makeDummyArgs() |
521 | + source_pocket = PackagePublishingPocket.RELEASE |
522 | + target_pocket = PackagePublishingPocket.BACKPORTS |
523 | + job = CopyArchiveJob.create( |
524 | + target_archive, args['source_archive'], args['distroseries'], |
525 | + source_pocket, args['distroseries'], target_pocket) |
526 | + self.assertEqual( |
527 | + target_pocket.value, job.metadata['target_pocket_value']) |
528 | + |
529 | + def test_create_sets_target_distroseries_id(self): |
530 | + target_archive = self.factory.makeArchive() |
531 | + args = self.makeDummyArgs() |
532 | + target_distroseries = self.factory.makeDistroSeries() |
533 | + job = CopyArchiveJob.create( |
534 | + target_archive, args['source_archive'], args['distroseries'], |
535 | + args['pocket'], target_distroseries, args['pocket']) |
536 | + self.assertEqual( |
537 | + target_distroseries.id, job.metadata['target_distroseries_id']) |
538 | + |
539 | + def test_create_sets_target_component_id(self): |
540 | + target_archive = self.factory.makeArchive() |
541 | + args = self.makeDummyArgs() |
542 | + target_component = self.factory.makeComponent() |
543 | + job = CopyArchiveJob.create( |
544 | + target_archive, args['source_archive'], args['distroseries'], |
545 | + args['pocket'], args['distroseries'], args['pocket'], |
546 | + target_component=target_component) |
547 | + self.assertEqual( |
548 | + target_component.id, job.metadata['target_component_id']) |
549 | + |
550 | + def test_create_sets_target_component_id_to_None_if_unspecified(self): |
551 | + target_archive = self.factory.makeArchive() |
552 | + args = self.makeDummyArgs() |
553 | + job = CopyArchiveJob.create( |
554 | + target_archive, args['source_archive'], args['distroseries'], |
555 | + args['pocket'], args['distroseries'], args['pocket']) |
556 | + self.assertEqual(None, job.metadata['target_component_id']) |
557 | + |
558 | + def test_create_sets_proc_family_ids(self): |
559 | + target_archive = self.factory.makeArchive() |
560 | + args = self.makeDummyArgs() |
561 | + family1 = self.factory.makeProcessorFamily(name="armel") |
562 | + family2 = self.factory.makeProcessorFamily(name="ia64") |
563 | + job = CopyArchiveJob.create( |
564 | + target_archive, args['source_archive'], args['distroseries'], |
565 | + args['pocket'], args['distroseries'], args['pocket'], |
566 | + proc_families=[family1, family2]) |
567 | + self.assertEqual( |
568 | + [f.name for f in [family1, family2]], |
569 | + job.metadata['proc_family_names']) |
570 | + |
571 | + def test_error_on_merge_with_proc_families(self): |
572 | + target_archive = self.factory.makeArchive() |
573 | + args = self.makeDummyArgs() |
574 | + family1 = self.factory.makeProcessorFamily(name="armel") |
575 | + family2 = self.factory.makeProcessorFamily(name="ia64") |
576 | + self.assertRaises( |
577 | + ValueError, CopyArchiveJob.create, target_archive, |
578 | + args['source_archive'], args['distroseries'], args['pocket'], |
579 | + args['distroseries'], args['pocket'], |
580 | + proc_families=[family1, family2], merge=True) |
581 | + |
582 | + def test_create_sets_source_package_set_ids(self): |
583 | + target_archive = self.factory.makeArchive() |
584 | + args = self.makeDummyArgs() |
585 | + packagesets = [ |
586 | + self.factory.makePackageset(), |
587 | + self.factory.makePackageset(), |
588 | + ] |
589 | + job = CopyArchiveJob.create( |
590 | + target_archive, args['source_archive'], args['distroseries'], |
591 | + args['pocket'], args['distroseries'], args['pocket'], |
592 | + packagesets=packagesets) |
593 | + self.assertEqual( |
594 | + [p.name for p in packagesets], job.metadata['packageset_names']) |
595 | + |
596 | + def test_create_sets_merge_False_by_default(self): |
597 | + target_archive = self.factory.makeArchive() |
598 | + args = self.makeDummyArgs() |
599 | + job = CopyArchiveJob.create( |
600 | + target_archive, args['source_archive'], args['distroseries'], |
601 | + args['pocket'], args['distroseries'], args['pocket']) |
602 | + self.assertEqual(False, job.metadata['merge']) |
603 | + |
604 | + def test_create_sets_merge_True_on_request(self): |
605 | + target_archive = self.factory.makeArchive() |
606 | + args = self.makeDummyArgs() |
607 | + job = CopyArchiveJob.create( |
608 | + target_archive, args['source_archive'], args['distroseries'], |
609 | + args['pocket'], args['distroseries'], args['pocket'], merge=True) |
610 | + self.assertEqual(True, job.metadata['merge']) |
611 | + |
612 | + def test_get_source_location(self): |
613 | + target_archive = self.factory.makeArchive() |
614 | + args = self.makeDummyArgs() |
615 | + source_distroseries = self.factory.makeDistroSeries() |
616 | + source_pocket = PackagePublishingPocket.RELEASE |
617 | + target_pocket = PackagePublishingPocket.BACKPORTS |
618 | + job = CopyArchiveJob.create( |
619 | + target_archive, args['source_archive'], source_distroseries, |
620 | + source_pocket, args['distroseries'], target_pocket) |
621 | + location = job.getSourceLocation() |
622 | + expected_location = PackageLocation( |
623 | + args['source_archive'], source_distroseries.distribution, |
624 | + source_distroseries, source_pocket) |
625 | + self.assertEqual(expected_location, location) |
626 | + |
627 | + def test_get_source_location_with_packagesets(self): |
628 | + target_archive = self.factory.makeArchive() |
629 | + args = self.makeDummyArgs() |
630 | + source_distroseries = self.factory.makeDistroSeries() |
631 | + source_pocket = PackagePublishingPocket.RELEASE |
632 | + target_pocket = PackagePublishingPocket.BACKPORTS |
633 | + packagesets = [ |
634 | + self.factory.makePackageset(), |
635 | + self.factory.makePackageset(), |
636 | + ] |
637 | + job = CopyArchiveJob.create( |
638 | + target_archive, args['source_archive'], source_distroseries, |
639 | + source_pocket, args['distroseries'], target_pocket, |
640 | + packagesets=packagesets) |
641 | + location = job.getSourceLocation() |
642 | + expected_location = PackageLocation( |
643 | + args['source_archive'], source_distroseries.distribution, |
644 | + source_distroseries, source_pocket, packagesets=packagesets) |
645 | + self.assertEqual(expected_location, location) |
646 | + |
647 | + def test_get_target_location(self): |
648 | + target_archive = self.factory.makeArchive() |
649 | + args = self.makeDummyArgs() |
650 | + target_distroseries = self.factory.makeDistroSeries() |
651 | + source_pocket = PackagePublishingPocket.RELEASE |
652 | + target_pocket = PackagePublishingPocket.BACKPORTS |
653 | + job = CopyArchiveJob.create( |
654 | + target_archive, args['source_archive'], args['distroseries'], |
655 | + source_pocket, target_distroseries, target_pocket) |
656 | + location = job.getTargetLocation() |
657 | + expected_location = PackageLocation( |
658 | + target_archive, target_distroseries.distribution, |
659 | + target_distroseries, target_pocket) |
660 | + self.assertEqual(expected_location, location) |
661 | + |
662 | + def test_get_target_location_with_component(self): |
663 | + target_archive = self.factory.makeArchive() |
664 | + args = self.makeDummyArgs() |
665 | + target_distroseries = self.factory.makeDistroSeries() |
666 | + source_pocket = PackagePublishingPocket.RELEASE |
667 | + target_pocket = PackagePublishingPocket.BACKPORTS |
668 | + target_component = self.factory.makeComponent() |
669 | + job = CopyArchiveJob.create( |
670 | + target_archive, args['source_archive'], args['distroseries'], |
671 | + source_pocket, target_distroseries, target_pocket, |
672 | + target_component=target_component) |
673 | + location = job.getTargetLocation() |
674 | + expected_location = PackageLocation( |
675 | + target_archive, target_distroseries.distribution, |
676 | + target_distroseries, target_pocket) |
677 | + expected_location.component = target_component |
678 | + self.assertEqual(expected_location, location) |
679 | + |
680 | + def _getJobs(self): |
681 | + """Return the pending CopyArchiveJobs as a list.""" |
682 | + return list(CopyArchiveJob.iterReady()) |
683 | + |
684 | + def _getJobCount(self): |
685 | + """Return the number of CopyArchiveJobs in the queue.""" |
686 | + return len(self._getJobs()) |
687 | + |
688 | + def makeSourceAndTarget(self): |
689 | + distribution = self.factory.makeDistribution(name="foobuntu") |
690 | + distroseries = self.factory.makeDistroSeries( |
691 | + distribution=distribution, name="maudlin") |
692 | + source_archive_owner = self.factory.makePerson(name="source-owner") |
693 | + source_archive = self.factory.makeArchive( |
694 | + name="source", owner=source_archive_owner, |
695 | + purpose=ArchivePurpose.PPA, distribution=distribution) |
696 | + self.factory.makeSourcePackagePublishingHistory( |
697 | + sourcepackagename=self.factory.getOrMakeSourcePackageName( |
698 | + name='bzr'), |
699 | + distroseries=distroseries, component=self.factory.makeComponent(), |
700 | + version="2.1", architecturehintlist='any', |
701 | + archive=source_archive, status=PackagePublishingStatus.PUBLISHED, |
702 | + pocket=PackagePublishingPocket.RELEASE) |
703 | + das = self.factory.makeDistroArchSeries( |
704 | + distroseries=distroseries, architecturetag="i386", |
705 | + processorfamily=ProcessorFamilySet().getByName("x86"), |
706 | + supports_virtualized=True) |
707 | + with celebrity_logged_in('admin'): |
708 | + distroseries.nominatedarchindep = das |
709 | + target_archive_owner = self.factory.makePerson() |
710 | + target_archive = self.factory.makeArchive( |
711 | + purpose=ArchivePurpose.COPY, owner=target_archive_owner, |
712 | + name="test-copy-archive", distribution=distribution, |
713 | + description="Test copy archive", enabled=False) |
714 | + return source_archive, target_archive, distroseries |
715 | + |
716 | + def checkPublishedSources(self, expected, archive, series): |
717 | + # We need to be admin as the archive is disabled at this point. |
718 | + with celebrity_logged_in('admin'): |
719 | + sources = archive.getPublishedSources( |
720 | + distroseries=series, |
721 | + status=( |
722 | + PackagePublishingStatus.PENDING, |
723 | + PackagePublishingStatus.PUBLISHED)) |
724 | + actual = [] |
725 | + for source in sources: |
726 | + actual.append( |
727 | + (source.source_package_name, |
728 | + source.source_package_version)) |
729 | + self.assertEqual(sorted(expected), sorted(actual)) |
730 | + |
731 | + def test_run(self): |
732 | + """Test that CopyArchiveJob.run() actually copies the archive. |
733 | + |
734 | + We just make a simple test here, and rely on PackageCloner tests |
735 | + to cover the functionality. |
736 | + """ |
737 | + source_archive, target_archive, series = self.makeSourceAndTarget() |
738 | + job = CopyArchiveJob.create( |
739 | + target_archive, source_archive, series, |
740 | + PackagePublishingPocket.RELEASE, series, |
741 | + PackagePublishingPocket.RELEASE) |
742 | + job.run() |
743 | + self.checkPublishedSources([("bzr", "2.1")], target_archive, series) |
744 | + |
745 | + def test_run_mergeCopy(self): |
746 | + """Test that CopyArchiveJob.run() when merge=True does a mergeCopy.""" |
747 | + source_archive, target_archive, series = self.makeSourceAndTarget() |
748 | + # Create the copy archive |
749 | + job = CopyArchiveJob.create( |
750 | + target_archive, source_archive, series, |
751 | + PackagePublishingPocket.RELEASE, series, |
752 | + PackagePublishingPocket.RELEASE) |
753 | + job.start() |
754 | + job.run() |
755 | + job.complete() |
756 | + # Now the two archives are in the same state, so we change the |
757 | + # source archive and request a merge to check that it works. |
758 | + # Create a new version of the apt package in the source |
759 | + self.factory.makeSourcePackagePublishingHistory( |
760 | + sourcepackagename=self.factory.getOrMakeSourcePackageName( |
761 | + name='apt'), |
762 | + distroseries=series, component=self.factory.makeComponent(), |
763 | + version="1.2", architecturehintlist='any', |
764 | + archive=source_archive, status=PackagePublishingStatus.PUBLISHED, |
765 | + pocket=PackagePublishingPocket.RELEASE) |
766 | + # Create a job to merge |
767 | + job = CopyArchiveJob.create( |
768 | + target_archive, source_archive, series, |
769 | + PackagePublishingPocket.RELEASE, series, |
770 | + PackagePublishingPocket.RELEASE, merge=True) |
771 | + job.run() |
772 | + # Check that the new apt package is in the target |
773 | + self.checkPublishedSources( |
774 | + [("bzr", "2.1"), ("apt", "1.2")], target_archive, series) |
775 | + |
776 | + def test_run_with_proc_families(self): |
777 | + """Test that a CopyArchiveJob job with proc_families uses them. |
778 | + |
779 | + If we create a CopyArchiveJob with proc_families != None then |
780 | + they should be used when cloning packages. |
781 | + """ |
782 | + source_archive, target_archive, series = self.makeSourceAndTarget() |
783 | + proc_families = [ProcessorFamilySet().getByName("x86")] |
784 | + job = CopyArchiveJob.create( |
785 | + target_archive, source_archive, series, |
786 | + PackagePublishingPocket.RELEASE, series, |
787 | + PackagePublishingPocket.RELEASE, proc_families=proc_families) |
788 | + job.run() |
789 | + builds = list( |
790 | + getUtility(IBinaryPackageBuildSet).getBuildsForArchive( |
791 | + target_archive, status=BuildStatus.NEEDSBUILD)) |
792 | + actual_builds = list() |
793 | + for build in builds: |
794 | + naked_build = removeSecurityProxy(build) |
795 | + spr = naked_build.source_package_release |
796 | + actual_builds.append( |
797 | + (spr.name, spr.version, naked_build.processor.family.name)) |
798 | + # One build for the one package, as we specified one processor |
799 | + # family. |
800 | + self.assertEqual([("bzr", "2.1", "x86")], actual_builds) |
Please ignore the schema/sampledata changes, they are in the pre-requisite branch.
Thanks,
James