Merge lp:~james-w/launchpad/copy-archive-job into lp:launchpad/db-devel

Proposed by James Westby
Status: Merged
Approved by: Edwin Grubbs
Approved revision: no longer in the source branch.
Merged at revision: 9598
Proposed branch: lp:~james-w/launchpad/copy-archive-job
Merge into: lp:launchpad/db-devel
Prerequisite: lp:~james-w/launchpad/archive-job-db
Diff against target: 800 lines (+765/-0)
6 files modified
database/schema/security.cfg (+1/-0)
lib/lp/soyuz/interfaces/archivejob.py (+55/-0)
lib/lp/soyuz/model/archivejob.py (+130/-0)
lib/lp/soyuz/model/copyarchivejob.py (+140/-0)
lib/lp/soyuz/tests/test_archivejob.py (+48/-0)
lib/lp/soyuz/tests/test_copyarchivejob.py (+391/-0)
To merge this branch: bzr merge lp:~james-w/launchpad/copy-archive-job
Reviewer Review Type Date Requested Status
Edwin Grubbs (community) code Approve
Review via email: mp+28439@code.launchpad.net

Commit message

Add CopyArchiveJob, a job to, um... copy archives.

Description of the change

Summary

This adds the model classes and code for CopyArchiveJob, with no users.

Currently copying archives is done by a script, and it has a table for
storing requests while it is doing it so that they could be deferred if
desired, though that was never fully implemented.

This lays the groundwork so that we can use the job system for copying
archives.

Proposed fix

This is mainly boilerplate code for a new job type, IArchiveJob, and
the implementation for a single job type. We can have the script
use this later, and write the cronscript to process them.

Pre-implementation notes

Julian confirmed that using the job system is the right thing to do.

Implementation details

We don't do what the other job types do and return the current request
if the same target archive is chosen, we error instead. I chose this as
it only makes sense for a single copy to be targeting an archive at one
time, but currently they will only be done in response to a user request,
so we should tell the user what is going on.

We make a lot more use of the metadata than the other job types do. While
it is inconvenient and leaves room for errors it is a lot easier than
creating a very specific job type that has references for all these things.

We rely on tests of PackageCloner for testing most of the behaviour of run(),
we just test enough to be fairly confident that we are passing the right
arguments to PackageCloner.

Tests

./bin/test -s lp.soyuz.tests -m test_copyarchivejob
./bin/test -s lp.soyuz.tests -m test_archivejob

Demo and Q/A

This will need QA when we convert the populate_archive script to use it.

lint

== Pyflakes notices ==

lib/lp/soyuz/scripts/populate_archive.py
    112: local variable 'ignore_this' is assigned to but never used
    279: local variable 'ignore_result' is assigned to but never used

== Pylint notices ==

lib/lp/soyuz/interfaces/archivejob.py
    39: [E0211, IArchiveJob.destroySelf] Method has no argument
    46: [E0213, IArchiveJobSource.create] Method should have "self" as first argument

lib/lp/soyuz/model/copyarchivejob.py
    29: [E1002, CopyArchiveJob.create] Use super on an old style class

Thanks,

James

To post a comment you must log in.
Revision history for this message
James Westby (james-w) wrote :

Please ignore the schema/sampledata changes, they are in the pre-requisite branch.

Thanks,

James

Revision history for this message
Guilherme Salgado (salgado) wrote :

I haven't done a through review as I'm not familiar with the job system, so wouldn't be comfortable doing it. I have a few comments, though.

I assume the new tables created to describe a new job are standardised, but we need to have the new one reviewed by either stub or jml.

It's weird to see new code raising SQLObjectNotFound. Does the job system expect that?

I think your tests can do with just the DatabaseFunctionalLayer, which might be faster than LPZopelessLayer.

I'll be happy to do a thorough review tomorrow, if you think it's worth it, but I'd like someone more familiar with the job system to review it as well.

Revision history for this message
James Westby (james-w) wrote :

Hi,

I adjusted the layer, and the db tables have been approved.

The SQLObjectNotFound is what the code jobs use, and no-one was
able to suggest anything better when I asked on IRC just now.

Thanks,

James

Revision history for this message
Edwin Grubbs (edwin-grubbs) wrote :
Download full text (8.2 KiB)

Hi James,

This branch looks very good. I have minior comments below.

merge-conditional

-Edwin

>=== added file 'lib/lp/soyuz/interfaces/archivejob.py'
>--- lib/lp/soyuz/interfaces/archivejob.py 1970-01-01 00:00:00 +0000
>+++ lib/lp/soyuz/interfaces/archivejob.py 2010-07-28 19:38:05 +0000
>@@ -0,0 +1,55 @@
>+from zope.interface import Attribute, Interface
>+from zope.schema import Int, Object
>+
>+from canonical.launchpad import _
>+
>+from lazr.enum import DBEnumeratedType, DBItem
>+
>+from lp.services.job.interfaces.job import IJob, IJobSource, IRunnableJob
>+from lp.soyuz.interfaces.archive import IArchive
>+
>+
>+class ArchiveJobType(DBEnumeratedType):
>+ """Values that IArchiveJob.job_type can take."""
>+
>+ COPY_ARCHIVE = DBItem(0, """
>+ Create a copy archive.
>+
>+ This job creates a copy archive from the current state of
>+ the archive.
>+ """)
>+
>+
>+class IArchiveJob(Interface):
>+ """A Job related to an Archive."""
>+
>+ id = Int(
>+ title=_('DB ID'), required=True, readonly=True,
>+ description=_("The tracking number for this job."))
>+
>+ archive = Object(
>+ title=_('The Archive this job is about.'), schema=IArchive,

It looks peculiar to occasionally capitalize the noun. For example,
"job" isn't capitalized here, but it is capitalized in the following title.
I think capitalizing to match the class names is better left to doc
strings, since field titles are often exposed to the user, even if that
would never happen for this interface.

>+ required=True)
>+
>+ job = Object(
>+ title=_('The common Job attributes'), schema=IJob, required=True)
>+
>+ metadata = Attribute('A dict of data about the job.')
>+
>+ def destroySelf():
>+ """Destroy this object."""
>+
>+
>+class IArchiveJobSource(IJobSource):
>+ """An interface for acquiring IArchiveJobs."""
>+
>+ def create(archive):
>+ """Create a new IArchiveJobs for an archive."""
>+
>+
>+class ICopyArchiveJob(IRunnableJob):
>+ """A Job to copy archives."""
>+
>+
>+class ICopyArchiveJobSource(IArchiveJobSource):
>+ """Interface for acquiring CopyArchiveJobs."""
>
>=== added file 'lib/lp/soyuz/model/archivejob.py'
>--- lib/lp/soyuz/model/archivejob.py 1970-01-01 00:00:00 +0000
>+++ lib/lp/soyuz/model/archivejob.py 2010-07-28 19:38:05 +0000
>@@ -0,0 +1,130 @@
>+
>+__metaclass__ = object
>+
>+import simplejson
>+
>+from sqlobject import SQLObjectNotFound
>+from storm.base import Storm
>+from storm.expr import And
>+from storm.locals import Int, Reference, Unicode
>+
>+from zope.component import getUtility
>+from zope.interface import classProvides, implements
>+
>+from canonical.database.enumcol import EnumCol
>+from canonical.launchpad.webapp.interfaces import (
>+ DEFAULT_FLAVOR, IStoreSelector, MAIN_STORE, MASTER_FLAVOR)
>+
>+from lazr.delegates import delegates
>+
>+from lp.services.job.model.job import Job
>+from lp.services.job.runner import BaseRunnableJob
>+from lp.soyuz.interfaces.archivejob import (
>+ ArchiveJobType, IArchiveJob, IArchiveJobSource)
>+from lp.soyuz.model.archive import Archive
>+
>+
>+class ArchiveJob(Storm):
>+ """Base class for jobs r...

Read more...

review: Approve (code)
Revision history for this message
James Westby (james-w) wrote :

> Here you are creating a SPPH record for the source_archive, and
> makeSourceAndTarget() creates a SPPH record for the source_archive. To
> test the merge, shouldn't the to packages be in different archives?

It's not the best test, as the setup runs a job.

What we do is create a source archive, and then clone it to a target archive,
we then modify the source archive and request a "merge" operation, which should
copy the new publication we create in the source.

I'm not sure of a better way to write test to make that clear, so I added
comments.

All other changes made as suggested.

Thanks,

James

Preview Diff

[H/L] Next/Prev Comment, [J/K] Next/Prev File, [N/P] Next/Prev Hunk
=== modified file 'database/schema/security.cfg'
--- database/schema/security.cfg 2010-07-30 15:59:06 +0000
+++ database/schema/security.cfg 2010-07-30 18:33:56 +0000
@@ -893,6 +893,7 @@
893public.account = SELECT, INSERT, UPDATE893public.account = SELECT, INSERT, UPDATE
894public.accountpassword = SELECT, INSERT894public.accountpassword = SELECT, INSERT
895public.archive = SELECT, INSERT, UPDATE895public.archive = SELECT, INSERT, UPDATE
896public.archivejob = SELECT, INSERT
896public.archivearch = SELECT, INSERT, UPDATE, DELETE897public.archivearch = SELECT, INSERT, UPDATE, DELETE
897public.binarypackagerelease = SELECT, INSERT, UPDATE898public.binarypackagerelease = SELECT, INSERT, UPDATE
898public.binarypackagefile = SELECT, INSERT, UPDATE899public.binarypackagefile = SELECT, INSERT, UPDATE
899900
=== added file 'lib/lp/soyuz/interfaces/archivejob.py'
--- lib/lp/soyuz/interfaces/archivejob.py 1970-01-01 00:00:00 +0000
+++ lib/lp/soyuz/interfaces/archivejob.py 2010-07-30 18:33:56 +0000
@@ -0,0 +1,55 @@
1from zope.interface import Attribute, Interface
2from zope.schema import Int, Object
3
4from canonical.launchpad import _
5
6from lazr.enum import DBEnumeratedType, DBItem
7
8from lp.services.job.interfaces.job import IJob, IJobSource, IRunnableJob
9from lp.soyuz.interfaces.archive import IArchive
10
11
12class ArchiveJobType(DBEnumeratedType):
13 """Values that IArchiveJob.job_type can take."""
14
15 COPY_ARCHIVE = DBItem(0, """
16 Create a copy archive.
17
18 This job creates a copy archive from the current state of
19 the archive.
20 """)
21
22
23class IArchiveJob(Interface):
24 """A Job related to an Archive."""
25
26 id = Int(
27 title=_('DB ID'), required=True, readonly=True,
28 description=_("The tracking number for this job."))
29
30 archive = Object(
31 title=_('The archive this job is about.'), schema=IArchive,
32 required=True)
33
34 job = Object(
35 title=_('The common Job attributes'), schema=IJob, required=True)
36
37 metadata = Attribute('A dict of data about the job.')
38
39 def destroySelf():
40 """Destroy this object."""
41
42
43class IArchiveJobSource(IJobSource):
44 """An interface for acquiring IArchiveJobs."""
45
46 def create(archive):
47 """Create a new IArchiveJobs for an archive."""
48
49
50class ICopyArchiveJob(IRunnableJob):
51 """A Job to copy archives."""
52
53
54class ICopyArchiveJobSource(IArchiveJobSource):
55 """Interface for acquiring CopyArchiveJobs."""
056
=== added file 'lib/lp/soyuz/model/archivejob.py'
--- lib/lp/soyuz/model/archivejob.py 1970-01-01 00:00:00 +0000
+++ lib/lp/soyuz/model/archivejob.py 2010-07-30 18:33:56 +0000
@@ -0,0 +1,130 @@
1
2__metaclass__ = object
3
4import simplejson
5
6from sqlobject import SQLObjectNotFound
7from storm.base import Storm
8from storm.expr import And
9from storm.locals import Int, Reference, Unicode
10
11from zope.component import getUtility
12from zope.interface import classProvides, implements
13
14from canonical.database.enumcol import EnumCol
15from canonical.launchpad.webapp.interfaces import (
16 DEFAULT_FLAVOR, IStoreSelector, MAIN_STORE, MASTER_FLAVOR)
17
18from lazr.delegates import delegates
19
20from lp.services.job.model.job import Job
21from lp.services.job.runner import BaseRunnableJob
22from lp.soyuz.interfaces.archivejob import (
23 ArchiveJobType, IArchiveJob, IArchiveJobSource)
24from lp.soyuz.model.archive import Archive
25
26
27class ArchiveJob(Storm):
28 """Base class for jobs related to Archives."""
29
30 implements(IArchiveJob)
31
32 __storm_table__ = 'archivejob'
33
34 id = Int(primary=True)
35
36 job_id = Int(name='job')
37 job = Reference(job_id, Job.id)
38
39 archive_id = Int(name='archive')
40 archive = Reference(archive_id, Archive.id)
41
42 job_type = EnumCol(enum=ArchiveJobType, notNull=True)
43
44 _json_data = Unicode('json_data')
45
46 @property
47 def metadata(self):
48 return simplejson.loads(self._json_data)
49
50 def __init__(self, archive, job_type, metadata):
51 """Create an ArchiveJob.
52
53 :param archive: the archive this job relates to.
54 :param job_type: the bugjobtype of this job.
55 :param metadata: the type-specific variables, as a json-compatible
56 dict.
57 """
58 super(ArchiveJob, self).__init__()
59 json_data = simplejson.dumps(metadata)
60 self.job = Job()
61 self.archive = archive
62 self.job_type = job_type
63 # XXX AaronBentley 2009-01-29 bug=322819: This should be a bytestring,
64 # but the db representation is unicode.
65 self._json_data = json_data.decode('utf-8')
66
67 @classmethod
68 def get(cls, key):
69 """Return the instance of this class whose key is supplied."""
70 store = getUtility(IStoreSelector).get(MAIN_STORE, DEFAULT_FLAVOR)
71 instance = store.get(cls, key)
72 if instance is None:
73 raise SQLObjectNotFound(
74 'No occurence of %s has key %s' % (cls.__name__, key))
75 return instance
76
77
78class ArchiveJobDerived(BaseRunnableJob):
79 """Intermediate class for deriving from ArchiveJob."""
80 delegates(IArchiveJob)
81 classProvides(IArchiveJobSource)
82
83 def __init__(self, job):
84 self.context = job
85
86 @classmethod
87 def create(cls, archive, metadata=None):
88 """See `IArchiveJob`."""
89 if metadata is None:
90 metadata = {}
91 job = ArchiveJob(archive, cls.class_job_type, metadata)
92 return cls(job)
93
94 @classmethod
95 def get(cls, job_id):
96 """Get a job by id.
97
98 :return: the ArchiveJob with the specified id, as the current
99 BugJobDerived subclass.
100 :raises: SQLObjectNotFound if there is no job with the specified id,
101 or its job_type does not match the desired subclass.
102 """
103 job = ArchiveJob.get(job_id)
104 if job.job_type != cls.class_job_type:
105 raise SQLObjectNotFound(
106 'No object found with id %d and type %s' % (job_id,
107 cls.class_job_type.title))
108 return cls(job)
109
110 @classmethod
111 def iterReady(cls):
112 """Iterate through all ready BugJobs."""
113 store = getUtility(IStoreSelector).get(MAIN_STORE, MASTER_FLAVOR)
114 jobs = store.find(
115 ArchiveJob,
116 And(ArchiveJob.job_type == cls.class_job_type,
117 ArchiveJob.job == Job.id,
118 Job.id.is_in(Job.ready_jobs),
119 ArchiveJob.archive == Archive.id))
120 return (cls(job) for job in jobs)
121
122 def getOopsVars(self):
123 """See `IRunnableJob`."""
124 vars = BaseRunnableJob.getOopsVars(self)
125 vars.extend([
126 ('archive_id', self.context.archive.id),
127 ('archive_job_id', self.context.id),
128 ('archive_job_type', self.context.job_type.title),
129 ])
130 return vars
0131
=== added file 'lib/lp/soyuz/model/copyarchivejob.py'
--- lib/lp/soyuz/model/copyarchivejob.py 1970-01-01 00:00:00 +0000
+++ lib/lp/soyuz/model/copyarchivejob.py 2010-07-30 18:33:56 +0000
@@ -0,0 +1,140 @@
1
2__metaclass__ = object
3
4from zope.component import getUtility
5from zope.interface import classProvides, implements
6
7from canonical.launchpad.webapp.interfaces import (
8 DEFAULT_FLAVOR, IStoreSelector, MAIN_STORE)
9
10from lp.registry.interfaces.distroseries import IDistroSeriesSet
11from lp.registry.interfaces.pocket import PackagePublishingPocket
12from lp.services.job.model.job import Job
13from lp.soyuz.adapters.packagelocation import PackageLocation
14from lp.soyuz.interfaces.archive import IArchiveSet
15from lp.soyuz.interfaces.archivejob import (
16 ArchiveJobType, ICopyArchiveJob, ICopyArchiveJobSource)
17from lp.soyuz.interfaces.packagecloner import IPackageCloner
18from lp.soyuz.interfaces.packageset import IPackagesetSet
19from lp.soyuz.interfaces.processor import IProcessorFamilySet
20from lp.soyuz.interfaces.component import IComponentSet
21from lp.soyuz.model.archivejob import ArchiveJob, ArchiveJobDerived
22
23
24class CopyArchiveJob(ArchiveJobDerived):
25
26 implements(ICopyArchiveJob)
27
28 class_job_type = ArchiveJobType.COPY_ARCHIVE
29 classProvides(ICopyArchiveJobSource)
30
31 @classmethod
32 def create(cls, target_archive, source_archive,
33 source_series, source_pocket, target_series, target_pocket,
34 target_component=None, proc_families=None, packagesets=None,
35 merge=False):
36 """See `ICopyArchiveJobSource`."""
37 store = getUtility(IStoreSelector).get(MAIN_STORE, DEFAULT_FLAVOR)
38 job_for_archive = store.find(
39 ArchiveJob,
40 ArchiveJob.archive == target_archive,
41 ArchiveJob.job_type == cls.class_job_type,
42 ArchiveJob.job == Job.id,
43 Job.id.is_in(Job.ready_jobs)
44 ).any()
45
46 if job_for_archive is not None:
47 raise ValueError(
48 "CopyArchiveJob already in progress for %s" % target_archive)
49 else:
50 if proc_families is None:
51 proc_families = []
52 if len(proc_families) > 0 and merge:
53 raise ValueError("Can't specify the architectures for merge.")
54 proc_family_names = [p.name for p in proc_families]
55 if packagesets is None:
56 packagesets = []
57 packageset_names = [p.name for p in packagesets]
58 target_component_id = None
59 if target_component is not None:
60 target_component_id = target_component.id
61 metadata = {
62 'source_archive_id': source_archive.id,
63 'source_distroseries_id': source_series.id,
64 'source_pocket_value': source_pocket.value,
65 'target_distroseries_id': target_series.id,
66 'target_pocket_value': target_pocket.value,
67 'target_component_id': target_component_id,
68 'proc_family_names': proc_family_names,
69 'packageset_names': packageset_names,
70 'merge': merge,
71 }
72 return super(CopyArchiveJob, cls).create(target_archive, metadata)
73
74 def getOopsVars(self):
75 """See `ArchiveJobDerived`."""
76 vars = ArchiveJobDerived.getOopsVars(self)
77 vars.extend([
78 ('source_archive_id', self.metadata['source_archive_id']),
79 ('source_distroseries_id',
80 self.metadata['source_distroseries_id']),
81 ('target_distroseries_id',
82 self.metadata['target_distroseries_id']),
83 ('source_pocket_value', self.metadata['source_pocket_value']),
84 ('target_pocket_value', self.metadata['target_pocket_value']),
85 ('target_component_id', self.metadata['target_component_id']),
86 ('merge', self.metadata['merge']),
87 ])
88 return vars
89
90 def getSourceLocation(self):
91 """Get the PackageLocation for the source."""
92 # TODO: handle things going bye-bye before we get here.
93 source_archive_id = self.metadata['source_archive_id']
94 source_archive = getUtility(IArchiveSet).get(source_archive_id)
95 source_distroseries_id = self.metadata['source_distroseries_id']
96 source_distroseries = getUtility(IDistroSeriesSet).get(
97 source_distroseries_id)
98 source_distribution = source_distroseries.distribution
99 source_pocket_value = self.metadata['source_pocket_value']
100 source_pocket = PackagePublishingPocket.items[source_pocket_value]
101 packageset_names = self.metadata['packageset_names']
102 packagesets = [getUtility(IPackagesetSet).getByName(name)
103 for name in packageset_names]
104 source_location = PackageLocation(
105 source_archive, source_distribution, source_distroseries,
106 source_pocket, packagesets=packagesets)
107 return source_location
108
109 def getTargetLocation(self):
110 """Get the PackageLocation for the target."""
111 # TODO: handle things going bye-bye before we get here.
112 target_distroseries_id = self.metadata['target_distroseries_id']
113 target_distroseries = getUtility(IDistroSeriesSet).get(
114 target_distroseries_id)
115 target_distribution = target_distroseries.distribution
116 target_pocket_value = self.metadata['target_pocket_value']
117 target_pocket = PackagePublishingPocket.items[target_pocket_value]
118 target_location = PackageLocation(
119 self.archive, target_distribution, target_distroseries,
120 target_pocket)
121 target_component_id = self.metadata['target_component_id']
122 if target_component_id is not None:
123 target_location.component = getUtility(IComponentSet).get(
124 target_component_id)
125 return target_location
126
127 def run(self):
128 """See `IRunnableJob`."""
129 source_location = self.getSourceLocation()
130 target_location = self.getTargetLocation()
131 proc_family_names = self.metadata['proc_family_names']
132 proc_family_set = getUtility(IProcessorFamilySet)
133 proc_families = [proc_family_set.getByName(p)
134 for p in proc_family_names]
135 package_cloner = getUtility(IPackageCloner)
136 if self.metadata['merge']:
137 package_cloner.mergeCopy(source_location, target_location)
138 else:
139 package_cloner.clonePackages(
140 source_location, target_location, proc_families=proc_families)
0141
=== added file 'lib/lp/soyuz/tests/test_archivejob.py'
--- lib/lp/soyuz/tests/test_archivejob.py 1970-01-01 00:00:00 +0000
+++ lib/lp/soyuz/tests/test_archivejob.py 2010-07-30 18:33:56 +0000
@@ -0,0 +1,48 @@
1import unittest
2
3from canonical.testing import DatabaseFunctionalLayer
4
5from lp.soyuz.interfaces.archivejob import ArchiveJobType
6from lp.soyuz.model.archivejob import ArchiveJob, ArchiveJobDerived
7from lp.testing import TestCaseWithFactory
8
9
10class ArchiveJobTestCase(TestCaseWithFactory):
11 """Test case for basic ArchiveJob gubbins."""
12
13 layer = DatabaseFunctionalLayer
14
15 def test_instantiate(self):
16 # ArchiveJob.__init__() instantiates a ArchiveJob instance.
17 archive = self.factory.makeArchive()
18
19 metadata = ('some', 'arbitrary', 'metadata')
20 archive_job = ArchiveJob(
21 archive, ArchiveJobType.COPY_ARCHIVE, metadata)
22
23 self.assertEqual(archive, archive_job.archive)
24 self.assertEqual(ArchiveJobType.COPY_ARCHIVE, archive_job.job_type)
25
26 # When we actually access the ArchiveJob's metadata it gets
27 # deserialized from JSON, so the representation returned by
28 # archive_job.metadata will be different from what we originally
29 # passed in.
30 metadata_expected = [u'some', u'arbitrary', u'metadata']
31 self.assertEqual(metadata_expected, archive_job.metadata)
32
33
34class ArchiveJobDerivedTestCase(TestCaseWithFactory):
35 """Test case for the ArchiveJobDerived class."""
36
37 layer = DatabaseFunctionalLayer
38
39 def test_create_explodes(self):
40 # ArchiveJobDerived.create() will blow up because it needs to be
41 # subclassed to work properly.
42 archive = self.factory.makeArchive()
43 self.assertRaises(
44 AttributeError, ArchiveJobDerived.create, archive)
45
46
47def test_suite():
48 return unittest.TestLoader().loadTestsFromName(__name__)
049
=== added file 'lib/lp/soyuz/tests/test_copyarchivejob.py'
--- lib/lp/soyuz/tests/test_copyarchivejob.py 1970-01-01 00:00:00 +0000
+++ lib/lp/soyuz/tests/test_copyarchivejob.py 2010-07-30 18:33:56 +0000
@@ -0,0 +1,391 @@
1# Copyright 2009 Canonical Ltd. This software is licensed under the
2# GNU Affero General Public License version 3 (see the file LICENSE).
3
4from __future__ import with_statement
5
6__metaclass__ = type
7
8from zope.component import getUtility
9from zope.security.proxy import removeSecurityProxy
10
11from canonical.testing import DatabaseFunctionalLayer
12
13from lp.buildmaster.interfaces.buildbase import BuildStatus
14from lp.registry.interfaces.pocket import PackagePublishingPocket
15from lp.soyuz.adapters.packagelocation import PackageLocation
16from lp.soyuz.interfaces.archive import ArchivePurpose
17from lp.soyuz.interfaces.binarypackagebuild import IBinaryPackageBuildSet
18from lp.soyuz.interfaces.publishing import PackagePublishingStatus
19from lp.soyuz.model.copyarchivejob import CopyArchiveJob
20from lp.soyuz.model.processor import ProcessorFamilySet
21from lp.testing import celebrity_logged_in, TestCaseWithFactory
22
23
24class CopyArchiveJobTests(TestCaseWithFactory):
25 """Tests for CopyArchiveJob."""
26
27 layer = DatabaseFunctionalLayer
28
29 def test_getOopsVars(self):
30 archive = self.factory.makeArchive()
31 args = self.makeDummyArgs()
32 target_distroseries = self.factory.makeDistroSeries()
33 source_pocket = PackagePublishingPocket.RELEASE
34 target_pocket = PackagePublishingPocket.BACKPORTS
35 target_component = self.factory.makeComponent()
36 job = CopyArchiveJob.create(
37 archive, args['source_archive'], args['distroseries'],
38 source_pocket, target_distroseries, target_pocket,
39 target_component=target_component)
40 vars = job.getOopsVars()
41 self.assertIn(('archive_id', archive.id), vars)
42 self.assertIn(('archive_job_id', job.context.id), vars)
43 self.assertIn(('archive_job_type', job.context.job_type.title), vars)
44 self.assertIn(('source_archive_id', args['source_archive'].id), vars)
45 self.assertIn(
46 ('source_distroseries_id', args['distroseries'].id), vars)
47 self.assertIn(
48 ('target_distroseries_id', target_distroseries.id), vars)
49 self.assertIn(('source_pocket_value', source_pocket.value), vars)
50 self.assertIn(('target_pocket_value', target_pocket.value), vars)
51 self.assertIn(
52 ('target_component_id', target_component.id), vars)
53 self.assertIn(('merge', False), vars)
54
55 def makeDummyArgs(self):
56 args = {}
57 distro = self.factory.makeDistribution()
58 args['distroseries'] = self.factory.makeDistroSeries(
59 distribution=distro)
60 args['pocket'] = self.factory.getAnyPocket()
61 args['source_archive'] = self.factory.makeArchive(
62 distribution=distro)
63 return args
64
65 def test_error_if_already_exists(self):
66 target_archive = self.factory.makeArchive()
67 args = self.makeDummyArgs()
68 CopyArchiveJob.create(
69 target_archive, args['source_archive'], args['distroseries'],
70 args['pocket'], args['distroseries'], args['pocket'])
71 self.assertEqual(1, self._getJobCount())
72 args = self.makeDummyArgs()
73 self.assertRaises(
74 ValueError, CopyArchiveJob.create, target_archive,
75 args['source_archive'], args['distroseries'], args['pocket'],
76 args['distroseries'], args['pocket'])
77
78 def test_create_sets_source_archive_id(self):
79 target_archive = self.factory.makeArchive()
80 args = self.makeDummyArgs()
81 source_archive = self.factory.makeArchive()
82 job = CopyArchiveJob.create(
83 target_archive, source_archive, args['distroseries'],
84 args['pocket'], args['distroseries'], args['pocket'])
85 self.assertEqual(
86 source_archive.id, job.metadata['source_archive_id'])
87
88 def test_create_sets_source_series_id(self):
89 target_archive = self.factory.makeArchive()
90 args = self.makeDummyArgs()
91 source_distroseries = self.factory.makeDistroSeries()
92 job = CopyArchiveJob.create(
93 target_archive, args['source_archive'], source_distroseries,
94 args['pocket'], args['distroseries'], args['pocket'])
95 self.assertEqual(
96 source_distroseries.id, job.metadata['source_distroseries_id'])
97
98 def test_create_sets_source_pocket_value(self):
99 target_archive = self.factory.makeArchive()
100 args = self.makeDummyArgs()
101 source_pocket = PackagePublishingPocket.RELEASE
102 target_pocket = PackagePublishingPocket.BACKPORTS
103 job = CopyArchiveJob.create(
104 target_archive, args['source_archive'], args['distroseries'],
105 source_pocket, args['distroseries'], target_pocket)
106 self.assertEqual(
107 source_pocket.value, job.metadata['source_pocket_value'])
108
109 def test_create_sets_target_pocket_value(self):
110 target_archive = self.factory.makeArchive()
111 args = self.makeDummyArgs()
112 source_pocket = PackagePublishingPocket.RELEASE
113 target_pocket = PackagePublishingPocket.BACKPORTS
114 job = CopyArchiveJob.create(
115 target_archive, args['source_archive'], args['distroseries'],
116 source_pocket, args['distroseries'], target_pocket)
117 self.assertEqual(
118 target_pocket.value, job.metadata['target_pocket_value'])
119
120 def test_create_sets_target_distroseries_id(self):
121 target_archive = self.factory.makeArchive()
122 args = self.makeDummyArgs()
123 target_distroseries = self.factory.makeDistroSeries()
124 job = CopyArchiveJob.create(
125 target_archive, args['source_archive'], args['distroseries'],
126 args['pocket'], target_distroseries, args['pocket'])
127 self.assertEqual(
128 target_distroseries.id, job.metadata['target_distroseries_id'])
129
130 def test_create_sets_target_component_id(self):
131 target_archive = self.factory.makeArchive()
132 args = self.makeDummyArgs()
133 target_component = self.factory.makeComponent()
134 job = CopyArchiveJob.create(
135 target_archive, args['source_archive'], args['distroseries'],
136 args['pocket'], args['distroseries'], args['pocket'],
137 target_component=target_component)
138 self.assertEqual(
139 target_component.id, job.metadata['target_component_id'])
140
141 def test_create_sets_target_component_id_to_None_if_unspecified(self):
142 target_archive = self.factory.makeArchive()
143 args = self.makeDummyArgs()
144 job = CopyArchiveJob.create(
145 target_archive, args['source_archive'], args['distroseries'],
146 args['pocket'], args['distroseries'], args['pocket'])
147 self.assertEqual(None, job.metadata['target_component_id'])
148
149 def test_create_sets_proc_family_ids(self):
150 target_archive = self.factory.makeArchive()
151 args = self.makeDummyArgs()
152 family1 = self.factory.makeProcessorFamily(name="armel")
153 family2 = self.factory.makeProcessorFamily(name="ia64")
154 job = CopyArchiveJob.create(
155 target_archive, args['source_archive'], args['distroseries'],
156 args['pocket'], args['distroseries'], args['pocket'],
157 proc_families=[family1, family2])
158 self.assertEqual(
159 [f.name for f in [family1, family2]],
160 job.metadata['proc_family_names'])
161
162 def test_error_on_merge_with_proc_families(self):
163 target_archive = self.factory.makeArchive()
164 args = self.makeDummyArgs()
165 family1 = self.factory.makeProcessorFamily(name="armel")
166 family2 = self.factory.makeProcessorFamily(name="ia64")
167 self.assertRaises(
168 ValueError, CopyArchiveJob.create, target_archive,
169 args['source_archive'], args['distroseries'], args['pocket'],
170 args['distroseries'], args['pocket'],
171 proc_families=[family1, family2], merge=True)
172
173 def test_create_sets_source_package_set_ids(self):
174 target_archive = self.factory.makeArchive()
175 args = self.makeDummyArgs()
176 packagesets = [
177 self.factory.makePackageset(),
178 self.factory.makePackageset(),
179 ]
180 job = CopyArchiveJob.create(
181 target_archive, args['source_archive'], args['distroseries'],
182 args['pocket'], args['distroseries'], args['pocket'],
183 packagesets=packagesets)
184 self.assertEqual(
185 [p.name for p in packagesets], job.metadata['packageset_names'])
186
187 def test_create_sets_merge_False_by_default(self):
188 target_archive = self.factory.makeArchive()
189 args = self.makeDummyArgs()
190 job = CopyArchiveJob.create(
191 target_archive, args['source_archive'], args['distroseries'],
192 args['pocket'], args['distroseries'], args['pocket'])
193 self.assertEqual(False, job.metadata['merge'])
194
195 def test_create_sets_merge_True_on_request(self):
196 target_archive = self.factory.makeArchive()
197 args = self.makeDummyArgs()
198 job = CopyArchiveJob.create(
199 target_archive, args['source_archive'], args['distroseries'],
200 args['pocket'], args['distroseries'], args['pocket'], merge=True)
201 self.assertEqual(True, job.metadata['merge'])
202
203 def test_get_source_location(self):
204 target_archive = self.factory.makeArchive()
205 args = self.makeDummyArgs()
206 source_distroseries = self.factory.makeDistroSeries()
207 source_pocket = PackagePublishingPocket.RELEASE
208 target_pocket = PackagePublishingPocket.BACKPORTS
209 job = CopyArchiveJob.create(
210 target_archive, args['source_archive'], source_distroseries,
211 source_pocket, args['distroseries'], target_pocket)
212 location = job.getSourceLocation()
213 expected_location = PackageLocation(
214 args['source_archive'], source_distroseries.distribution,
215 source_distroseries, source_pocket)
216 self.assertEqual(expected_location, location)
217
218 def test_get_source_location_with_packagesets(self):
219 target_archive = self.factory.makeArchive()
220 args = self.makeDummyArgs()
221 source_distroseries = self.factory.makeDistroSeries()
222 source_pocket = PackagePublishingPocket.RELEASE
223 target_pocket = PackagePublishingPocket.BACKPORTS
224 packagesets = [
225 self.factory.makePackageset(),
226 self.factory.makePackageset(),
227 ]
228 job = CopyArchiveJob.create(
229 target_archive, args['source_archive'], source_distroseries,
230 source_pocket, args['distroseries'], target_pocket,
231 packagesets=packagesets)
232 location = job.getSourceLocation()
233 expected_location = PackageLocation(
234 args['source_archive'], source_distroseries.distribution,
235 source_distroseries, source_pocket, packagesets=packagesets)
236 self.assertEqual(expected_location, location)
237
238 def test_get_target_location(self):
239 target_archive = self.factory.makeArchive()
240 args = self.makeDummyArgs()
241 target_distroseries = self.factory.makeDistroSeries()
242 source_pocket = PackagePublishingPocket.RELEASE
243 target_pocket = PackagePublishingPocket.BACKPORTS
244 job = CopyArchiveJob.create(
245 target_archive, args['source_archive'], args['distroseries'],
246 source_pocket, target_distroseries, target_pocket)
247 location = job.getTargetLocation()
248 expected_location = PackageLocation(
249 target_archive, target_distroseries.distribution,
250 target_distroseries, target_pocket)
251 self.assertEqual(expected_location, location)
252
253 def test_get_target_location_with_component(self):
254 target_archive = self.factory.makeArchive()
255 args = self.makeDummyArgs()
256 target_distroseries = self.factory.makeDistroSeries()
257 source_pocket = PackagePublishingPocket.RELEASE
258 target_pocket = PackagePublishingPocket.BACKPORTS
259 target_component = self.factory.makeComponent()
260 job = CopyArchiveJob.create(
261 target_archive, args['source_archive'], args['distroseries'],
262 source_pocket, target_distroseries, target_pocket,
263 target_component=target_component)
264 location = job.getTargetLocation()
265 expected_location = PackageLocation(
266 target_archive, target_distroseries.distribution,
267 target_distroseries, target_pocket)
268 expected_location.component = target_component
269 self.assertEqual(expected_location, location)
270
271 def _getJobs(self):
272 """Return the pending CopyArchiveJobs as a list."""
273 return list(CopyArchiveJob.iterReady())
274
275 def _getJobCount(self):
276 """Return the number of CopyArchiveJobs in the queue."""
277 return len(self._getJobs())
278
279 def makeSourceAndTarget(self):
280 distribution = self.factory.makeDistribution(name="foobuntu")
281 distroseries = self.factory.makeDistroSeries(
282 distribution=distribution, name="maudlin")
283 source_archive_owner = self.factory.makePerson(name="source-owner")
284 source_archive = self.factory.makeArchive(
285 name="source", owner=source_archive_owner,
286 purpose=ArchivePurpose.PPA, distribution=distribution)
287 self.factory.makeSourcePackagePublishingHistory(
288 sourcepackagename=self.factory.getOrMakeSourcePackageName(
289 name='bzr'),
290 distroseries=distroseries, component=self.factory.makeComponent(),
291 version="2.1", architecturehintlist='any',
292 archive=source_archive, status=PackagePublishingStatus.PUBLISHED,
293 pocket=PackagePublishingPocket.RELEASE)
294 das = self.factory.makeDistroArchSeries(
295 distroseries=distroseries, architecturetag="i386",
296 processorfamily=ProcessorFamilySet().getByName("x86"),
297 supports_virtualized=True)
298 with celebrity_logged_in('admin'):
299 distroseries.nominatedarchindep = das
300 target_archive_owner = self.factory.makePerson()
301 target_archive = self.factory.makeArchive(
302 purpose=ArchivePurpose.COPY, owner=target_archive_owner,
303 name="test-copy-archive", distribution=distribution,
304 description="Test copy archive", enabled=False)
305 return source_archive, target_archive, distroseries
306
307 def checkPublishedSources(self, expected, archive, series):
308 # We need to be admin as the archive is disabled at this point.
309 with celebrity_logged_in('admin'):
310 sources = archive.getPublishedSources(
311 distroseries=series,
312 status=(
313 PackagePublishingStatus.PENDING,
314 PackagePublishingStatus.PUBLISHED))
315 actual = []
316 for source in sources:
317 actual.append(
318 (source.source_package_name,
319 source.source_package_version))
320 self.assertEqual(sorted(expected), sorted(actual))
321
322 def test_run(self):
323 """Test that CopyArchiveJob.run() actually copies the archive.
324
325 We just make a simple test here, and rely on PackageCloner tests
326 to cover the functionality.
327 """
328 source_archive, target_archive, series = self.makeSourceAndTarget()
329 job = CopyArchiveJob.create(
330 target_archive, source_archive, series,
331 PackagePublishingPocket.RELEASE, series,
332 PackagePublishingPocket.RELEASE)
333 job.run()
334 self.checkPublishedSources([("bzr", "2.1")], target_archive, series)
335
336 def test_run_mergeCopy(self):
337 """Test that CopyArchiveJob.run() when merge=True does a mergeCopy."""
338 source_archive, target_archive, series = self.makeSourceAndTarget()
339 # Create the copy archive
340 job = CopyArchiveJob.create(
341 target_archive, source_archive, series,
342 PackagePublishingPocket.RELEASE, series,
343 PackagePublishingPocket.RELEASE)
344 job.start()
345 job.run()
346 job.complete()
347 # Now the two archives are in the same state, so we change the
348 # source archive and request a merge to check that it works.
349 # Create a new version of the apt package in the source
350 self.factory.makeSourcePackagePublishingHistory(
351 sourcepackagename=self.factory.getOrMakeSourcePackageName(
352 name='apt'),
353 distroseries=series, component=self.factory.makeComponent(),
354 version="1.2", architecturehintlist='any',
355 archive=source_archive, status=PackagePublishingStatus.PUBLISHED,
356 pocket=PackagePublishingPocket.RELEASE)
357 # Create a job to merge
358 job = CopyArchiveJob.create(
359 target_archive, source_archive, series,
360 PackagePublishingPocket.RELEASE, series,
361 PackagePublishingPocket.RELEASE, merge=True)
362 job.run()
363 # Check that the new apt package is in the target
364 self.checkPublishedSources(
365 [("bzr", "2.1"), ("apt", "1.2")], target_archive, series)
366
367 def test_run_with_proc_families(self):
368 """Test that a CopyArchiveJob job with proc_families uses them.
369
370 If we create a CopyArchiveJob with proc_families != None then
371 they should be used when cloning packages.
372 """
373 source_archive, target_archive, series = self.makeSourceAndTarget()
374 proc_families = [ProcessorFamilySet().getByName("x86")]
375 job = CopyArchiveJob.create(
376 target_archive, source_archive, series,
377 PackagePublishingPocket.RELEASE, series,
378 PackagePublishingPocket.RELEASE, proc_families=proc_families)
379 job.run()
380 builds = list(
381 getUtility(IBinaryPackageBuildSet).getBuildsForArchive(
382 target_archive, status=BuildStatus.NEEDSBUILD))
383 actual_builds = list()
384 for build in builds:
385 naked_build = removeSecurityProxy(build)
386 spr = naked_build.source_package_release
387 actual_builds.append(
388 (spr.name, spr.version, naked_build.processor.family.name))
389 # One build for the one package, as we specified one processor
390 # family.
391 self.assertEqual([("bzr", "2.1", "x86")], actual_builds)

Subscribers

People subscribed via source and target branches

to status/vote changes: