Merge lp:~wgrant/launchpad/distroseries-source-format-selection-part1 into lp:launchpad
- distroseries-source-format-selection-part1
- Merge into devel
Status: | Merged |
---|---|
Approved by: | Julian Edwards |
Approved revision: | not available |
Merge reported by: | William Grant |
Merged at revision: | not available |
Proposed branch: | lp:~wgrant/launchpad/distroseries-source-format-selection-part1 |
Merge into: | lp:launchpad |
Diff against target: | 4589 lines |
To merge this branch: | bzr merge lp:~wgrant/launchpad/distroseries-source-format-selection-part1 |
Related bugs: |
Reviewer | Review Type | Date Requested | Status |
---|---|---|---|
Julian Edwards (community) | code | Approve | |
Gavin Panella (community) | Approve | ||
Review via email: mp+14729@code.launchpad.net |
Commit message
Refactor bits and pieces to prepare for Debian source format 3.0 support.
Description of the change
William Grant (wgrant) wrote : | # |
Julian Edwards (julian-edwards) wrote : | # |
Hi William, thanks for making this change. There's some much-needed refactoring and clearing up in nascentupload.
The basic direction is good, I just have a few comments to make for things that need fixing. Gavin will do a more thorough review as we agreed on IRC.
General questions:
* I can't tell from the diff but are the db permissions added for the new table correctly tested in the tests? This usually just means ensuring that the test runs as the correct db user.
* The XXX about the orphaned files is interesting. I don't think we deal with this right now do we? I'm trying to think if it would be bad to reject the upload if we detect orphan files, and I can't think of one. What's your opinion?
Things that need fixing:
* The upload format check is not tested. (The "%s: format '%s' is not permitted in %s." one)
* IDistroSeries methods permitSourcePac
* SourcePackageFo
* SourcePackageFo
would fit with our current style. You can keep the isSourcePackage
* Once that's done, you should remove the __init__ on the SourcePackageFormat model class and make the new utility's add() method initialise the correct fields.
Everything else looks great, thanks!
Gavin Panella (allenap) wrote : | # |
Hi William,
As Julian has already reviewed the logic in this branch, I'll just do
a style/convention review.
I really have very little to say, and that which I have said is pretty
trivial. It's inline with the diff below.
Running `make lint` generated a lot of warnings. Please can you clean
up those that make sense.
Thanks, Gavin.
> === modified file 'database/
> --- database/
> +++ database/
> @@ -271,6 +271,7 @@
> public.shippingrun = SELECT, INSERT, UPDATE
> public.
> public.
> +public.
> public.
> public.
> public.
> @@ -986,6 +987,7 @@
> public.section = SELECT, INSERT, UPDATE
> public.
> public.
> +public.
This should go after the following line. Wow, how pedantic I am :)
> public.
> public.
> public.
> @@ -1103,6 +1105,7 @@
> public.
> public.processor = SELECT
> public.
> +public.
>
> # Source and Binary packages and builds
> public.
>
> === modified file 'lib/canonical/
> --- lib/canonical/
> +++ lib/canonical/
> @@ -477,9 +477,9 @@
> if fname.endswith(
> return SourcePackageFi
> if fname.endswith(
> - return SourcePackageFi
> + return SourcePackageFi
> if fname.endswith(
> - return SourcePackageFi
> + return SourcePackageFi
>
>
> BINARYPACKAGE_
>
> === modified file 'lib/lp/
> --- lib/lp/
> +++ lib/lp/
> @@ -31,10 +31,13 @@
> parse_tagfile, TagFileParseError)
> from lp.archiveuploa
> prefix_
> - re_valid_pkg_name, re_valid_version, re_issource)
> + re_valid_pkg_name, re_valid_version, re_issource,
> + determine_
> from canonical.encoding import guess as guess_encoding
> from lp.registry.
> +from lp.registry.
William Grant (wgrant) wrote : | # |
> Hi William, thanks for making this change. There's some much-needed
> refactoring and clearing up in nascentupload.
>
> The basic direction is good, I just have a few comments to make for things
> that need fixing. Gavin will do a more thorough review as we agreed on IRC.
>
> General questions:
>
> * I can't tell from the diff but are the db permissions added for the new
> table correctly tested in the tests? This usually just means ensuring that
> the test runs as the correct db user.
I'll have to look at that later. I'm not quite sure.
> * The XXX about the orphaned files is interesting. I don't think we deal
> with this right now do we? I'm trying to think if it would be bad to reject
> the upload if we detect orphan files, and I can't think of one. What's your
> opinion?
I think there is a word or two missing there. You mean you can't think of a reason that it would be bad, or that it wouldn't be bad?
I can't see any compelling reason to reject if it has extra files, apart from it seeming slightly cleaner. If you feel it needs to be done, it's probably easy enough to add.
> Things that need fixing:
>
> * The upload format check is not tested. (The "%s: format '%s' is not
> permitted in %s." one)
Ah, yes, forgot about that. The other half of the branch has 3.0 format test source packages, and tests that. I suppose I should bring one of those in.
> * IDistroSeries methods permitSourcePac
> isSourcePackage
> SourcePackageFo
> security wrapped. Although the upload processor is zopeless, we might need to
> do that one day. Something like
> * SourcePackageFo
> * SourcePackageFo
> would fit with our current style. You can keep the
> isSourcePackage
> convenience, but the permitSourcePac
>
> * Once that's done, you should remove the __init__ on the SourcePackageFormat
> model class and make the new utility's add() method initialise the correct
> fields.
OK, sure. Will fix.
> Everything else looks great, thanks!
Thanks for the review.
William Grant (wgrant) wrote : | # |
Hi Gavin,
Thanks for the review.
> Hi William,
>
> As Julian has already reviewed the logic in this branch, I'll just do
> a style/convention review.
>
> I really have very little to say, and that which I have said is pretty
> trivial. It's inline with the diff below.
>
> Running `make lint` generated a lot of warnings. Please can you clean
> up those that make sense.
I don't think any of those are actually mine, and I felt my diff was big enough. If you think I should clean them up anyway, I will.
> Thanks, Gavin.
>
>
> > === modified file 'database/
> > --- database/
> > +++ database/
> > @@ -271,6 +271,7 @@
> > public.shippingrun = SELECT, INSERT, UPDATE
> > public.
> > public.
> > +public.
> > public.
> > public.
> > public.
> > @@ -986,6 +987,7 @@
> > public.section = SELECT, INSERT, UPDATE
> > public.
> > public.
> > +public.
>
> This should go after the following line. Wow, how pedantic I am :)
Oops -- I renamed the table after I added this line. Fixed.
> > public.
> > public.
> > public.
> > @@ -1103,6 +1105,7 @@
> > public.
> > public.processor = SELECT
> > public.
> > +public.
> >
> > # Source and Binary packages and builds
> > public.
> >
> > === modified file 'lib/canonical/
> > --- lib/canonical/
> > +++ lib/canonical/
> > @@ -477,9 +477,9 @@
> > if fname.endswith(
> > return SourcePackageFi
> > if fname.endswith(
> > - return SourcePackageFi
> > + return SourcePackageFi
> > if fname.endswith(
> > - return SourcePackageFi
> > + return SourcePackageFi
> >
> >
> > BINARYPACKAGE_
> >
> > === modified file 'lib/lp/
> > --- lib/lp/
> > +++ lib/lp/
> > @@ -31,10 +31,13 @@
> > parse_tagfile, TagFileParseError)
> > from lp.archiveuploa
> > prefix_
Gavin Panella (allenap) wrote : | # |
On Wed, 11 Nov 2009 21:53:34 -0000
William Grant <email address hidden> wrote:
> Hi Gavin,
>
> Thanks for the review.
>
> > Hi William,
> >
> > As Julian has already reviewed the logic in this branch, I'll just do
> > a style/convention review.
> >
> > I really have very little to say, and that which I have said is pretty
> > trivial. It's inline with the diff below.
> >
> > Running `make lint` generated a lot of warnings. Please can you clean
> > up those that make sense.
>
> I don't think any of those are actually mine, and I felt my diff was big enough. If you think I should clean them up anyway, I will.
It's not essential, but in general it's good to clean up while moving
through the tree because there's a lot of crud from before we
cared. If you do feel energetic enough to sort some of them out
(anything is better than nothing), then do it in a follow-on branch,
and just ask for a sanity-check review before landing it.
Gavin.
William Grant (wgrant) wrote : | # |
Hi Julian,
On Wed, 2009-11-11 at 21:30 +0000, William Grant wrote:
> > Hi William, thanks for making this change. There's some much-needed
> > refactoring and clearing up in nascentupload.
> >
> > The basic direction is good, I just have a few comments to make for things
> > that need fixing. Gavin will do a more thorough review as we agreed on IRC.
> >
> > General questions:
> >
> > * I can't tell from the diff but are the db permissions added for the new
> > table correctly tested in the tests? This usually just means ensuring that
> > the test runs as the correct db user.
>
> I'll have to look at that later. I'm not quite sure.
I'm not entirely sure how to check this, but I would really hope that
the upload and copy code paths were already executed as the relevant
users at some point in the test suite.
> > * The XXX about the orphaned files is interesting. I don't think we deal
> > with this right now do we? I'm trying to think if it would be bad to reject
> > the upload if we detect orphan files, and I can't think of one. What's your
> > opinion?
>
> I think there is a word or two missing there. You mean you can't think of a reason that it would be bad, or that it wouldn't be bad?
>
> I can't see any compelling reason to reject if it has extra files, apart from it seeming slightly cleaner. If you feel it needs to be done, it's probably easy enough to add.
As discussed on IRC, I've removed the XXX.
> > Things that need fixing:
> >
> > * The upload format check is not tested. (The "%s: format '%s' is not
> > permitted in %s." one)
>
> Ah, yes, forgot about that. The other half of the branch has 3.0 format test source packages, and tests that. I suppose I should bring one of those in.
I've created a 3.0 (quilt) source package and brought across the
relevant test from the second branch.
> > * IDistroSeries methods permitSourcePac
> > isSourcePackage
> > SourcePackageFo
> > security wrapped. Although the upload processor is zopeless, we might need to
> > do that one day. Something like
> > * SourcePackageFo
> > * SourcePackageFo
> > would fit with our current style. You can keep the
> > isSourcePackage
> > convenience, but the permitSourcePac
> >
> > * Once that's done, you should remove the __init__ on the SourcePackageFormat
> > model class and make the new utility's add() method initialise the correct
> > fields.
>
> OK, sure. Will fix.
Fixed.
The intermediate diff is attached.
1 | === modified file 'database/schema/security.cfg' |
2 | --- database/schema/security.cfg 2009-11-10 13:09:26 +0000 |
3 | +++ database/schema/security.cfg 2009-11-11 21:34:02 +0000 |
4 | @@ -987,8 +987,8 @@ |
5 | public.section = SELECT, INSERT, UPDATE |
6 | public.sectionselection = SELECT, INSERT, UPDATE |
7 | public.signedcodeofconduct = SELECT, INSERT, UPDATE |
8 | +public.sourcepackagefilepublishing = SELECT, INSERT, UPDATE |
9 | public.sourcepackageformatselection = SELECT, INSERT |
10 | -public.sourcepackagefilepublishing = SELECT, INSERT, UPDATE |
11 | public.sourcepackagename = SELECT, INSERT, UPDATE |
12 | public.sourcepackagepublishinghistory = SELECT |
13 | public.securesourcepackagepublishinghistory = SELECT, INSERT, UPDATE |
14 | |
15 | === modified file 'lib/lp/archiveuploader/nascentupload.py' |
16 | --- lib/lp/archiveuploader/nascentupload.py 2009-11-10 13:09:26 +0000 |
17 | +++ lib/lp/archiveuploader/nascentupload.py 2009-11-12 11:44:36 +0000 |
18 | @@ -325,9 +325,6 @@ |
19 | |
20 | |
21 | # It is never sane to upload more than one source at a time. |
22 | - # XXX: What about orphaned files? How will that work? |
23 | - # I think we might need to verify that all source files are |
24 | - # claimed by a dsc. |
25 | if dsc > 1: |
26 | self.reject("Changes file lists more than one .dsc") |
27 | |
28 | |
29 | === added directory 'lib/lp/archiveuploader/tests/data/suite/bar_1.0-1_3.0-quilt' |
30 | === added file 'lib/lp/archiveuploader/tests/data/suite/bar_1.0-1_3.0-quilt/bar_1.0-1.debian.tar.gz' |
31 | Binary files lib/lp/archiveuploader/tests/data/suite/bar_1.0-1_3.0-quilt/bar_1.0-1.debian.tar.gz 1970-01-01 00:00:00 +0000 and lib/lp/archiveuploader/tests/data/suite/bar_1.0-1_3.0-quilt/bar_1.0-1.debian.tar.gz 2009-11-12 11:51:53 +0000 differ |
32 | === added file 'lib/lp/archiveuploader/tests/data/suite/bar_1.0-1_3.0-quilt/bar_1.0-1.dsc' |
33 | --- lib/lp/archiveuploader/tests/data/suite/bar_1.0-1_3.0-quilt/bar_1.0-1.dsc 1970-01-01 00:00:00 +0000 |
34 | +++ lib/lp/archiveuploader/tests/data/suite/bar_1.0-1_3.0-quilt/bar_1.0-1.dsc 2009-11-12 11:51:53 +0000 |
35 | @@ -0,0 +1,16 @@ |
36 | +Format: 3.0 (quilt) |
37 | +Source: bar |
38 | +Binary: bar |
39 | +Architecture: any |
40 | +Version: 1.0-1 |
41 | +Maintainer: Launchpad team <launchpad@lists.canonical.com> |
42 | +Standards-Version: 3.6.2 |
43 | +Checksums-Sha1: |
44 | + 73a04163fee97fd2257ab266bd48f1d3d528e012 164 bar_1.0.orig.tar.gz |
45 | + abce262314a7c0ca00e43598f21b41a3e6ff6b21 688 bar_1.0-1.debian.tar.gz |
46 | +Checksums-Sha256: |
47 | + f1ecff929899b567f45d6734b69d59a4f3c04dabce3cc8e6ed6d64073eda360e 164 bar_1.0.orig.tar.gz |
48 | + ffdcce60fca14618f68483ca77a206f332a3773dc7ece1c3e6de55c0118c69c6 688 bar_1.0-1.debian.tar.gz |
49 | +Files: |
50 | + fc1464e5985b962a042d5354452f361d 164 bar_1.0.orig.tar.gz |
51 | + 056db4dfe7de8322296b6d417592ee01 688 bar_1.0-1.debian.tar.gz |
52 | |
53 | === added file 'lib/lp/archiveuploader/tests/data/suite/bar_1.0-1_3.0-quilt/bar_1.0-1_source.changes' |
54 | --- lib/lp/archiveuploader/tests/data/suite/bar_1.0-1_3.0-quilt/bar_1.0-1_source.changes 1970-01-01 00:00:00 +0000 |
55 | +++ lib/lp/archiveuploader/tests/data/suite/bar_1.0-1_3.0-quilt/bar_1.0-1_source.changes 2009-11-12 11:51:54 +0000 |
56 | @@ -0,0 +1,28 @@ |
57 | +Format: 1.8 |
58 | +Date: Thu, 16 Feb 2006 15:34:09 +0000 |
59 | +Source: bar |
60 | +Binary: bar |
61 | +Architecture: source |
62 | +Version: 1.0-1 |
63 | +Distribution: breezy |
64 | +Urgency: low |
65 | +Maintainer: Launchpad team <launchpad@lists.canonical.com> |
66 | +Changed-By: Daniel Silverstone <daniel.silverstone@canonical.com> |
67 | +Description: |
68 | + bar - Stuff for testing |
69 | +Changes: |
70 | + bar (1.0-1) breezy; urgency=low |
71 | + . |
72 | + * Initial version |
73 | +Checksums-Sha1: |
74 | + bc97e185cf31af33bf8d109044ce51f32d09c229 645 bar_1.0-1.dsc |
75 | + 73a04163fee97fd2257ab266bd48f1d3d528e012 164 bar_1.0.orig.tar.gz |
76 | + abce262314a7c0ca00e43598f21b41a3e6ff6b21 688 bar_1.0-1.debian.tar.gz |
77 | +Checksums-Sha256: |
78 | + ae0fb16941a95518332a8ee962d00d55963b491c2df94b3f230a65d2bdbeedf8 645 bar_1.0-1.dsc |
79 | + f1ecff929899b567f45d6734b69d59a4f3c04dabce3cc8e6ed6d64073eda360e 164 bar_1.0.orig.tar.gz |
80 | + ffdcce60fca14618f68483ca77a206f332a3773dc7ece1c3e6de55c0118c69c6 688 bar_1.0-1.debian.tar.gz |
81 | +Files: |
82 | + c320d2827f08f09ec2e1bbbac635225c 645 devel optional bar_1.0-1.dsc |
83 | + fc1464e5985b962a042d5354452f361d 164 devel optional bar_1.0.orig.tar.gz |
84 | + 056db4dfe7de8322296b6d417592ee01 688 devel optional bar_1.0-1.debian.tar.gz |
85 | |
86 | === added file 'lib/lp/archiveuploader/tests/data/suite/bar_1.0-1_3.0-quilt/bar_1.0.orig.tar.gz' |
87 | Binary files lib/lp/archiveuploader/tests/data/suite/bar_1.0-1_3.0-quilt/bar_1.0.orig.tar.gz 1970-01-01 00:00:00 +0000 and lib/lp/archiveuploader/tests/data/suite/bar_1.0-1_3.0-quilt/bar_1.0.orig.tar.gz 2009-11-12 11:47:52 +0000 differ |
88 | === modified file 'lib/lp/archiveuploader/tests/test_uploadprocessor.py' |
89 | --- lib/lp/archiveuploader/tests/test_uploadprocessor.py 2009-11-10 13:09:26 +0000 |
90 | +++ lib/lp/archiveuploader/tests/test_uploadprocessor.py 2009-11-12 11:49:00 +0000 |
91 | @@ -49,7 +49,8 @@ |
92 | from lp.soyuz.interfaces.archivepermission import ( |
93 | ArchivePermissionType, IArchivePermissionSet) |
94 | from lp.soyuz.interfaces.component import IComponentSet |
95 | -from lp.soyuz.interfaces.sourcepackageformat import SourcePackageFormat |
96 | +from lp.soyuz.interfaces.sourcepackageformat import ( |
97 | + ISourcePackageFormatSelectionSet, SourcePackageFormat) |
98 | from lp.registry.interfaces.person import IPersonSet |
99 | from lp.registry.interfaces.sourcepackagename import ( |
100 | ISourcePackageNameSet) |
101 | @@ -192,7 +193,9 @@ |
102 | permitted_formats = [SourcePackageFormat.FORMAT_1_0] |
103 | |
104 | for format in permitted_formats: |
105 | - self.breezy.permitSourcePackageFormat(format) |
106 | + if not self.breezy.isSourcePackageFormatPermitted(format): |
107 | + getUtility(ISourcePackageFormatSelectionSet).add( |
108 | + self.breezy, format) |
109 | |
110 | def addMockFile(self, filename, content="anything"): |
111 | """Return a librarian file.""" |
112 | @@ -1404,6 +1407,28 @@ |
113 | ] |
114 | self.assertEmail(contents, recipients=recipients) |
115 | |
116 | + def test30QuiltUploadToUnsupportingSeriesIsRejected(self): |
117 | + """Ensure that uploads to series without format support are rejected. |
118 | + |
119 | + Series can restrict the source formats that they accept. Uploads |
120 | + should be rejected if an unsupported format is uploaded. |
121 | + """ |
122 | + self.setupBreezy() |
123 | + self.layer.txn.commit() |
124 | + self.options.context = 'absolutely-anything' |
125 | + uploadprocessor = UploadProcessor( |
126 | + self.options, self.layer.txn, self.log) |
127 | + |
128 | + # Upload the source. |
129 | + upload_dir = self.queueUpload("bar_1.0-1_3.0-quilt") |
130 | + self.processUpload(uploadprocessor, upload_dir) |
131 | + # Make sure it was rejected. |
132 | + from_addr, to_addrs, raw_msg = stub.test_emails.pop() |
133 | + self.assertTrue( |
134 | + "bar_1.0-1.dsc: format '3.0 (quilt)' is not permitted in " |
135 | + "breezy." in raw_msg, |
136 | + "Source was not rejected properly:\n%s" % raw_msg) |
137 | + |
138 | |
139 | def test_suite(): |
140 | return unittest.TestLoader().loadTestsFromName(__name__) |
141 | |
142 | === modified file 'lib/lp/archiveuploader/tests/test_utils.py' |
143 | --- lib/lp/archiveuploader/tests/test_utils.py 2009-11-10 13:09:26 +0000 |
144 | +++ lib/lp/archiveuploader/tests/test_utils.py 2009-11-11 21:49:51 +0000 |
145 | @@ -24,17 +24,19 @@ |
146 | from lp.archiveuploader.utils import determine_source_file_type |
147 | |
148 | self.assertEquals( |
149 | - determine_source_file_type('foo_1.0-1.dsc'), |
150 | - SourcePackageFileType.DSC) |
151 | - self.assertEquals( |
152 | - determine_source_file_type('foo_1.0-1.diff.gz'), |
153 | - SourcePackageFileType.DIFF) |
154 | - self.assertEquals( |
155 | - determine_source_file_type('foo_1.0.orig.tar.gz'), |
156 | - SourcePackageFileType.ORIG_TARBALL) |
157 | - self.assertEquals( |
158 | - determine_source_file_type('foo_1.0.tar.gz'), |
159 | - SourcePackageFileType.NATIVE_TARBALL) |
160 | + SourcePackageFileType.DSC, |
161 | + determine_source_file_type('foo_1.0-1.dsc')) |
162 | + self.assertEquals( |
163 | + SourcePackageFileType.DIFF, |
164 | + determine_source_file_type('foo_1.0-1.diff.gz')) |
165 | + self.assertEquals( |
166 | + SourcePackageFileType.ORIG_TARBALL, |
167 | + determine_source_file_type('foo_1.0.orig.tar.gz')) |
168 | + self.assertEquals( |
169 | + SourcePackageFileType.NATIVE_TARBALL, |
170 | + determine_source_file_type('foo_1.0.tar.gz')) |
171 | + self.assertEquals(None, determine_source_file_type('foo_1.0')) |
172 | + self.assertEquals(None, determine_source_file_type('foo_1.0.blah.gz')) |
173 | |
174 | def testPrefixMultilineString(self): |
175 | """lp.archiveuploader.utils.prefix_multi_line_string should work""" |
176 | |
177 | === modified file 'lib/lp/archiveuploader/utils.py' |
178 | --- lib/lp/archiveuploader/utils.py 2009-11-10 13:09:26 +0000 |
179 | +++ lib/lp/archiveuploader/utils.py 2009-11-11 21:49:20 +0000 |
180 | @@ -34,12 +34,11 @@ |
181 | |
182 | re_isadeb = re.compile(r"(.+?)_(.+?)_(.+)\.(u?d?deb)$") |
183 | |
184 | +source_file_exts = ['orig.tar.gz', 'diff.gz', 'tar.gz', 'dsc'] |
185 | re_issource = re.compile( |
186 | - r"(.+)_(.+?)\." |
187 | - "(orig\.tar\.gz" |
188 | - "|diff\.gz" |
189 | - "|tar\.gz" |
190 | - "|dsc)$") |
191 | + r"(.+)_(.+?)\.(%s)" % "|".join( |
192 | + re.escape(ext) for ext in source_file_exts)) |
193 | + |
194 | re_is_orig_tar_ext = re.compile(r"^orig.tar.gz$") |
195 | re_is_native_tar_ext = re.compile(r"^tar.gz$") |
196 | |
197 | @@ -78,6 +77,8 @@ |
198 | return SourcePackageFileType.ORIG_TARBALL |
199 | elif re_is_native_tar_ext.match(extension): |
200 | return SourcePackageFileType.NATIVE_TARBALL |
201 | + else: |
202 | + return None |
203 | |
204 | |
205 | def prefix_multi_line_string(str, prefix, include_blank_lines=0): |
206 | |
207 | === modified file 'lib/lp/registry/interfaces/distroseries.py' |
208 | --- lib/lp/registry/interfaces/distroseries.py 2009-11-10 13:09:26 +0000 |
209 | +++ lib/lp/registry/interfaces/distroseries.py 2009-11-12 11:00:33 +0000 |
210 | @@ -205,12 +205,6 @@ |
211 | def newMilestone(name, dateexpected=None, summary=None, code_name=None): |
212 | """Create a new milestone for this DistroSeries.""" |
213 | |
214 | - def permitSourcePackageFormat(format): |
215 | - """Permit a source format to be uploaded to this series. |
216 | - |
217 | - :param format: The SourcePackageFormat to permit. |
218 | - """ |
219 | - |
220 | |
221 | class ISeriesMixin(Interface): |
222 | """Methods & properties shared between distro & product series.""" |
223 | |
224 | === modified file 'lib/lp/registry/model/distroseries.py' |
225 | --- lib/lp/registry/model/distroseries.py 2009-11-10 13:09:26 +0000 |
226 | +++ lib/lp/registry/model/distroseries.py 2009-11-12 11:34:49 +0000 |
227 | @@ -82,7 +82,6 @@ |
228 | from lp.registry.model.sourcepackagename import SourcePackageName |
229 | from lp.soyuz.model.sourcepackagerelease import ( |
230 | SourcePackageRelease) |
231 | -from lp.soyuz.model.sourcepackageformat import SourcePackageFormatSelection |
232 | from lp.blueprints.model.specification import ( |
233 | HasSpecificationsMixin, Specification) |
234 | from lp.translations.model.translationimportqueue import ( |
235 | @@ -119,6 +118,8 @@ |
236 | from canonical.launchpad.webapp.interfaces import ( |
237 | IStoreSelector, MAIN_STORE, NotFoundError, SLAVE_FLAVOR, |
238 | TranslationUnavailable) |
239 | +from lp.soyuz.interfaces.sourcepackageformat import ( |
240 | + ISourcePackageFormatSelectionSet) |
241 | |
242 | |
243 | class SeriesMixin: |
244 | @@ -1744,14 +1745,8 @@ |
245 | return '%s%s' % (self.name, pocketsuffix[pocket]) |
246 | |
247 | def isSourcePackageFormatPermitted(self, format): |
248 | - return Store.of(self).find( |
249 | - SourcePackageFormatSelection, distroseries=self, |
250 | - format=format).count() == 1 |
251 | - |
252 | - def permitSourcePackageFormat(self, format): |
253 | - if not self.isSourcePackageFormatPermitted(format): |
254 | - return Store.of(self).add( |
255 | - SourcePackageFormatSelection(self, format)) |
256 | + return getUtility(ISourcePackageFormatSelectionSet |
257 | + ).getBySeriesAndFormat(self, format) is not None |
258 | |
259 | |
260 | class DistroSeriesSet: |
261 | |
262 | === modified file 'lib/lp/soyuz/configure.zcml' |
263 | --- lib/lp/soyuz/configure.zcml 2009-10-30 06:28:19 +0000 |
264 | +++ lib/lp/soyuz/configure.zcml 2009-11-12 10:49:55 +0000 |
265 | @@ -791,6 +791,28 @@ |
266 | interface="lp.soyuz.interfaces.section.ISectionSet"/> |
267 | </securedutility> |
268 | |
269 | + <!-- SourcePackageFormatSelection --> |
270 | + |
271 | + <class |
272 | + class="lp.soyuz.model.sourcepackageformat.SourcePackageFormatSelection"> |
273 | + <allow |
274 | + interface="lp.soyuz.interfaces.sourcepackageformat.ISourcePackageFormatSelection"/> |
275 | + </class> |
276 | + |
277 | + <!-- SourcePackageFormatSelectionSet --> |
278 | + |
279 | + <class |
280 | + class="lp.soyuz.model.sourcepackageformat.SourcePackageFormatSelectionSet"> |
281 | + <allow |
282 | + interface="lp.soyuz.interfaces.sourcepackageformat.ISourcePackageFormatSelectionSet"/> |
283 | + </class> |
284 | + <securedutility |
285 | + class="lp.soyuz.model.sourcepackageformat.SourcePackageFormatSelectionSet" |
286 | + provides="lp.soyuz.interfaces.sourcepackageformat.ISourcePackageFormatSelectionSet"> |
287 | + <allow |
288 | + interface="lp.soyuz.interfaces.sourcepackageformat.ISourcePackageFormatSelectionSet"/> |
289 | + </securedutility> |
290 | + |
291 | <!-- SourcePackageReleaseFile --> |
292 | |
293 | <class |
294 | |
295 | === modified file 'lib/lp/soyuz/interfaces/sourcepackageformat.py' |
296 | --- lib/lp/soyuz/interfaces/sourcepackageformat.py 2009-11-10 13:09:26 +0000 |
297 | +++ lib/lp/soyuz/interfaces/sourcepackageformat.py 2009-11-12 10:46:22 +0000 |
298 | @@ -8,6 +8,7 @@ |
299 | __all__ = [ |
300 | 'SourcePackageFormat', |
301 | 'ISourcePackageFormatSelection', |
302 | + 'ISourcePackageFormatSelectionSet', |
303 | ] |
304 | |
305 | from zope.interface import Attribute, Interface |
306 | @@ -50,3 +51,14 @@ |
307 | id = Attribute("ID") |
308 | distroseries = Attribute("Target series") |
309 | format = Attribute("Permitted source package format") |
310 | + |
311 | + |
312 | +class ISourcePackageFormatSelectionSet(Interface): |
313 | + """Set manipulation tools for the SourcePackageFormatSelection table.""" |
314 | + |
315 | + def getBySeriesAndFormat(distroseries, format): |
316 | + """Return the ISourcePackageFormatSelection for the given series and |
317 | + format.""" |
318 | + |
319 | + def add(distroseries, format): |
320 | + """Allow the given source package format in the given series.""" |
321 | |
322 | === modified file 'lib/lp/soyuz/model/sourcepackageformat.py' |
323 | --- lib/lp/soyuz/model/sourcepackageformat.py 2009-11-10 13:09:26 +0000 |
324 | +++ lib/lp/soyuz/model/sourcepackageformat.py 2009-11-12 11:02:48 +0000 |
325 | @@ -5,25 +5,26 @@ |
326 | |
327 | __all__ = [ |
328 | 'SourcePackageFormatSelection', |
329 | + 'SourcePackageFormatSelectionSet', |
330 | ] |
331 | |
332 | from storm.locals import Storm, Int, Reference |
333 | +from zope.component import getUtility |
334 | from zope.interface import implements |
335 | |
336 | +from canonical.launchpad.webapp.interfaces import ( |
337 | + IStoreSelector, MAIN_STORE, DEFAULT_FLAVOR, MASTER_FLAVOR) |
338 | from canonical.database.enumcol import DBEnum |
339 | from lp.soyuz.interfaces.sourcepackageformat import ( |
340 | - ISourcePackageFormatSelection, SourcePackageFormat) |
341 | + ISourcePackageFormatSelection, ISourcePackageFormatSelectionSet, |
342 | + SourcePackageFormat) |
343 | + |
344 | |
345 | class SourcePackageFormatSelection(Storm): |
346 | """See ISourcePackageFormatSelection.""" |
347 | |
348 | implements(ISourcePackageFormatSelection) |
349 | |
350 | - def __init__(self, distroseries, format): |
351 | - super(SourcePackageFormatSelection, self).__init__() |
352 | - self.distroseries = distroseries |
353 | - self.format = format |
354 | - |
355 | __storm_table__ = 'sourcepackageformatselection' |
356 | |
357 | id = Int(primary=True) |
358 | @@ -33,3 +34,23 @@ |
359 | |
360 | format = DBEnum(enum=SourcePackageFormat) |
361 | |
362 | + |
363 | +class SourcePackageFormatSelectionSet: |
364 | + """See ISourcePackageFormatSelectionSet.""" |
365 | + |
366 | + implements(ISourcePackageFormatSelectionSet) |
367 | + |
368 | + def getBySeriesAndFormat(self, distroseries, format): |
369 | + """See `ISourcePackageFormatSelection`.""" |
370 | + return getUtility(IStoreSelector).get( |
371 | + MAIN_STORE, DEFAULT_FLAVOR).find( |
372 | + SourcePackageFormatSelection, distroseries=distroseries, |
373 | + format=format).one() |
374 | + |
375 | + def add(self, distroseries, format): |
376 | + """See `ISourcePackageFormatSelection`.""" |
377 | + spfs = SourcePackageFormatSelection() |
378 | + spfs.distroseries = distroseries |
379 | + spfs.format = format |
380 | + return getUtility(IStoreSelector).get(MAIN_STORE, MASTER_FLAVOR).add( |
381 | + spfs) |
382 | |
383 | === modified file 'lib/lp/soyuz/scripts/tests/test_copypackage.py' |
384 | --- lib/lp/soyuz/scripts/tests/test_copypackage.py 2009-11-10 13:09:26 +0000 |
385 | +++ lib/lp/soyuz/scripts/tests/test_copypackage.py 2009-11-12 11:18:42 +0000 |
386 | @@ -37,7 +37,8 @@ |
387 | PackagePublishingStatus, active_publishing_status) |
388 | from lp.soyuz.interfaces.queue import ( |
389 | PackageUploadCustomFormat, PackageUploadStatus) |
390 | -from lp.soyuz.interfaces.sourcepackageformat import SourcePackageFormat |
391 | +from lp.soyuz.interfaces.sourcepackageformat import ( |
392 | + ISourcePackageFormatSelectionSet, SourcePackageFormat) |
393 | from lp.soyuz.model.publishing import ( |
394 | SecureSourcePackagePublishingHistory, |
395 | SecureBinaryPackagePublishingHistory) |
396 | @@ -728,7 +729,8 @@ |
397 | # Get hoary, and configure it to accept 3.0 (quilt) uploads. |
398 | ubuntu = getUtility(IDistributionSet).getByName('ubuntu') |
399 | hoary = ubuntu.getSeries('hoary') |
400 | - hoary.permitSourcePackageFormat(SourcePackageFormat.FORMAT_3_0_QUILT) |
401 | + getUtility(ISourcePackageFormatSelectionSet).add( |
402 | + hoary, SourcePackageFormat.FORMAT_3_0_QUILT) |
403 | |
404 | # Create a 3.0 (quilt) source. |
405 | source = self.test_publisher.getPubSource( |
Julian Edwards (julian-edwards) wrote : | # |
Looking good, thanks William. I will land this for you.
Julian Edwards (julian-edwards) wrote : | # |
William, I can't land this, it's failing in deathrow.txt.
William Grant (wgrant) wrote : | # |
Julian, I've fixed the broken tests.
Curtis Hovey (sinzui) wrote : | # |
> Julian, I've fixed the broken tests.
I'll submit the branch.
William Grant (wgrant) wrote : | # |
So, it turns out this needs to land on db-devel. I've merged and resolved conflicts, with no other changes.
Preview Diff
1 | === modified file 'database/sampledata/current-dev.sql' |
2 | --- database/sampledata/current-dev.sql 2009-11-09 13:01:13 +0000 |
3 | +++ database/sampledata/current-dev.sql 2009-11-16 23:27:14 +0000 |
4 | @@ -745,6 +745,15 @@ |
5 | |
6 | |
7 | |
8 | + |
9 | + |
10 | + |
11 | + |
12 | + |
13 | + |
14 | + |
15 | + |
16 | + |
17 | ALTER TABLE account DISABLE TRIGGER ALL; |
18 | |
19 | INSERT INTO account (id, date_created, creation_rationale, status, date_status_set, displayname, openid_identifier, status_comment, old_openid_identifier) VALUES (1, '2005-06-06 08:59:51.591618', 8, 20, '2005-06-06 08:59:51.591618', 'Mark Shuttleworth', 'mark_oid', NULL, '123/mark'); |
20 | @@ -1695,10 +1704,19 @@ |
21 | ALTER TABLE builder ENABLE TRIGGER ALL; |
22 | |
23 | |
24 | +ALTER TABLE buildpackagejob DISABLE TRIGGER ALL; |
25 | + |
26 | +INSERT INTO buildpackagejob (id, job, build) VALUES (1, 1, 8); |
27 | +INSERT INTO buildpackagejob (id, job, build) VALUES (2, 2, 11); |
28 | + |
29 | + |
30 | +ALTER TABLE buildpackagejob ENABLE TRIGGER ALL; |
31 | + |
32 | + |
33 | ALTER TABLE buildqueue DISABLE TRIGGER ALL; |
34 | |
35 | -INSERT INTO buildqueue (id, build, builder, logtail, created, buildstart, lastscore, manual) VALUES (1, 8, 1, 'Dummy sampledata entry, not processing', '2005-06-15 09:14:12.820778', '2005-06-15 09:20:12.820778', 1, false); |
36 | -INSERT INTO buildqueue (id, build, builder, logtail, created, buildstart, lastscore, manual) VALUES (2, 11, NULL, NULL, '2005-06-15 10:14:12.820778', NULL, 10, false); |
37 | +INSERT INTO buildqueue (id, builder, logtail, lastscore, manual, job, job_type) VALUES (1, 1, 'Dummy sampledata entry, not processing', 1, false, 1, 1); |
38 | +INSERT INTO buildqueue (id, builder, logtail, lastscore, manual, job, job_type) VALUES (2, NULL, NULL, 10, false, 2, 1); |
39 | |
40 | |
41 | ALTER TABLE buildqueue ENABLE TRIGGER ALL; |
42 | @@ -2781,6 +2799,8 @@ |
43 | |
44 | ALTER TABLE job DISABLE TRIGGER ALL; |
45 | |
46 | +INSERT INTO job (id, requester, reason, status, progress, last_report_seen, next_report_due, attempt_count, max_retries, log, scheduled_start, lease_expires, date_created, date_started, date_finished) VALUES (1, NULL, NULL, 0, NULL, NULL, NULL, 0, 0, NULL, NULL, NULL, '2005-06-15 09:14:12.820778', '2005-06-15 09:20:12.820778', NULL); |
47 | +INSERT INTO job (id, requester, reason, status, progress, last_report_seen, next_report_due, attempt_count, max_retries, log, scheduled_start, lease_expires, date_created, date_started, date_finished) VALUES (2, NULL, NULL, 0, NULL, NULL, NULL, 0, 0, NULL, NULL, NULL, '2005-06-15 10:14:12.820778', NULL, NULL); |
48 | |
49 | |
50 | ALTER TABLE job ENABLE TRIGGER ALL; |
51 | @@ -4615,6 +4635,13 @@ |
52 | ALTER TABLE packageset ENABLE TRIGGER ALL; |
53 | |
54 | |
55 | +ALTER TABLE packagesetgroup DISABLE TRIGGER ALL; |
56 | + |
57 | + |
58 | + |
59 | +ALTER TABLE packagesetgroup ENABLE TRIGGER ALL; |
60 | + |
61 | + |
62 | ALTER TABLE packagesetinclusion DISABLE TRIGGER ALL; |
63 | |
64 | |
65 | @@ -8569,6 +8596,26 @@ |
66 | ALTER TABLE signedcodeofconduct ENABLE TRIGGER ALL; |
67 | |
68 | |
69 | +ALTER TABLE sourcepackageformatselection DISABLE TRIGGER ALL; |
70 | + |
71 | +INSERT INTO sourcepackageformatselection (id, distroseries, format) VALUES (1, 1, 0); |
72 | +INSERT INTO sourcepackageformatselection (id, distroseries, format) VALUES (2, 2, 0); |
73 | +INSERT INTO sourcepackageformatselection (id, distroseries, format) VALUES (3, 3, 0); |
74 | +INSERT INTO sourcepackageformatselection (id, distroseries, format) VALUES (4, 4, 0); |
75 | +INSERT INTO sourcepackageformatselection (id, distroseries, format) VALUES (5, 5, 0); |
76 | +INSERT INTO sourcepackageformatselection (id, distroseries, format) VALUES (6, 6, 0); |
77 | +INSERT INTO sourcepackageformatselection (id, distroseries, format) VALUES (7, 7, 0); |
78 | +INSERT INTO sourcepackageformatselection (id, distroseries, format) VALUES (8, 8, 0); |
79 | +INSERT INTO sourcepackageformatselection (id, distroseries, format) VALUES (9, 9, 0); |
80 | +INSERT INTO sourcepackageformatselection (id, distroseries, format) VALUES (10, 10, 0); |
81 | +INSERT INTO sourcepackageformatselection (id, distroseries, format) VALUES (11, 11, 0); |
82 | +INSERT INTO sourcepackageformatselection (id, distroseries, format) VALUES (12, 12, 0); |
83 | +INSERT INTO sourcepackageformatselection (id, distroseries, format) VALUES (13, 13, 0); |
84 | + |
85 | + |
86 | +ALTER TABLE sourcepackageformatselection ENABLE TRIGGER ALL; |
87 | + |
88 | + |
89 | ALTER TABLE sourcepackagename DISABLE TRIGGER ALL; |
90 | |
91 | INSERT INTO sourcepackagename (id, name) VALUES (1, 'mozilla-firefox'); |
92 | @@ -8596,17 +8643,17 @@ |
93 | ALTER TABLE sourcepackagerelease DISABLE TRIGGER ALL; |
94 | |
95 | INSERT INTO sourcepackagerelease (id, creator, version, dateuploaded, urgency, dscsigningkey, component, changelog_entry, builddepends, builddependsindep, architecturehintlist, dsc, section, maintainer, sourcepackagename, upload_distroseries, format, dsc_maintainer_rfc822, dsc_standards_version, dsc_format, dsc_binaries, upload_archive, copyright, build_conflicts, build_conflicts_indep) VALUES (14, 1, '0.9', '2004-09-27 11:57:13', 1, NULL, 1, 'Mozilla dummy Changelog......', 'gcc-3.4-base, libc6 (>= 2.3.2.ds1-4), gcc-3.4 (>= 3.4.1-4sarge1), gcc-3.4 (<< 3.4.2), libstdc++6-dev (>= 3.4.1-4sarge1), pmount', 'bacula-common (= 1.34.6-2), bacula-director-common (= 1.34.6-2), postgresql-client (>= 7.4), pmount', 'any', NULL, 1, 1, 1, 1, 1, 'Mark Shuttleworth <mark@canonical.com>', '3.6.2', '1.0', 'mozilla-firefox', 1, NULL, 'gcc-4.0, pmount', 'gcc-4.0-base, pmount'); |
96 | -INSERT INTO sourcepackagerelease (id, creator, version, dateuploaded, urgency, dscsigningkey, component, changelog_entry, builddepends, builddependsindep, architecturehintlist, dsc, section, maintainer, sourcepackagename, upload_distroseries, format, dsc_maintainer_rfc822, dsc_standards_version, dsc_format, dsc_binaries, upload_archive, copyright, build_conflicts, build_conflicts_indep) VALUES (15, 1, '1.0', '2004-09-27 11:57:13', 1, NULL, 1, NULL, NULL, NULL, 'all', NULL, 2, 1, 9, 3, 1, NULL, NULL, NULL, NULL, 1, NULL, NULL, NULL); |
97 | -INSERT INTO sourcepackagerelease (id, creator, version, dateuploaded, urgency, dscsigningkey, component, changelog_entry, builddepends, builddependsindep, architecturehintlist, dsc, section, maintainer, sourcepackagename, upload_distroseries, format, dsc_maintainer_rfc822, dsc_standards_version, dsc_format, dsc_binaries, upload_archive, copyright, build_conflicts, build_conflicts_indep) VALUES (16, 1, '1.0-1', '2005-03-10 16:30:00', 1, NULL, 1, NULL, NULL, NULL, 'any', NULL, 3, 1, 10, 3, 1, NULL, NULL, NULL, NULL, 1, NULL, NULL, NULL); |
98 | -INSERT INTO sourcepackagerelease (id, creator, version, dateuploaded, urgency, dscsigningkey, component, changelog_entry, builddepends, builddependsindep, architecturehintlist, dsc, section, maintainer, sourcepackagename, upload_distroseries, format, dsc_maintainer_rfc822, dsc_standards_version, dsc_format, dsc_binaries, upload_archive, copyright, build_conflicts, build_conflicts_indep) VALUES (17, 1, '0.99.6-1', '2005-03-14 18:00:00', 1, NULL, 1, NULL, NULL, NULL, 'i386', NULL, 2, 1, 10, 1, 1, NULL, NULL, NULL, NULL, 1, NULL, NULL, NULL); |
99 | +INSERT INTO sourcepackagerelease (id, creator, version, dateuploaded, urgency, dscsigningkey, component, changelog_entry, builddepends, builddependsindep, architecturehintlist, dsc, section, maintainer, sourcepackagename, upload_distroseries, format, dsc_maintainer_rfc822, dsc_standards_version, dsc_format, dsc_binaries, upload_archive, copyright, build_conflicts, build_conflicts_indep) VALUES (15, 1, '1.0', '2004-09-27 11:57:13', 1, NULL, 1, NULL, NULL, NULL, 'all', NULL, 2, 1, 9, 3, 1, NULL, NULL, '1.0', NULL, 1, NULL, NULL, NULL); |
100 | +INSERT INTO sourcepackagerelease (id, creator, version, dateuploaded, urgency, dscsigningkey, component, changelog_entry, builddepends, builddependsindep, architecturehintlist, dsc, section, maintainer, sourcepackagename, upload_distroseries, format, dsc_maintainer_rfc822, dsc_standards_version, dsc_format, dsc_binaries, upload_archive, copyright, build_conflicts, build_conflicts_indep) VALUES (16, 1, '1.0-1', '2005-03-10 16:30:00', 1, NULL, 1, NULL, NULL, NULL, 'any', NULL, 3, 1, 10, 3, 1, NULL, NULL, '1.0', NULL, 1, NULL, NULL, NULL); |
101 | +INSERT INTO sourcepackagerelease (id, creator, version, dateuploaded, urgency, dscsigningkey, component, changelog_entry, builddepends, builddependsindep, architecturehintlist, dsc, section, maintainer, sourcepackagename, upload_distroseries, format, dsc_maintainer_rfc822, dsc_standards_version, dsc_format, dsc_binaries, upload_archive, copyright, build_conflicts, build_conflicts_indep) VALUES (17, 1, '0.99.6-1', '2005-03-14 18:00:00', 1, NULL, 1, NULL, NULL, NULL, 'i386', NULL, 2, 1, 10, 1, 1, NULL, NULL, '1.0', NULL, 1, NULL, NULL, NULL); |
102 | INSERT INTO sourcepackagerelease (id, creator, version, dateuploaded, urgency, dscsigningkey, component, changelog_entry, builddepends, builddependsindep, architecturehintlist, dsc, section, maintainer, sourcepackagename, upload_distroseries, format, dsc_maintainer_rfc822, dsc_standards_version, dsc_format, dsc_binaries, upload_archive, copyright, build_conflicts, build_conflicts_indep) VALUES (20, 1, '0.1-1', '2005-03-24 20:59:31.439579', 1, NULL, 1, 'pmount (0.1-1) hoary; urgency=low |
103 | |
104 | * Fix description (Malone #1) |
105 | * Fix debian (Debian #2000) |
106 | * Fix warty (Warty Ubuntu #1) |
107 | |
108 | - -- Sample Person <test@canonical.com> Tue, 7 Feb 2006 12:10:08 +0300', NULL, NULL, 'all', NULL, 2, 1, 14, 3, 1, NULL, NULL, NULL, NULL, 1, NULL, NULL, NULL); |
109 | -INSERT INTO sourcepackagerelease (id, creator, version, dateuploaded, urgency, dscsigningkey, component, changelog_entry, builddepends, builddependsindep, architecturehintlist, dsc, section, maintainer, sourcepackagename, upload_distroseries, format, dsc_maintainer_rfc822, dsc_standards_version, dsc_format, dsc_binaries, upload_archive, copyright, build_conflicts, build_conflicts_indep) VALUES (21, 1, '0.1-2', '2005-06-24 20:59:31.439579', 1, NULL, 1, 'This is a placeholder changelog for pmount 0.1-2', NULL, NULL, 'powerpc', NULL, 1, 1, 14, 3, 1, NULL, NULL, NULL, NULL, 1, NULL, NULL, NULL); |
110 | + -- Sample Person <test@canonical.com> Tue, 7 Feb 2006 12:10:08 +0300', NULL, NULL, 'all', NULL, 2, 1, 14, 3, 1, NULL, NULL, '1.0', NULL, 1, NULL, NULL, NULL); |
111 | +INSERT INTO sourcepackagerelease (id, creator, version, dateuploaded, urgency, dscsigningkey, component, changelog_entry, builddepends, builddependsindep, architecturehintlist, dsc, section, maintainer, sourcepackagename, upload_distroseries, format, dsc_maintainer_rfc822, dsc_standards_version, dsc_format, dsc_binaries, upload_archive, copyright, build_conflicts, build_conflicts_indep) VALUES (21, 1, '0.1-2', '2005-06-24 20:59:31.439579', 1, NULL, 1, 'This is a placeholder changelog for pmount 0.1-2', NULL, NULL, 'powerpc', NULL, 1, 1, 14, 3, 1, NULL, NULL, '1.0', NULL, 1, NULL, NULL, NULL); |
112 | INSERT INTO sourcepackagerelease (id, creator, version, dateuploaded, urgency, dscsigningkey, component, changelog_entry, builddepends, builddependsindep, architecturehintlist, dsc, section, maintainer, sourcepackagename, upload_distroseries, format, dsc_maintainer_rfc822, dsc_standards_version, dsc_format, dsc_binaries, upload_archive, copyright, build_conflicts, build_conflicts_indep) VALUES (23, 1, '1.0.8-1ubuntu1', '2005-02-03 08:50:00', 1, NULL, 1, 'alsa-utils (1.0.8-1ubuntu1) warty; urgency=low |
113 | |
114 | * Placeholder |
115 | @@ -8616,7 +8663,7 @@ |
116 | |
117 | * Placeholder |
118 | |
119 | - -- Sample Person <test@canonical.com> Tue, 7 Feb 2006 12:10:08 +0300', NULL, NULL, 'any', NULL, 2, 1, 19, 8, 1, NULL, NULL, NULL, NULL, 1, NULL, NULL, NULL); |
120 | + -- Sample Person <test@canonical.com> Tue, 7 Feb 2006 12:10:08 +0300', NULL, NULL, 'any', NULL, 2, 1, 19, 8, 1, NULL, NULL, '1.0', NULL, 1, NULL, NULL, NULL); |
121 | INSERT INTO sourcepackagerelease (id, creator, version, dateuploaded, urgency, dscsigningkey, component, changelog_entry, builddepends, builddependsindep, architecturehintlist, dsc, section, maintainer, sourcepackagename, upload_distroseries, format, dsc_maintainer_rfc822, dsc_standards_version, dsc_format, dsc_binaries, upload_archive, copyright, build_conflicts, build_conflicts_indep) VALUES (25, 1, '1.0.9a-4ubuntu1', '2005-08-01 14:10:00', 1, NULL, 1, 'alsa-utils (1.0.9a-4ubuntu1) hoary; urgency=low |
122 | |
123 | * Placeholder |
124 | @@ -8626,21 +8673,21 @@ |
125 | LP: #7, #8, |
126 | #11 |
127 | |
128 | - -- Sample Person <test@canonical.com> Tue, 7 Feb 2006 12:10:08 +0300', NULL, NULL, 'all', NULL, 1, 16, 19, 3, 1, NULL, NULL, NULL, NULL, 1, NULL, NULL, NULL); |
129 | -INSERT INTO sourcepackagerelease (id, creator, version, dateuploaded, urgency, dscsigningkey, component, changelog_entry, builddepends, builddependsindep, architecturehintlist, dsc, section, maintainer, sourcepackagename, upload_distroseries, format, dsc_maintainer_rfc822, dsc_standards_version, dsc_format, dsc_binaries, upload_archive, copyright, build_conflicts, build_conflicts_indep) VALUES (26, 1, 'cr.g7-37', '2005-12-22 18:19:00', 1, NULL, 1, NULL, NULL, NULL, 'i386', NULL, 1, 16, 20, 3, 1, NULL, NULL, NULL, NULL, 1, NULL, NULL, NULL); |
130 | + -- Sample Person <test@canonical.com> Tue, 7 Feb 2006 12:10:08 +0300', NULL, NULL, 'all', NULL, 1, 16, 19, 3, 1, NULL, NULL, '1.0', NULL, 1, NULL, NULL, NULL); |
131 | +INSERT INTO sourcepackagerelease (id, creator, version, dateuploaded, urgency, dscsigningkey, component, changelog_entry, builddepends, builddependsindep, architecturehintlist, dsc, section, maintainer, sourcepackagename, upload_distroseries, format, dsc_maintainer_rfc822, dsc_standards_version, dsc_format, dsc_binaries, upload_archive, copyright, build_conflicts, build_conflicts_indep) VALUES (26, 1, 'cr.g7-37', '2005-12-22 18:19:00', 1, NULL, 1, NULL, NULL, NULL, 'i386', NULL, 1, 16, 20, 3, 1, NULL, NULL, '1.0', NULL, 1, NULL, NULL, NULL); |
132 | INSERT INTO sourcepackagerelease (id, creator, version, dateuploaded, urgency, dscsigningkey, component, changelog_entry, builddepends, builddependsindep, architecturehintlist, dsc, section, maintainer, sourcepackagename, upload_distroseries, format, dsc_maintainer_rfc822, dsc_standards_version, dsc_format, dsc_binaries, upload_archive, copyright, build_conflicts, build_conflicts_indep) VALUES (27, 1, 'b8p', '2006-02-10 11:19:00', 1, NULL, 1, 'libstdc++ (9.9-1) hoary; urgency=high |
133 | |
134 | * Placeholder |
135 | |
136 | - -- Sample Person <test@canonical.com> Tue, 10 Feb 2006 10:10:08 +0300', NULL, NULL, 'powerpc i386', NULL, 1, 16, 21, 3, 1, NULL, NULL, NULL, NULL, 1, NULL, NULL, NULL); |
137 | -INSERT INTO sourcepackagerelease (id, creator, version, dateuploaded, urgency, dscsigningkey, component, changelog_entry, builddepends, builddependsindep, architecturehintlist, dsc, section, maintainer, sourcepackagename, upload_distroseries, format, dsc_maintainer_rfc822, dsc_standards_version, dsc_format, dsc_binaries, upload_archive, copyright, build_conflicts, build_conflicts_indep) VALUES (28, 1, '2.6.15.3', '2005-12-22 18:19:00', 1, NULL, 1, NULL, NULL, NULL, 'any', NULL, 1, 16, 22, 3, 1, NULL, NULL, NULL, NULL, 1, NULL, NULL, NULL); |
138 | -INSERT INTO sourcepackagerelease (id, creator, version, dateuploaded, urgency, dscsigningkey, component, changelog_entry, builddepends, builddependsindep, architecturehintlist, dsc, section, maintainer, sourcepackagename, upload_distroseries, format, dsc_maintainer_rfc822, dsc_standards_version, dsc_format, dsc_binaries, upload_archive, copyright, build_conflicts, build_conflicts_indep) VALUES (29, 1, '0.00', '2005-12-22 18:19:00', 1, NULL, 1, NULL, NULL, NULL, 'all', NULL, 1, 16, 17, 3, 1, NULL, NULL, NULL, NULL, 1, NULL, NULL, NULL); |
139 | -INSERT INTO sourcepackagerelease (id, creator, version, dateuploaded, urgency, dscsigningkey, component, changelog_entry, builddepends, builddependsindep, architecturehintlist, dsc, section, maintainer, sourcepackagename, upload_distroseries, format, dsc_maintainer_rfc822, dsc_standards_version, dsc_format, dsc_binaries, upload_archive, copyright, build_conflicts, build_conflicts_indep) VALUES (30, 1, '1.0', '2006-09-28 18:19:00', 1, NULL, 1, NULL, NULL, NULL, 'all', NULL, 1, 16, 20, 10, 1, NULL, NULL, NULL, NULL, 1, NULL, NULL, NULL); |
140 | -INSERT INTO sourcepackagerelease (id, creator, version, dateuploaded, urgency, dscsigningkey, component, changelog_entry, builddepends, builddependsindep, architecturehintlist, dsc, section, maintainer, sourcepackagename, upload_distroseries, format, dsc_maintainer_rfc822, dsc_standards_version, dsc_format, dsc_binaries, upload_archive, copyright, build_conflicts, build_conflicts_indep) VALUES (31, 1, '1.0', '2006-09-28 18:19:01', 1, NULL, 1, NULL, NULL, NULL, 'all', NULL, 1, 16, 20, 10, 1, NULL, NULL, NULL, NULL, 1, NULL, NULL, NULL); |
141 | -INSERT INTO sourcepackagerelease (id, creator, version, dateuploaded, urgency, dscsigningkey, component, changelog_entry, builddepends, builddependsindep, architecturehintlist, dsc, section, maintainer, sourcepackagename, upload_distroseries, format, dsc_maintainer_rfc822, dsc_standards_version, dsc_format, dsc_binaries, upload_archive, copyright, build_conflicts, build_conflicts_indep) VALUES (32, 1, '1.0', '2006-12-01 13:19:01', 1, NULL, 1, NULL, NULL, NULL, 'all', NULL, 1, 16, 23, 10, 1, NULL, NULL, NULL, NULL, 1, NULL, NULL, NULL); |
142 | -INSERT INTO sourcepackagerelease (id, creator, version, dateuploaded, urgency, dscsigningkey, component, changelog_entry, builddepends, builddependsindep, architecturehintlist, dsc, section, maintainer, sourcepackagename, upload_distroseries, format, dsc_maintainer_rfc822, dsc_standards_version, dsc_format, dsc_binaries, upload_archive, copyright, build_conflicts, build_conflicts_indep) VALUES (33, 1, '1.0', '2006-12-01 13:19:01', 1, NULL, 1, NULL, NULL, NULL, 'all', NULL, 1, 16, 24, 10, 1, NULL, NULL, NULL, NULL, 1, NULL, NULL, NULL); |
143 | -INSERT INTO sourcepackagerelease (id, creator, version, dateuploaded, urgency, dscsigningkey, component, changelog_entry, builddepends, builddependsindep, architecturehintlist, dsc, section, maintainer, sourcepackagename, upload_distroseries, format, dsc_maintainer_rfc822, dsc_standards_version, dsc_format, dsc_binaries, upload_archive, copyright, build_conflicts, build_conflicts_indep) VALUES (34, 1, '1.0', '2007-02-15 14:19:01', 1, NULL, 1, NULL, NULL, NULL, 'i386', NULL, 29, 16, 25, 10, 1, NULL, NULL, NULL, NULL, 1, NULL, NULL, NULL); |
144 | -INSERT INTO sourcepackagerelease (id, creator, version, dateuploaded, urgency, dscsigningkey, component, changelog_entry, builddepends, builddependsindep, architecturehintlist, dsc, section, maintainer, sourcepackagename, upload_distroseries, format, dsc_maintainer_rfc822, dsc_standards_version, dsc_format, dsc_binaries, upload_archive, copyright, build_conflicts, build_conflicts_indep) VALUES (35, 1, '1.0', '2006-04-11 11:19:01', 1, NULL, 1, NULL, NULL, NULL, 'any', NULL, 1, 16, 26, 1, 1, NULL, NULL, NULL, NULL, 10, NULL, NULL, NULL); |
145 | + -- Sample Person <test@canonical.com> Tue, 10 Feb 2006 10:10:08 +0300', NULL, NULL, 'powerpc i386', NULL, 1, 16, 21, 3, 1, NULL, NULL, '1.0', NULL, 1, NULL, NULL, NULL); |
146 | +INSERT INTO sourcepackagerelease (id, creator, version, dateuploaded, urgency, dscsigningkey, component, changelog_entry, builddepends, builddependsindep, architecturehintlist, dsc, section, maintainer, sourcepackagename, upload_distroseries, format, dsc_maintainer_rfc822, dsc_standards_version, dsc_format, dsc_binaries, upload_archive, copyright, build_conflicts, build_conflicts_indep) VALUES (28, 1, '2.6.15.3', '2005-12-22 18:19:00', 1, NULL, 1, NULL, NULL, NULL, 'any', NULL, 1, 16, 22, 3, 1, NULL, NULL, '1.0', NULL, 1, NULL, NULL, NULL); |
147 | +INSERT INTO sourcepackagerelease (id, creator, version, dateuploaded, urgency, dscsigningkey, component, changelog_entry, builddepends, builddependsindep, architecturehintlist, dsc, section, maintainer, sourcepackagename, upload_distroseries, format, dsc_maintainer_rfc822, dsc_standards_version, dsc_format, dsc_binaries, upload_archive, copyright, build_conflicts, build_conflicts_indep) VALUES (29, 1, '0.00', '2005-12-22 18:19:00', 1, NULL, 1, NULL, NULL, NULL, 'all', NULL, 1, 16, 17, 3, 1, NULL, NULL, '1.0', NULL, 1, NULL, NULL, NULL); |
148 | +INSERT INTO sourcepackagerelease (id, creator, version, dateuploaded, urgency, dscsigningkey, component, changelog_entry, builddepends, builddependsindep, architecturehintlist, dsc, section, maintainer, sourcepackagename, upload_distroseries, format, dsc_maintainer_rfc822, dsc_standards_version, dsc_format, dsc_binaries, upload_archive, copyright, build_conflicts, build_conflicts_indep) VALUES (30, 1, '1.0', '2006-09-28 18:19:00', 1, NULL, 1, NULL, NULL, NULL, 'all', NULL, 1, 16, 20, 10, 1, NULL, NULL, '1.0', NULL, 1, NULL, NULL, NULL); |
149 | +INSERT INTO sourcepackagerelease (id, creator, version, dateuploaded, urgency, dscsigningkey, component, changelog_entry, builddepends, builddependsindep, architecturehintlist, dsc, section, maintainer, sourcepackagename, upload_distroseries, format, dsc_maintainer_rfc822, dsc_standards_version, dsc_format, dsc_binaries, upload_archive, copyright, build_conflicts, build_conflicts_indep) VALUES (31, 1, '1.0', '2006-09-28 18:19:01', 1, NULL, 1, NULL, NULL, NULL, 'all', NULL, 1, 16, 20, 10, 1, NULL, NULL, '1.0', NULL, 1, NULL, NULL, NULL); |
150 | +INSERT INTO sourcepackagerelease (id, creator, version, dateuploaded, urgency, dscsigningkey, component, changelog_entry, builddepends, builddependsindep, architecturehintlist, dsc, section, maintainer, sourcepackagename, upload_distroseries, format, dsc_maintainer_rfc822, dsc_standards_version, dsc_format, dsc_binaries, upload_archive, copyright, build_conflicts, build_conflicts_indep) VALUES (32, 1, '1.0', '2006-12-01 13:19:01', 1, NULL, 1, NULL, NULL, NULL, 'all', NULL, 1, 16, 23, 10, 1, NULL, NULL, '1.0', NULL, 1, NULL, NULL, NULL); |
151 | +INSERT INTO sourcepackagerelease (id, creator, version, dateuploaded, urgency, dscsigningkey, component, changelog_entry, builddepends, builddependsindep, architecturehintlist, dsc, section, maintainer, sourcepackagename, upload_distroseries, format, dsc_maintainer_rfc822, dsc_standards_version, dsc_format, dsc_binaries, upload_archive, copyright, build_conflicts, build_conflicts_indep) VALUES (33, 1, '1.0', '2006-12-01 13:19:01', 1, NULL, 1, NULL, NULL, NULL, 'all', NULL, 1, 16, 24, 10, 1, NULL, NULL, '1.0', NULL, 1, NULL, NULL, NULL); |
152 | +INSERT INTO sourcepackagerelease (id, creator, version, dateuploaded, urgency, dscsigningkey, component, changelog_entry, builddepends, builddependsindep, architecturehintlist, dsc, section, maintainer, sourcepackagename, upload_distroseries, format, dsc_maintainer_rfc822, dsc_standards_version, dsc_format, dsc_binaries, upload_archive, copyright, build_conflicts, build_conflicts_indep) VALUES (34, 1, '1.0', '2007-02-15 14:19:01', 1, NULL, 1, NULL, NULL, NULL, 'i386', NULL, 29, 16, 25, 10, 1, NULL, NULL, '1.0', NULL, 1, NULL, NULL, NULL); |
153 | +INSERT INTO sourcepackagerelease (id, creator, version, dateuploaded, urgency, dscsigningkey, component, changelog_entry, builddepends, builddependsindep, architecturehintlist, dsc, section, maintainer, sourcepackagename, upload_distroseries, format, dsc_maintainer_rfc822, dsc_standards_version, dsc_format, dsc_binaries, upload_archive, copyright, build_conflicts, build_conflicts_indep) VALUES (35, 1, '1.0', '2006-04-11 11:19:01', 1, NULL, 1, NULL, NULL, NULL, 'any', NULL, 1, 16, 26, 1, 1, NULL, NULL, '1.0', NULL, 10, NULL, NULL, NULL); |
154 | INSERT INTO sourcepackagerelease (id, creator, version, dateuploaded, urgency, dscsigningkey, component, changelog_entry, builddepends, builddependsindep, architecturehintlist, dsc, section, maintainer, sourcepackagename, upload_distroseries, format, dsc_maintainer_rfc822, dsc_standards_version, dsc_format, dsc_binaries, upload_archive, copyright, build_conflicts, build_conflicts_indep) VALUES (36, 243606, '1.0-1', '2007-08-09 21:25:37.832976', 1, NULL, 5, 'commercialpackage (1.0-1) breezy; urgency=low |
155 | |
156 | * Initial version |
157 | @@ -8666,7 +8713,7 @@ |
158 | 3F4bEPeRcnUjCFI/hjR0kxg= |
159 | =Tjln |
160 | ', 7, 243606, 27, 10, 1, 'Julian Edwards <launchpad@julian-edwards.com>', '3.6.2', '1.0', 'commercialpackage', 12, NULL, NULL, NULL); |
161 | -INSERT INTO sourcepackagerelease (id, creator, version, dateuploaded, urgency, dscsigningkey, component, changelog_entry, builddepends, builddependsindep, architecturehintlist, dsc, section, maintainer, sourcepackagename, upload_distroseries, format, dsc_maintainer_rfc822, dsc_standards_version, dsc_format, dsc_binaries, upload_archive, copyright, build_conflicts, build_conflicts_indep) VALUES (37, 1, '1.0', '2006-04-11 11:19:01', 1, NULL, 1, NULL, NULL, NULL, 'i386', NULL, 1, 16, 26, 1, 1, NULL, NULL, NULL, NULL, 11, NULL, NULL, NULL); |
162 | +INSERT INTO sourcepackagerelease (id, creator, version, dateuploaded, urgency, dscsigningkey, component, changelog_entry, builddepends, builddependsindep, architecturehintlist, dsc, section, maintainer, sourcepackagename, upload_distroseries, format, dsc_maintainer_rfc822, dsc_standards_version, dsc_format, dsc_binaries, upload_archive, copyright, build_conflicts, build_conflicts_indep) VALUES (37, 1, '1.0', '2006-04-11 11:19:01', 1, NULL, 1, NULL, NULL, NULL, 'i386', NULL, 1, 16, 26, 1, 1, NULL, NULL, '1.0', NULL, 11, NULL, NULL, NULL); |
163 | |
164 | |
165 | ALTER TABLE sourcepackagerelease ENABLE TRIGGER ALL; |
166 | |
167 | === modified file 'database/sampledata/current.sql' |
168 | --- database/sampledata/current.sql 2009-11-05 10:51:36 +0000 |
169 | +++ database/sampledata/current.sql 2009-11-16 23:27:14 +0000 |
170 | @@ -745,6 +745,15 @@ |
171 | |
172 | |
173 | |
174 | + |
175 | + |
176 | + |
177 | + |
178 | + |
179 | + |
180 | + |
181 | + |
182 | + |
183 | ALTER TABLE account DISABLE TRIGGER ALL; |
184 | |
185 | INSERT INTO account (id, date_created, creation_rationale, status, date_status_set, displayname, openid_identifier, status_comment, old_openid_identifier) VALUES (11, '2005-06-06 08:59:51.591618', 8, 20, '2005-06-06 08:59:51.591618', 'Mark Shuttleworth', 'mark_oid', NULL, '123/mark'); |
186 | @@ -949,20 +958,20 @@ |
187 | |
188 | ALTER TABLE archive DISABLE TRIGGER ALL; |
189 | |
190 | -INSERT INTO archive (id, owner, description, enabled, authorized_size, distribution, purpose, private, sources_cached, binaries_cached, package_description_cache, fti, buildd_secret, require_virtualized, name, publish, date_updated, total_count, pending_count, succeeded_count, failed_count, building_count, date_created, signing_key, removed_binary_retention_days, num_old_versions_published, displayname, relative_build_score) VALUES (1, 17, NULL, true, NULL, 1, 1, false, NULL, NULL, NULL, NULL, NULL, false, 'primary', true, '2008-05-27 18:15:12.241774', 15, 1, 8, 5, 1, '2008-09-23 17:29:03.442606', NULL, NULL, NULL, 'Primary Archive for Ubuntu Linux', 0); |
191 | -INSERT INTO archive (id, owner, description, enabled, authorized_size, distribution, purpose, private, sources_cached, binaries_cached, package_description_cache, fti, buildd_secret, require_virtualized, name, publish, date_updated, total_count, pending_count, succeeded_count, failed_count, building_count, date_created, signing_key, removed_binary_retention_days, num_old_versions_published, displayname, relative_build_score) VALUES (2, 1, NULL, true, NULL, 2, 1, false, NULL, NULL, NULL, NULL, NULL, false, 'primary', true, '2008-05-27 18:15:15.863812', 0, 0, 0, 0, 0, '2008-09-23 17:29:03.445921', NULL, NULL, NULL, 'Primary Archive for Redhat Advanced Server', 0); |
192 | -INSERT INTO archive (id, owner, description, enabled, authorized_size, distribution, purpose, private, sources_cached, binaries_cached, package_description_cache, fti, buildd_secret, require_virtualized, name, publish, date_updated, total_count, pending_count, succeeded_count, failed_count, building_count, date_created, signing_key, removed_binary_retention_days, num_old_versions_published, displayname, relative_build_score) VALUES (3, 1, NULL, true, NULL, 3, 1, false, NULL, NULL, NULL, NULL, NULL, false, 'primary', true, '2008-05-27 18:15:15.864941', 0, 0, 0, 0, 0, '2008-09-23 17:29:03.446557', NULL, NULL, NULL, 'Primary Archive for Debian GNU/Linux', 0); |
193 | -INSERT INTO archive (id, owner, description, enabled, authorized_size, distribution, purpose, private, sources_cached, binaries_cached, package_description_cache, fti, buildd_secret, require_virtualized, name, publish, date_updated, total_count, pending_count, succeeded_count, failed_count, building_count, date_created, signing_key, removed_binary_retention_days, num_old_versions_published, displayname, relative_build_score) VALUES (4, 1, NULL, true, NULL, 4, 1, false, NULL, NULL, NULL, NULL, NULL, false, 'primary', true, '2008-05-27 18:15:15.865502', 0, 0, 0, 0, 0, '2008-09-23 17:29:03.44689', NULL, NULL, NULL, 'Primary Archive for The Gentoo Linux', 0); |
194 | -INSERT INTO archive (id, owner, description, enabled, authorized_size, distribution, purpose, private, sources_cached, binaries_cached, package_description_cache, fti, buildd_secret, require_virtualized, name, publish, date_updated, total_count, pending_count, succeeded_count, failed_count, building_count, date_created, signing_key, removed_binary_retention_days, num_old_versions_published, displayname, relative_build_score) VALUES (5, 1, NULL, true, NULL, 5, 1, false, NULL, NULL, NULL, NULL, NULL, false, 'primary', true, '2008-05-27 18:15:15.866015', 0, 0, 0, 0, 0, '2008-09-23 17:29:03.447202', NULL, NULL, NULL, 'Primary Archive for Kubuntu - Free KDE-based Linux', 0); |
195 | -INSERT INTO archive (id, owner, description, enabled, authorized_size, distribution, purpose, private, sources_cached, binaries_cached, package_description_cache, fti, buildd_secret, require_virtualized, name, publish, date_updated, total_count, pending_count, succeeded_count, failed_count, building_count, date_created, signing_key, removed_binary_retention_days, num_old_versions_published, displayname, relative_build_score) VALUES (7, 4, NULL, true, NULL, 7, 1, false, NULL, NULL, NULL, NULL, NULL, false, 'primary', true, '2008-05-27 18:15:15.866529', 0, 0, 0, 0, 0, '2008-09-23 17:29:03.447515', NULL, NULL, NULL, 'Primary Archive for GuadaLinex: Linux for Andalucia', 0); |
196 | -INSERT INTO archive (id, owner, description, enabled, authorized_size, distribution, purpose, private, sources_cached, binaries_cached, package_description_cache, fti, buildd_secret, require_virtualized, name, publish, date_updated, total_count, pending_count, succeeded_count, failed_count, building_count, date_created, signing_key, removed_binary_retention_days, num_old_versions_published, displayname, relative_build_score) VALUES (8, 17, NULL, true, NULL, 8, 1, false, NULL, NULL, NULL, NULL, NULL, false, 'primary', true, '2008-05-27 18:15:15.867154', 0, 0, 0, 0, 0, '2008-09-23 17:29:03.447851', NULL, NULL, NULL, 'Primary Archive for Ubuntu Test', 0); |
197 | -INSERT INTO archive (id, owner, description, enabled, authorized_size, distribution, purpose, private, sources_cached, binaries_cached, package_description_cache, fti, buildd_secret, require_virtualized, name, publish, date_updated, total_count, pending_count, succeeded_count, failed_count, building_count, date_created, signing_key, removed_binary_retention_days, num_old_versions_published, displayname, relative_build_score) VALUES (9, 28, 'packages to help my friends.', true, 1024, 1, 2, false, 3, 3, NULL, NULL, NULL, true, 'ppa', true, '2008-05-27 18:15:15.867684', 4, 0, 3, 1, 0, '2008-09-23 17:29:03.448178', NULL, NULL, NULL, 'PPA for Celso Providelo', 0); |
198 | -INSERT INTO archive (id, owner, description, enabled, authorized_size, distribution, purpose, private, sources_cached, binaries_cached, package_description_cache, fti, buildd_secret, require_virtualized, name, publish, date_updated, total_count, pending_count, succeeded_count, failed_count, building_count, date_created, signing_key, removed_binary_retention_days, num_old_versions_published, displayname, relative_build_score) VALUES (10, 1, 'packages to help the humanity (you know, ubuntu)', true, 1024, 1, 2, false, 1, 1, NULL, NULL, NULL, true, 'ppa', true, '2008-05-27 18:15:15.868202', 0, 0, 0, 0, 0, '2008-09-23 17:29:03.448488', NULL, NULL, NULL, 'PPA for Mark Shuttleworth', 0); |
199 | -INSERT INTO archive (id, owner, description, enabled, authorized_size, distribution, purpose, private, sources_cached, binaries_cached, package_description_cache, fti, buildd_secret, require_virtualized, name, publish, date_updated, total_count, pending_count, succeeded_count, failed_count, building_count, date_created, signing_key, removed_binary_retention_days, num_old_versions_published, displayname, relative_build_score) VALUES (11, 52, 'I am not allowed to say, I have no privs.', true, 1024, 1, 2, false, 0, 0, NULL, NULL, NULL, true, 'ppa', true, '2008-05-27 18:15:15.868709', 1, 0, 0, 1, 0, '2008-09-23 17:29:03.448797', NULL, NULL, NULL, 'PPA for No Privileges Person', 0); |
200 | -INSERT INTO archive (id, owner, description, enabled, authorized_size, distribution, purpose, private, sources_cached, binaries_cached, package_description_cache, fti, buildd_secret, require_virtualized, name, publish, date_updated, total_count, pending_count, succeeded_count, failed_count, building_count, date_created, signing_key, removed_binary_retention_days, num_old_versions_published, displayname, relative_build_score) VALUES (12, 17, 'Partner archive', true, NULL, 1, 4, false, NULL, NULL, NULL, NULL, NULL, false, 'partner', true, '2008-05-27 18:15:15.869209', 1, 0, 1, 0, 0, '2008-09-23 17:29:03.449157', NULL, NULL, NULL, 'Partner Archive for Ubuntu Linux', 0); |
201 | -INSERT INTO archive (id, owner, description, enabled, authorized_size, distribution, purpose, private, sources_cached, binaries_cached, package_description_cache, fti, buildd_secret, require_virtualized, name, publish, date_updated, total_count, pending_count, succeeded_count, failed_count, building_count, date_created, signing_key, removed_binary_retention_days, num_old_versions_published, displayname, relative_build_score) VALUES (13, 17, 'Partner archive', true, NULL, 8, 4, false, NULL, NULL, NULL, NULL, NULL, false, 'partner', true, '2008-05-27 18:15:15.869732', 0, 0, 0, 0, 0, '2008-09-23 17:29:03.449471', NULL, NULL, NULL, 'Partner Archive for Ubuntu Test', 0); |
202 | -INSERT INTO archive (id, owner, description, enabled, authorized_size, distribution, purpose, private, sources_cached, binaries_cached, package_description_cache, fti, buildd_secret, require_virtualized, name, publish, date_updated, total_count, pending_count, succeeded_count, failed_count, building_count, date_created, signing_key, removed_binary_retention_days, num_old_versions_published, displayname, relative_build_score) VALUES (14, 17, 'Sample copy archive', true, NULL, 8, 6, false, NULL, NULL, NULL, NULL, NULL, false, 'samplecopyarchive', false, '2008-11-19 18:15:15.869732', 0, 0, 0, 0, 0, '2008-11-18 17:29:03.449471', NULL, NULL, NULL, 'Copy archive samplecopyarchive for Ubuntu Team', 0); |
203 | -INSERT INTO archive (id, owner, description, enabled, authorized_size, distribution, purpose, private, sources_cached, binaries_cached, package_description_cache, fti, buildd_secret, require_virtualized, name, publish, date_updated, total_count, pending_count, succeeded_count, failed_count, building_count, date_created, signing_key, removed_binary_retention_days, num_old_versions_published, displayname, relative_build_score) VALUES (15, 17, 'Debug archive', true, NULL, 1, 7, false, NULL, NULL, NULL, NULL, NULL, false, 'debug', true, '2009-04-17 10:09:10.859746', 0, 0, 0, 0, 0, '2009-04-17 10:01:03.449876', NULL, NULL, NULL, 'Ubuntu DEBUG archive', 0); |
204 | +INSERT INTO archive (id, owner, description, enabled, authorized_size, distribution, purpose, private, sources_cached, binaries_cached, package_description_cache, fti, buildd_secret, require_virtualized, name, publish, date_updated, total_count, pending_count, succeeded_count, failed_count, building_count, date_created, signing_key, removed_binary_retention_days, num_old_versions_published, displayname, relative_build_score, external_dependencies) VALUES (1, 17, NULL, true, NULL, 1, 1, false, NULL, NULL, NULL, NULL, NULL, false, 'primary', true, '2008-05-27 18:15:12.241774', 15, 1, 8, 5, 1, '2008-09-23 17:29:03.442606', NULL, NULL, NULL, 'Primary Archive for Ubuntu Linux', 0, NULL); |
205 | +INSERT INTO archive (id, owner, description, enabled, authorized_size, distribution, purpose, private, sources_cached, binaries_cached, package_description_cache, fti, buildd_secret, require_virtualized, name, publish, date_updated, total_count, pending_count, succeeded_count, failed_count, building_count, date_created, signing_key, removed_binary_retention_days, num_old_versions_published, displayname, relative_build_score, external_dependencies) VALUES (2, 1, NULL, true, NULL, 2, 1, false, NULL, NULL, NULL, NULL, NULL, false, 'primary', true, '2008-05-27 18:15:15.863812', 0, 0, 0, 0, 0, '2008-09-23 17:29:03.445921', NULL, NULL, NULL, 'Primary Archive for Redhat Advanced Server', 0, NULL); |
206 | +INSERT INTO archive (id, owner, description, enabled, authorized_size, distribution, purpose, private, sources_cached, binaries_cached, package_description_cache, fti, buildd_secret, require_virtualized, name, publish, date_updated, total_count, pending_count, succeeded_count, failed_count, building_count, date_created, signing_key, removed_binary_retention_days, num_old_versions_published, displayname, relative_build_score, external_dependencies) VALUES (3, 1, NULL, true, NULL, 3, 1, false, NULL, NULL, NULL, NULL, NULL, false, 'primary', true, '2008-05-27 18:15:15.864941', 0, 0, 0, 0, 0, '2008-09-23 17:29:03.446557', NULL, NULL, NULL, 'Primary Archive for Debian GNU/Linux', 0, NULL); |
207 | +INSERT INTO archive (id, owner, description, enabled, authorized_size, distribution, purpose, private, sources_cached, binaries_cached, package_description_cache, fti, buildd_secret, require_virtualized, name, publish, date_updated, total_count, pending_count, succeeded_count, failed_count, building_count, date_created, signing_key, removed_binary_retention_days, num_old_versions_published, displayname, relative_build_score, external_dependencies) VALUES (4, 1, NULL, true, NULL, 4, 1, false, NULL, NULL, NULL, NULL, NULL, false, 'primary', true, '2008-05-27 18:15:15.865502', 0, 0, 0, 0, 0, '2008-09-23 17:29:03.44689', NULL, NULL, NULL, 'Primary Archive for The Gentoo Linux', 0, NULL); |
208 | +INSERT INTO archive (id, owner, description, enabled, authorized_size, distribution, purpose, private, sources_cached, binaries_cached, package_description_cache, fti, buildd_secret, require_virtualized, name, publish, date_updated, total_count, pending_count, succeeded_count, failed_count, building_count, date_created, signing_key, removed_binary_retention_days, num_old_versions_published, displayname, relative_build_score, external_dependencies) VALUES (5, 1, NULL, true, NULL, 5, 1, false, NULL, NULL, NULL, NULL, NULL, false, 'primary', true, '2008-05-27 18:15:15.866015', 0, 0, 0, 0, 0, '2008-09-23 17:29:03.447202', NULL, NULL, NULL, 'Primary Archive for Kubuntu - Free KDE-based Linux', 0, NULL); |
209 | +INSERT INTO archive (id, owner, description, enabled, authorized_size, distribution, purpose, private, sources_cached, binaries_cached, package_description_cache, fti, buildd_secret, require_virtualized, name, publish, date_updated, total_count, pending_count, succeeded_count, failed_count, building_count, date_created, signing_key, removed_binary_retention_days, num_old_versions_published, displayname, relative_build_score, external_dependencies) VALUES (7, 4, NULL, true, NULL, 7, 1, false, NULL, NULL, NULL, NULL, NULL, false, 'primary', true, '2008-05-27 18:15:15.866529', 0, 0, 0, 0, 0, '2008-09-23 17:29:03.447515', NULL, NULL, NULL, 'Primary Archive for GuadaLinex: Linux for Andalucia', 0, NULL); |
210 | +INSERT INTO archive (id, owner, description, enabled, authorized_size, distribution, purpose, private, sources_cached, binaries_cached, package_description_cache, fti, buildd_secret, require_virtualized, name, publish, date_updated, total_count, pending_count, succeeded_count, failed_count, building_count, date_created, signing_key, removed_binary_retention_days, num_old_versions_published, displayname, relative_build_score, external_dependencies) VALUES (8, 17, NULL, true, NULL, 8, 1, false, NULL, NULL, NULL, NULL, NULL, false, 'primary', true, '2008-05-27 18:15:15.867154', 0, 0, 0, 0, 0, '2008-09-23 17:29:03.447851', NULL, NULL, NULL, 'Primary Archive for Ubuntu Test', 0, NULL); |
211 | +INSERT INTO archive (id, owner, description, enabled, authorized_size, distribution, purpose, private, sources_cached, binaries_cached, package_description_cache, fti, buildd_secret, require_virtualized, name, publish, date_updated, total_count, pending_count, succeeded_count, failed_count, building_count, date_created, signing_key, removed_binary_retention_days, num_old_versions_published, displayname, relative_build_score, external_dependencies) VALUES (9, 28, 'packages to help my friends.', true, 1024, 1, 2, false, 3, 3, NULL, NULL, NULL, true, 'ppa', true, '2008-05-27 18:15:15.867684', 4, 0, 3, 1, 0, '2008-09-23 17:29:03.448178', NULL, NULL, NULL, 'PPA for Celso Providelo', 0, NULL); |
212 | +INSERT INTO archive (id, owner, description, enabled, authorized_size, distribution, purpose, private, sources_cached, binaries_cached, package_description_cache, fti, buildd_secret, require_virtualized, name, publish, date_updated, total_count, pending_count, succeeded_count, failed_count, building_count, date_created, signing_key, removed_binary_retention_days, num_old_versions_published, displayname, relative_build_score, external_dependencies) VALUES (10, 1, 'packages to help the humanity (you know, ubuntu)', true, 1024, 1, 2, false, 1, 1, NULL, NULL, NULL, true, 'ppa', true, '2008-05-27 18:15:15.868202', 0, 0, 0, 0, 0, '2008-09-23 17:29:03.448488', NULL, NULL, NULL, 'PPA for Mark Shuttleworth', 0, NULL); |
213 | +INSERT INTO archive (id, owner, description, enabled, authorized_size, distribution, purpose, private, sources_cached, binaries_cached, package_description_cache, fti, buildd_secret, require_virtualized, name, publish, date_updated, total_count, pending_count, succeeded_count, failed_count, building_count, date_created, signing_key, removed_binary_retention_days, num_old_versions_published, displayname, relative_build_score, external_dependencies) VALUES (11, 52, 'I am not allowed to say, I have no privs.', true, 1024, 1, 2, false, 0, 0, NULL, NULL, NULL, true, 'ppa', true, '2008-05-27 18:15:15.868709', 1, 0, 0, 1, 0, '2008-09-23 17:29:03.448797', NULL, NULL, NULL, 'PPA for No Privileges Person', 0, NULL); |
214 | +INSERT INTO archive (id, owner, description, enabled, authorized_size, distribution, purpose, private, sources_cached, binaries_cached, package_description_cache, fti, buildd_secret, require_virtualized, name, publish, date_updated, total_count, pending_count, succeeded_count, failed_count, building_count, date_created, signing_key, removed_binary_retention_days, num_old_versions_published, displayname, relative_build_score, external_dependencies) VALUES (12, 17, 'Partner archive', true, NULL, 1, 4, false, NULL, NULL, NULL, NULL, NULL, false, 'partner', true, '2008-05-27 18:15:15.869209', 1, 0, 1, 0, 0, '2008-09-23 17:29:03.449157', NULL, NULL, NULL, 'Partner Archive for Ubuntu Linux', 0, NULL); |
215 | +INSERT INTO archive (id, owner, description, enabled, authorized_size, distribution, purpose, private, sources_cached, binaries_cached, package_description_cache, fti, buildd_secret, require_virtualized, name, publish, date_updated, total_count, pending_count, succeeded_count, failed_count, building_count, date_created, signing_key, removed_binary_retention_days, num_old_versions_published, displayname, relative_build_score, external_dependencies) VALUES (13, 17, 'Partner archive', true, NULL, 8, 4, false, NULL, NULL, NULL, NULL, NULL, false, 'partner', true, '2008-05-27 18:15:15.869732', 0, 0, 0, 0, 0, '2008-09-23 17:29:03.449471', NULL, NULL, NULL, 'Partner Archive for Ubuntu Test', 0, NULL); |
216 | +INSERT INTO archive (id, owner, description, enabled, authorized_size, distribution, purpose, private, sources_cached, binaries_cached, package_description_cache, fti, buildd_secret, require_virtualized, name, publish, date_updated, total_count, pending_count, succeeded_count, failed_count, building_count, date_created, signing_key, removed_binary_retention_days, num_old_versions_published, displayname, relative_build_score, external_dependencies) VALUES (14, 17, 'Sample copy archive', true, NULL, 8, 6, false, NULL, NULL, NULL, NULL, NULL, false, 'samplecopyarchive', false, '2008-11-19 18:15:15.869732', 0, 0, 0, 0, 0, '2008-11-18 17:29:03.449471', NULL, NULL, NULL, 'Copy archive samplecopyarchive for Ubuntu Team', 0, NULL); |
217 | +INSERT INTO archive (id, owner, description, enabled, authorized_size, distribution, purpose, private, sources_cached, binaries_cached, package_description_cache, fti, buildd_secret, require_virtualized, name, publish, date_updated, total_count, pending_count, succeeded_count, failed_count, building_count, date_created, signing_key, removed_binary_retention_days, num_old_versions_published, displayname, relative_build_score, external_dependencies) VALUES (15, 17, 'Debug archive', true, NULL, 1, 7, false, NULL, NULL, NULL, NULL, NULL, false, 'debug', true, '2009-04-17 10:09:10.859746', 0, 0, 0, 0, 0, '2009-04-17 10:01:03.449876', NULL, NULL, NULL, 'Ubuntu DEBUG archive', 0, NULL); |
218 | |
219 | |
220 | ALTER TABLE archive ENABLE TRIGGER ALL; |
221 | @@ -1677,10 +1686,19 @@ |
222 | ALTER TABLE builder ENABLE TRIGGER ALL; |
223 | |
224 | |
225 | +ALTER TABLE buildpackagejob DISABLE TRIGGER ALL; |
226 | + |
227 | +INSERT INTO buildpackagejob (id, job, build) VALUES (1, 1, 8); |
228 | +INSERT INTO buildpackagejob (id, job, build) VALUES (2, 2, 11); |
229 | + |
230 | + |
231 | +ALTER TABLE buildpackagejob ENABLE TRIGGER ALL; |
232 | + |
233 | + |
234 | ALTER TABLE buildqueue DISABLE TRIGGER ALL; |
235 | |
236 | -INSERT INTO buildqueue (id, build, builder, logtail, created, buildstart, lastscore, manual) VALUES (1, 8, 1, 'Dummy sampledata entry, not processing', '2005-06-15 09:14:12.820778', '2005-06-15 09:20:12.820778', 1, false); |
237 | -INSERT INTO buildqueue (id, build, builder, logtail, created, buildstart, lastscore, manual) VALUES (2, 11, NULL, NULL, '2005-06-15 10:14:12.820778', NULL, 10, false); |
238 | +INSERT INTO buildqueue (id, builder, logtail, lastscore, manual, job, job_type) VALUES (1, 1, 'Dummy sampledata entry, not processing', 1, false, 1, 1); |
239 | +INSERT INTO buildqueue (id, builder, logtail, lastscore, manual, job, job_type) VALUES (2, NULL, NULL, 10, false, 2, 1); |
240 | |
241 | |
242 | ALTER TABLE buildqueue ENABLE TRIGGER ALL; |
243 | @@ -2753,6 +2771,8 @@ |
244 | |
245 | ALTER TABLE job DISABLE TRIGGER ALL; |
246 | |
247 | +INSERT INTO job (id, requester, reason, status, progress, last_report_seen, next_report_due, attempt_count, max_retries, log, scheduled_start, lease_expires, date_created, date_started, date_finished) VALUES (1, NULL, NULL, 0, NULL, NULL, NULL, 0, 0, NULL, NULL, NULL, '2005-06-15 09:14:12.820778', '2005-06-15 09:20:12.820778', NULL); |
248 | +INSERT INTO job (id, requester, reason, status, progress, last_report_seen, next_report_due, attempt_count, max_retries, log, scheduled_start, lease_expires, date_created, date_started, date_finished) VALUES (2, NULL, NULL, 0, NULL, NULL, NULL, 0, 0, NULL, NULL, NULL, '2005-06-15 10:14:12.820778', NULL, NULL); |
249 | |
250 | |
251 | ALTER TABLE job ENABLE TRIGGER ALL; |
252 | @@ -4579,6 +4599,13 @@ |
253 | ALTER TABLE packageset ENABLE TRIGGER ALL; |
254 | |
255 | |
256 | +ALTER TABLE packagesetgroup DISABLE TRIGGER ALL; |
257 | + |
258 | + |
259 | + |
260 | +ALTER TABLE packagesetgroup ENABLE TRIGGER ALL; |
261 | + |
262 | + |
263 | ALTER TABLE packagesetinclusion DISABLE TRIGGER ALL; |
264 | |
265 | |
266 | @@ -8500,6 +8527,26 @@ |
267 | ALTER TABLE signedcodeofconduct ENABLE TRIGGER ALL; |
268 | |
269 | |
270 | +ALTER TABLE sourcepackageformatselection DISABLE TRIGGER ALL; |
271 | + |
272 | +INSERT INTO sourcepackageformatselection (id, distroseries, format) VALUES (1, 1, 0); |
273 | +INSERT INTO sourcepackageformatselection (id, distroseries, format) VALUES (2, 2, 0); |
274 | +INSERT INTO sourcepackageformatselection (id, distroseries, format) VALUES (3, 3, 0); |
275 | +INSERT INTO sourcepackageformatselection (id, distroseries, format) VALUES (4, 4, 0); |
276 | +INSERT INTO sourcepackageformatselection (id, distroseries, format) VALUES (5, 5, 0); |
277 | +INSERT INTO sourcepackageformatselection (id, distroseries, format) VALUES (6, 6, 0); |
278 | +INSERT INTO sourcepackageformatselection (id, distroseries, format) VALUES (7, 7, 0); |
279 | +INSERT INTO sourcepackageformatselection (id, distroseries, format) VALUES (8, 8, 0); |
280 | +INSERT INTO sourcepackageformatselection (id, distroseries, format) VALUES (9, 9, 0); |
281 | +INSERT INTO sourcepackageformatselection (id, distroseries, format) VALUES (10, 10, 0); |
282 | +INSERT INTO sourcepackageformatselection (id, distroseries, format) VALUES (11, 11, 0); |
283 | +INSERT INTO sourcepackageformatselection (id, distroseries, format) VALUES (12, 12, 0); |
284 | +INSERT INTO sourcepackageformatselection (id, distroseries, format) VALUES (13, 13, 0); |
285 | + |
286 | + |
287 | +ALTER TABLE sourcepackageformatselection ENABLE TRIGGER ALL; |
288 | + |
289 | + |
290 | ALTER TABLE sourcepackagename DISABLE TRIGGER ALL; |
291 | |
292 | INSERT INTO sourcepackagename (id, name) VALUES (1, 'mozilla-firefox'); |
293 | @@ -8527,17 +8574,17 @@ |
294 | ALTER TABLE sourcepackagerelease DISABLE TRIGGER ALL; |
295 | |
296 | INSERT INTO sourcepackagerelease (id, creator, version, dateuploaded, urgency, dscsigningkey, component, changelog_entry, builddepends, builddependsindep, architecturehintlist, dsc, section, maintainer, sourcepackagename, upload_distroseries, format, dsc_maintainer_rfc822, dsc_standards_version, dsc_format, dsc_binaries, upload_archive, copyright, build_conflicts, build_conflicts_indep) VALUES (14, 1, '0.9', '2004-09-27 11:57:13', 1, NULL, 1, 'Mozilla dummy Changelog......', 'gcc-3.4-base, libc6 (>= 2.3.2.ds1-4), gcc-3.4 (>= 3.4.1-4sarge1), gcc-3.4 (<< 3.4.2), libstdc++6-dev (>= 3.4.1-4sarge1), pmount', 'bacula-common (= 1.34.6-2), bacula-director-common (= 1.34.6-2), postgresql-client (>= 7.4), pmount', 'any', NULL, 1, 1, 1, 1, 1, 'Mark Shuttleworth <mark@canonical.com>', '3.6.2', '1.0', 'mozilla-firefox', 1, NULL, 'gcc-4.0, pmount', 'gcc-4.0-base, pmount'); |
297 | -INSERT INTO sourcepackagerelease (id, creator, version, dateuploaded, urgency, dscsigningkey, component, changelog_entry, builddepends, builddependsindep, architecturehintlist, dsc, section, maintainer, sourcepackagename, upload_distroseries, format, dsc_maintainer_rfc822, dsc_standards_version, dsc_format, dsc_binaries, upload_archive, copyright, build_conflicts, build_conflicts_indep) VALUES (15, 1, '1.0', '2004-09-27 11:57:13', 1, NULL, 1, NULL, NULL, NULL, 'all', NULL, 2, 1, 9, 3, 1, NULL, NULL, NULL, NULL, 1, NULL, NULL, NULL); |
298 | -INSERT INTO sourcepackagerelease (id, creator, version, dateuploaded, urgency, dscsigningkey, component, changelog_entry, builddepends, builddependsindep, architecturehintlist, dsc, section, maintainer, sourcepackagename, upload_distroseries, format, dsc_maintainer_rfc822, dsc_standards_version, dsc_format, dsc_binaries, upload_archive, copyright, build_conflicts, build_conflicts_indep) VALUES (16, 1, '1.0-1', '2005-03-10 16:30:00', 1, NULL, 1, NULL, NULL, NULL, 'any', NULL, 3, 1, 10, 3, 1, NULL, NULL, NULL, NULL, 1, NULL, NULL, NULL); |
299 | -INSERT INTO sourcepackagerelease (id, creator, version, dateuploaded, urgency, dscsigningkey, component, changelog_entry, builddepends, builddependsindep, architecturehintlist, dsc, section, maintainer, sourcepackagename, upload_distroseries, format, dsc_maintainer_rfc822, dsc_standards_version, dsc_format, dsc_binaries, upload_archive, copyright, build_conflicts, build_conflicts_indep) VALUES (17, 1, '0.99.6-1', '2005-03-14 18:00:00', 1, NULL, 1, NULL, NULL, NULL, 'i386', NULL, 2, 1, 10, 1, 1, NULL, NULL, NULL, NULL, 1, NULL, NULL, NULL); |
300 | +INSERT INTO sourcepackagerelease (id, creator, version, dateuploaded, urgency, dscsigningkey, component, changelog_entry, builddepends, builddependsindep, architecturehintlist, dsc, section, maintainer, sourcepackagename, upload_distroseries, format, dsc_maintainer_rfc822, dsc_standards_version, dsc_format, dsc_binaries, upload_archive, copyright, build_conflicts, build_conflicts_indep) VALUES (15, 1, '1.0', '2004-09-27 11:57:13', 1, NULL, 1, NULL, NULL, NULL, 'all', NULL, 2, 1, 9, 3, 1, NULL, NULL, '1.0', NULL, 1, NULL, NULL, NULL); |
301 | +INSERT INTO sourcepackagerelease (id, creator, version, dateuploaded, urgency, dscsigningkey, component, changelog_entry, builddepends, builddependsindep, architecturehintlist, dsc, section, maintainer, sourcepackagename, upload_distroseries, format, dsc_maintainer_rfc822, dsc_standards_version, dsc_format, dsc_binaries, upload_archive, copyright, build_conflicts, build_conflicts_indep) VALUES (16, 1, '1.0-1', '2005-03-10 16:30:00', 1, NULL, 1, NULL, NULL, NULL, 'any', NULL, 3, 1, 10, 3, 1, NULL, NULL, '1.0', NULL, 1, NULL, NULL, NULL); |
302 | +INSERT INTO sourcepackagerelease (id, creator, version, dateuploaded, urgency, dscsigningkey, component, changelog_entry, builddepends, builddependsindep, architecturehintlist, dsc, section, maintainer, sourcepackagename, upload_distroseries, format, dsc_maintainer_rfc822, dsc_standards_version, dsc_format, dsc_binaries, upload_archive, copyright, build_conflicts, build_conflicts_indep) VALUES (17, 1, '0.99.6-1', '2005-03-14 18:00:00', 1, NULL, 1, NULL, NULL, NULL, 'i386', NULL, 2, 1, 10, 1, 1, NULL, NULL, '1.0', NULL, 1, NULL, NULL, NULL); |
303 | INSERT INTO sourcepackagerelease (id, creator, version, dateuploaded, urgency, dscsigningkey, component, changelog_entry, builddepends, builddependsindep, architecturehintlist, dsc, section, maintainer, sourcepackagename, upload_distroseries, format, dsc_maintainer_rfc822, dsc_standards_version, dsc_format, dsc_binaries, upload_archive, copyright, build_conflicts, build_conflicts_indep) VALUES (20, 1, '0.1-1', '2005-03-24 20:59:31.439579', 1, NULL, 1, 'pmount (0.1-1) hoary; urgency=low |
304 | |
305 | * Fix description (Malone #1) |
306 | * Fix debian (Debian #2000) |
307 | * Fix warty (Warty Ubuntu #1) |
308 | |
309 | - -- Sample Person <test@canonical.com> Tue, 7 Feb 2006 12:10:08 +0300', NULL, NULL, 'all', NULL, 2, 1, 14, 3, 1, NULL, NULL, NULL, NULL, 1, NULL, NULL, NULL); |
310 | -INSERT INTO sourcepackagerelease (id, creator, version, dateuploaded, urgency, dscsigningkey, component, changelog_entry, builddepends, builddependsindep, architecturehintlist, dsc, section, maintainer, sourcepackagename, upload_distroseries, format, dsc_maintainer_rfc822, dsc_standards_version, dsc_format, dsc_binaries, upload_archive, copyright, build_conflicts, build_conflicts_indep) VALUES (21, 1, '0.1-2', '2005-06-24 20:59:31.439579', 1, NULL, 1, 'This is a placeholder changelog for pmount 0.1-2', NULL, NULL, 'powerpc', NULL, 1, 1, 14, 3, 1, NULL, NULL, NULL, NULL, 1, NULL, NULL, NULL); |
311 | + -- Sample Person <test@canonical.com> Tue, 7 Feb 2006 12:10:08 +0300', NULL, NULL, 'all', NULL, 2, 1, 14, 3, 1, NULL, NULL, '1.0', NULL, 1, NULL, NULL, NULL); |
312 | +INSERT INTO sourcepackagerelease (id, creator, version, dateuploaded, urgency, dscsigningkey, component, changelog_entry, builddepends, builddependsindep, architecturehintlist, dsc, section, maintainer, sourcepackagename, upload_distroseries, format, dsc_maintainer_rfc822, dsc_standards_version, dsc_format, dsc_binaries, upload_archive, copyright, build_conflicts, build_conflicts_indep) VALUES (21, 1, '0.1-2', '2005-06-24 20:59:31.439579', 1, NULL, 1, 'This is a placeholder changelog for pmount 0.1-2', NULL, NULL, 'powerpc', NULL, 1, 1, 14, 3, 1, NULL, NULL, '1.0', NULL, 1, NULL, NULL, NULL); |
313 | INSERT INTO sourcepackagerelease (id, creator, version, dateuploaded, urgency, dscsigningkey, component, changelog_entry, builddepends, builddependsindep, architecturehintlist, dsc, section, maintainer, sourcepackagename, upload_distroseries, format, dsc_maintainer_rfc822, dsc_standards_version, dsc_format, dsc_binaries, upload_archive, copyright, build_conflicts, build_conflicts_indep) VALUES (23, 1, '1.0.8-1ubuntu1', '2005-02-03 08:50:00', 1, NULL, 1, 'alsa-utils (1.0.8-1ubuntu1) warty; urgency=low |
314 | |
315 | * Placeholder |
316 | @@ -8547,7 +8594,7 @@ |
317 | |
318 | * Placeholder |
319 | |
320 | - -- Sample Person <test@canonical.com> Tue, 7 Feb 2006 12:10:08 +0300', NULL, NULL, 'any', NULL, 2, 1, 19, 8, 1, NULL, NULL, NULL, NULL, 1, NULL, NULL, NULL); |
321 | + -- Sample Person <test@canonical.com> Tue, 7 Feb 2006 12:10:08 +0300', NULL, NULL, 'any', NULL, 2, 1, 19, 8, 1, NULL, NULL, '1.0', NULL, 1, NULL, NULL, NULL); |
322 | INSERT INTO sourcepackagerelease (id, creator, version, dateuploaded, urgency, dscsigningkey, component, changelog_entry, builddepends, builddependsindep, architecturehintlist, dsc, section, maintainer, sourcepackagename, upload_distroseries, format, dsc_maintainer_rfc822, dsc_standards_version, dsc_format, dsc_binaries, upload_archive, copyright, build_conflicts, build_conflicts_indep) VALUES (25, 1, '1.0.9a-4ubuntu1', '2005-08-01 14:10:00', 1, NULL, 1, 'alsa-utils (1.0.9a-4ubuntu1) hoary; urgency=low |
323 | |
324 | * Placeholder |
325 | @@ -8557,21 +8604,21 @@ |
326 | LP: #7, #8, |
327 | #11 |
328 | |
329 | - -- Sample Person <test@canonical.com> Tue, 7 Feb 2006 12:10:08 +0300', NULL, NULL, 'all', NULL, 1, 16, 19, 3, 1, NULL, NULL, NULL, NULL, 1, NULL, NULL, NULL); |
330 | -INSERT INTO sourcepackagerelease (id, creator, version, dateuploaded, urgency, dscsigningkey, component, changelog_entry, builddepends, builddependsindep, architecturehintlist, dsc, section, maintainer, sourcepackagename, upload_distroseries, format, dsc_maintainer_rfc822, dsc_standards_version, dsc_format, dsc_binaries, upload_archive, copyright, build_conflicts, build_conflicts_indep) VALUES (26, 1, 'cr.g7-37', '2005-12-22 18:19:00', 1, NULL, 1, NULL, NULL, NULL, 'i386', NULL, 1, 16, 20, 3, 1, NULL, NULL, NULL, NULL, 1, NULL, NULL, NULL); |
331 | + -- Sample Person <test@canonical.com> Tue, 7 Feb 2006 12:10:08 +0300', NULL, NULL, 'all', NULL, 1, 16, 19, 3, 1, NULL, NULL, '1.0', NULL, 1, NULL, NULL, NULL); |
332 | +INSERT INTO sourcepackagerelease (id, creator, version, dateuploaded, urgency, dscsigningkey, component, changelog_entry, builddepends, builddependsindep, architecturehintlist, dsc, section, maintainer, sourcepackagename, upload_distroseries, format, dsc_maintainer_rfc822, dsc_standards_version, dsc_format, dsc_binaries, upload_archive, copyright, build_conflicts, build_conflicts_indep) VALUES (26, 1, 'cr.g7-37', '2005-12-22 18:19:00', 1, NULL, 1, NULL, NULL, NULL, 'i386', NULL, 1, 16, 20, 3, 1, NULL, NULL, '1.0', NULL, 1, NULL, NULL, NULL); |
333 | INSERT INTO sourcepackagerelease (id, creator, version, dateuploaded, urgency, dscsigningkey, component, changelog_entry, builddepends, builddependsindep, architecturehintlist, dsc, section, maintainer, sourcepackagename, upload_distroseries, format, dsc_maintainer_rfc822, dsc_standards_version, dsc_format, dsc_binaries, upload_archive, copyright, build_conflicts, build_conflicts_indep) VALUES (27, 1, 'b8p', '2006-02-10 11:19:00', 1, NULL, 1, 'libstdc++ (9.9-1) hoary; urgency=high |
334 | |
335 | * Placeholder |
336 | |
337 | - -- Sample Person <test@canonical.com> Tue, 10 Feb 2006 10:10:08 +0300', NULL, NULL, 'powerpc i386', NULL, 1, 16, 21, 3, 1, NULL, NULL, NULL, NULL, 1, NULL, NULL, NULL); |
338 | -INSERT INTO sourcepackagerelease (id, creator, version, dateuploaded, urgency, dscsigningkey, component, changelog_entry, builddepends, builddependsindep, architecturehintlist, dsc, section, maintainer, sourcepackagename, upload_distroseries, format, dsc_maintainer_rfc822, dsc_standards_version, dsc_format, dsc_binaries, upload_archive, copyright, build_conflicts, build_conflicts_indep) VALUES (28, 1, '2.6.15.3', '2005-12-22 18:19:00', 1, NULL, 1, NULL, NULL, NULL, 'any', NULL, 1, 16, 22, 3, 1, NULL, NULL, NULL, NULL, 1, NULL, NULL, NULL); |
339 | -INSERT INTO sourcepackagerelease (id, creator, version, dateuploaded, urgency, dscsigningkey, component, changelog_entry, builddepends, builddependsindep, architecturehintlist, dsc, section, maintainer, sourcepackagename, upload_distroseries, format, dsc_maintainer_rfc822, dsc_standards_version, dsc_format, dsc_binaries, upload_archive, copyright, build_conflicts, build_conflicts_indep) VALUES (29, 1, '0.00', '2005-12-22 18:19:00', 1, NULL, 1, NULL, NULL, NULL, 'all', NULL, 1, 16, 17, 3, 1, NULL, NULL, NULL, NULL, 1, NULL, NULL, NULL); |
340 | -INSERT INTO sourcepackagerelease (id, creator, version, dateuploaded, urgency, dscsigningkey, component, changelog_entry, builddepends, builddependsindep, architecturehintlist, dsc, section, maintainer, sourcepackagename, upload_distroseries, format, dsc_maintainer_rfc822, dsc_standards_version, dsc_format, dsc_binaries, upload_archive, copyright, build_conflicts, build_conflicts_indep) VALUES (30, 1, '1.0', '2006-09-28 18:19:00', 1, NULL, 1, NULL, NULL, NULL, 'all', NULL, 1, 16, 20, 10, 1, NULL, NULL, NULL, NULL, 1, NULL, NULL, NULL); |
341 | -INSERT INTO sourcepackagerelease (id, creator, version, dateuploaded, urgency, dscsigningkey, component, changelog_entry, builddepends, builddependsindep, architecturehintlist, dsc, section, maintainer, sourcepackagename, upload_distroseries, format, dsc_maintainer_rfc822, dsc_standards_version, dsc_format, dsc_binaries, upload_archive, copyright, build_conflicts, build_conflicts_indep) VALUES (31, 1, '1.0', '2006-09-28 18:19:01', 1, NULL, 1, NULL, NULL, NULL, 'all', NULL, 1, 16, 20, 10, 1, NULL, NULL, NULL, NULL, 1, NULL, NULL, NULL); |
342 | -INSERT INTO sourcepackagerelease (id, creator, version, dateuploaded, urgency, dscsigningkey, component, changelog_entry, builddepends, builddependsindep, architecturehintlist, dsc, section, maintainer, sourcepackagename, upload_distroseries, format, dsc_maintainer_rfc822, dsc_standards_version, dsc_format, dsc_binaries, upload_archive, copyright, build_conflicts, build_conflicts_indep) VALUES (32, 1, '1.0', '2006-12-01 13:19:01', 1, NULL, 1, NULL, NULL, NULL, 'all', NULL, 1, 16, 23, 10, 1, NULL, NULL, NULL, NULL, 1, NULL, NULL, NULL); |
343 | -INSERT INTO sourcepackagerelease (id, creator, version, dateuploaded, urgency, dscsigningkey, component, changelog_entry, builddepends, builddependsindep, architecturehintlist, dsc, section, maintainer, sourcepackagename, upload_distroseries, format, dsc_maintainer_rfc822, dsc_standards_version, dsc_format, dsc_binaries, upload_archive, copyright, build_conflicts, build_conflicts_indep) VALUES (33, 1, '1.0', '2006-12-01 13:19:01', 1, NULL, 1, NULL, NULL, NULL, 'all', NULL, 1, 16, 24, 10, 1, NULL, NULL, NULL, NULL, 1, NULL, NULL, NULL); |
344 | -INSERT INTO sourcepackagerelease (id, creator, version, dateuploaded, urgency, dscsigningkey, component, changelog_entry, builddepends, builddependsindep, architecturehintlist, dsc, section, maintainer, sourcepackagename, upload_distroseries, format, dsc_maintainer_rfc822, dsc_standards_version, dsc_format, dsc_binaries, upload_archive, copyright, build_conflicts, build_conflicts_indep) VALUES (34, 1, '1.0', '2007-02-15 14:19:01', 1, NULL, 1, NULL, NULL, NULL, 'i386', NULL, 29, 16, 25, 10, 1, NULL, NULL, NULL, NULL, 1, NULL, NULL, NULL); |
345 | -INSERT INTO sourcepackagerelease (id, creator, version, dateuploaded, urgency, dscsigningkey, component, changelog_entry, builddepends, builddependsindep, architecturehintlist, dsc, section, maintainer, sourcepackagename, upload_distroseries, format, dsc_maintainer_rfc822, dsc_standards_version, dsc_format, dsc_binaries, upload_archive, copyright, build_conflicts, build_conflicts_indep) VALUES (35, 1, '1.0', '2006-04-11 11:19:01', 1, NULL, 1, NULL, NULL, NULL, 'any', NULL, 1, 16, 26, 1, 1, NULL, NULL, NULL, NULL, 10, NULL, NULL, NULL); |
346 | + -- Sample Person <test@canonical.com> Tue, 10 Feb 2006 10:10:08 +0300', NULL, NULL, 'powerpc i386', NULL, 1, 16, 21, 3, 1, NULL, NULL, '1.0', NULL, 1, NULL, NULL, NULL); |
347 | +INSERT INTO sourcepackagerelease (id, creator, version, dateuploaded, urgency, dscsigningkey, component, changelog_entry, builddepends, builddependsindep, architecturehintlist, dsc, section, maintainer, sourcepackagename, upload_distroseries, format, dsc_maintainer_rfc822, dsc_standards_version, dsc_format, dsc_binaries, upload_archive, copyright, build_conflicts, build_conflicts_indep) VALUES (28, 1, '2.6.15.3', '2005-12-22 18:19:00', 1, NULL, 1, NULL, NULL, NULL, 'any', NULL, 1, 16, 22, 3, 1, NULL, NULL, '1.0', NULL, 1, NULL, NULL, NULL); |
348 | +INSERT INTO sourcepackagerelease (id, creator, version, dateuploaded, urgency, dscsigningkey, component, changelog_entry, builddepends, builddependsindep, architecturehintlist, dsc, section, maintainer, sourcepackagename, upload_distroseries, format, dsc_maintainer_rfc822, dsc_standards_version, dsc_format, dsc_binaries, upload_archive, copyright, build_conflicts, build_conflicts_indep) VALUES (29, 1, '0.00', '2005-12-22 18:19:00', 1, NULL, 1, NULL, NULL, NULL, 'all', NULL, 1, 16, 17, 3, 1, NULL, NULL, '1.0', NULL, 1, NULL, NULL, NULL); |
349 | +INSERT INTO sourcepackagerelease (id, creator, version, dateuploaded, urgency, dscsigningkey, component, changelog_entry, builddepends, builddependsindep, architecturehintlist, dsc, section, maintainer, sourcepackagename, upload_distroseries, format, dsc_maintainer_rfc822, dsc_standards_version, dsc_format, dsc_binaries, upload_archive, copyright, build_conflicts, build_conflicts_indep) VALUES (30, 1, '1.0', '2006-09-28 18:19:00', 1, NULL, 1, NULL, NULL, NULL, 'all', NULL, 1, 16, 20, 10, 1, NULL, NULL, '1.0', NULL, 1, NULL, NULL, NULL); |
350 | +INSERT INTO sourcepackagerelease (id, creator, version, dateuploaded, urgency, dscsigningkey, component, changelog_entry, builddepends, builddependsindep, architecturehintlist, dsc, section, maintainer, sourcepackagename, upload_distroseries, format, dsc_maintainer_rfc822, dsc_standards_version, dsc_format, dsc_binaries, upload_archive, copyright, build_conflicts, build_conflicts_indep) VALUES (31, 1, '1.0', '2006-09-28 18:19:01', 1, NULL, 1, NULL, NULL, NULL, 'all', NULL, 1, 16, 20, 10, 1, NULL, NULL, '1.0', NULL, 1, NULL, NULL, NULL); |
351 | +INSERT INTO sourcepackagerelease (id, creator, version, dateuploaded, urgency, dscsigningkey, component, changelog_entry, builddepends, builddependsindep, architecturehintlist, dsc, section, maintainer, sourcepackagename, upload_distroseries, format, dsc_maintainer_rfc822, dsc_standards_version, dsc_format, dsc_binaries, upload_archive, copyright, build_conflicts, build_conflicts_indep) VALUES (32, 1, '1.0', '2006-12-01 13:19:01', 1, NULL, 1, NULL, NULL, NULL, 'all', NULL, 1, 16, 23, 10, 1, NULL, NULL, '1.0', NULL, 1, NULL, NULL, NULL); |
352 | +INSERT INTO sourcepackagerelease (id, creator, version, dateuploaded, urgency, dscsigningkey, component, changelog_entry, builddepends, builddependsindep, architecturehintlist, dsc, section, maintainer, sourcepackagename, upload_distroseries, format, dsc_maintainer_rfc822, dsc_standards_version, dsc_format, dsc_binaries, upload_archive, copyright, build_conflicts, build_conflicts_indep) VALUES (33, 1, '1.0', '2006-12-01 13:19:01', 1, NULL, 1, NULL, NULL, NULL, 'all', NULL, 1, 16, 24, 10, 1, NULL, NULL, '1.0', NULL, 1, NULL, NULL, NULL); |
353 | +INSERT INTO sourcepackagerelease (id, creator, version, dateuploaded, urgency, dscsigningkey, component, changelog_entry, builddepends, builddependsindep, architecturehintlist, dsc, section, maintainer, sourcepackagename, upload_distroseries, format, dsc_maintainer_rfc822, dsc_standards_version, dsc_format, dsc_binaries, upload_archive, copyright, build_conflicts, build_conflicts_indep) VALUES (34, 1, '1.0', '2007-02-15 14:19:01', 1, NULL, 1, NULL, NULL, NULL, 'i386', NULL, 29, 16, 25, 10, 1, NULL, NULL, '1.0', NULL, 1, NULL, NULL, NULL); |
354 | +INSERT INTO sourcepackagerelease (id, creator, version, dateuploaded, urgency, dscsigningkey, component, changelog_entry, builddepends, builddependsindep, architecturehintlist, dsc, section, maintainer, sourcepackagename, upload_distroseries, format, dsc_maintainer_rfc822, dsc_standards_version, dsc_format, dsc_binaries, upload_archive, copyright, build_conflicts, build_conflicts_indep) VALUES (35, 1, '1.0', '2006-04-11 11:19:01', 1, NULL, 1, NULL, NULL, NULL, 'any', NULL, 1, 16, 26, 1, 1, NULL, NULL, '1.0', NULL, 10, NULL, NULL, NULL); |
355 | INSERT INTO sourcepackagerelease (id, creator, version, dateuploaded, urgency, dscsigningkey, component, changelog_entry, builddepends, builddependsindep, architecturehintlist, dsc, section, maintainer, sourcepackagename, upload_distroseries, format, dsc_maintainer_rfc822, dsc_standards_version, dsc_format, dsc_binaries, upload_archive, copyright, build_conflicts, build_conflicts_indep) VALUES (36, 243606, '1.0-1', '2007-08-09 21:25:37.832976', 1, NULL, 5, 'commercialpackage (1.0-1) breezy; urgency=low |
356 | |
357 | * Initial version |
358 | @@ -8597,7 +8644,7 @@ |
359 | 3F4bEPeRcnUjCFI/hjR0kxg= |
360 | =Tjln |
361 | ', 7, 243606, 27, 10, 1, 'Julian Edwards <launchpad@julian-edwards.com>', '3.6.2', '1.0', 'commercialpackage', 12, NULL, NULL, NULL); |
362 | -INSERT INTO sourcepackagerelease (id, creator, version, dateuploaded, urgency, dscsigningkey, component, changelog_entry, builddepends, builddependsindep, architecturehintlist, dsc, section, maintainer, sourcepackagename, upload_distroseries, format, dsc_maintainer_rfc822, dsc_standards_version, dsc_format, dsc_binaries, upload_archive, copyright, build_conflicts, build_conflicts_indep) VALUES (37, 1, '1.0', '2006-04-11 11:19:01', 1, NULL, 1, NULL, NULL, NULL, 'i386', NULL, 1, 16, 26, 1, 1, NULL, NULL, NULL, NULL, 11, NULL, NULL, NULL); |
363 | +INSERT INTO sourcepackagerelease (id, creator, version, dateuploaded, urgency, dscsigningkey, component, changelog_entry, builddepends, builddependsindep, architecturehintlist, dsc, section, maintainer, sourcepackagename, upload_distroseries, format, dsc_maintainer_rfc822, dsc_standards_version, dsc_format, dsc_binaries, upload_archive, copyright, build_conflicts, build_conflicts_indep) VALUES (37, 1, '1.0', '2006-04-11 11:19:01', 1, NULL, 1, NULL, NULL, NULL, 'i386', NULL, 1, 16, 26, 1, 1, NULL, NULL, '1.0', NULL, 11, NULL, NULL, NULL); |
364 | |
365 | |
366 | ALTER TABLE sourcepackagerelease ENABLE TRIGGER ALL; |
367 | |
368 | === modified file 'database/schema/comments.sql' |
369 | --- database/schema/comments.sql 2009-11-02 15:33:35 +0000 |
370 | +++ database/schema/comments.sql 2009-11-16 23:27:14 +0000 |
371 | @@ -1531,14 +1531,13 @@ |
372 | COMMENT ON COLUMN Builder.active IS 'Whether to present or not the builder in the public list of builders avaialble. It is used to hide transient or defunct builders while they get fixed.'; |
373 | |
374 | -- BuildQueue |
375 | -COMMENT ON TABLE BuildQueue IS 'BuildQueue: The queue of builds in progress/scheduled to run. This table is the core of the build daemon master. It lists all builds in progress or scheduled to start.'; |
376 | -COMMENT ON COLUMN BuildQueue.build IS 'The build for which this queue item exists. This is how the buildd master will find all the files it needs to perform the build'; |
377 | +COMMENT ON TABLE BuildQueue IS 'BuildQueue: The queue of jobs in progress/scheduled to run on the Soyuz build farm.'; |
378 | COMMENT ON COLUMN BuildQueue.builder IS 'The builder assigned to this build. Some builds will have a builder assigned to queue them up; some will be building on the specified builder already; others will not have a builder yet (NULL) and will be waiting to be assigned into a builder''s queue'; |
379 | -COMMENT ON COLUMN BuildQueue.created IS 'The timestamp of the creation of this row. This is used by the buildd master scheduling algorithm to decide how soon to schedule a build to run on a given builder.'; |
380 | -COMMENT ON COLUMN BuildQueue.buildstart IS 'The timestamp of the start of the build run on the given builder. If this is NULL then the build is not running yet.'; |
381 | COMMENT ON COLUMN BuildQueue.logtail IS 'The tail end of the log of the current build. This is updated regularly as the buildd master polls the buildd slaves. Once the build is complete; the full log will be lodged with the librarian and linked into the build table.'; |
382 | COMMENT ON COLUMN BuildQueue.lastscore IS 'The last score ascribed to this build record. This can be used in the UI among other places.'; |
383 | COMMENT ON COLUMN BuildQueue.manual IS 'Indicates if the current record was or not rescored manually, if so it get skipped from the auto-score procedure.'; |
384 | +COMMENT ON COLUMN BuildQueue.job IS 'Foreign key to the `Job` table row with the generic job data.'; |
385 | +COMMENT ON COLUMN BuildQueue.job_type IS 'Type of job (enumeration value), enables us to find/query the correct table with the data specific to this type of job.'; |
386 | |
387 | -- Mirrors |
388 | |
389 | |
390 | === added file 'database/schema/patch-2207-09.0.sql' |
391 | --- database/schema/patch-2207-09.0.sql 1970-01-01 00:00:00 +0000 |
392 | +++ database/schema/patch-2207-09.0.sql 2009-11-16 23:27:14 +0000 |
393 | @@ -0,0 +1,10 @@ |
394 | +SET client_min_messages=ERROR; |
395 | + |
396 | +-- Per Bug #196774 |
397 | +ALTER TABLE Packaging |
398 | + DROP CONSTRAINT packaging_uniqueness, |
399 | + ADD CONSTRAINT packaging__distroseries__sourcepackagename__key |
400 | + UNIQUE (distroseries, sourcepackagename); |
401 | + |
402 | +INSERT INTO LaunchpadDatabaseRevision VALUES (2207, 9, 0); |
403 | + |
404 | |
405 | === added file 'database/schema/patch-2207-11-0.sql' |
406 | --- database/schema/patch-2207-11-0.sql 1970-01-01 00:00:00 +0000 |
407 | +++ database/schema/patch-2207-11-0.sql 2009-11-16 23:27:14 +0000 |
408 | @@ -0,0 +1,87 @@ |
409 | +-- Copyright 2009 Canonical Ltd. This software is licensed under the |
410 | +-- GNU Affero General Public License version 3 (see the file LICENSE). |
411 | + |
412 | +SET client_min_messages=ERROR; |
413 | + |
414 | +-- The schema patch required for the Soyuz buildd generalisation, see |
415 | +-- https://dev.launchpad.net/Soyuz/Specs/BuilddGeneralisation for details. |
416 | +-- Bug #478919. |
417 | + |
418 | +-- Step 1 |
419 | +-- The `BuildPackageJob` table captures whatever data is required for |
420 | +-- "normal" Soyuz build farm jobs that build source packages. |
421 | + |
422 | +CREATE TABLE buildpackagejob ( |
423 | + id serial PRIMARY KEY, |
424 | + -- FK to the `Job` record with "generic" data about this source package |
425 | + -- build job. Please note that the corresponding `BuildQueue` row will |
426 | + -- have a FK referencing the same `Job` row. |
427 | + job integer NOT NULL CONSTRAINT buildpackagejob__job__fk REFERENCES job, |
428 | + -- FK to the associated `Build` record. |
429 | + build integer NOT NULL CONSTRAINT buildpackagejob__build__fk REFERENCES build |
430 | +); |
431 | + |
432 | +-- Step 2 |
433 | +-- Changes needed to the `BuildQueue` table. |
434 | + |
435 | +-- The 'job' and the 'job_type' columns will enable us to find the correct |
436 | +-- database rows that hold the generic and the specific data pertaining to |
437 | +-- the job respectively. |
438 | +ALTER TABLE ONLY buildqueue ADD COLUMN job integer; |
439 | +ALTER TABLE ONLY buildqueue ADD COLUMN job_type integer NOT NULL DEFAULT 1; |
440 | + |
441 | +-- Step 3 |
442 | +-- Data migration for the existing `BuildQueue` records. |
443 | +CREATE OR REPLACE FUNCTION migrate_buildqueue_rows() RETURNS integer |
444 | +LANGUAGE plpgsql AS |
445 | +$$ |
446 | +DECLARE |
447 | + queue_row RECORD; |
448 | + job_id integer; |
449 | + buildpackagejob_id integer; |
450 | + rows_migrated integer; |
451 | +BEGIN |
452 | + rows_migrated := 0; |
453 | + FOR queue_row IN SELECT * FROM buildqueue LOOP |
454 | + INSERT INTO job(status, date_created, date_started) VALUES(0, queue_row.created, queue_row.buildstart); |
455 | + -- Get the key of the `Job` row just inserted. |
456 | + SELECT currval('job_id_seq') INTO job_id; |
457 | + INSERT INTO buildpackagejob(job, build) VALUES(job_id, queue_row.build); |
458 | + -- Get the key of the `BuildPackageJob` row just inserted. |
459 | + SELECT currval('buildpackagejob_id_seq') INTO buildpackagejob_id; |
460 | + UPDATE buildqueue SET job=job_id WHERE id=queue_row.id; |
461 | + rows_migrated := rows_migrated + 1; |
462 | + END LOOP; |
463 | + RETURN rows_migrated; |
464 | +END; |
465 | +$$; |
466 | + |
467 | +-- Run the data migration function. |
468 | +SELECT * FROM migrate_buildqueue_rows(); |
469 | +-- The `BuildQueue` data is migrated at this point, we can get rid of the |
470 | +-- data migration function. |
471 | +DROP FUNCTION migrate_buildqueue_rows(); |
472 | + |
473 | +-- Now that the data was migrated we can make the 'job' column mandatory |
474 | +-- and define the foreign key constraint for it. |
475 | +ALTER TABLE ONLY buildqueue ALTER COLUMN job SET NOT NULL; |
476 | +ALTER TABLE ONLY buildqueue |
477 | + ADD CONSTRAINT buildqueue__job__fk |
478 | + FOREIGN KEY (job) REFERENCES job(id); |
479 | + |
480 | +-- Step 4 |
481 | +-- Now remove the obsolete columns, constraints and indexes from `BuildQueue`. |
482 | +-- The latter will from now on refer to the `Build` record via the |
483 | +-- `Job`/`BuildPackageJob` tables (and not directly any more). |
484 | +DROP INDEX buildqueue__build__idx; |
485 | +ALTER TABLE ONLY buildqueue DROP CONSTRAINT "$1"; |
486 | +ALTER TABLE ONLY buildqueue DROP COLUMN build; |
487 | +ALTER TABLE ONLY buildqueue DROP COLUMN created; |
488 | +ALTER TABLE ONLY buildqueue DROP COLUMN buildstart; |
489 | + |
490 | +-- Step 5 |
491 | +-- Add indexes for the new `BuildQueue` columns. |
492 | +CREATE INDEX buildqueue__job__idx ON buildqueue(job); |
493 | +CREATE INDEX buildqueue__job_type__idx ON buildqueue(job_type); |
494 | + |
495 | +INSERT INTO LaunchpadDatabaseRevision VALUES (2207, 11, 0); |
496 | |
497 | === modified file 'database/schema/security.cfg' |
498 | --- database/schema/security.cfg 2009-11-06 01:16:21 +0000 |
499 | +++ database/schema/security.cfg 2009-11-16 23:27:14 +0000 |
500 | @@ -271,6 +271,7 @@ |
501 | public.shippingrun = SELECT, INSERT, UPDATE |
502 | public.sourcepackagepublishinghistory = SELECT |
503 | public.seriessourcepackagebranch = SELECT, INSERT, UPDATE, DELETE |
504 | +public.sourcepackageformatselection = SELECT |
505 | public.specificationbranch = SELECT, INSERT, UPDATE, DELETE |
506 | public.specificationbug = SELECT, INSERT, DELETE |
507 | public.specificationdependency = SELECT, INSERT, DELETE |
508 | @@ -837,6 +838,8 @@ |
509 | public.archivearch = SELECT, UPDATE |
510 | public.archivedependency = SELECT |
511 | public.buildqueue = SELECT, INSERT, UPDATE, DELETE |
512 | +public.job = SELECT, INSERT, UPDATE, DELETE |
513 | +public.buildpackagejob = SELECT, INSERT, UPDATE, DELETE |
514 | public.builder = SELECT, INSERT, UPDATE |
515 | public.build = SELECT, INSERT, UPDATE |
516 | public.distribution = SELECT, UPDATE |
517 | @@ -928,6 +931,8 @@ |
518 | public.build = SELECT, INSERT, UPDATE |
519 | public.builder = SELECT, INSERT, UPDATE |
520 | public.buildqueue = SELECT, INSERT, UPDATE, DELETE |
521 | +public.job = SELECT, INSERT, UPDATE, DELETE |
522 | +public.buildpackagejob = SELECT, INSERT, UPDATE, DELETE |
523 | public.component = SELECT, INSERT, UPDATE |
524 | public.componentselection = SELECT, INSERT, UPDATE |
525 | public.country = SELECT, INSERT, UPDATE |
526 | @@ -987,6 +992,7 @@ |
527 | public.sectionselection = SELECT, INSERT, UPDATE |
528 | public.signedcodeofconduct = SELECT, INSERT, UPDATE |
529 | public.sourcepackagefilepublishing = SELECT, INSERT, UPDATE |
530 | +public.sourcepackageformatselection = SELECT, INSERT |
531 | public.sourcepackagename = SELECT, INSERT, UPDATE |
532 | public.sourcepackagepublishinghistory = SELECT |
533 | public.securesourcepackagepublishinghistory = SELECT, INSERT, UPDATE |
534 | @@ -1103,6 +1109,7 @@ |
535 | public.archivepermission = SELECT |
536 | public.processor = SELECT |
537 | public.processorfamily = SELECT |
538 | +public.sourcepackageformatselection = SELECT |
539 | |
540 | # Source and Binary packages and builds |
541 | public.sourcepackagename = SELECT, INSERT |
542 | @@ -1114,6 +1121,8 @@ |
543 | public.pocketchroot = SELECT |
544 | public.build = SELECT, INSERT, UPDATE |
545 | public.buildqueue = SELECT, INSERT, UPDATE |
546 | +public.job = SELECT, INSERT, UPDATE |
547 | +public.buildpackagejob = SELECT, INSERT, UPDATE |
548 | |
549 | # Thusly the librarian |
550 | public.libraryfilecontent = SELECT, INSERT |
551 | @@ -1195,6 +1204,8 @@ |
552 | public.distrocomponentuploader = SELECT |
553 | public.build = SELECT, INSERT, UPDATE |
554 | public.buildqueue = SELECT, INSERT, UPDATE |
555 | +public.job = SELECT, INSERT, UPDATE |
556 | +public.buildpackagejob = SELECT, INSERT, UPDATE |
557 | public.pocketchroot = SELECT |
558 | public.sourcepackagerelease = SELECT, UPDATE |
559 | public.binarypackagerelease = SELECT, UPDATE |
560 | |
561 | === modified file 'lib/canonical/launchpad/helpers.py' |
562 | --- lib/canonical/launchpad/helpers.py 2009-07-17 00:26:05 +0000 |
563 | +++ lib/canonical/launchpad/helpers.py 2009-11-16 23:27:14 +0000 |
564 | @@ -477,9 +477,9 @@ |
565 | if fname.endswith(".diff.gz"): |
566 | return SourcePackageFileType.DIFF |
567 | if fname.endswith(".orig.tar.gz"): |
568 | - return SourcePackageFileType.ORIG |
569 | + return SourcePackageFileType.ORIG_TARBALL |
570 | if fname.endswith(".tar.gz"): |
571 | - return SourcePackageFileType.TARBALL |
572 | + return SourcePackageFileType.NATIVE_TARBALL |
573 | |
574 | |
575 | BINARYPACKAGE_EXTENSIONS = { |
576 | |
577 | === modified file 'lib/lp/archivepublisher/tests/deathrow.txt' |
578 | --- lib/lp/archivepublisher/tests/deathrow.txt 2009-08-28 07:34:44 +0000 |
579 | +++ lib/lp/archivepublisher/tests/deathrow.txt 2009-11-16 23:27:14 +0000 |
580 | @@ -106,7 +106,7 @@ |
581 | They all share a source file. |
582 | |
583 | >>> shared_file = test_publisher.addMockFile( |
584 | - ... 'shared.tar.gz', filecontent='Y') |
585 | + ... 'shared_1.0.tar.gz', filecontent='Y') |
586 | >>> discard = removed_source.sourcepackagerelease.addFile(shared_file) |
587 | >>> discard = postponed_source.sourcepackagerelease.addFile(shared_file) |
588 | >>> discard = published_source.sourcepackagerelease.addFile(shared_file) |
589 | @@ -228,7 +228,7 @@ |
590 | obsolete-bin_666_i386.deb: OK |
591 | obsolete_666.dsc: OK |
592 | stuck-bin_666_i386.deb: OK |
593 | - shared.tar.gz: OK |
594 | + shared_1.0.tar.gz: OK |
595 | stuck_666.dsc: OK |
596 | stuck_667.dsc: OK |
597 | stuck_668.dsc: OK |
598 | @@ -265,9 +265,9 @@ |
599 | used with different names for two distinct sourcepackages |
600 | (openoffice and openoffice-l10n is an example); |
601 | |
602 | - * The source file shared across publications ('shared.tar.gz') wasn't |
603 | - removed as it is still related to a 'live' and a 'future-deletion' |
604 | - publications. |
605 | + * The source file shared across publications ('shared_1.0.tar.gz') |
606 | + wasn't removed as it is still related to a 'live' and a |
607 | + 'future-deletion' publications. |
608 | |
609 | * Dependent binaries are only possible via publication copies and are |
610 | only removed 'atomically', i.e. since there is a 'live' publication |
611 | @@ -316,7 +316,7 @@ |
612 | obsolete-bin_666_i386.deb: REMOVED |
613 | obsolete_666.dsc: REMOVED |
614 | stuck-bin_666_i386.deb: OK |
615 | - shared.tar.gz: OK |
616 | + shared_1.0.tar.gz: OK |
617 | stuck_666.dsc: REMOVED |
618 | stuck_667.dsc: OK |
619 | stuck_668.dsc: OK |
620 | @@ -389,7 +389,7 @@ |
621 | obsolete-bin_666_i386.deb: REMOVED |
622 | obsolete_666.dsc: REMOVED |
623 | stuck-bin_666_i386.deb: REMOVED |
624 | - shared.tar.gz: OK |
625 | + shared_1.0.tar.gz: OK |
626 | stuck_666.dsc: REMOVED |
627 | stuck_667.dsc: OK |
628 | stuck_668.dsc: OK |
629 | |
630 | === modified file 'lib/lp/archivepublisher/tests/test_publisher.py' |
631 | --- lib/lp/archivepublisher/tests/test_publisher.py 2009-08-28 07:34:44 +0000 |
632 | +++ lib/lp/archivepublisher/tests/test_publisher.py 2009-11-16 23:27:14 +0000 |
633 | @@ -280,7 +280,7 @@ |
634 | owner=ubuntu_team, purpose=ArchivePurpose.PPA) |
635 | |
636 | pub_source = self.getPubSource( |
637 | - sourcename="foo", filename="foo.dsc", filecontent='Hello world', |
638 | + sourcename="foo", filename="foo_1.dsc", filecontent='Hello world', |
639 | status=PackagePublishingStatus.PENDING, archive=test_archive) |
640 | |
641 | publisher.A_publish(False) |
642 | @@ -290,7 +290,7 @@ |
643 | self.assertEqual(pub_source.status, PackagePublishingStatus.PENDING) |
644 | |
645 | # nothing got published |
646 | - foo_path = "%s/main/f/foo/foo.dsc" % self.pool_dir |
647 | + foo_path = "%s/main/f/foo/foo_1.dsc" % self.pool_dir |
648 | self.assertEqual(os.path.exists(foo_path), False) |
649 | |
650 | def testPublishingWorksForOtherArchives(self): |
651 | @@ -309,7 +309,7 @@ |
652 | test_archive) |
653 | |
654 | pub_source = self.getPubSource( |
655 | - sourcename="foo", filename="foo.dsc", |
656 | + sourcename="foo", filename="foo_1.dsc", |
657 | filecontent='I am supposed to be a embargoed archive', |
658 | status=PackagePublishingStatus.PENDING, archive=test_archive) |
659 | |
660 | @@ -322,7 +322,7 @@ |
661 | self.assertEqual(pub_source.status, PackagePublishingStatus.PUBLISHED) |
662 | |
663 | # nothing got published |
664 | - foo_path = "%s/main/f/foo/foo.dsc" % test_pool_dir |
665 | + foo_path = "%s/main/f/foo/foo_1.dsc" % test_pool_dir |
666 | self.assertEqual( |
667 | open(foo_path).read().strip(), |
668 | 'I am supposed to be a embargoed archive',) |
669 | @@ -401,11 +401,11 @@ |
670 | owner=name16, distribution=ubuntu, purpose=ArchivePurpose.PPA) |
671 | |
672 | pub_source = self.getPubSource( |
673 | - sourcename="foo", filename="foo.dsc", filecontent='Hello world', |
674 | + sourcename="foo", filename="foo_1.dsc", filecontent='Hello world', |
675 | status=PackagePublishingStatus.PENDING, archive=spiv.archive) |
676 | |
677 | pub_source = self.getPubSource( |
678 | - sourcename="foo", filename="foo.dsc", filecontent='Hello world', |
679 | + sourcename="foo", filename="foo_1.dsc", filecontent='Hello world', |
680 | status=PackagePublishingStatus.PUBLISHED, archive=name16.archive) |
681 | |
682 | self.assertEqual(4, ubuntu.getAllPPAs().count()) |
683 | @@ -465,7 +465,7 @@ |
684 | # Pending source and binary publications. |
685 | # The binary description explores index formatting properties. |
686 | pub_source = self.getPubSource( |
687 | - sourcename="foo", filename="foo.dsc", filecontent='Hello world', |
688 | + sourcename="foo", filename="foo_1.dsc", filecontent='Hello world', |
689 | status=PackagePublishingStatus.PENDING, archive=cprov.archive) |
690 | pub_bin = self.getPubBinaries( |
691 | pub_source=pub_source, |
692 | @@ -507,7 +507,7 @@ |
693 | 'Format: 1.0', |
694 | 'Directory: pool/main/f/foo', |
695 | 'Files:', |
696 | - ' 3e25960a79dbc69b674cd4ec67a72c62 11 foo.dsc', |
697 | + ' 3e25960a79dbc69b674cd4ec67a72c62 11 foo_1.dsc', |
698 | ''], |
699 | index_contents) |
700 | |
701 | |
702 | === modified file 'lib/lp/archiveuploader/dscfile.py' |
703 | --- lib/lp/archiveuploader/dscfile.py 2009-06-24 23:33:29 +0000 |
704 | +++ lib/lp/archiveuploader/dscfile.py 2009-11-16 23:27:14 +0000 |
705 | @@ -31,10 +31,13 @@ |
706 | parse_tagfile, TagFileParseError) |
707 | from lp.archiveuploader.utils import ( |
708 | prefix_multi_line_string, safe_fix_maintainer, ParseMaintError, |
709 | - re_valid_pkg_name, re_valid_version, re_issource) |
710 | + re_valid_pkg_name, re_valid_version, re_issource, |
711 | + determine_source_file_type) |
712 | from canonical.encoding import guess as guess_encoding |
713 | from lp.registry.interfaces.person import IPersonSet, PersonCreationRationale |
714 | -from lp.soyuz.interfaces.archive import ArchivePurpose |
715 | +from lp.registry.interfaces.sourcepackage import SourcePackageFileType |
716 | +from lp.soyuz.interfaces.archive import ArchivePurpose, IArchiveSet |
717 | +from lp.soyuz.interfaces.sourcepackageformat import SourcePackageFormat |
718 | from canonical.launchpad.interfaces import ( |
719 | GPGVerificationError, IGPGHandler, IGPGKeySet, |
720 | ISourcePackageNameSet, NotFoundError) |
721 | @@ -228,6 +231,9 @@ |
722 | This method is an error generator, i.e, it returns an iterator over all |
723 | exceptions that are generated while processing DSC file checks. |
724 | """ |
725 | + # Avoid circular imports. |
726 | + from lp.archiveuploader.nascentupload import EarlyReturnUploadError |
727 | + |
728 | for error in SourceUploadFile.verify(self): |
729 | yield error |
730 | |
731 | @@ -265,10 +271,17 @@ |
732 | yield UploadError( |
733 | "%s: invalid version %s" % (self.filename, self.dsc_version)) |
734 | |
735 | - if self.format != "1.0": |
736 | + try: |
737 | + format_term = SourcePackageFormat.getTermByToken(self.format) |
738 | + except LookupError: |
739 | + raise EarlyReturnUploadError( |
740 | + "Unsupported source format: %s" % self.format) |
741 | + |
742 | + if not self.policy.distroseries.isSourcePackageFormatPermitted( |
743 | + format_term.value): |
744 | yield UploadError( |
745 | - "%s: Format is not 1.0. This is incompatible with " |
746 | - "dpkg-source." % self.filename) |
747 | + "%s: format '%s' is not permitted in %s." % |
748 | + (self.filename, self.format, self.policy.distroseries.name)) |
749 | |
750 | # Validate the build dependencies |
751 | for field_name in ['build-depends', 'build-depends-indep']: |
752 | @@ -323,8 +336,19 @@ |
753 | |
754 | :raise: `NotFoundError` when the wanted file could not be found. |
755 | """ |
756 | - if (self.policy.archive.purpose == ArchivePurpose.PPA and |
757 | - filename.endswith('.orig.tar.gz')): |
758 | + # We cannot check the archive purpose for partner archives here, |
759 | + # because the archive override rules have not been applied yet. |
760 | + # Uploads destined for the Ubuntu main archive and the 'partner' |
761 | + # component will eventually end up in the partner archive though. |
762 | + if (self.policy.archive.purpose == ArchivePurpose.PRIMARY and |
763 | + self.component_name == 'partner'): |
764 | + archives = [ |
765 | + getUtility(IArchiveSet).getByDistroPurpose( |
766 | + distribution=self.policy.distro, |
767 | + purpose=ArchivePurpose.PARTNER)] |
768 | + elif (self.policy.archive.purpose == ArchivePurpose.PPA and |
769 | + determine_source_file_type(filename) == |
770 | + SourcePackageFileType.ORIG_TARBALL): |
771 | archives = [self.policy.archive, self.policy.distro.main_archive] |
772 | else: |
773 | archives = [self.policy.archive] |
774 | @@ -348,11 +372,25 @@ |
775 | We don't use the NascentUploadFile.verify here, only verify size |
776 | and checksum. |
777 | """ |
778 | - has_tar = False |
779 | + |
780 | + diff_count = 0 |
781 | + orig_tar_count = 0 |
782 | + native_tar_count = 0 |
783 | + |
784 | files_missing = False |
785 | for sub_dsc_file in self.files: |
786 | - if sub_dsc_file.filename.endswith("tar.gz"): |
787 | - has_tar = True |
788 | + filetype = determine_source_file_type(sub_dsc_file.filename) |
789 | + |
790 | + if filetype == SourcePackageFileType.DIFF: |
791 | + diff_count += 1 |
792 | + elif filetype == SourcePackageFileType.ORIG_TARBALL: |
793 | + orig_tar_count += 1 |
794 | + elif filetype == SourcePackageFileType.NATIVE_TARBALL: |
795 | + native_tar_count += 1 |
796 | + else: |
797 | + yield UploadError('Unknown file: ' + sub_dsc_file.filename) |
798 | + continue |
799 | + |
800 | try: |
801 | library_file, file_archive = self._getFileByName( |
802 | sub_dsc_file.filename) |
803 | @@ -397,11 +435,37 @@ |
804 | yield error |
805 | files_missing = True |
806 | |
807 | - |
808 | - if not has_tar: |
809 | - yield UploadError( |
810 | - "%s: does not mention any tar.gz or orig.tar.gz." |
811 | - % self.filename) |
812 | + # Reject if we have more than one file of any type. |
813 | + if orig_tar_count > 1: |
814 | + yield UploadError( |
815 | + "%s: has more than one orig.tar.*." |
816 | + % self.filename) |
817 | + if native_tar_count > 1: |
818 | + yield UploadError( |
819 | + "%s: has more than one tar.*." |
820 | + % self.filename) |
821 | + if diff_count > 1: |
822 | + yield UploadError( |
823 | + "%s: has more than one diff.gz." |
824 | + % self.filename) |
825 | + |
826 | + if ((orig_tar_count == 0 and native_tar_count == 0) or |
827 | + (orig_tar_count > 0 and native_tar_count > 0)): |
828 | + yield UploadError( |
829 | + "%s: must have exactly one tar.* or orig.tar.*." |
830 | + % self.filename) |
831 | + |
832 | + # Format 1.0 must be native (exactly one tar.gz), or |
833 | + # have an orig.tar.gz and a diff.gz. It cannot have |
834 | + # compression types other than 'gz'. |
835 | + if self.format == '1.0': |
836 | + if ((diff_count == 0 and native_tar_count == 0) or |
837 | + (diff_count > 0 and native_tar_count > 0)): |
838 | + yield UploadError( |
839 | + "%s: must have exactly one diff.gz or tar.gz." |
840 | + % self.filename) |
841 | + else: |
842 | + raise AssertionError("Unknown source format.") |
843 | |
844 | if files_missing: |
845 | yield UploadError( |
846 | |
847 | === modified file 'lib/lp/archiveuploader/nascentupload.py' |
848 | --- lib/lp/archiveuploader/nascentupload.py 2009-11-09 17:54:02 +0000 |
849 | +++ lib/lp/archiveuploader/nascentupload.py 2009-11-16 23:27:14 +0000 |
850 | @@ -28,8 +28,10 @@ |
851 | from lp.archiveuploader.nascentuploadfile import ( |
852 | UploadError, UploadWarning, CustomUploadFile, SourceUploadFile, |
853 | BaseBinaryUploadFile) |
854 | +from lp.archiveuploader.utils import determine_source_file_type |
855 | from lp.archiveuploader.permission import verify_upload |
856 | from lp.registry.interfaces.pocket import PackagePublishingPocket |
857 | +from lp.registry.interfaces.sourcepackage import SourcePackageFileType |
858 | from lp.soyuz.interfaces.archive import ArchivePurpose, MAIN_ARCHIVE_PURPOSES |
859 | from canonical.launchpad.interfaces import ( |
860 | IBinaryPackageNameSet, IDistributionSet, ILibraryFileAliasSet, |
861 | @@ -302,53 +304,35 @@ |
862 | """Heuristic checks on a sourceful upload. |
863 | |
864 | Raises AssertionError when called for a non-sourceful upload. |
865 | - Ensures a sourceful upload has, at least: |
866 | - |
867 | - * One DSC |
868 | - * One or none DIFF |
869 | - * One or none ORIG |
870 | - * One or none TAR |
871 | - * If no DIFF is present it must have a TAR (native) |
872 | - |
873 | - 'hasorig' and 'native' attributes are set when an ORIG and/or an |
874 | - TAR file, respectively, are present. |
875 | + Ensures a sourceful upload has exactly one DSC. |
876 | """ |
877 | assert self.sourceful, ( |
878 | "Source consistency check called for a non-source upload") |
879 | |
880 | dsc = 0 |
881 | - diff = 0 |
882 | - orig = 0 |
883 | - tar = 0 |
884 | + native_tarball = 0 |
885 | + orig_tarball = 0 |
886 | |
887 | for uploaded_file in self.changes.files: |
888 | - if uploaded_file.filename.endswith(".dsc"): |
889 | + filetype = determine_source_file_type(uploaded_file.filename) |
890 | + if filetype == SourcePackageFileType.DSC: |
891 | dsc += 1 |
892 | - elif uploaded_file.filename.endswith(".diff.gz"): |
893 | - diff += 1 |
894 | - elif uploaded_file.filename.endswith(".orig.tar.gz"): |
895 | - orig += 1 |
896 | - elif (uploaded_file.filename.endswith(".tar.gz") |
897 | + elif (filetype == SourcePackageFileType.NATIVE_TARBALL |
898 | and not isinstance(uploaded_file, CustomUploadFile)): |
899 | - tar += 1 |
900 | - |
901 | - # Okay, let's check the sanity of the upload. |
902 | + native_tarball += 1 |
903 | + elif filetype == SourcePackageFileType.ORIG_TARBALL: |
904 | + orig_tarball += 1 |
905 | + |
906 | + |
907 | + # It is never sane to upload more than one source at a time. |
908 | if dsc > 1: |
909 | self.reject("Changes file lists more than one .dsc") |
910 | - if diff > 1: |
911 | - self.reject("Changes file lists more than one .diff.gz") |
912 | - if orig > 1: |
913 | - self.reject("Changes file lists more than one orig.tar.gz") |
914 | - if tar > 1: |
915 | - self.reject("Changes file lists more than one native tar.gz") |
916 | |
917 | if dsc == 0: |
918 | self.reject("Sourceful upload without a .dsc") |
919 | - if diff == 0 and tar == 0: |
920 | - self.reject("Sourceful upload without a diff or native tar") |
921 | |
922 | - self.native = bool(tar) |
923 | - self.hasorig = bool(orig) |
924 | + self.native = bool(native_tarball) |
925 | + self.hasorig = bool(orig_tarball) |
926 | |
927 | def _check_binaryful_consistency(self): |
928 | """Heuristic checks on a binaryful upload. |
929 | |
930 | === modified file 'lib/lp/archiveuploader/nascentuploadfile.py' |
931 | --- lib/lp/archiveuploader/nascentuploadfile.py 2009-07-17 00:26:05 +0000 |
932 | +++ lib/lp/archiveuploader/nascentuploadfile.py 2009-11-16 23:27:14 +0000 |
933 | @@ -33,8 +33,9 @@ |
934 | from lp.archiveuploader.utils import ( |
935 | prefix_multi_line_string, re_taint_free, re_isadeb, re_issource, |
936 | re_no_epoch, re_no_revision, re_valid_version, re_valid_pkg_name, |
937 | - re_extract_src_version) |
938 | + re_extract_src_version, determine_source_file_type) |
939 | from canonical.encoding import guess as guess_encoding |
940 | +from lp.registry.interfaces.sourcepackage import SourcePackageFileType |
941 | from lp.soyuz.interfaces.binarypackagename import ( |
942 | IBinaryPackageNameSet) |
943 | from lp.soyuz.interfaces.binarypackagerelease import ( |
944 | @@ -351,7 +352,8 @@ |
945 | "Architecture field." % (self.filename)) |
946 | |
947 | version_chopped = re_no_epoch.sub('', self.version) |
948 | - if self.filename.endswith("orig.tar.gz"): |
949 | + if determine_source_file_type(self.filename) == ( |
950 | + SourcePackageFileType.ORIG_TARBALL): |
951 | version_chopped = re_no_revision.sub('', version_chopped) |
952 | |
953 | source_match = re_issource.match(self.filename) |
954 | |
955 | === added directory 'lib/lp/archiveuploader/tests/data/suite/bar_1.0-1_3.0-quilt' |
956 | === added file 'lib/lp/archiveuploader/tests/data/suite/bar_1.0-1_3.0-quilt/bar_1.0-1.debian.tar.gz' |
957 | Binary files lib/lp/archiveuploader/tests/data/suite/bar_1.0-1_3.0-quilt/bar_1.0-1.debian.tar.gz 1970-01-01 00:00:00 +0000 and lib/lp/archiveuploader/tests/data/suite/bar_1.0-1_3.0-quilt/bar_1.0-1.debian.tar.gz 2009-11-16 23:27:14 +0000 differ |
958 | === added file 'lib/lp/archiveuploader/tests/data/suite/bar_1.0-1_3.0-quilt/bar_1.0-1.dsc' |
959 | --- lib/lp/archiveuploader/tests/data/suite/bar_1.0-1_3.0-quilt/bar_1.0-1.dsc 1970-01-01 00:00:00 +0000 |
960 | +++ lib/lp/archiveuploader/tests/data/suite/bar_1.0-1_3.0-quilt/bar_1.0-1.dsc 2009-11-16 23:27:14 +0000 |
961 | @@ -0,0 +1,16 @@ |
962 | +Format: 3.0 (quilt) |
963 | +Source: bar |
964 | +Binary: bar |
965 | +Architecture: any |
966 | +Version: 1.0-1 |
967 | +Maintainer: Launchpad team <launchpad@lists.canonical.com> |
968 | +Standards-Version: 3.6.2 |
969 | +Checksums-Sha1: |
970 | + 73a04163fee97fd2257ab266bd48f1d3d528e012 164 bar_1.0.orig.tar.gz |
971 | + abce262314a7c0ca00e43598f21b41a3e6ff6b21 688 bar_1.0-1.debian.tar.gz |
972 | +Checksums-Sha256: |
973 | + f1ecff929899b567f45d6734b69d59a4f3c04dabce3cc8e6ed6d64073eda360e 164 bar_1.0.orig.tar.gz |
974 | + ffdcce60fca14618f68483ca77a206f332a3773dc7ece1c3e6de55c0118c69c6 688 bar_1.0-1.debian.tar.gz |
975 | +Files: |
976 | + fc1464e5985b962a042d5354452f361d 164 bar_1.0.orig.tar.gz |
977 | + 056db4dfe7de8322296b6d417592ee01 688 bar_1.0-1.debian.tar.gz |
978 | |
979 | === added file 'lib/lp/archiveuploader/tests/data/suite/bar_1.0-1_3.0-quilt/bar_1.0-1_source.changes' |
980 | --- lib/lp/archiveuploader/tests/data/suite/bar_1.0-1_3.0-quilt/bar_1.0-1_source.changes 1970-01-01 00:00:00 +0000 |
981 | +++ lib/lp/archiveuploader/tests/data/suite/bar_1.0-1_3.0-quilt/bar_1.0-1_source.changes 2009-11-16 23:27:14 +0000 |
982 | @@ -0,0 +1,28 @@ |
983 | +Format: 1.8 |
984 | +Date: Thu, 16 Feb 2006 15:34:09 +0000 |
985 | +Source: bar |
986 | +Binary: bar |
987 | +Architecture: source |
988 | +Version: 1.0-1 |
989 | +Distribution: breezy |
990 | +Urgency: low |
991 | +Maintainer: Launchpad team <launchpad@lists.canonical.com> |
992 | +Changed-By: Daniel Silverstone <daniel.silverstone@canonical.com> |
993 | +Description: |
994 | + bar - Stuff for testing |
995 | +Changes: |
996 | + bar (1.0-1) breezy; urgency=low |
997 | + . |
998 | + * Initial version |
999 | +Checksums-Sha1: |
1000 | + bc97e185cf31af33bf8d109044ce51f32d09c229 645 bar_1.0-1.dsc |
1001 | + 73a04163fee97fd2257ab266bd48f1d3d528e012 164 bar_1.0.orig.tar.gz |
1002 | + abce262314a7c0ca00e43598f21b41a3e6ff6b21 688 bar_1.0-1.debian.tar.gz |
1003 | +Checksums-Sha256: |
1004 | + ae0fb16941a95518332a8ee962d00d55963b491c2df94b3f230a65d2bdbeedf8 645 bar_1.0-1.dsc |
1005 | + f1ecff929899b567f45d6734b69d59a4f3c04dabce3cc8e6ed6d64073eda360e 164 bar_1.0.orig.tar.gz |
1006 | + ffdcce60fca14618f68483ca77a206f332a3773dc7ece1c3e6de55c0118c69c6 688 bar_1.0-1.debian.tar.gz |
1007 | +Files: |
1008 | + c320d2827f08f09ec2e1bbbac635225c 645 devel optional bar_1.0-1.dsc |
1009 | + fc1464e5985b962a042d5354452f361d 164 devel optional bar_1.0.orig.tar.gz |
1010 | + 056db4dfe7de8322296b6d417592ee01 688 devel optional bar_1.0-1.debian.tar.gz |
1011 | |
1012 | === added file 'lib/lp/archiveuploader/tests/data/suite/bar_1.0-1_3.0-quilt/bar_1.0.orig.tar.gz' |
1013 | Binary files lib/lp/archiveuploader/tests/data/suite/bar_1.0-1_3.0-quilt/bar_1.0.orig.tar.gz 1970-01-01 00:00:00 +0000 and lib/lp/archiveuploader/tests/data/suite/bar_1.0-1_3.0-quilt/bar_1.0.orig.tar.gz 2009-11-16 23:27:14 +0000 differ |
1014 | === added directory 'lib/lp/archiveuploader/tests/data/suite/foocomm_1.0-3' |
1015 | === added file 'lib/lp/archiveuploader/tests/data/suite/foocomm_1.0-3/foocomm_1.0-3.diff.gz' |
1016 | Binary files lib/lp/archiveuploader/tests/data/suite/foocomm_1.0-3/foocomm_1.0-3.diff.gz 1970-01-01 00:00:00 +0000 and lib/lp/archiveuploader/tests/data/suite/foocomm_1.0-3/foocomm_1.0-3.diff.gz 2009-11-16 23:27:14 +0000 differ |
1017 | === added file 'lib/lp/archiveuploader/tests/data/suite/foocomm_1.0-3/foocomm_1.0-3.dsc' |
1018 | --- lib/lp/archiveuploader/tests/data/suite/foocomm_1.0-3/foocomm_1.0-3.dsc 1970-01-01 00:00:00 +0000 |
1019 | +++ lib/lp/archiveuploader/tests/data/suite/foocomm_1.0-3/foocomm_1.0-3.dsc 2009-11-16 23:27:14 +0000 |
1020 | @@ -0,0 +1,10 @@ |
1021 | +Format: 1.0 |
1022 | +Source: foocomm |
1023 | +Version: 1.0-3 |
1024 | +Binary: foocomm |
1025 | +Maintainer: Launchpad team <launchpad@lists.canonical.com> |
1026 | +Architecture: any |
1027 | +Standards-Version: 3.6.2 |
1028 | +Files: |
1029 | + ad436f97a58df5b233209857439f1e7c 171 foocomm_1.0.orig.tar.gz |
1030 | + e03c530973064ebbbde9226e03868cb1 595 foocomm_1.0-3.diff.gz |
1031 | |
1032 | === added file 'lib/lp/archiveuploader/tests/data/suite/foocomm_1.0-3/foocomm_1.0-3_source.changes' |
1033 | --- lib/lp/archiveuploader/tests/data/suite/foocomm_1.0-3/foocomm_1.0-3_source.changes 1970-01-01 00:00:00 +0000 |
1034 | +++ lib/lp/archiveuploader/tests/data/suite/foocomm_1.0-3/foocomm_1.0-3_source.changes 2009-11-16 23:27:14 +0000 |
1035 | @@ -0,0 +1,27 @@ |
1036 | +Format: 1.7 |
1037 | +Date: Thu, 27 Feb 2006 15:34:09 +0000 |
1038 | +Source: foocomm |
1039 | +Binary: foocomm |
1040 | +Architecture: source |
1041 | +Version: 1.0-3 |
1042 | +Distribution: breezy |
1043 | +Urgency: low |
1044 | +Maintainer: Launchpad team <launchpad@lists.canonical.com> |
1045 | +Changed-By: Foo Bar <foo.bar@canonical.com> |
1046 | +Description: |
1047 | + foocomm - Stuff for testing |
1048 | +Changes: |
1049 | + foocomm (1.0-1) breezy; urgency=low |
1050 | + . |
1051 | + * Initial version |
1052 | + . |
1053 | + foocomm (1.0-2) breezy; urgency=low |
1054 | + . |
1055 | + * Version 2 testing |
1056 | + . |
1057 | + foocomm (1.0-3) breezy; urgency=low |
1058 | + . |
1059 | + * Version 3 testing, orig.tar.gz reuse from partner archive. |
1060 | +Files: |
1061 | + 5885c292c1f4a3611a6506e4fa4e80a8 291 partner/devel optional foocomm_1.0-3.dsc |
1062 | + e03c530973064ebbbde9226e03868cb1 595 partner/devel optional foocomm_1.0-3.diff.gz |
1063 | |
1064 | === modified file 'lib/lp/archiveuploader/tests/test_ppauploadprocessor.py' |
1065 | --- lib/lp/archiveuploader/tests/test_ppauploadprocessor.py 2009-10-26 18:40:04 +0000 |
1066 | +++ lib/lp/archiveuploader/tests/test_ppauploadprocessor.py 2009-11-16 23:27:14 +0000 |
1067 | @@ -946,6 +946,7 @@ |
1068 | except NotFoundError: |
1069 | self.fail('bar_1.0.orig.tar.gz is not yet published.') |
1070 | |
1071 | + # Please note: this upload goes to the Ubuntu main archive. |
1072 | upload_dir = self.queueUpload("bar_1.0-10") |
1073 | self.processUpload(self.uploadprocessor, upload_dir) |
1074 | # Discard the announcement email and check the acceptance message |
1075 | @@ -961,6 +962,7 @@ |
1076 | # Make the official bar orig.tar.gz available in the system. |
1077 | self.uploadNewBarToUbuntu() |
1078 | |
1079 | + # Please note: the upload goes to the PPA. |
1080 | # Upload a higher version of 'bar' to a PPA that relies on the |
1081 | # availability of orig.tar.gz published in ubuntu. |
1082 | upload_dir = self.queueUpload("bar_1.0-10", "~name16/ubuntu") |
1083 | @@ -1032,6 +1034,7 @@ |
1084 | # Make the official bar orig.tar.gz available in the system. |
1085 | self.uploadNewBarToUbuntu() |
1086 | |
1087 | + # Please note: the upload goes to the PPA. |
1088 | # Upload a higher version of 'bar' to a PPA that relies on the |
1089 | # availability of orig.tar.gz published in the PPA itself. |
1090 | upload_dir = self.queueUpload("bar_1.0-10-ppa-orig", "~name16/ubuntu") |
1091 | |
1092 | === modified file 'lib/lp/archiveuploader/tests/test_uploadprocessor.py' |
1093 | --- lib/lp/archiveuploader/tests/test_uploadprocessor.py 2009-11-09 16:44:39 +0000 |
1094 | +++ lib/lp/archiveuploader/tests/test_uploadprocessor.py 2009-11-16 23:27:14 +0000 |
1095 | @@ -49,6 +49,8 @@ |
1096 | from lp.soyuz.interfaces.archivepermission import ( |
1097 | ArchivePermissionType, IArchivePermissionSet) |
1098 | from lp.soyuz.interfaces.component import IComponentSet |
1099 | +from lp.soyuz.interfaces.sourcepackageformat import ( |
1100 | + ISourcePackageFormatSelectionSet, SourcePackageFormat) |
1101 | from lp.registry.interfaces.person import IPersonSet |
1102 | from lp.registry.interfaces.sourcepackagename import ( |
1103 | ISourcePackageNameSet) |
1104 | @@ -157,7 +159,7 @@ |
1105 | excName = str(excClass) |
1106 | raise self.failureException, "%s not raised" % excName |
1107 | |
1108 | - def setupBreezy(self, name="breezy"): |
1109 | + def setupBreezy(self, name="breezy", permitted_formats=None): |
1110 | """Create a fresh distroseries in ubuntu. |
1111 | |
1112 | Use *initialiseFromParent* procedure to create 'breezy' |
1113 | @@ -168,6 +170,8 @@ |
1114 | |
1115 | :param name: supply the name of the distroseries if you don't want |
1116 | it to be called "breezy" |
1117 | + :param permitted_formats: list of SourcePackageFormats to allow |
1118 | + in the new distroseries. Only permits '1.0' by default. |
1119 | """ |
1120 | self.ubuntu = getUtility(IDistributionSet).getByName('ubuntu') |
1121 | bat = self.ubuntu['breezy-autotest'] |
1122 | @@ -185,6 +189,14 @@ |
1123 | self.breezy.changeslist = 'breezy-changes@ubuntu.com' |
1124 | self.breezy.initialiseFromParent() |
1125 | |
1126 | + if permitted_formats is None: |
1127 | + permitted_formats = [SourcePackageFormat.FORMAT_1_0] |
1128 | + |
1129 | + for format in permitted_formats: |
1130 | + if not self.breezy.isSourcePackageFormatPermitted(format): |
1131 | + getUtility(ISourcePackageFormatSelectionSet).add( |
1132 | + self.breezy, format) |
1133 | + |
1134 | def addMockFile(self, filename, content="anything"): |
1135 | """Return a librarian file.""" |
1136 | return getUtility(ILibraryFileAliasSet).create( |
1137 | @@ -742,6 +754,53 @@ |
1138 | "Expected email containing 'Cannot mix partner files with " |
1139 | "non-partner.', got:\n%s" % raw_msg) |
1140 | |
1141 | + def testPartnerReusingOrigFromPartner(self): |
1142 | + """Partner uploads reuse 'orig.tar.gz' from the partner archive.""" |
1143 | + # Make the official bar orig.tar.gz available in the system. |
1144 | + uploadprocessor = self.setupBreezyAndGetUploadProcessor( |
1145 | + policy='absolutely-anything') |
1146 | + |
1147 | + upload_dir = self.queueUpload("foocomm_1.0-1") |
1148 | + self.processUpload(uploadprocessor, upload_dir) |
1149 | + |
1150 | + self.assertEqual( |
1151 | + uploadprocessor.last_processed_upload.queue_root.status, |
1152 | + PackageUploadStatus.NEW) |
1153 | + |
1154 | + [queue_item] = self.breezy.getQueueItems( |
1155 | + status=PackageUploadStatus.NEW, name="foocomm", |
1156 | + version="1.0-1", exact_match=True) |
1157 | + queue_item.setAccepted() |
1158 | + queue_item.realiseUpload() |
1159 | + self.layer.commit() |
1160 | + |
1161 | + archive = getUtility(IArchiveSet).getByDistroPurpose( |
1162 | + distribution=self.ubuntu, purpose=ArchivePurpose.PARTNER) |
1163 | + try: |
1164 | + self.ubuntu.getFileByName( |
1165 | + 'foocomm_1.0.orig.tar.gz', archive=archive, source=True, |
1166 | + binary=False) |
1167 | + except NotFoundError: |
1168 | + self.fail('foocomm_1.0.orig.tar.gz is not yet published.') |
1169 | + |
1170 | + # Please note: this upload goes to the Ubuntu main archive. |
1171 | + upload_dir = self.queueUpload("foocomm_1.0-3") |
1172 | + self.processUpload(uploadprocessor, upload_dir) |
1173 | + # Discard the announcement email and check the acceptance message |
1174 | + # content. |
1175 | + from_addr, to_addrs, raw_msg = stub.test_emails.pop() |
1176 | + msg = message_from_string(raw_msg) |
1177 | + # This is now a MIMEMultipart message. |
1178 | + body = msg.get_payload(0) |
1179 | + body = body.get_payload(decode=True) |
1180 | + |
1181 | + self.assertEqual( |
1182 | + '[ubuntu/breezy] foocomm 1.0-3 (Accepted)', msg['Subject']) |
1183 | + self.assertFalse( |
1184 | + 'Unable to find foocomm_1.0.orig.tar.gz in upload or ' |
1185 | + 'distribution.' in body, |
1186 | + 'Unable to find foocomm_1.0.orig.tar.gz') |
1187 | + |
1188 | def testPartnerUpload(self): |
1189 | """Partner packages should be uploaded to the partner archive. |
1190 | |
1191 | @@ -1420,6 +1479,28 @@ |
1192 | ] |
1193 | self.assertEmail(contents, recipients=recipients) |
1194 | |
1195 | + def test30QuiltUploadToUnsupportingSeriesIsRejected(self): |
1196 | + """Ensure that uploads to series without format support are rejected. |
1197 | + |
1198 | + Series can restrict the source formats that they accept. Uploads |
1199 | + should be rejected if an unsupported format is uploaded. |
1200 | + """ |
1201 | + self.setupBreezy() |
1202 | + self.layer.txn.commit() |
1203 | + self.options.context = 'absolutely-anything' |
1204 | + uploadprocessor = UploadProcessor( |
1205 | + self.options, self.layer.txn, self.log) |
1206 | + |
1207 | + # Upload the source. |
1208 | + upload_dir = self.queueUpload("bar_1.0-1_3.0-quilt") |
1209 | + self.processUpload(uploadprocessor, upload_dir) |
1210 | + # Make sure it was rejected. |
1211 | + from_addr, to_addrs, raw_msg = stub.test_emails.pop() |
1212 | + self.assertTrue( |
1213 | + "bar_1.0-1.dsc: format '3.0 (quilt)' is not permitted in " |
1214 | + "breezy." in raw_msg, |
1215 | + "Source was not rejected properly:\n%s" % raw_msg) |
1216 | + |
1217 | |
1218 | def test_suite(): |
1219 | return unittest.TestLoader().loadTestsFromName(__name__) |
1220 | |
1221 | === modified file 'lib/lp/archiveuploader/tests/test_utils.py' |
1222 | --- lib/lp/archiveuploader/tests/test_utils.py 2009-06-24 23:33:29 +0000 |
1223 | +++ lib/lp/archiveuploader/tests/test_utils.py 2009-11-16 23:27:14 +0000 |
1224 | @@ -8,6 +8,8 @@ |
1225 | import unittest |
1226 | import sys |
1227 | import shutil |
1228 | + |
1229 | +from lp.registry.interfaces.sourcepackage import SourcePackageFileType |
1230 | from lp.archiveuploader.tests import datadir |
1231 | |
1232 | |
1233 | @@ -17,6 +19,25 @@ |
1234 | """lp.archiveuploader.utils should be importable""" |
1235 | import lp.archiveuploader.utils |
1236 | |
1237 | + def test_determine_source_file_type(self): |
1238 | + """lp.archiveuploader.utils.determine_source_file_type should work.""" |
1239 | + from lp.archiveuploader.utils import determine_source_file_type |
1240 | + |
1241 | + self.assertEquals( |
1242 | + SourcePackageFileType.DSC, |
1243 | + determine_source_file_type('foo_1.0-1.dsc')) |
1244 | + self.assertEquals( |
1245 | + SourcePackageFileType.DIFF, |
1246 | + determine_source_file_type('foo_1.0-1.diff.gz')) |
1247 | + self.assertEquals( |
1248 | + SourcePackageFileType.ORIG_TARBALL, |
1249 | + determine_source_file_type('foo_1.0.orig.tar.gz')) |
1250 | + self.assertEquals( |
1251 | + SourcePackageFileType.NATIVE_TARBALL, |
1252 | + determine_source_file_type('foo_1.0.tar.gz')) |
1253 | + self.assertEquals(None, determine_source_file_type('foo_1.0')) |
1254 | + self.assertEquals(None, determine_source_file_type('foo_1.0.blah.gz')) |
1255 | + |
1256 | def testPrefixMultilineString(self): |
1257 | """lp.archiveuploader.utils.prefix_multi_line_string should work""" |
1258 | from lp.archiveuploader.utils import prefix_multi_line_string |
1259 | |
1260 | === modified file 'lib/lp/archiveuploader/utils.py' |
1261 | --- lib/lp/archiveuploader/utils.py 2009-06-24 23:33:29 +0000 |
1262 | +++ lib/lp/archiveuploader/utils.py 2009-11-16 23:27:14 +0000 |
1263 | @@ -15,6 +15,8 @@ |
1264 | 're_valid_pkg_name', |
1265 | 're_changes_file_name', |
1266 | 're_extract_src_version', |
1267 | + 'get_source_file_extension', |
1268 | + 'determine_source_file_type', |
1269 | 'prefix_multi_line_string', |
1270 | 'safe_fix_maintainer', |
1271 | 'ParseMaintError', |
1272 | @@ -31,7 +33,14 @@ |
1273 | re_taint_free = re.compile(r"^[-+~/\.\w]+$") |
1274 | |
1275 | re_isadeb = re.compile(r"(.+?)_(.+?)_(.+)\.(u?d?deb)$") |
1276 | -re_issource = re.compile(r"(.+)_(.+?)\.(orig\.tar\.gz|diff\.gz|tar\.gz|dsc)$") |
1277 | + |
1278 | +source_file_exts = ['orig.tar.gz', 'diff.gz', 'tar.gz', 'dsc'] |
1279 | +re_issource = re.compile( |
1280 | + r"(.+)_(.+?)\.(%s)" % "|".join( |
1281 | + re.escape(ext) for ext in source_file_exts)) |
1282 | + |
1283 | +re_is_orig_tar_ext = re.compile(r"^orig.tar.gz$") |
1284 | +re_is_native_tar_ext = re.compile(r"^tar.gz$") |
1285 | |
1286 | re_no_epoch = re.compile(r"^\d+\:") |
1287 | re_no_revision = re.compile(r"-[^-]+$") |
1288 | @@ -44,6 +53,34 @@ |
1289 | re_parse_maintainer = re.compile(r"^\s*(\S.*\S)\s*\<([^\>]+)\>") |
1290 | |
1291 | |
1292 | +def get_source_file_extension(filename): |
1293 | + """Get the extension part of a source file name.""" |
1294 | + match = re_issource.match(filename) |
1295 | + if match is None: |
1296 | + return None |
1297 | + return match.group(3) |
1298 | + |
1299 | + |
1300 | +def determine_source_file_type(filename): |
1301 | + """Determine the SourcePackageFileType of the given filename.""" |
1302 | + # Avoid circular imports. |
1303 | + from lp.registry.interfaces.sourcepackage import SourcePackageFileType |
1304 | + |
1305 | + extension = get_source_file_extension(filename) |
1306 | + if extension is None: |
1307 | + return None |
1308 | + elif extension == "dsc": |
1309 | + return SourcePackageFileType.DSC |
1310 | + elif extension == "diff.gz": |
1311 | + return SourcePackageFileType.DIFF |
1312 | + elif re_is_orig_tar_ext.match(extension): |
1313 | + return SourcePackageFileType.ORIG_TARBALL |
1314 | + elif re_is_native_tar_ext.match(extension): |
1315 | + return SourcePackageFileType.NATIVE_TARBALL |
1316 | + else: |
1317 | + return None |
1318 | + |
1319 | + |
1320 | def prefix_multi_line_string(str, prefix, include_blank_lines=0): |
1321 | """Utility function to split an input string and prefix, |
1322 | |
1323 | |
1324 | === modified file 'lib/lp/buildmaster/buildergroup.py' |
1325 | --- lib/lp/buildmaster/buildergroup.py 2009-08-16 12:38:12 +0000 |
1326 | +++ lib/lp/buildmaster/buildergroup.py 2009-11-16 23:27:14 +0000 |
1327 | @@ -130,8 +130,9 @@ |
1328 | try: |
1329 | build = getUtility(IBuildSet).getByBuildID(int(build_id)) |
1330 | queue_item = getUtility(IBuildQueueSet).get(int(queue_item_id)) |
1331 | - # Also check it build and buildqueue are properly related. |
1332 | - if queue_item.build.id != build.id: |
1333 | + queued_build = getUtility(IBuildSet).getByQueueEntry(queue_item) |
1334 | + # Also check whether build and buildqueue are properly related. |
1335 | + if queued_build.id != build.id: |
1336 | raise BuildJobMismatch('Job build entry mismatch') |
1337 | |
1338 | except (SQLObjectNotFound, NotFoundError, BuildJobMismatch), reason: |
1339 | @@ -159,9 +160,10 @@ |
1340 | |
1341 | Invoke getFileFromSlave method with 'buildlog' identifier. |
1342 | """ |
1343 | + build = getUtility(IBuildSet).getByQueueEntry(queueItem) |
1344 | return queueItem.builder.transferSlaveFileToLibrarian( |
1345 | 'buildlog', queueItem.getLogFileName(), |
1346 | - queueItem.build.archive.private) |
1347 | + build.archive.private) |
1348 | |
1349 | def updateBuild(self, queueItem): |
1350 | """Verify the current build job status. |
1351 | @@ -199,7 +201,7 @@ |
1352 | "Unknown status code (%s) returned from status() probe." |
1353 | % builder_status) |
1354 | queueItem.builder = None |
1355 | - queueItem.buildstart = None |
1356 | + queueItem.setDateStarted(None) |
1357 | self.commit() |
1358 | return |
1359 | |
1360 | @@ -261,17 +263,18 @@ |
1361 | |
1362 | Store Buildlog, datebuilt, duration, dependencies. |
1363 | """ |
1364 | - queueItem.build.buildlog = self.getLogFromSlave(queueItem) |
1365 | - queueItem.build.builder = queueItem.builder |
1366 | - queueItem.build.dependencies = dependencies |
1367 | + build = getUtility(IBuildSet).getByQueueEntry(queueItem) |
1368 | + build.buildlog = self.getLogFromSlave(queueItem) |
1369 | + build.builder = queueItem.builder |
1370 | + build.dependencies = dependencies |
1371 | # XXX cprov 20060615 bug=120584: Currently buildduration includes |
1372 | # the scanner latency, it should really be asking the slave for |
1373 | # the duration spent building locally. |
1374 | - queueItem.build.datebuilt = UTC_NOW |
1375 | + build.datebuilt = UTC_NOW |
1376 | # We need dynamic datetime.now() instance to be able to perform |
1377 | # the time operations for duration. |
1378 | RIGHT_NOW = datetime.datetime.now(pytz.timezone('UTC')) |
1379 | - queueItem.build.buildduration = RIGHT_NOW - queueItem.buildstart |
1380 | + build.buildduration = RIGHT_NOW - queueItem.date_started |
1381 | |
1382 | |
1383 | def buildStatus_OK(self, queueItem, librarian, buildid, |
1384 | @@ -287,7 +290,7 @@ |
1385 | self.logger.debug("Processing successful build %s" % buildid) |
1386 | # Explode before collect a binary that is denied in this |
1387 | # distroseries/pocket |
1388 | - build = queueItem.build |
1389 | + build = getUtility(IBuildSet).getByQueueEntry(queueItem) |
1390 | if not build.archive.allowUpdatesToReleasePocket(): |
1391 | assert build.distroseries.canUploadToPocket(build.pocket), ( |
1392 | "%s (%s) can not be built for pocket %s: illegal status" |
1393 | @@ -309,8 +312,9 @@ |
1394 | # can be correctly found during the upload: |
1395 | # <archive_id>/distribution_name |
1396 | # for all destination archive types. |
1397 | - archive = queueItem.build.archive |
1398 | - distribution_name = queueItem.build.distribution.name |
1399 | + build = getUtility(IBuildSet).getByQueueEntry(queueItem) |
1400 | + archive = build.archive |
1401 | + distribution_name = build.distribution.name |
1402 | target_path = '%s/%s' % (archive.id, distribution_name) |
1403 | upload_path = os.path.join(upload_dir, target_path) |
1404 | os.makedirs(upload_path) |
1405 | @@ -330,10 +334,10 @@ |
1406 | # add extra arguments for processing a binary upload |
1407 | extra_args = [ |
1408 | "--log-file", "%s" % uploader_logfilename, |
1409 | - "-d", "%s" % queueItem.build.distribution.name, |
1410 | - "-s", "%s" % (queueItem.build.distroseries.name + |
1411 | - pocketsuffix[queueItem.build.pocket]), |
1412 | - "-b", "%s" % queueItem.build.id, |
1413 | + "-d", "%s" % build.distribution.name, |
1414 | + "-s", "%s" % (build.distroseries.name + |
1415 | + pocketsuffix[build.pocket]), |
1416 | + "-b", "%s" % build.id, |
1417 | "-J", "%s" % upload_leaf, |
1418 | "%s" % root, |
1419 | ] |
1420 | @@ -409,12 +413,11 @@ |
1421 | # uploader about this occurrence. The failure notification will |
1422 | # also contain the information required to manually reprocess the |
1423 | # binary upload when it was the case. |
1424 | - build = getUtility(IBuildSet).getByBuildID(queueItem.build.id) |
1425 | + build = getUtility(IBuildSet).getByQueueEntry(queueItem) |
1426 | if (build.buildstate != BuildStatus.FULLYBUILT or |
1427 | build.binarypackages.count() == 0): |
1428 | self.logger.debug("Build %s upload failed." % build.id) |
1429 | - # update builder |
1430 | - queueItem.build.buildstate = BuildStatus.FAILEDTOUPLOAD |
1431 | + build.buildstate = BuildStatus.FAILEDTOUPLOAD |
1432 | # Retrieve log file content. |
1433 | possible_locations = ( |
1434 | 'failed', 'failed-to-move', 'rejected', 'accepted') |
1435 | @@ -434,11 +437,13 @@ |
1436 | uploader_log_content = 'Could not find upload log file' |
1437 | # Store the upload_log_contents in librarian so it can be |
1438 | # accessed by anyone with permission to see the build. |
1439 | - queueItem.build.storeUploadLog(uploader_log_content) |
1440 | + build.storeUploadLog(uploader_log_content) |
1441 | # Notify the build failure. |
1442 | - queueItem.build.notify(extra_info=uploader_log_content) |
1443 | + build.notify(extra_info=uploader_log_content) |
1444 | else: |
1445 | - self.logger.debug("Gathered build %s completely" % queueItem.name) |
1446 | + self.logger.debug( |
1447 | + "Gathered build %s completely" % |
1448 | + build.sourcepackagerelease.name) |
1449 | |
1450 | # Release the builder for another job. |
1451 | queueItem.builder.cleanSlave() |
1452 | @@ -456,10 +461,11 @@ |
1453 | set the job status as FAILEDTOBUILD, store available info and |
1454 | remove Buildqueue entry. |
1455 | """ |
1456 | - queueItem.build.buildstate = BuildStatus.FAILEDTOBUILD |
1457 | + build = getUtility(IBuildSet).getByQueueEntry(queueItem) |
1458 | + build.buildstate = BuildStatus.FAILEDTOBUILD |
1459 | self.storeBuildInfo(queueItem, librarian, buildid, dependencies) |
1460 | queueItem.builder.cleanSlave() |
1461 | - queueItem.build.notify() |
1462 | + build.notify() |
1463 | queueItem.destroySelf() |
1464 | |
1465 | def buildStatus_DEPFAIL(self, queueItem, librarian, buildid, |
1466 | @@ -470,7 +476,8 @@ |
1467 | MANUALDEPWAIT, store available information, remove BuildQueue |
1468 | entry and release builder slave for another job. |
1469 | """ |
1470 | - queueItem.build.buildstate = BuildStatus.MANUALDEPWAIT |
1471 | + build = getUtility(IBuildSet).getByQueueEntry(queueItem) |
1472 | + build.buildstate = BuildStatus.MANUALDEPWAIT |
1473 | self.storeBuildInfo(queueItem, librarian, buildid, dependencies) |
1474 | self.logger.critical("***** %s is MANUALDEPWAIT *****" |
1475 | % queueItem.builder.name) |
1476 | @@ -485,12 +492,13 @@ |
1477 | job as CHROOTFAIL, store available information, remove BuildQueue |
1478 | and release the builder. |
1479 | """ |
1480 | - queueItem.build.buildstate = BuildStatus.CHROOTWAIT |
1481 | + build = getUtility(IBuildSet).getByQueueEntry(queueItem) |
1482 | + build.buildstate = BuildStatus.CHROOTWAIT |
1483 | self.storeBuildInfo(queueItem, librarian, buildid, dependencies) |
1484 | self.logger.critical("***** %s is CHROOTWAIT *****" % |
1485 | queueItem.builder.name) |
1486 | queueItem.builder.cleanSlave() |
1487 | - queueItem.build.notify() |
1488 | + build.notify() |
1489 | queueItem.destroySelf() |
1490 | |
1491 | def buildStatus_BUILDERFAIL(self, queueItem, librarian, buildid, |
1492 | @@ -507,10 +515,11 @@ |
1493 | ("Builder returned BUILDERFAIL when asked " |
1494 | "for its status")) |
1495 | # simply reset job |
1496 | - queueItem.build.buildstate = BuildStatus.NEEDSBUILD |
1497 | + build = getUtility(IBuildSet).getByQueueEntry(queueItem) |
1498 | + build.buildstate = BuildStatus.NEEDSBUILD |
1499 | self.storeBuildInfo(queueItem, librarian, buildid, dependencies) |
1500 | queueItem.builder = None |
1501 | - queueItem.buildstart = None |
1502 | + queueItem.setDateStarted(None) |
1503 | |
1504 | def buildStatus_GIVENBACK(self, queueItem, librarian, buildid, |
1505 | filemap=None, dependencies=None): |
1506 | @@ -522,7 +531,8 @@ |
1507 | """ |
1508 | self.logger.warning("***** %s is GIVENBACK by %s *****" |
1509 | % (buildid, queueItem.builder.name)) |
1510 | - queueItem.build.buildstate = BuildStatus.NEEDSBUILD |
1511 | + build = getUtility(IBuildSet).getByQueueEntry(queueItem) |
1512 | + build.buildstate = BuildStatus.NEEDSBUILD |
1513 | self.storeBuildInfo(queueItem, librarian, buildid, dependencies) |
1514 | # XXX cprov 2006-05-30: Currently this information is not |
1515 | # properly presented in the Web UI. We will discuss it in |
1516 | @@ -530,7 +540,7 @@ |
1517 | # to use this content. For now we just ensure it's stored. |
1518 | queueItem.builder.cleanSlave() |
1519 | queueItem.builder = None |
1520 | - queueItem.buildstart = None |
1521 | + queueItem.setDateStarted(None) |
1522 | queueItem.logtail = None |
1523 | queueItem.lastscore = 0 |
1524 | |
1525 | |
1526 | === added directory 'lib/lp/buildmaster/interfaces' |
1527 | === added file 'lib/lp/buildmaster/interfaces/__init__.py' |
1528 | === added file 'lib/lp/buildmaster/interfaces/buildfarmjob.py' |
1529 | --- lib/lp/buildmaster/interfaces/buildfarmjob.py 1970-01-01 00:00:00 +0000 |
1530 | +++ lib/lp/buildmaster/interfaces/buildfarmjob.py 2009-11-16 23:27:14 +0000 |
1531 | @@ -0,0 +1,71 @@ |
1532 | +# Copyright 2009 Canonical Ltd. This software is licensed under the |
1533 | +# GNU Affero General Public License version 3 (see the file LICENSE). |
1534 | + |
1535 | +# pylint: disable-msg=E0211,E0213 |
1536 | + |
1537 | +"""Interface for Soyuz build farm jobs.""" |
1538 | + |
1539 | +__metaclass__ = type |
1540 | + |
1541 | +__all__ = [ |
1542 | + 'IBuildFarmJob', |
1543 | + 'BuildFarmJobType', |
1544 | + ] |
1545 | + |
1546 | +from zope.interface import Interface |
1547 | +from lazr.enum import DBEnumeratedType, DBItem |
1548 | + |
1549 | + |
1550 | +class BuildFarmJobType(DBEnumeratedType): |
1551 | + """Soyuz build farm job type. |
1552 | + |
1553 | + An enumeration with the types of jobs that may be run on the Soyuz build |
1554 | + farm. |
1555 | + """ |
1556 | + |
1557 | + PACKAGEBUILD = DBItem(1, """ |
1558 | + PackageBuildJob |
1559 | + |
1560 | + Build a source package. |
1561 | + """) |
1562 | + |
1563 | + BRANCHBUILD = DBItem(2, """ |
1564 | + BranchBuildJob |
1565 | + |
1566 | + Build a package from a bazaar branch. |
1567 | + """) |
1568 | + |
1569 | + RECIPEBRANCHBUILD = DBItem(3, """ |
1570 | + RecipeBranchBuildJob |
1571 | + |
1572 | + Build a package from a bazaar branch and a recipe. |
1573 | + """) |
1574 | + |
1575 | + TRANSLATION = DBItem(4, """ |
1576 | + TranslationJob |
1577 | + |
1578 | + Perform a translation job. |
1579 | + """) |
1580 | + |
1581 | + |
1582 | +class IBuildFarmJob(Interface): |
1583 | + """Operations that Soyuz build farm jobs must implement.""" |
1584 | + |
1585 | + def score(): |
1586 | + """Calculate a job score appropriate for the job type in question.""" |
1587 | + |
1588 | + def getLogFileName(): |
1589 | + """The preferred file name for the log of this Soyuz job.""" |
1590 | + |
1591 | + def getName(): |
1592 | + """An appropriate name for this Soyuz job.""" |
1593 | + |
1594 | + def jobStarted(): |
1595 | + """'Job started' life cycle event, handle as appropriate.""" |
1596 | + |
1597 | + def jobReset(): |
1598 | + """'Job reset' life cycle event, handle as appropriate.""" |
1599 | + |
1600 | + def jobAborted(): |
1601 | + """'Job aborted' life cycle event, handle as appropriate.""" |
1602 | + |
1603 | |
1604 | === modified file 'lib/lp/buildmaster/master.py' |
1605 | --- lib/lp/buildmaster/master.py 2009-10-26 18:40:04 +0000 |
1606 | +++ lib/lp/buildmaster/master.py 2009-11-16 23:27:14 +0000 |
1607 | @@ -280,8 +280,10 @@ |
1608 | "scanActiveBuilders() found %d active build(s) to check" |
1609 | % queueItems.count()) |
1610 | |
1611 | + build_set = getUtility(IBuildSet) |
1612 | for job in queueItems: |
1613 | - proc = job.archseries.processorfamily |
1614 | + build = build_set.getByQueueEntry(job) |
1615 | + proc = build.distroarchseries.processorfamily |
1616 | try: |
1617 | builders = notes[proc]["builders"] |
1618 | except KeyError: |
1619 | @@ -309,7 +311,7 @@ |
1620 | % candidates.count()) |
1621 | |
1622 | for job in candidates: |
1623 | - uptodate_build = getUtility(IBuildSet).getByBuildID(job.build.id) |
1624 | + uptodate_build = getUtility(IBuildSet).getByQueueEntry(job) |
1625 | if uptodate_build.buildstate != BuildStatus.NEEDSBUILD: |
1626 | continue |
1627 | job.score() |
1628 | |
1629 | === added directory 'lib/lp/buildmaster/model' |
1630 | === added file 'lib/lp/buildmaster/model/__init__.py' |
1631 | === added file 'lib/lp/buildmaster/model/buildfarmjob.py' |
1632 | --- lib/lp/buildmaster/model/buildfarmjob.py 1970-01-01 00:00:00 +0000 |
1633 | +++ lib/lp/buildmaster/model/buildfarmjob.py 2009-11-16 23:27:14 +0000 |
1634 | @@ -0,0 +1,40 @@ |
1635 | +# Copyright 2009 Canonical Ltd. This software is licensed under the |
1636 | +# GNU Affero General Public License version 3 (see the file LICENSE). |
1637 | + |
1638 | +__metaclass__ = type |
1639 | +__all__ = ['BuildFarmJob'] |
1640 | + |
1641 | + |
1642 | +from zope.interface import implements |
1643 | + |
1644 | +from lp.buildmaster.interfaces.buildfarmjob import IBuildFarmJob |
1645 | + |
1646 | + |
1647 | +class BuildFarmJob: |
1648 | + """Mix-in class for `IBuildFarmJob` implementations.""" |
1649 | + implements(IBuildFarmJob) |
1650 | + |
1651 | + def score(self): |
1652 | + """See `IBuildFarmJob`.""" |
1653 | + raise NotImplementedError |
1654 | + |
1655 | + def getLogFileName(self): |
1656 | + """See `IBuildFarmJob`.""" |
1657 | + raise NotImplementedError |
1658 | + |
1659 | + def getName(self): |
1660 | + """See `IBuildFarmJob`.""" |
1661 | + raise NotImplementedError |
1662 | + |
1663 | + def jobStarted(self): |
1664 | + """See `IBuildFarmJob`.""" |
1665 | + pass |
1666 | + |
1667 | + def jobReset(self): |
1668 | + """See `IBuildFarmJob`.""" |
1669 | + pass |
1670 | + |
1671 | + def jobAborted(self): |
1672 | + """See `IBuildFarmJob`.""" |
1673 | + pass |
1674 | + |
1675 | |
1676 | === modified file 'lib/lp/buildmaster/tests/queuebuilder.txt' |
1677 | --- lib/lp/buildmaster/tests/queuebuilder.txt 2009-10-26 18:40:04 +0000 |
1678 | +++ lib/lp/buildmaster/tests/queuebuilder.txt 2009-11-16 23:27:14 +0000 |
1679 | @@ -229,7 +229,7 @@ |
1680 | >>> copied_pub = pub_failed.copyTo( |
1681 | ... hoary, PackagePublishingPocket.RELEASE, warty.main_archive) |
1682 | |
1683 | - >>> from lp.soyuz.interfaces.build import BuildStatus |
1684 | + >>> from lp.soyuz.interfaces.build import BuildStatus, IBuildSet |
1685 | >>> failed_build = pub_failed.sourcepackagerelease.createBuild( |
1686 | ... warty['i386'], PackagePublishingPocket.RELEASE, |
1687 | ... warty.main_archive, status=BuildStatus.FAILEDTOBUILD) |
1688 | @@ -343,7 +343,8 @@ |
1689 | happen in parallel with build creation. |
1690 | |
1691 | >>> build_queue = active_jobs[0] |
1692 | - >>> print build_queue.build.title |
1693 | + >>> build = getUtility(IBuildSet).getByQueueEntry(build_queue) |
1694 | + >>> print build.title |
1695 | i386 build of test-buildd 667 in ubuntu hoary RELEASE |
1696 | >>> build_queue.lastscore |
1697 | 2505 |
1698 | @@ -351,15 +352,15 @@ |
1699 | Check the published component name retriever, they might be different, |
1700 | i.e., the published component can be different than the original component. |
1701 | |
1702 | - >>> print build_queue.build.current_component.name |
1703 | + >>> print build.current_component.name |
1704 | main |
1705 | - >>> print build_queue.build.sourcepackagerelease.component.name |
1706 | + >>> print build.sourcepackagerelease.component.name |
1707 | main |
1708 | |
1709 | Missing BuildQueue records, resulting from given-back builds, are |
1710 | created in the last stage of the queue-builder script. |
1711 | |
1712 | - >>> given_back_build = build_queue.build |
1713 | + >>> given_back_build = getUtility(IBuildSet).getByQueueEntry(build_queue) |
1714 | >>> build_queue.destroySelf() |
1715 | >>> flush_database_updates() |
1716 | |
1717 | |
1718 | === modified file 'lib/lp/buildmaster/tests/test_manager.py' |
1719 | --- lib/lp/buildmaster/tests/test_manager.py 2009-09-07 13:02:02 +0000 |
1720 | +++ lib/lp/buildmaster/tests/test_manager.py 2009-11-16 23:27:14 +0000 |
1721 | @@ -24,7 +24,7 @@ |
1722 | from lp.buildmaster.tests.harness import BuilddManagerTestSetup |
1723 | from canonical.launchpad.ftests import ANONYMOUS, login |
1724 | from lp.soyuz.tests.soyuzbuilddhelpers import SaneBuildingSlave |
1725 | -from lp.soyuz.interfaces.build import BuildStatus |
1726 | +from lp.soyuz.interfaces.build import BuildStatus, IBuildSet |
1727 | from lp.soyuz.interfaces.builder import IBuilderSet |
1728 | from lp.soyuz.interfaces.buildqueue import IBuildQueueSet |
1729 | from lp.registry.interfaces.distribution import IDistributionSet |
1730 | @@ -494,8 +494,9 @@ |
1731 | |
1732 | self.assertTrue(job is not None) |
1733 | self.assertEqual(job.builder, builder) |
1734 | - self.assertTrue(job.buildstart is not None) |
1735 | - self.assertEqual(job.build.buildstate, BuildStatus.BUILDING) |
1736 | + self.assertTrue(job.date_started is not None) |
1737 | + build = getUtility(IBuildSet).getByQueueEntry(job) |
1738 | + self.assertEqual(build.buildstate, BuildStatus.BUILDING) |
1739 | self.assertEqual(job.logtail, logtail) |
1740 | |
1741 | def _getManager(self): |
1742 | @@ -617,8 +618,9 @@ |
1743 | |
1744 | job = getUtility(IBuildQueueSet).get(job.id) |
1745 | self.assertTrue(job.builder is None) |
1746 | - self.assertTrue(job.buildstart is None) |
1747 | - self.assertEqual(job.build.buildstate, BuildStatus.NEEDSBUILD) |
1748 | + self.assertTrue(job.date_started is None) |
1749 | + build = getUtility(IBuildSet).getByQueueEntry(job) |
1750 | + self.assertEqual(build.buildstate, BuildStatus.NEEDSBUILD) |
1751 | |
1752 | def testScanRescuesJobFromBrokenBuilder(self): |
1753 | # The job assigned to a broken builder is rescued. |
1754 | @@ -701,13 +703,14 @@ |
1755 | builder.builderok = True |
1756 | |
1757 | job = builder.currentjob |
1758 | + build = getUtility(IBuildSet).getByQueueEntry(job) |
1759 | self.assertEqual( |
1760 | 'i386 build of mozilla-firefox 0.9 in ubuntu hoary RELEASE', |
1761 | - job.build.title) |
1762 | + build.title) |
1763 | |
1764 | - self.assertEqual('BUILDING', job.build.buildstate.name) |
1765 | + self.assertEqual('BUILDING', build.buildstate.name) |
1766 | self.assertNotEqual(None, job.builder) |
1767 | - self.assertNotEqual(None, job.buildstart) |
1768 | + self.assertNotEqual(None, job.date_started) |
1769 | self.assertNotEqual(None, job.logtail) |
1770 | |
1771 | transaction.commit() |
1772 | @@ -717,9 +720,10 @@ |
1773 | def assertJobIsClean(self, job_id): |
1774 | """Re-fetch the `IBuildQueue` record and check if it's clean.""" |
1775 | job = getUtility(IBuildQueueSet).get(job_id) |
1776 | - self.assertEqual('NEEDSBUILD', job.build.buildstate.name) |
1777 | + build = getUtility(IBuildSet).getByQueueEntry(job) |
1778 | + self.assertEqual('NEEDSBUILD', build.buildstate.name) |
1779 | self.assertEqual(None, job.builder) |
1780 | - self.assertEqual(None, job.buildstart) |
1781 | + self.assertEqual(None, job.date_started) |
1782 | self.assertEqual(None, job.logtail) |
1783 | |
1784 | def testResetDispatchResult(self): |
1785 | |
1786 | === modified file 'lib/lp/registry/interfaces/distroseries.py' |
1787 | --- lib/lp/registry/interfaces/distroseries.py 2009-11-07 16:14:22 +0000 |
1788 | +++ lib/lp/registry/interfaces/distroseries.py 2009-11-16 23:27:14 +0000 |
1789 | @@ -851,6 +851,12 @@ |
1790 | :return: A string. |
1791 | """ |
1792 | |
1793 | + def isSourcePackageFormatPermitted(format): |
1794 | + """Check if the specified source format is allowed in this series. |
1795 | + |
1796 | + :param format: The SourcePackageFormat to check. |
1797 | + """ |
1798 | + |
1799 | |
1800 | class IDistroSeries(IDistroSeriesEditRestricted, IDistroSeriesPublic, |
1801 | IStructuralSubscriptionTarget): |
1802 | |
1803 | === modified file 'lib/lp/registry/interfaces/sourcepackage.py' |
1804 | --- lib/lp/registry/interfaces/sourcepackage.py 2009-11-15 23:23:12 +0000 |
1805 | +++ lib/lp/registry/interfaces/sourcepackage.py 2009-11-16 23:27:14 +0000 |
1806 | @@ -284,7 +284,7 @@ |
1807 | which in turn lists the orig.tar.gz and diff.tar.gz files used to |
1808 | make up the package. """) |
1809 | |
1810 | - ORIG = DBItem(4, """ |
1811 | + ORIG_TARBALL = DBItem(4, """ |
1812 | Orig Tarball |
1813 | |
1814 | This file is an Ubuntu "orig" file, typically an upstream tarball or |
1815 | @@ -298,8 +298,8 @@ |
1816 | diff creates additional directories with patches and documentation |
1817 | used to build the binary packages for Ubuntu. """) |
1818 | |
1819 | - TARBALL = DBItem(6, """ |
1820 | - Tarball |
1821 | + NATIVE_TARBALL = DBItem(6, """ |
1822 | + Native Tarball |
1823 | |
1824 | This is a tarball, usually of a mixture of Ubuntu and upstream code, |
1825 | used in the build process for this source package. """) |
1826 | |
1827 | === modified file 'lib/lp/registry/model/distroseries.py' |
1828 | --- lib/lp/registry/model/distroseries.py 2009-11-07 16:14:22 +0000 |
1829 | +++ lib/lp/registry/model/distroseries.py 2009-11-16 23:27:14 +0000 |
1830 | @@ -118,6 +118,8 @@ |
1831 | from canonical.launchpad.webapp.interfaces import ( |
1832 | IStoreSelector, MAIN_STORE, NotFoundError, SLAVE_FLAVOR, |
1833 | TranslationUnavailable) |
1834 | +from lp.soyuz.interfaces.sourcepackageformat import ( |
1835 | + ISourcePackageFormatSelectionSet) |
1836 | |
1837 | |
1838 | class SeriesMixin: |
1839 | @@ -1496,7 +1498,7 @@ |
1840 | cur = cursor() |
1841 | |
1842 | # Perform the copies |
1843 | - self._copy_component_and_section_selections(cur) |
1844 | + self._copy_component_section_and_format_selections(cur) |
1845 | |
1846 | # Prepare the list of distroarchseries for which binary packages |
1847 | # shall be copied. |
1848 | @@ -1557,9 +1559,9 @@ |
1849 | PackagePublishingPocket.RELEASE) |
1850 | clone_packages(origin, destination, distroarchseries_list) |
1851 | |
1852 | - def _copy_component_and_section_selections(self, cur): |
1853 | - """Copy the section and component selections from the parent distro |
1854 | - series into this one. |
1855 | + def _copy_component_section_and_format_selections(self, cur): |
1856 | + """Copy the section, component and format selections from the parent |
1857 | + distro series into this one. |
1858 | """ |
1859 | # Copy the component selections |
1860 | cur.execute(''' |
1861 | @@ -1573,6 +1575,13 @@ |
1862 | SELECT %s as distroseries, ss.section AS section |
1863 | FROM SectionSelection AS ss WHERE ss.distroseries = %s |
1864 | ''' % sqlvalues(self.id, self.parent_series.id)) |
1865 | + # Copy the source format selections |
1866 | + cur.execute(''' |
1867 | + INSERT INTO SourcePackageFormatSelection (distroseries, format) |
1868 | + SELECT %s as distroseries, spfs.format AS format |
1869 | + FROM SourcePackageFormatSelection AS spfs |
1870 | + WHERE spfs.distroseries = %s |
1871 | + ''' % sqlvalues(self.id, self.parent_series.id)) |
1872 | |
1873 | def copyTranslationsFromParent(self, transaction, logger=None): |
1874 | """See `IDistroSeries`.""" |
1875 | @@ -1739,6 +1748,10 @@ |
1876 | else: |
1877 | return '%s%s' % (self.name, pocketsuffix[pocket]) |
1878 | |
1879 | + def isSourcePackageFormatPermitted(self, format): |
1880 | + return getUtility(ISourcePackageFormatSelectionSet |
1881 | + ).getBySeriesAndFormat(self, format) is not None |
1882 | + |
1883 | |
1884 | class DistroSeriesSet: |
1885 | implements(IDistroSeriesSet) |
1886 | |
1887 | === modified file 'lib/lp/registry/model/sourcepackage.py' |
1888 | --- lib/lp/registry/model/sourcepackage.py 2009-11-15 23:23:12 +0000 |
1889 | +++ lib/lp/registry/model/sourcepackage.py 2009-11-16 23:27:14 +0000 |
1890 | @@ -554,8 +554,10 @@ |
1891 | # It should present the builds in a more natural order. |
1892 | if build_state in [BuildStatus.NEEDSBUILD, BuildStatus.BUILDING]: |
1893 | orderBy = ["-BuildQueue.lastscore"] |
1894 | + clauseTables.append('BuildPackageJob') |
1895 | + condition_clauses.append('BuildPackageJob.build = Build.id') |
1896 | clauseTables.append('BuildQueue') |
1897 | - condition_clauses.append('BuildQueue.build = Build.id') |
1898 | + condition_clauses.append('BuildQueue.job = BuildPackageJob.job') |
1899 | elif build_state == BuildStatus.SUPERSEDED or build_state is None: |
1900 | orderBy = ["-Build.datecreated"] |
1901 | else: |
1902 | |
1903 | === modified file 'lib/lp/services/job/tests/test_job.py' |
1904 | --- lib/lp/services/job/tests/test_job.py 2009-07-17 00:26:05 +0000 |
1905 | +++ lib/lp/services/job/tests/test_job.py 2009-11-16 23:27:14 +0000 |
1906 | @@ -8,6 +8,8 @@ |
1907 | from unittest import TestLoader |
1908 | |
1909 | import pytz |
1910 | +from zope.component import getUtility |
1911 | + |
1912 | from canonical.database.constants import UTC_NOW |
1913 | from canonical.testing import LaunchpadZopelessLayer |
1914 | from storm.locals import Store |
1915 | @@ -17,6 +19,8 @@ |
1916 | from lp.services.job.interfaces.job import IJob, JobStatus |
1917 | from lp.testing import TestCase |
1918 | from canonical.launchpad.webapp.testing import verifyObject |
1919 | +from canonical.launchpad.webapp.interfaces import ( |
1920 | + IStoreSelector, MAIN_STORE, DEFAULT_FLAVOR) |
1921 | |
1922 | |
1923 | class TestJob(TestCase): |
1924 | @@ -155,40 +159,53 @@ |
1925 | |
1926 | layer = LaunchpadZopelessLayer |
1927 | |
1928 | + def _sampleData(self): |
1929 | + store = getUtility(IStoreSelector).get(MAIN_STORE, DEFAULT_FLAVOR) |
1930 | + return list(store.execute(Job.ready_jobs)) |
1931 | + |
1932 | def test_ready_jobs(self): |
1933 | """Job.ready_jobs should include new jobs.""" |
1934 | + preexisting = self._sampleData() |
1935 | job = Job() |
1936 | self.assertEqual( |
1937 | - [(job.id,)], list(Store.of(job).execute(Job.ready_jobs))) |
1938 | + preexisting + [(job.id,)], |
1939 | + list(Store.of(job).execute(Job.ready_jobs))) |
1940 | |
1941 | def test_ready_jobs_started(self): |
1942 | """Job.ready_jobs should not jobs that have been started.""" |
1943 | + preexisting = self._sampleData() |
1944 | job = Job(_status=JobStatus.RUNNING) |
1945 | self.assertEqual( |
1946 | - [], list(Store.of(job).execute(Job.ready_jobs))) |
1947 | + preexisting, list(Store.of(job).execute(Job.ready_jobs))) |
1948 | |
1949 | def test_ready_jobs_lease_expired(self): |
1950 | """Job.ready_jobs should include jobs with expired leases.""" |
1951 | + preexisting = self._sampleData() |
1952 | UNIX_EPOCH = datetime.fromtimestamp(0, pytz.timezone('UTC')) |
1953 | job = Job(lease_expires=UNIX_EPOCH) |
1954 | self.assertEqual( |
1955 | - [(job.id,)], list(Store.of(job).execute(Job.ready_jobs))) |
1956 | + preexisting + [(job.id,)], |
1957 | + list(Store.of(job).execute(Job.ready_jobs))) |
1958 | |
1959 | def test_ready_jobs_lease_in_future(self): |
1960 | """Job.ready_jobs should not include jobs with active leases.""" |
1961 | + preexisting = self._sampleData() |
1962 | future = datetime.fromtimestamp( |
1963 | time.time() + 1000, pytz.timezone('UTC')) |
1964 | job = Job(lease_expires=future) |
1965 | - self.assertEqual([], list(Store.of(job).execute(Job.ready_jobs))) |
1966 | + self.assertEqual( |
1967 | + preexisting, list(Store.of(job).execute(Job.ready_jobs))) |
1968 | |
1969 | def test_ready_jobs_not_jobs_scheduled_in_future(self): |
1970 | """Job.ready_jobs does not included jobs scheduled for a time in the |
1971 | future. |
1972 | """ |
1973 | + preexisting = self._sampleData() |
1974 | future = datetime.fromtimestamp( |
1975 | time.time() + 1000, pytz.timezone('UTC')) |
1976 | job = Job(scheduled_start=future) |
1977 | - self.assertEqual([], list(Store.of(job).execute(Job.ready_jobs))) |
1978 | + self.assertEqual( |
1979 | + preexisting, list(Store.of(job).execute(Job.ready_jobs))) |
1980 | |
1981 | def test_acquireLease(self): |
1982 | """Job.acquireLease should set job.lease_expires.""" |
1983 | |
1984 | === modified file 'lib/lp/soyuz/browser/build.py' |
1985 | --- lib/lp/soyuz/browser/build.py 2009-10-26 18:40:04 +0000 |
1986 | +++ lib/lp/soyuz/browser/build.py 2009-11-16 23:27:14 +0000 |
1987 | @@ -288,10 +288,10 @@ |
1988 | prefetched_data = dict() |
1989 | build_ids = [build.id for build in builds] |
1990 | results = getUtility(IBuildQueueSet).getForBuilds(build_ids) |
1991 | - for (buildqueue, builder) in results: |
1992 | + for (buildqueue, _builder, build_job) in results: |
1993 | # Get the build's id, 'buildqueue', 'sourcepackagerelease' and |
1994 | # 'buildlog' (from the result set) respectively. |
1995 | - prefetched_data[buildqueue.build.id] = buildqueue |
1996 | + prefetched_data[build_job.build.id] = buildqueue |
1997 | |
1998 | complete_builds = [] |
1999 | for build in builds: |
2000 | |
2001 | === modified file 'lib/lp/soyuz/browser/builder.py' |
2002 | --- lib/lp/soyuz/browser/builder.py 2009-09-17 14:45:15 +0000 |
2003 | +++ lib/lp/soyuz/browser/builder.py 2009-11-16 23:27:14 +0000 |
2004 | @@ -237,12 +237,12 @@ |
2005 | def current_build_duration(self): |
2006 | """Return the delta representing the duration of the current job.""" |
2007 | if (self.context.currentjob is None or |
2008 | - self.context.currentjob.buildstart is None): |
2009 | + self.context.currentjob.date_started is None): |
2010 | return None |
2011 | else: |
2012 | UTC = pytz.timezone('UTC') |
2013 | - buildstart = self.context.currentjob.buildstart |
2014 | - return datetime.datetime.now(UTC) - buildstart |
2015 | + date_started = self.context.currentjob.date_started |
2016 | + return datetime.datetime.now(UTC) - date_started |
2017 | |
2018 | @property |
2019 | def page_title(self): |
2020 | |
2021 | === modified file 'lib/lp/soyuz/browser/tests/builder-views.txt' |
2022 | --- lib/lp/soyuz/browser/tests/builder-views.txt 2009-09-16 19:06:48 +0000 |
2023 | +++ lib/lp/soyuz/browser/tests/builder-views.txt 2009-11-16 23:27:14 +0000 |
2024 | @@ -1,7 +1,7 @@ |
2025 | = Builder View Classes and Pages = |
2026 | |
2027 | >>> from zope.component import getMultiAdapter, getUtility |
2028 | - >>> from canonical.launchpad.interfaces import IBuilderSet |
2029 | + >>> from canonical.launchpad.interfaces import IBuildSet, IBuilderSet |
2030 | >>> from canonical.launchpad.webapp.servers import LaunchpadTestRequest |
2031 | |
2032 | >>> builder = getUtility(IBuilderSet).get(1) |
2033 | @@ -158,7 +158,8 @@ |
2034 | >>> frog = getUtility(IBuilderSet)['frog'] |
2035 | >>> frog.builderok = True |
2036 | >>> private_build.builder = frog |
2037 | - >>> private_job = BuildQueue(build=private_build, builder=frog) |
2038 | + >>> private_job = private_build.createBuildQueueEntry() |
2039 | + >>> private_job.builder = frog |
2040 | >>> private_job_id = private_job.id |
2041 | |
2042 | >>> from canonical.database.sqlbase import flush_database_caches |
2043 | @@ -175,7 +176,9 @@ |
2044 | >>> print frog.builderok |
2045 | True |
2046 | |
2047 | - >>> print frog.currentjob.build.title |
2048 | + >>> build_set = getUtility(IBuildSet) |
2049 | + >>> build = build_set.getByQueueEntry(frog.currentjob) |
2050 | + >>> print build.title |
2051 | i386 build of privacy-test 666 in ubuntutest breezy-autotest RELEASE |
2052 | |
2053 | >>> print frog.failnotes |
2054 | @@ -199,7 +202,8 @@ |
2055 | >>> print admin_view.context.builderok |
2056 | True |
2057 | |
2058 | - >>> print admin_view.context.currentjob.build.title |
2059 | + >>> build = build_set.getByQueueEntry(admin_view.context.currentjob) |
2060 | + >>> print build.title |
2061 | i386 build of privacy-test 666 in ubuntutest breezy-autotest RELEASE |
2062 | |
2063 | >>> print admin_view.context.failnotes |
2064 | @@ -211,7 +215,7 @@ |
2065 | |
2066 | >>> import datetime |
2067 | >>> import pytz |
2068 | - >>> private_job.buildstart = ( |
2069 | + >>> private_job.setDateStarted( |
2070 | ... datetime.datetime.now(pytz.UTC) - datetime.timedelta(10)) |
2071 | >>> print admin_view.current_build_duration |
2072 | 10 days... |
2073 | |
2074 | === modified file 'lib/lp/soyuz/configure.zcml' |
2075 | --- lib/lp/soyuz/configure.zcml 2009-11-09 17:59:18 +0000 |
2076 | +++ lib/lp/soyuz/configure.zcml 2009-11-16 23:27:14 +0000 |
2077 | @@ -560,7 +560,7 @@ |
2078 | |
2079 | <require |
2080 | permission="zope.Public" |
2081 | - set_attributes="lastscore builder buildstart logtail"/> |
2082 | + set_attributes="lastscore builder logtail date_started"/> |
2083 | </class> |
2084 | |
2085 | <!-- BuildQueueSet --> |
2086 | @@ -791,6 +791,28 @@ |
2087 | interface="lp.soyuz.interfaces.section.ISectionSet"/> |
2088 | </securedutility> |
2089 | |
2090 | + <!-- SourcePackageFormatSelection --> |
2091 | + |
2092 | + <class |
2093 | + class="lp.soyuz.model.sourcepackageformat.SourcePackageFormatSelection"> |
2094 | + <allow |
2095 | + interface="lp.soyuz.interfaces.sourcepackageformat.ISourcePackageFormatSelection"/> |
2096 | + </class> |
2097 | + |
2098 | + <!-- SourcePackageFormatSelectionSet --> |
2099 | + |
2100 | + <class |
2101 | + class="lp.soyuz.model.sourcepackageformat.SourcePackageFormatSelectionSet"> |
2102 | + <allow |
2103 | + interface="lp.soyuz.interfaces.sourcepackageformat.ISourcePackageFormatSelectionSet"/> |
2104 | + </class> |
2105 | + <securedutility |
2106 | + class="lp.soyuz.model.sourcepackageformat.SourcePackageFormatSelectionSet" |
2107 | + provides="lp.soyuz.interfaces.sourcepackageformat.ISourcePackageFormatSelectionSet"> |
2108 | + <allow |
2109 | + interface="lp.soyuz.interfaces.sourcepackageformat.ISourcePackageFormatSelectionSet"/> |
2110 | + </securedutility> |
2111 | + |
2112 | <!-- SourcePackageReleaseFile --> |
2113 | |
2114 | <class |
2115 | @@ -863,5 +885,11 @@ |
2116 | interface="lp.soyuz.interfaces.packagesetgroup.IPackagesetGroup"/> |
2117 | </class> |
2118 | |
2119 | + <!-- BuildPackageJob --> |
2120 | + <class |
2121 | + class="lp.soyuz.model.buildpackagejob.BuildPackageJob"> |
2122 | + <allow |
2123 | + interface="lp.soyuz.interfaces.buildpackagejob.IBuildPackageJob"/> |
2124 | + </class> |
2125 | |
2126 | </configure> |
2127 | |
2128 | === modified file 'lib/lp/soyuz/doc/archive.txt' |
2129 | --- lib/lp/soyuz/doc/archive.txt 2009-11-09 13:01:13 +0000 |
2130 | +++ lib/lp/soyuz/doc/archive.txt 2009-11-16 23:27:14 +0000 |
2131 | @@ -850,7 +850,7 @@ |
2132 | |
2133 | >>> print_published_files(cprov_archive) |
2134 | cdrkit - 1.0: foobar-1.0.dsc (DSC, 716 bytes) |
2135 | - iceweasel - 1.0: firefox_0.9.2.orig.tar.gz (ORIG, 9922560 bytes) |
2136 | + iceweasel - 1.0: firefox_0.9.2.orig.tar.gz (ORIG_TARBALL, 9922560 bytes) |
2137 | iceweasel - 1.0: iceweasel-1.0.dsc (DSC, 123 bytes) |
2138 | |
2139 | Now we will emulate a duplicated reference to the same 'orig.tar.gz', |
2140 | @@ -867,9 +867,9 @@ |
2141 | 'firefox_0.9.2.orig.tar.gz' file. |
2142 | |
2143 | >>> print_published_files(cprov_archive) |
2144 | - cdrkit - 1.0: firefox_0.9.2.orig.tar.gz (ORIG, 9922560 bytes) |
2145 | + cdrkit - 1.0: firefox_0.9.2.orig.tar.gz (ORIG_TARBALL, 9922560 bytes) |
2146 | cdrkit - 1.0: foobar-1.0.dsc (DSC, 716 bytes) |
2147 | - iceweasel - 1.0: firefox_0.9.2.orig.tar.gz (ORIG, 9922560 bytes) |
2148 | + iceweasel - 1.0: firefox_0.9.2.orig.tar.gz (ORIG_TARBALL, 9922560 bytes) |
2149 | iceweasel - 1.0: iceweasel-1.0.dsc (DSC, 123 bytes) |
2150 | |
2151 | Similarly to what happen in the archive disk 'pool', where already |
2152 | |
2153 | === modified file 'lib/lp/soyuz/doc/build-estimated-dispatch-time.txt' |
2154 | --- lib/lp/soyuz/doc/build-estimated-dispatch-time.txt 2009-08-28 07:34:44 +0000 |
2155 | +++ lib/lp/soyuz/doc/build-estimated-dispatch-time.txt 2009-11-16 23:27:14 +0000 |
2156 | @@ -59,7 +59,8 @@ |
2157 | >>> UTC = pytz.timezone('UTC') |
2158 | >>> bob_the_builder = builder_set.get(1) |
2159 | >>> cur_bqueue = bob_the_builder.currentjob |
2160 | - >>> cur_build = cur_bqueue.build |
2161 | + >>> from lp.soyuz.interfaces.build import IBuildSet |
2162 | + >>> cur_build = getUtility(IBuildSet).getByQueueEntry(cur_bqueue) |
2163 | |
2164 | Make sure the job at hand is currently being built. |
2165 | |
2166 | @@ -73,16 +74,16 @@ |
2167 | of job N in the build queue. These values will now be set for the job |
2168 | that is currently building. |
2169 | |
2170 | + >>> from zope.security.proxy import removeSecurityProxy |
2171 | >>> cur_bqueue.lastscore = 1111 |
2172 | - >>> cur_bqueue.buildstart = datetime(2008, 4, 1, 10, 45, 39, |
2173 | - ... tzinfo=UTC) |
2174 | - >>> print cur_bqueue.buildstart |
2175 | + >>> cur_bqueue.setDateStarted( |
2176 | + ... datetime(2008, 4, 1, 10, 45, 39, tzinfo=UTC)) |
2177 | + >>> print cur_bqueue.date_started |
2178 | 2008-04-01 10:45:39+00:00 |
2179 | |
2180 | Please note that the "estimated build duration" is an internal property |
2181 | and not meant to be viewed or modified by an end user. |
2182 | |
2183 | - >>> from zope.security.proxy import removeSecurityProxy |
2184 | >>> naked_build = removeSecurityProxy(cur_build) |
2185 | >>> naked_build.estimated_build_duration = timedelta(minutes=56) |
2186 | |
2187 | |
2188 | === modified file 'lib/lp/soyuz/doc/buildd-dispatching.txt' |
2189 | --- lib/lp/soyuz/doc/buildd-dispatching.txt 2009-10-14 08:20:40 +0000 |
2190 | +++ lib/lp/soyuz/doc/buildd-dispatching.txt 2009-11-16 23:27:14 +0000 |
2191 | @@ -134,18 +134,20 @@ |
2192 | |
2193 | >>> job.id |
2194 | 2 |
2195 | - >>> job.build.buildstate.name |
2196 | + >>> from lp.soyuz.interfaces.build import IBuildSet |
2197 | + >>> build = getUtility(IBuildSet).getByQueueEntry(job) |
2198 | + >>> build.buildstate.name |
2199 | 'NEEDSBUILD' |
2200 | >>> job.builder is None |
2201 | True |
2202 | - >>> job.buildstart is None |
2203 | + >>> job.date_started is None |
2204 | True |
2205 | - >>> job.is_virtualized |
2206 | + >>> build.is_virtualized |
2207 | False |
2208 | |
2209 | The build start time is not set yet either. |
2210 | |
2211 | - >>> print job.build.date_first_dispatched |
2212 | + >>> print build.date_first_dispatched |
2213 | None |
2214 | |
2215 | Update the SourcePackageReleaseFile corresponding to this job: |
2216 | @@ -154,7 +156,7 @@ |
2217 | >>> alias_id = librarian_client.addFile( |
2218 | ... 'foo.dsc', len(content), StringIO(content), 'application/dsc') |
2219 | |
2220 | - >>> sprf = job.build.sourcepackagerelease.files[0] |
2221 | + >>> sprf = build.sourcepackagerelease.files[0] |
2222 | >>> from zope.security.proxy import removeSecurityProxy |
2223 | >>> naked_sprf = removeSecurityProxy(sprf) |
2224 | >>> naked_sprf.libraryfile = getUtility(ILibraryFileAliasSet)[alias_id] |
2225 | @@ -167,35 +169,20 @@ |
2226 | |
2227 | Verify if the job (BuildQueue) was updated appropriately: |
2228 | |
2229 | - >>> def checkTimes(expected, actual): |
2230 | - ... if expected != actual: |
2231 | - ... return "expected: %s, actual: %s" % (expected, actual) |
2232 | - ... else: |
2233 | - ... return "OK" |
2234 | - |
2235 | >>> job.builder.id == bob_builder.id |
2236 | True |
2237 | |
2238 | - >>> job.build.buildstate.name |
2239 | + >>> build = getUtility(IBuildSet).getByQueueEntry(job) |
2240 | + >>> build.buildstate.name |
2241 | 'BUILDING' |
2242 | |
2243 | - >>> from canonical.database.sqlbase import get_transaction_timestamp |
2244 | - >>> checkTimes(get_transaction_timestamp(), job.buildstart) |
2245 | - 'OK' |
2246 | - |
2247 | -The build start time will be set to the same value. |
2248 | - |
2249 | - >>> checkTimes(get_transaction_timestamp(), |
2250 | - ... job.build.date_first_dispatched) |
2251 | - 'OK' |
2252 | - |
2253 | Shutdown builder, mark the build record as failed and remove the |
2254 | buildqueue record, so the build was eliminated: |
2255 | |
2256 | >>> BuilddSlaveTestSetup().tearDown() |
2257 | |
2258 | >>> from lp.soyuz.interfaces.build import BuildStatus |
2259 | - >>> job.build.buildstate = BuildStatus.FAILEDTOBUILD |
2260 | + >>> build.buildstate = BuildStatus.FAILEDTOBUILD |
2261 | >>> job.destroySelf() |
2262 | >>> flush_database_updates() |
2263 | |
2264 | @@ -217,12 +204,13 @@ |
2265 | 3 |
2266 | >>> ppa_job.builder == None |
2267 | True |
2268 | - >>> ppa_job.buildstart == None |
2269 | + >>> ppa_job.date_started == None |
2270 | True |
2271 | |
2272 | The build job's archive requires virtualized builds. |
2273 | |
2274 | - >>> ppa_job.build.archive.require_virtualized |
2275 | + >>> build = getUtility(IBuildSet).getByQueueEntry(ppa_job) |
2276 | + >>> build.archive.require_virtualized |
2277 | True |
2278 | |
2279 | But the builder is not virtualized. |
2280 | @@ -249,10 +237,10 @@ |
2281 | >>> from lp.soyuz.model.publishing import ( |
2282 | ... SourcePackagePublishingHistory) |
2283 | >>> [old_pub] = SourcePackagePublishingHistory.selectBy( |
2284 | - ... distroseries=ppa_job.build.distroseries, |
2285 | - ... sourcepackagerelease=ppa_job.build.sourcepackagerelease) |
2286 | + ... distroseries=build.distroseries, |
2287 | + ... sourcepackagerelease=build.sourcepackagerelease) |
2288 | >>> new_pub = old_pub.copyTo( |
2289 | - ... old_pub.distroseries, old_pub.pocket, ppa_job.build.archive) |
2290 | + ... old_pub.distroseries, old_pub.pocket, build.archive) |
2291 | |
2292 | >>> bob_builder.virtualized = True |
2293 | >>> syncUpdate(bob_builder) |
2294 | @@ -293,19 +281,16 @@ |
2295 | >>> ppa_job.builder.name |
2296 | u'bob' |
2297 | |
2298 | - >>> ppa_job.build.buildstate.name |
2299 | + >>> build.buildstate.name |
2300 | 'BUILDING' |
2301 | |
2302 | - >>> ppa_job.buildstart == get_transaction_timestamp() |
2303 | - True |
2304 | - |
2305 | Shutdown builder slave, mark the ppa build record as failed, remove the |
2306 | buildqueue record and make 'bob' builder non-virtual again, so the |
2307 | environment is back to the initial state. |
2308 | |
2309 | >>> BuilddSlaveTestSetup().tearDown() |
2310 | |
2311 | - >>> ppa_job.build.buildstate = BuildStatus.FAILEDTOBUILD |
2312 | + >>> build.buildstate = BuildStatus.FAILEDTOBUILD |
2313 | >>> ppa_job.destroySelf() |
2314 | >>> bob_builder.virtualized = False |
2315 | >>> flush_database_updates() |
2316 | @@ -332,9 +317,9 @@ |
2317 | 4 |
2318 | >>> print sec_job.builder |
2319 | None |
2320 | - >>> print sec_job.buildstart |
2321 | + >>> print sec_job.date_started |
2322 | None |
2323 | - >>> sec_job.is_virtualized |
2324 | + >>> sec_build.is_virtualized |
2325 | False |
2326 | |
2327 | In normal conditions the next available candidate would be the job |
2328 | |
2329 | === modified file 'lib/lp/soyuz/doc/buildd-scoring.txt' |
2330 | --- lib/lp/soyuz/doc/buildd-scoring.txt 2009-08-30 23:57:41 +0000 |
2331 | +++ lib/lp/soyuz/doc/buildd-scoring.txt 2009-11-16 23:27:14 +0000 |
2332 | @@ -49,7 +49,7 @@ |
2333 | >>> def setUpBuildQueueEntry( |
2334 | ... component_name='main', urgency=SourcePackageUrgency.HIGH, |
2335 | ... pocket=PackagePublishingPocket.RELEASE, |
2336 | - ... date_created=LOCAL_NOW, manual=False): |
2337 | + ... date_created=LOCAL_NOW, manual=False, archive=None): |
2338 | ... global version |
2339 | ... commit() |
2340 | ... LaunchpadZopelessLayer.switchDbUser('launchpad') |
2341 | @@ -57,7 +57,7 @@ |
2342 | ... sourcename='test-build', version=str(version), |
2343 | ... distroseries=hoary, component=component_name, |
2344 | ... urgency=urgency, pocket=pocket, |
2345 | - ... status=PackagePublishingStatus.PUBLISHED) |
2346 | + ... status=PackagePublishingStatus.PUBLISHED, archive=archive) |
2347 | ... commit() |
2348 | ... LaunchpadZopelessLayer.switchDbUser(test_dbuser) |
2349 | ... version += 1 |
2350 | @@ -65,7 +65,7 @@ |
2351 | ... hoary386, pub.pocket, pub.archive) |
2352 | ... |
2353 | ... build_queue = build.createBuildQueueEntry() |
2354 | - ... build_queue.created = date_created |
2355 | + ... build_queue.job.date_created = date_created |
2356 | ... build_queue.manual = manual |
2357 | ... |
2358 | ... return build_queue |
2359 | @@ -86,8 +86,10 @@ |
2360 | |
2361 | >>> commit() |
2362 | >>> LaunchpadZopelessLayer.switchDbUser('launchpad') |
2363 | - >>> bq0.build.archive.buildd_secret = "secret" |
2364 | - >>> bq0.build.archive.private = True |
2365 | + >>> from lp.soyuz.interfaces.build import IBuildSet |
2366 | + >>> build = getUtility(IBuildSet).getByQueueEntry(bq0) |
2367 | + >>> build.archive.buildd_secret = "secret" |
2368 | + >>> build.archive.private = True |
2369 | >>> bq0.score() |
2370 | >>> bq0.lastscore |
2371 | 12515 |
2372 | @@ -96,19 +98,19 @@ |
2373 | IArchive.relative_build_score to boost by 100 changes the lastscore value |
2374 | appropriately. |
2375 | |
2376 | - >>> bq0.build.archive.relative_build_score = 100 |
2377 | + >>> build.archive.relative_build_score = 100 |
2378 | >>> bq0.score() |
2379 | >>> bq0.lastscore |
2380 | 12615 |
2381 | |
2382 | The delta can also be negative. |
2383 | |
2384 | - >>> bq0.build.archive.relative_build_score = -100 |
2385 | + >>> build.archive.relative_build_score = -100 |
2386 | >>> bq0.score() |
2387 | >>> bq0.lastscore |
2388 | 12415 |
2389 | |
2390 | - >>> bq0.build.archive.relative_build_score = 0 |
2391 | + >>> build.archive.relative_build_score = 0 |
2392 | >>> LaunchpadZopelessLayer.switchDbUser(test_dbuser) |
2393 | |
2394 | |
2395 | @@ -250,9 +252,15 @@ |
2396 | they all have a fixed score of -10. They will get built in the order |
2397 | they were created. |
2398 | |
2399 | - >>> from canonical.launchpad.interfaces import ArchivePurpose |
2400 | - >>> bqc = setUpBuildQueueEntry() |
2401 | - >>> bqc.build.archive.purpose = ArchivePurpose.COPY |
2402 | + >>> LaunchpadZopelessLayer.switchDbUser('launchpad') |
2403 | + >>> from canonical.launchpad.interfaces import ( |
2404 | + ... ArchivePurpose, IArchiveSet) |
2405 | + >>> copy = getUtility(IArchiveSet).new( |
2406 | + ... owner=ubuntu.owner, purpose=ArchivePurpose.COPY, |
2407 | + ... name='test-rebuild') |
2408 | + |
2409 | + >>> bqc = setUpBuildQueueEntry(archive=copy) |
2410 | + >>> build = getUtility(IBuildSet).getByQueueEntry(bqc) |
2411 | >>> bqc.score() |
2412 | >>> bqc.lastscore |
2413 | -10 |
2414 | |
2415 | === modified file 'lib/lp/soyuz/doc/buildd-slavescanner.txt' |
2416 | --- lib/lp/soyuz/doc/buildd-slavescanner.txt 2009-11-05 10:51:36 +0000 |
2417 | +++ lib/lp/soyuz/doc/buildd-slavescanner.txt 2009-11-16 23:27:14 +0000 |
2418 | @@ -188,10 +188,11 @@ |
2419 | To make testing easier we provide a convenience function to put a BuildQueue |
2420 | object into a preset fixed state: |
2421 | |
2422 | + >>> from zope.security.proxy import removeSecurityProxy |
2423 | >>> default_start = datetime.datetime(2005, 1, 1, 8, 0, 0, tzinfo=UTC) |
2424 | >>> def setupBuildQueue(build_queue, builder): |
2425 | ... build_queue.builder = builder |
2426 | - ... build_queue.buildstart = default_start |
2427 | + ... build_queue.setDateStarted(default_start) |
2428 | |
2429 | Remove any previous buildmaster ROOT directory, to avoid any garbage |
2430 | lock conflict (it would be recreated automatically if necessary) |
2431 | @@ -216,7 +217,7 @@ |
2432 | >>> from canonical.launchpad.ftests import syncUpdate |
2433 | >>> if a_builder.currentjob is not None: |
2434 | ... currentjob = a_builder.currentjob |
2435 | - ... currentjob.buildstart = None |
2436 | + ... currentjob.setDateStarted(None) |
2437 | ... currentjob.builder = None |
2438 | ... syncUpdate(currentjob) |
2439 | |
2440 | @@ -236,17 +237,18 @@ |
2441 | Do the test execution: |
2442 | |
2443 | >>> buildergroup.updateBuild(bqItem3) |
2444 | - >>> bqItem3.build.builder is not None |
2445 | - True |
2446 | - >>> bqItem3.build.datebuilt is not None |
2447 | - True |
2448 | - >>> bqItem3.build.buildduration is not None |
2449 | - True |
2450 | - >>> bqItem3.build.buildlog is not None |
2451 | + >>> build = getUtility(IBuildSet).getByQueueEntry(bqItem3) |
2452 | + >>> build.builder is not None |
2453 | + True |
2454 | + >>> build.datebuilt is not None |
2455 | + True |
2456 | + >>> build.buildduration is not None |
2457 | + True |
2458 | + >>> build.buildlog is not None |
2459 | True |
2460 | >>> check_mail_sent(last_stub_mail_count) |
2461 | True |
2462 | - >>> bqItem3.build.buildstate.title |
2463 | + >>> build.buildstate.title |
2464 | 'Failed to build' |
2465 | |
2466 | Cleanup in preparation for the next test: |
2467 | @@ -270,19 +272,20 @@ |
2468 | |
2469 | >>> buildergroup.updateBuild(bqItem4) |
2470 | CRITICAL:root:***** bob is MANUALDEPWAIT ***** |
2471 | - >>> bqItem4.build.builder is not None |
2472 | - True |
2473 | - >>> bqItem4.build.datebuilt is not None |
2474 | - True |
2475 | - >>> bqItem4.build.buildduration is not None |
2476 | - True |
2477 | - >>> bqItem4.build.buildlog is not None |
2478 | + >>> build = getUtility(IBuildSet).getByQueueEntry(bqItem4) |
2479 | + >>> build.builder is not None |
2480 | + True |
2481 | + >>> build.datebuilt is not None |
2482 | + True |
2483 | + >>> build.buildduration is not None |
2484 | + True |
2485 | + >>> build.buildlog is not None |
2486 | True |
2487 | >>> check_mail_sent(last_stub_mail_count) |
2488 | False |
2489 | - >>> bqItem4.build.dependencies |
2490 | + >>> build.dependencies |
2491 | u'baz (>= 1.0.1)' |
2492 | - >>> bqItem4.build.buildstate.title |
2493 | + >>> build.buildstate.title |
2494 | 'Dependency wait' |
2495 | |
2496 | Cleanup in preparation for the next test: |
2497 | @@ -302,17 +305,18 @@ |
2498 | ... WaitingSlave('BuildStatus.CHROOTFAIL')) |
2499 | >>> buildergroup.updateBuild(bqItem5) |
2500 | CRITICAL:root:***** bob is CHROOTWAIT ***** |
2501 | - >>> bqItem5.build.builder is not None |
2502 | - True |
2503 | - >>> bqItem5.build.datebuilt is not None |
2504 | - True |
2505 | - >>> bqItem5.build.buildduration is not None |
2506 | - True |
2507 | - >>> bqItem5.build.buildlog is not None |
2508 | + >>> build = getUtility(IBuildSet).getByQueueEntry(bqItem5) |
2509 | + >>> build.builder is not None |
2510 | + True |
2511 | + >>> build.datebuilt is not None |
2512 | + True |
2513 | + >>> build.buildduration is not None |
2514 | + True |
2515 | + >>> build.buildlog is not None |
2516 | True |
2517 | >>> check_mail_sent(last_stub_mail_count) |
2518 | True |
2519 | - >>> bqItem5.build.buildstate.title |
2520 | + >>> build.buildstate.title |
2521 | 'Chroot problem' |
2522 | |
2523 | Cleanup in preparation for the next test: |
2524 | @@ -343,7 +347,8 @@ |
2525 | True |
2526 | >>> check_mail_sent(last_stub_mail_count) |
2527 | False |
2528 | - >>> bqItem6.build.buildstate.title |
2529 | + >>> build = getUtility(IBuildSet).getByQueueEntry(bqItem6) |
2530 | + >>> build.buildstate.title |
2531 | 'Needs building' |
2532 | |
2533 | Cleanup in preparation for the next test: |
2534 | @@ -384,6 +389,8 @@ |
2535 | >>> setupBuildQueue(bqItem8, a_builder) |
2536 | >>> last_stub_mail_count = len(stub.test_emails) |
2537 | |
2538 | + >>> bqItem8.builder.setSlaveForTesting(BuildingSlave()) |
2539 | + >>> buildergroup.updateBuild(bqItem8) |
2540 | >>> bqItem8.builder.setSlaveForTesting(AbortedSlave()) |
2541 | >>> bqItem8.builder.name |
2542 | u'bob' |
2543 | @@ -445,17 +452,18 @@ |
2544 | FAILEDTOUPLOAD: |
2545 | |
2546 | >>> buildergroup.updateBuild(bqItem10) |
2547 | - >>> bqItem10.build.builder is not None |
2548 | - True |
2549 | - >>> bqItem10.build.datebuilt is not None |
2550 | - True |
2551 | - >>> bqItem10.build.buildduration is not None |
2552 | - True |
2553 | - >>> bqItem10.build.buildlog is not None |
2554 | + >>> build = getUtility(IBuildSet).getByQueueEntry(bqItem10) |
2555 | + >>> build.builder is not None |
2556 | + True |
2557 | + >>> build.datebuilt is not None |
2558 | + True |
2559 | + >>> build.buildduration is not None |
2560 | + True |
2561 | + >>> build.buildlog is not None |
2562 | True |
2563 | >>> check_mail_sent(last_stub_mail_count) |
2564 | True |
2565 | - >>> bqItem10.build.buildstate.title |
2566 | + >>> build.buildstate.title |
2567 | 'Failed to upload' |
2568 | |
2569 | Let's check the emails generated by this 'failure' |
2570 | @@ -493,7 +501,7 @@ |
2571 | output is both emailed in an immediate notification, and stored in the |
2572 | librarian for future reference. |
2573 | |
2574 | - >>> bqItem10.build.upload_log is not None |
2575 | + >>> build.upload_log is not None |
2576 | True |
2577 | |
2578 | What we can clearly notice is that the buildlog is still containing |
2579 | @@ -514,16 +522,16 @@ |
2580 | |
2581 | >>> bqItem10 = getUtility(IBuildSet).getByBuildID( |
2582 | ... 6).createBuildQueueEntry() |
2583 | + >>> build = getUtility(IBuildSet).getByQueueEntry(bqItem10) |
2584 | |
2585 | XXX: The pocket attribute is not intended to be changed in regular code, but |
2586 | for this test we want to change it on the fly. An alternative would be to add |
2587 | new sample data for a build that can be uploaded with binary packages attached |
2588 | to it. |
2589 | |
2590 | - >>> from zope.security.proxy import removeSecurityProxy |
2591 | >>> from lp.registry.interfaces.pocket import PackagePublishingPocket |
2592 | >>> removeSecurityProxy( |
2593 | - ... bqItem10.build).pocket = PackagePublishingPocket.UPDATES |
2594 | + ... build).pocket = PackagePublishingPocket.UPDATES |
2595 | >>> setupBuildQueue(bqItem10, a_builder) |
2596 | >>> last_stub_mail_count = len(stub.test_emails) |
2597 | |
2598 | @@ -535,22 +543,22 @@ |
2599 | the build record to FULLYBUILT, as the process-upload would do: |
2600 | |
2601 | >>> from canonical.launchpad.interfaces import BuildStatus |
2602 | - >>> bqItem10.build.buildstate = BuildStatus.FULLYBUILT |
2603 | + >>> build.buildstate = BuildStatus.FULLYBUILT |
2604 | |
2605 | Now the updateBuild should recognize this build record as a |
2606 | Successfully built and uploaded procedure, not sending any |
2607 | notification and updating the build information: |
2608 | |
2609 | >>> buildergroup.updateBuild(bqItem10) |
2610 | - >>> bqItem10.build.builder is not None |
2611 | - True |
2612 | - >>> bqItem10.build.datebuilt is not None |
2613 | - True |
2614 | - >>> bqItem10.build.buildduration is not None |
2615 | - True |
2616 | - >>> bqItem10.build.buildlog is not None |
2617 | - True |
2618 | - >>> bqItem10.build.buildstate.title |
2619 | + >>> build.builder is not None |
2620 | + True |
2621 | + >>> build.datebuilt is not None |
2622 | + True |
2623 | + >>> build.buildduration is not None |
2624 | + True |
2625 | + >>> build.buildlog is not None |
2626 | + True |
2627 | + >>> build.buildstate.title |
2628 | 'Successfully built' |
2629 | >>> check_mail_sent(last_stub_mail_count) |
2630 | False |
2631 | @@ -558,7 +566,7 @@ |
2632 | We do not store any build log information when the binary upload |
2633 | processing succeeded. |
2634 | |
2635 | - >>> bqItem10.build.upload_log is None |
2636 | + >>> build.upload_log is None |
2637 | True |
2638 | |
2639 | Cleanup in preparation for the next test: |
2640 | @@ -585,13 +593,14 @@ |
2641 | |
2642 | >>> bqItem11.builder is None |
2643 | True |
2644 | - >>> bqItem11.buildstart is None |
2645 | + >>> bqItem11.date_started is None |
2646 | True |
2647 | >>> bqItem11.lastscore |
2648 | 0 |
2649 | >>> check_mail_sent(last_stub_mail_count) |
2650 | False |
2651 | - >>> bqItem11.build.buildstate.title |
2652 | + >>> build = getUtility(IBuildSet).getByQueueEntry(bqItem11) |
2653 | + >>> build.buildstate.title |
2654 | 'Needs building' |
2655 | |
2656 | Cleanup in preparation for the next test: |
2657 | @@ -790,11 +799,11 @@ |
2658 | tests. |
2659 | |
2660 | >>> current_job = a_builder.currentjob |
2661 | - >>> resurrect_build = current_job.build |
2662 | + >>> resurrect_build = getUtility(IBuildSet).getByQueueEntry(current_job) |
2663 | >>> resurrect_build.buildstate = BuildStatus.NEEDSBUILD |
2664 | >>> syncUpdate(resurrect_build) |
2665 | >>> current_job.builder = None |
2666 | - >>> current_job.buildstart = None |
2667 | + >>> current_job.setDateStarted(None) |
2668 | >>> syncUpdate(current_job) |
2669 | |
2670 | IBuilder.findCandidate also identifies if there are builds for |
2671 | @@ -802,7 +811,8 @@ |
2672 | corresponding build record as SUPERSEDED. |
2673 | |
2674 | >>> old_candidate = a_builder.findBuildCandidate() |
2675 | - >>> print old_candidate.build.buildstate.name |
2676 | + >>> build = getUtility(IBuildSet).getByQueueEntry(old_candidate) |
2677 | + >>> print build.buildstate.name |
2678 | NEEDSBUILD |
2679 | |
2680 | The 'candidate' is constant until we dispatch it. |
2681 | @@ -814,7 +824,7 @@ |
2682 | Now let's disable the archive of the associated build record and see |
2683 | whether the candidate will still be found. |
2684 | |
2685 | - >>> old_candidate.build.archive.enabled = False |
2686 | + >>> build.archive.enabled = False |
2687 | >>> new_candidate = a_builder.findBuildCandidate() |
2688 | >>> new_candidate is None |
2689 | True |
2690 | @@ -823,7 +833,7 @@ |
2691 | archives are ignored. Now let's re-enable that archive and the build |
2692 | candidate will be found again. |
2693 | |
2694 | - >>> old_candidate.build.archive.enabled = True |
2695 | + >>> build.archive.enabled = True |
2696 | >>> new_candidate = a_builder.findBuildCandidate() |
2697 | >>> new_candidate.id == old_candidate.id |
2698 | True |
2699 | @@ -836,9 +846,9 @@ |
2700 | >>> from canonical.launchpad.interfaces import PackagePublishingStatus |
2701 | >>> from canonical.testing.layers import LaunchpadZopelessLayer |
2702 | |
2703 | - >>> spr = old_candidate.build.sourcepackagerelease |
2704 | + >>> spr = build.sourcepackagerelease |
2705 | >>> secure_pub = removeSecurityProxy( |
2706 | - ... old_candidate).build.current_source_publication.secure_record |
2707 | + ... build).current_source_publication.secure_record |
2708 | >>> commit() |
2709 | >>> LaunchpadZopelessLayer.switchDbUser('launchpad') |
2710 | >>> secure_pub.status = PackagePublishingStatus.SUPERSEDED |
2711 | @@ -854,7 +864,7 @@ |
2712 | Because the 'previous' candidate was marked as superseded, so it's not |
2713 | part of the candidates list anymore. |
2714 | |
2715 | - >>> print old_candidate.build.buildstate.name |
2716 | + >>> print build.buildstate.name |
2717 | SUPERSEDED |
2718 | |
2719 | If the candidate is for a private build whose source has not been |
2720 | @@ -862,9 +872,10 @@ |
2721 | published. We need to tweak the status of the publishing record again |
2722 | to demonstrate this, and also make the archive private: |
2723 | |
2724 | - >>> source = new_candidate.build.sourcepackagerelease |
2725 | + >>> build = getUtility(IBuildSet).getByQueueEntry(new_candidate) |
2726 | + >>> source = build.sourcepackagerelease |
2727 | >>> secure_pub = removeSecurityProxy( |
2728 | - ... new_candidate).build.current_source_publication.secure_record |
2729 | + ... build).current_source_publication.secure_record |
2730 | >>> commit() |
2731 | >>> LaunchpadZopelessLayer.switchDbUser('launchpad') |
2732 | >>> secure_pub.status = PackagePublishingStatus.PENDING |
2733 | @@ -903,22 +914,23 @@ |
2734 | |
2735 | >>> LaunchpadZopelessLayer.switchDbUser('launchpad') |
2736 | >>> secure_pub = removeSecurityProxy( |
2737 | - ... new_candidate).build.current_source_publication.secure_record |
2738 | + ... build).current_source_publication.secure_record |
2739 | >>> secure_pub.status = PackagePublishingStatus.DELETED |
2740 | >>> secure_pub = removeSecurityProxy( |
2741 | - ... new_candidate).build.current_source_publication.secure_record |
2742 | + ... build).current_source_publication.secure_record |
2743 | >>> secure_pub.status = PackagePublishingStatus.SUPERSEDED |
2744 | >>> commit() |
2745 | >>> LaunchpadZopelessLayer.switchDbUser(config.builddmaster.dbuser) |
2746 | |
2747 | - >>> print current_job.build.buildstate.name |
2748 | + >>> build = getUtility(IBuildSet).getByQueueEntry(current_job) |
2749 | + >>> print build.buildstate.name |
2750 | NEEDSBUILD |
2751 | |
2752 | >>> another_candidate = a_builder.findBuildCandidate() |
2753 | >>> print another_candidate |
2754 | None |
2755 | |
2756 | - >>> print current_job.build.buildstate.name |
2757 | + >>> print build.buildstate.name |
2758 | SUPERSEDED |
2759 | |
2760 | We'll reset the archive back to non-private for further tests: |
2761 | @@ -1147,7 +1159,8 @@ |
2762 | >>> cprov_archive.private = True |
2763 | >>> cprov_archive.buildd_secret = "secret" |
2764 | >>> cprov_archive.require_virtualized = False |
2765 | - >>> for build_file in candidate.files: |
2766 | + >>> build = getUtility(IBuildSet).getByQueueEntry(candidate) |
2767 | + >>> for build_file in build.sourcepackagerelease.files: |
2768 | ... removeSecurityProxy(build_file).libraryfile.restricted = True |
2769 | >>> commit() |
2770 | >>> LaunchpadZopelessLayer.switchDbUser(test_dbuser) |
2771 | @@ -1169,7 +1182,8 @@ |
2772 | archive and not the one from the PPA, which on the absence of ancestry |
2773 | defaults to 'universe'. |
2774 | |
2775 | - >>> print candidate.build.current_component.name |
2776 | + >>> build = getUtility(IBuildSet).getByQueueEntry(candidate) |
2777 | + >>> print build.current_component.name |
2778 | main |
2779 | |
2780 | This is so that the mangling tools will run over the built packages. |
2781 | @@ -1196,7 +1210,7 @@ |
2782 | We will create an ancestry in the primary archive target to the 'main' |
2783 | component and this time the dispatching will follow that component. |
2784 | |
2785 | - >>> sourcename = candidate.build.sourcepackagerelease.name |
2786 | + >>> sourcename = build.sourcepackagerelease.name |
2787 | |
2788 | >>> LaunchpadZopelessLayer.switchDbUser('launchpad') |
2789 | >>> login('foo.bar@canonical.com') |
2790 | @@ -1227,14 +1241,14 @@ |
2791 | |
2792 | >>> candidate = a_build.createBuildQueueEntry() |
2793 | >>> setupBuildQueue(candidate, a_builder) |
2794 | - >>> candidate.build.upload_log = None |
2795 | + >>> build.upload_log = None |
2796 | >>> candidate.builder.setSlaveForTesting(WaitingSlave('BuildStatus.OK')) |
2797 | >>> buildergroup.updateBuild(candidate) |
2798 | |
2799 | - >>> candidate.build.archive.private |
2800 | + >>> build.archive.private |
2801 | True |
2802 | |
2803 | - >>> lfa = candidate.build.buildlog |
2804 | + >>> lfa = build.buildlog |
2805 | >>> lfa.restricted |
2806 | True |
2807 | >>> print lfa.filename |
2808 | @@ -1269,7 +1283,8 @@ |
2809 | |
2810 | >>> cprov_archive.private = False |
2811 | >>> cprov_archive.require_virtualized = True |
2812 | - >>> for build_file in candidate.files: |
2813 | + >>> build = getUtility(IBuildSet).getByQueueEntry(candidate) |
2814 | + >>> for build_file in build.sourcepackagerelease.files: |
2815 | ... removeSecurityProxy(build_file).libraryfile.restricted = False |
2816 | >>> mark_archive = getUtility(IPersonSet).getByName('mark').archive |
2817 | |
2818 | @@ -1388,7 +1403,8 @@ |
2819 | >>> a_builder.currentjob.destroySelf() |
2820 | |
2821 | >>> bqItem3 = a_build.createBuildQueueEntry() |
2822 | - >>> removeSecurityProxy(bqItem3.build).pocket = ( |
2823 | + >>> build = getUtility(IBuildSet).getByQueueEntry(bqItem3) |
2824 | + >>> removeSecurityProxy(build).pocket = ( |
2825 | ... PackagePublishingPocket.UPDATES) |
2826 | >>> last_stub_mail_count = len(stub.test_emails) |
2827 | >>> a_builder.dispatchBuildCandidate(bqItem3) |
2828 | @@ -1410,7 +1426,7 @@ |
2829 | >>> a_builder.currentjob.destroySelf() |
2830 | |
2831 | >>> bqItem3 = a_build.createBuildQueueEntry() |
2832 | - >>> removeSecurityProxy(bqItem3.build).pocket = ( |
2833 | + >>> removeSecurityProxy(build).pocket = ( |
2834 | ... PackagePublishingPocket.PROPOSED) |
2835 | >>> last_stub_mail_count = len(stub.test_emails) |
2836 | >>> a_builder.dispatchBuildCandidate(bqItem3) |
2837 | @@ -1433,7 +1449,7 @@ |
2838 | >>> a_builder.currentjob.destroySelf() |
2839 | |
2840 | >>> bqItem3 = a_build.createBuildQueueEntry() |
2841 | - >>> removeSecurityProxy(bqItem3.build).pocket = ( |
2842 | + >>> removeSecurityProxy(build).pocket = ( |
2843 | ... PackagePublishingPocket.BACKPORTS) |
2844 | >>> last_stub_mail_count = len(stub.test_emails) |
2845 | >>> a_builder.dispatchBuildCandidate(bqItem3) |
2846 | @@ -1456,9 +1472,9 @@ |
2847 | >>> a_builder.currentjob.destroySelf() |
2848 | |
2849 | >>> bqItem3 = a_build.createBuildQueueEntry() |
2850 | - >>> removeSecurityProxy(bqItem3.build).buildstate = ( |
2851 | + >>> removeSecurityProxy(build).buildstate = ( |
2852 | ... BuildStatus.NEEDSBUILD) |
2853 | - >>> removeSecurityProxy(bqItem3.build).pocket = ( |
2854 | + >>> removeSecurityProxy(build).pocket = ( |
2855 | ... PackagePublishingPocket.SECURITY) |
2856 | >>> last_stub_mail_count = len(stub.test_emails) |
2857 | |
2858 | |
2859 | === modified file 'lib/lp/soyuz/doc/builder.txt' |
2860 | --- lib/lp/soyuz/doc/builder.txt 2009-08-27 19:09:44 +0000 |
2861 | +++ lib/lp/soyuz/doc/builder.txt 2009-11-16 23:27:14 +0000 |
2862 | @@ -45,8 +45,10 @@ |
2863 | |
2864 | >>> from lp.soyuz.interfaces.archive import ArchivePurpose |
2865 | >>> from zope.security.proxy import removeSecurityProxy |
2866 | + >>> from lp.soyuz.interfaces.build import IBuildSet |
2867 | + >>> build = getUtility(IBuildSet).getByQueueEntry(builder.currentjob) |
2868 | >>> builder_archive = removeSecurityProxy( |
2869 | - ... builder.currentjob.build.archive) |
2870 | + ... build.archive) |
2871 | >>> saved_purpose = builder_archive.purpose |
2872 | >>> builder_archive.purpose = ArchivePurpose.COPY |
2873 | |
2874 | |
2875 | === modified file 'lib/lp/soyuz/doc/buildqueue.txt' |
2876 | --- lib/lp/soyuz/doc/buildqueue.txt 2009-08-28 07:34:44 +0000 |
2877 | +++ lib/lp/soyuz/doc/buildqueue.txt 2009-11-16 23:27:14 +0000 |
2878 | @@ -31,19 +31,21 @@ |
2879 | The IBuild record related to this job is provided by the 'build' |
2880 | attribute: |
2881 | |
2882 | - >>> bq.build.id |
2883 | + >>> from lp.soyuz.interfaces.build import IBuildSet |
2884 | + >>> build = getUtility(IBuildSet).getByQueueEntry(bq) |
2885 | + >>> build.id |
2886 | 8 |
2887 | - >>> bq.build.buildstate.name |
2888 | + >>> build.buildstate.name |
2889 | 'BUILDING' |
2890 | |
2891 | The static timestamps, representing when the record was initialised |
2892 | (inserted) and when the job was dispatched are provided as datetime |
2893 | instances: |
2894 | |
2895 | - >>> bq.created |
2896 | + >>> bq.job.date_created |
2897 | datetime.datetime(2005, 6, 15, 9, 14, 12, 820778, tzinfo=<UTC>) |
2898 | |
2899 | - >>> bq.buildstart |
2900 | + >>> bq.date_started |
2901 | datetime.datetime(2005, 6, 15, 9, 20, 12, 820778, tzinfo=<UTC>) |
2902 | |
2903 | Check Builder foreign key, which indicated which builder 'is processing' |
2904 | @@ -77,29 +79,6 @@ |
2905 | >>> bq.manual |
2906 | False |
2907 | |
2908 | -BuildQueue provides a property which calculates the partial duration |
2909 | -of the build procedure (NOW - buildstart), it's mainly used in the UI. |
2910 | - |
2911 | - >>> bq.buildduration |
2912 | - datetime.timedelta(...) |
2913 | - |
2914 | -Some local properties inherited from related content classes: |
2915 | - |
2916 | - >>> bq.archseries.id == bq.build.distroarchseries.id |
2917 | - True |
2918 | - >>> bq.urgency == bq.build.sourcepackagerelease.urgency |
2919 | - True |
2920 | - >>> bq.archhintlist == bq.build.sourcepackagerelease.architecturehintlist |
2921 | - True |
2922 | - >>> bq.name == bq.build.sourcepackagerelease.name |
2923 | - True |
2924 | - >>> bq.version == bq.build.sourcepackagerelease.version |
2925 | - True |
2926 | - >>> bq.files.count() == bq.build.sourcepackagerelease.files.count() |
2927 | - True |
2928 | - >>> bq.builddependsindep == bq.build.sourcepackagerelease.builddependsindep |
2929 | - True |
2930 | - |
2931 | BuildQueue provides the name for the logfile resulting from the build: |
2932 | |
2933 | >>> bq.getLogFileName() |
2934 | @@ -131,11 +110,12 @@ |
2935 | |
2936 | >>> print job.builder.name |
2937 | bob |
2938 | - >>> job.buildstart is not None |
2939 | + >>> job.date_started is not None |
2940 | True |
2941 | >>> print job.logtail |
2942 | Dummy sampledata entry, not processing |
2943 | - >>> print job.build.buildstate.name |
2944 | + >>> build = getUtility(IBuildSet).getByQueueEntry(job) |
2945 | + >>> print build.buildstate.name |
2946 | BUILDING |
2947 | >>> print job.lastscore |
2948 | 1 |
2949 | @@ -150,11 +130,11 @@ |
2950 | |
2951 | >>> print job.builder |
2952 | None |
2953 | - >>> print job.buildstart |
2954 | + >>> print job.date_started |
2955 | None |
2956 | >>> print job.logtail |
2957 | None |
2958 | - >>> print job.build.buildstate.name |
2959 | + >>> print build.buildstate.name |
2960 | NEEDSBUILD |
2961 | >>> print job.lastscore |
2962 | 1 |
2963 | @@ -169,9 +149,9 @@ |
2964 | |
2965 | >>> print job.builder.name |
2966 | bob |
2967 | - >>> job.buildstart is not None |
2968 | + >>> job.date_started is not None |
2969 | True |
2970 | - >>> print job.build.buildstate.name |
2971 | + >>> print build.buildstate.name |
2972 | BUILDING |
2973 | |
2974 | |
2975 | @@ -266,7 +246,7 @@ |
2976 | and restricted |
2977 | |
2978 | >>> for bq in bqset.calculateCandidates(archseries): |
2979 | - ... build = bq.build |
2980 | + ... build = getUtility(IBuildSet).getByQueueEntry(bq) |
2981 | ... print "%s (%s, %d)" % (build.title, bq.lastscore, bq.id) |
2982 | hppa build of pmount 0.1-2 in ubuntu hoary RELEASE (1500, 4) |
2983 | i386 build of alsa-utils 1.0.9a-4ubuntu1 in ubuntu hoary RELEASE (1000, 2) |
2984 | @@ -298,7 +278,7 @@ |
2985 | as intended. |
2986 | |
2987 | >>> for bq in bqset.calculateCandidates(archseries): |
2988 | - ... build = bq.build |
2989 | + ... build = getUtility(IBuildSet).getByQueueEntry(bq) |
2990 | ... print "%s (%s, %d)" % (build.title, bq.lastscore, bq.id) |
2991 | hppa build of pmount 0.1-2 in ubuntu hoary RELEASE (1500, 4) |
2992 | i386 build of alsa-utils 1.0.9a-4ubuntu1 in ubuntu hoary RELEASE (1000, 2) |
2993 | @@ -310,7 +290,7 @@ |
2994 | |
2995 | >>> archseries = [hoary['hppa']] |
2996 | >>> for bq in bqset.calculateCandidates(archseries): |
2997 | - ... build = bq.build |
2998 | + ... build = getUtility(IBuildSet).getByQueueEntry(bq) |
2999 | ... print "%s (%s, %d)" % (build.title, bq.lastscore, bq.id) |
3000 | hppa build of pmount 0.1-2 in ubuntu hoary RELEASE (1500, 4) |
3001 | hppa build of alsa-utils 1.0.9a-4 in ubuntu hoary RELEASE (500, 3) |
3002 | |
3003 | === modified file 'lib/lp/soyuz/doc/initialise-from-parent.txt' |
3004 | --- lib/lp/soyuz/doc/initialise-from-parent.txt 2009-10-26 18:40:04 +0000 |
3005 | +++ lib/lp/soyuz/doc/initialise-from-parent.txt 2009-11-16 23:27:14 +0000 |
3006 | @@ -172,3 +172,10 @@ |
3007 | >>> pmount_source.sourcepackagerelease.getBuildByArch( |
3008 | ... foobuntu['hppa'], ubuntu.main_archive) is None |
3009 | True |
3010 | + |
3011 | +initialiseFromParent also copies the permitted source formats from the |
3012 | +parent series. |
3013 | + |
3014 | + >>> from lp.soyuz.interfaces.sourcepackageformat import SourcePackageFormat |
3015 | + >>> foobuntu.isSourcePackageFormatPermitted(SourcePackageFormat.FORMAT_1_0) |
3016 | + True |
3017 | |
3018 | === modified file 'lib/lp/soyuz/interfaces/build.py' |
3019 | --- lib/lp/soyuz/interfaces/build.py 2009-10-26 18:40:04 +0000 |
3020 | +++ lib/lp/soyuz/interfaces/build.py 2009-11-16 23:27:14 +0000 |
3021 | @@ -524,6 +524,13 @@ |
3022 | :rtype: ``dict``. |
3023 | """ |
3024 | |
3025 | + def getByQueueEntry(queue_entry): |
3026 | + """Return an IBuild instance for the given build queue entry. |
3027 | + |
3028 | + Retrieve the only one possible build record associated with the given |
3029 | + build queue entry. If not found, return None. |
3030 | + """ |
3031 | + |
3032 | |
3033 | class IBuildRescoreForm(Interface): |
3034 | """Form for rescoring a build.""" |
3035 | |
3036 | === added file 'lib/lp/soyuz/interfaces/buildpackagejob.py' |
3037 | --- lib/lp/soyuz/interfaces/buildpackagejob.py 1970-01-01 00:00:00 +0000 |
3038 | +++ lib/lp/soyuz/interfaces/buildpackagejob.py 2009-11-16 23:27:14 +0000 |
3039 | @@ -0,0 +1,34 @@ |
3040 | +# Copyright 2009 Canonical Ltd. This software is licensed under the |
3041 | +# GNU Affero General Public License version 3 (see the file LICENSE). |
3042 | + |
3043 | +# pylint: disable-msg=E0211,E0213 |
3044 | + |
3045 | +"""BuildPackageJob interfaces.""" |
3046 | + |
3047 | +__metaclass__ = type |
3048 | + |
3049 | +__all__ = [ |
3050 | + 'IBuildPackageJob', |
3051 | + ] |
3052 | + |
3053 | +from zope.schema import Int |
3054 | + |
3055 | +from canonical.launchpad import _ |
3056 | +from lazr.restful.fields import Reference |
3057 | +from lp.buildmaster.interfaces.buildfarmjob import IBuildFarmJob |
3058 | +from lp.services.job.interfaces.job import IJob |
3059 | +from lp.soyuz.interfaces.build import IBuild |
3060 | + |
3061 | + |
3062 | +class IBuildPackageJob(IBuildFarmJob): |
3063 | + """A read-only interface for build package jobs.""" |
3064 | + id = Int(title=_('ID'), required=True, readonly=True) |
3065 | + |
3066 | + job = Reference( |
3067 | + IJob, title=_("Job"), required=True, readonly=True, |
3068 | + description=_("Data common to all job types.")) |
3069 | + |
3070 | + build = Reference( |
3071 | + IBuild, title=_("Build"), |
3072 | + required=True,readonly=True, |
3073 | + description=_("Build record associated with this job.")) |
3074 | |
3075 | === modified file 'lib/lp/soyuz/interfaces/buildqueue.py' |
3076 | --- lib/lp/soyuz/interfaces/buildqueue.py 2009-06-25 04:06:00 +0000 |
3077 | +++ lib/lp/soyuz/interfaces/buildqueue.py 2009-11-16 23:27:14 +0000 |
3078 | @@ -13,6 +13,14 @@ |
3079 | ] |
3080 | |
3081 | from zope.interface import Interface, Attribute |
3082 | +from zope.schema import Choice, Datetime |
3083 | + |
3084 | +from lazr.restful.fields import Reference |
3085 | + |
3086 | +from canonical.launchpad import _ |
3087 | +from lp.buildmaster.interfaces.buildfarmjob import ( |
3088 | + IBuildFarmJob, BuildFarmJobType) |
3089 | +from lp.services.job.interfaces.job import IJob |
3090 | |
3091 | |
3092 | class IBuildQueue(Interface): |
3093 | @@ -30,59 +38,24 @@ |
3094 | """ |
3095 | |
3096 | id = Attribute("Job identifier") |
3097 | - build = Attribute("The IBuild record that originated this job") |
3098 | builder = Attribute("The IBuilder instance processing this job") |
3099 | - created = Attribute("The datetime that the queue entry was created") |
3100 | - buildstart = Attribute("The datetime of the last build attempt") |
3101 | logtail = Attribute("The current tail of the log of the build") |
3102 | lastscore = Attribute("Last score to be computed for this job") |
3103 | manual = Attribute("Whether or not the job was manually scored") |
3104 | |
3105 | - # properties inherited from related Content classes. |
3106 | - archseries = Attribute( |
3107 | - "DistroArchSeries target of the IBuild releated to this job.") |
3108 | - name = Attribute( |
3109 | - "Name of the ISourcePackageRelease releated to this job.") |
3110 | - version = Attribute( |
3111 | - "Version of the ISourcePackageRelease releated to this job.") |
3112 | - files = Attribute( |
3113 | - "Collection of files related to the ISourcePackageRelease " |
3114 | - "releated to this job.") |
3115 | - urgency = Attribute( |
3116 | - "Urgency of the ISourcePackageRelease releated to this job.") |
3117 | - archhintlist = Attribute( |
3118 | - "architecturehintlist of the ISourcePackageRelease releated " |
3119 | - "to this job.") |
3120 | - builddependsindep = Attribute( |
3121 | - "builddependsindep of the ISourcePackageRelease releated to " |
3122 | - "this job.") |
3123 | - buildduration = Attribute( |
3124 | - "Duration of the job, calculated on-the-fly based on buildstart.") |
3125 | - is_virtualized = Attribute("See IBuild.is_virtualized.") |
3126 | + job = Reference( |
3127 | + IJob, title=_("Job"), required=True, readonly=True, |
3128 | + description=_("Data common to all job types.")) |
3129 | + |
3130 | + job_type = Choice( |
3131 | + title=_('Job type'), required=True, vocabulary=BuildFarmJobType, |
3132 | + description=_("The type of this job.")) |
3133 | |
3134 | def manualScore(value): |
3135 | """Manually set a score value to a queue item and lock it.""" |
3136 | |
3137 | def score(): |
3138 | - """Perform scoring based on heuristic values. |
3139 | - |
3140 | - Creates a 'score' (priority) value based on: |
3141 | - |
3142 | - * Component: main component gets higher values |
3143 | - (main, 1000, restricted, 750, universe, 250, multiverse, 0) |
3144 | - |
3145 | - * Urgency: EMERGENCY sources gets higher values |
3146 | - (EMERGENCY, 20, HIGH, 15, MEDIUM, 10, LOW, 5) |
3147 | - |
3148 | - * Queue time: old records gets a relative higher priority |
3149 | - (The rate against component is something like: a 'multiverse' |
3150 | - build will be as important as a 'main' after 40 hours in queue) |
3151 | - |
3152 | - This method automatically updates IBuildQueue.lastscore value and |
3153 | - skips 'manually-scored' records. |
3154 | - |
3155 | - This method use any logger available in the standard logging system. |
3156 | - """ |
3157 | + """The job score calculated for the job type in question.""" |
3158 | |
3159 | def destroySelf(): |
3160 | """Delete this entry from the database.""" |
3161 | @@ -121,6 +94,17 @@ |
3162 | Clean the builder for another jobs. |
3163 | """ |
3164 | |
3165 | + specific_job = Reference( |
3166 | + IBuildFarmJob, title=_("Job"), |
3167 | + description=_("Data and operations common to all build farm jobs.")) |
3168 | + |
3169 | + def setDateStarted(timestamp): |
3170 | + """Sets the date started property to the given value.""" |
3171 | + |
3172 | + date_started = Datetime( |
3173 | + title=_('Start time'), |
3174 | + description=_('Time when the job started.')) |
3175 | + |
3176 | |
3177 | class IBuildQueueSet(Interface): |
3178 | """Launchpad Auto Build queue set handler and auxiliary methods.""" |
3179 | @@ -165,4 +149,3 @@ |
3180 | Retrieve the build queue and related builder rows associated with the |
3181 | builds in question where they exist. |
3182 | """ |
3183 | - |
3184 | |
3185 | === added file 'lib/lp/soyuz/interfaces/sourcepackageformat.py' |
3186 | --- lib/lp/soyuz/interfaces/sourcepackageformat.py 1970-01-01 00:00:00 +0000 |
3187 | +++ lib/lp/soyuz/interfaces/sourcepackageformat.py 2009-11-16 23:27:14 +0000 |
3188 | @@ -0,0 +1,64 @@ |
3189 | +# Copyright 2009 Canonical Ltd. This software is licensed under the |
3190 | +# GNU Affero General Public License version 3 (see the file LICENSE). |
3191 | + |
3192 | +"""Source package format interfaces.""" |
3193 | + |
3194 | +__metaclass__ = type |
3195 | + |
3196 | +__all__ = [ |
3197 | + 'SourcePackageFormat', |
3198 | + 'ISourcePackageFormatSelection', |
3199 | + 'ISourcePackageFormatSelectionSet', |
3200 | + ] |
3201 | + |
3202 | +from zope.interface import Attribute, Interface |
3203 | +from lazr.enum import DBEnumeratedType, DBItem |
3204 | + |
3205 | + |
3206 | +class SourcePackageFormat(DBEnumeratedType): |
3207 | + """Source package format |
3208 | + |
3209 | + There are currently three formats of Debian source packages. The Format |
3210 | + field in the .dsc file must specify one of these formats. |
3211 | + """ |
3212 | + |
3213 | + FORMAT_1_0 = DBItem(0, """ |
3214 | + 1.0 |
3215 | + |
3216 | + Specifies either a native (having a single tar.gz) or non-native |
3217 | + (having an orig.tar.gz and a diff.gz) package. Supports only gzip |
3218 | + compression. |
3219 | + """) |
3220 | + |
3221 | + FORMAT_3_0_QUILT = DBItem(1, """ |
3222 | + 3.0 (quilt) |
3223 | + |
3224 | + Specifies a non-native package, with an orig.tar.* and a debian.tar.*. |
3225 | + Supports gzip and bzip2 compression. |
3226 | + """) |
3227 | + |
3228 | + FORMAT_3_0_NATIVE = DBItem(2, """ |
3229 | + 3.0 (native) |
3230 | + |
3231 | + Specifies a native package, with a single tar.*. Supports gzip and |
3232 | + bzip2 compression. |
3233 | + """) |
3234 | + |
3235 | + |
3236 | +class ISourcePackageFormatSelection(Interface): |
3237 | + """A source package format allowed within a DistroSeries.""" |
3238 | + |
3239 | + id = Attribute("ID") |
3240 | + distroseries = Attribute("Target series") |
3241 | + format = Attribute("Permitted source package format") |
3242 | + |
3243 | + |
3244 | +class ISourcePackageFormatSelectionSet(Interface): |
3245 | + """Set manipulation tools for the SourcePackageFormatSelection table.""" |
3246 | + |
3247 | + def getBySeriesAndFormat(distroseries, format): |
3248 | + """Return the ISourcePackageFormatSelection for the given series and |
3249 | + format.""" |
3250 | + |
3251 | + def add(distroseries, format): |
3252 | + """Allow the given source package format in the given series.""" |
3253 | |
3254 | === modified file 'lib/lp/soyuz/model/build.py' |
3255 | --- lib/lp/soyuz/model/build.py 2009-10-26 18:40:04 +0000 |
3256 | +++ lib/lp/soyuz/model/build.py 2009-11-16 23:27:14 +0000 |
3257 | @@ -18,7 +18,6 @@ |
3258 | from zope.security.proxy import removeSecurityProxy |
3259 | from storm.expr import ( |
3260 | Desc, In, Join, LeftJoin) |
3261 | -from storm.references import Reference |
3262 | from storm.store import Store |
3263 | from sqlobject import ( |
3264 | StringCol, ForeignKey, IntervalCol, SQLObjectNotFound) |
3265 | @@ -46,7 +45,9 @@ |
3266 | IStoreSelector, MAIN_STORE, DEFAULT_FLAVOR) |
3267 | from canonical.launchpad.webapp.tales import DurationFormatterAPI |
3268 | from lp.archivepublisher.utils import get_ppa_reference |
3269 | +from lp.buildmaster.interfaces.buildfarmjob import BuildFarmJobType |
3270 | from lp.registry.interfaces.pocket import PackagePublishingPocket |
3271 | +from lp.services.job.model.job import Job |
3272 | from lp.soyuz.adapters.archivedependencies import get_components_for_building |
3273 | from lp.soyuz.interfaces.archive import ArchivePurpose |
3274 | from lp.soyuz.interfaces.build import ( |
3275 | @@ -55,6 +56,7 @@ |
3276 | from lp.soyuz.interfaces.publishing import active_publishing_status |
3277 | from lp.soyuz.model.binarypackagerelease import BinaryPackageRelease |
3278 | from lp.soyuz.model.builder import Builder |
3279 | +from lp.soyuz.model.buildpackagejob import BuildPackageJob |
3280 | from lp.soyuz.model.buildqueue import BuildQueue |
3281 | from lp.soyuz.model.files import BinaryPackageFile |
3282 | from lp.soyuz.model.publishing import SourcePackagePublishingHistory |
3283 | @@ -88,8 +90,6 @@ |
3284 | archive = ForeignKey(foreignKey='Archive', dbName='archive', notNull=True) |
3285 | estimated_build_duration = IntervalCol(default=None) |
3286 | |
3287 | - buildqueue_record = Reference("<primary key>", BuildQueue.buildID, |
3288 | - on_remote=True) |
3289 | date_first_dispatched = UtcDateTimeCol(dbName='date_first_dispatched') |
3290 | |
3291 | upload_log = ForeignKey( |
3292 | @@ -105,6 +105,16 @@ |
3293 | return proxied_file.http_url |
3294 | |
3295 | @property |
3296 | + def buildqueue_record(self): |
3297 | + """See `IBuild`.""" |
3298 | + store = Store.of(self) |
3299 | + results = store.find( |
3300 | + BuildQueue, |
3301 | + BuildPackageJob.job == BuildQueue.jobID, |
3302 | + BuildPackageJob.build == self.id) |
3303 | + return results.one() |
3304 | + |
3305 | + @property |
3306 | def upload_log_url(self): |
3307 | """See `IBuild`.""" |
3308 | if self.upload_log is None: |
3309 | @@ -351,8 +361,10 @@ |
3310 | Archive |
3311 | JOIN Build ON |
3312 | Build.archive = Archive.id |
3313 | + JOIN BuildPackageJob ON |
3314 | + Build.id = BuildPackageJob.build |
3315 | JOIN BuildQueue ON |
3316 | - Build.id = BuildQueue.build |
3317 | + BuildPackageJob.job = BuildQueue.job |
3318 | WHERE |
3319 | Build.buildstate = 0 AND |
3320 | Build.processor = %s AND |
3321 | @@ -412,16 +424,20 @@ |
3322 | SELECT |
3323 | CAST (EXTRACT(EPOCH FROM |
3324 | (Build.estimated_build_duration - |
3325 | - (NOW() - BuildQueue.buildstart))) AS INTEGER) |
3326 | + (NOW() - Job.date_started))) AS INTEGER) |
3327 | AS remainder |
3328 | FROM |
3329 | Archive |
3330 | JOIN Build ON |
3331 | Build.archive = Archive.id |
3332 | + JOIN BuildPackageJob ON |
3333 | + Build.id = BuildPackageJob.build |
3334 | JOIN BuildQueue ON |
3335 | - Build.id = BuildQueue.build |
3336 | + BuildQueue.job = BuildPackageJob.job |
3337 | JOIN Builder ON |
3338 | Builder.id = BuildQueue.builder |
3339 | + JOIN Job ON |
3340 | + Job.id = BuildPackageJob.job |
3341 | WHERE |
3342 | Archive.require_virtualized = %s AND |
3343 | Archive.enabled = TRUE AND |
3344 | @@ -605,7 +621,18 @@ |
3345 | |
3346 | def createBuildQueueEntry(self): |
3347 | """See `IBuild`""" |
3348 | - return BuildQueue(build=self) |
3349 | + store = Store.of(self) |
3350 | + job = Job() |
3351 | + store.add(job) |
3352 | + specific_job = BuildPackageJob() |
3353 | + specific_job.build = self.id |
3354 | + specific_job.job = job.id |
3355 | + store.add(specific_job) |
3356 | + queue_entry = BuildQueue() |
3357 | + queue_entry.job = job.id |
3358 | + queue_entry.job_type = BuildFarmJobType.PACKAGEBUILD |
3359 | + store.add(queue_entry) |
3360 | + return queue_entry |
3361 | |
3362 | def notify(self, extra_info=None): |
3363 | """See `IBuild`""" |
3364 | @@ -966,7 +993,9 @@ |
3365 | if status in [BuildStatus.NEEDSBUILD, BuildStatus.BUILDING]: |
3366 | orderBy = ["-BuildQueue.lastscore", "Build.id"] |
3367 | clauseTables.append('BuildQueue') |
3368 | - condition_clauses.append('BuildQueue.build = Build.id') |
3369 | + clauseTables.append('BuildPackageJob') |
3370 | + condition_clauses.append('BuildPackageJob.build = Build.id') |
3371 | + condition_clauses.append('BuildPackageJob.job = BuildQueue.job') |
3372 | elif status == BuildStatus.SUPERSEDED or status is None: |
3373 | orderBy = ["-Build.datecreated"] |
3374 | else: |
3375 | @@ -1144,3 +1173,13 @@ |
3376 | # this (pre_iter_hook()) method that will iterate over the |
3377 | # result set and force the query execution that way. |
3378 | return list(result_set) |
3379 | + |
3380 | + def getByQueueEntry(self, queue_entry): |
3381 | + """See `IBuildSet`.""" |
3382 | + store = getUtility(IStoreSelector).get(MAIN_STORE, DEFAULT_FLAVOR) |
3383 | + result_set = store.find( |
3384 | + Build, |
3385 | + BuildPackageJob.build == Build.id, |
3386 | + BuildPackageJob.job == queue_entry.job) |
3387 | + |
3388 | + return result_set.one() |
3389 | |
3390 | === modified file 'lib/lp/soyuz/model/builder.py' |
3391 | --- lib/lp/soyuz/model/builder.py 2009-11-11 10:43:07 +0000 |
3392 | +++ lib/lp/soyuz/model/builder.py 2009-11-16 23:27:14 +0000 |
3393 | @@ -52,6 +52,7 @@ |
3394 | from lp.soyuz.interfaces.buildqueue import IBuildQueueSet |
3395 | from lp.soyuz.interfaces.publishing import ( |
3396 | PackagePublishingStatus) |
3397 | +from lp.soyuz.model.buildpackagejob import BuildPackageJob |
3398 | from canonical.launchpad.webapp import urlappend |
3399 | from canonical.librarian.utils import copy_and_close |
3400 | |
3401 | @@ -151,11 +152,11 @@ |
3402 | # Avoid circular imports. |
3403 | from lp.soyuz.model.publishing import makePoolPath |
3404 | |
3405 | - build = build_queue_item.build |
3406 | + build = getUtility(IBuildSet).getByQueueEntry(build_queue_item) |
3407 | archive = build.archive |
3408 | archive_url = archive.archive_url |
3409 | component_name = build.current_component.name |
3410 | - for source_file in build_queue_item.files: |
3411 | + for source_file in build.sourcepackagerelease.files: |
3412 | file_name = source_file.libraryfile.filename |
3413 | sha1 = source_file.libraryfile.content.sha1 |
3414 | source_name = build.sourcepackagerelease.sourcepackagename.name |
3415 | @@ -264,8 +265,8 @@ |
3416 | * Ensure that the build pocket allows builds for the current |
3417 | distroseries state. |
3418 | """ |
3419 | - assert not (not self.virtualized and |
3420 | - build_queue_item.is_virtualized), ( |
3421 | + build = getUtility(IBuildSet).getByQueueEntry(build_queue_item) |
3422 | + assert not (not self.virtualized and build.is_virtualized), ( |
3423 | "Attempt to build non-virtual item on a virtual builder.") |
3424 | |
3425 | # Assert that we are not silently building SECURITY jobs. |
3426 | @@ -274,27 +275,27 @@ |
3427 | # XXX Julian 2007-12-18 spec=security-in-soyuz: This is being |
3428 | # addressed in the work on the blueprint: |
3429 | # https://blueprints.launchpad.net/soyuz/+spec/security-in-soyuz |
3430 | - target_pocket = build_queue_item.build.pocket |
3431 | + target_pocket = build.pocket |
3432 | assert target_pocket != PackagePublishingPocket.SECURITY, ( |
3433 | "Soyuz is not yet capable of building SECURITY uploads.") |
3434 | |
3435 | # Ensure build has the needed chroot |
3436 | - chroot = build_queue_item.archseries.getChroot() |
3437 | + build = getUtility(IBuildSet).getByQueueEntry(build_queue_item) |
3438 | + chroot = build.distroarchseries.getChroot() |
3439 | if chroot is None: |
3440 | raise CannotBuild( |
3441 | "Missing CHROOT for %s/%s/%s" % ( |
3442 | - build_queue_item.build.distroseries.distribution.name, |
3443 | - build_queue_item.build.distroseries.name, |
3444 | - build_queue_item.build.distroarchseries.architecturetag) |
3445 | + build.distroseries.distribution.name, |
3446 | + build.distroseries.name, |
3447 | + build.distroarchseries.architecturetag) |
3448 | ) |
3449 | |
3450 | # The main distribution has policies to prevent uploads to some |
3451 | # pockets (e.g. security) during different parts of the distribution |
3452 | # series lifecycle. These do not apply to PPA builds nor any archive |
3453 | # that allows release pocket updates. |
3454 | - if (build_queue_item.build.archive.purpose != ArchivePurpose.PPA and |
3455 | - not build_queue_item.build.archive.allowUpdatesToReleasePocket()): |
3456 | - build = build_queue_item.build |
3457 | + if (build.archive.purpose != ArchivePurpose.PPA and |
3458 | + not build.archive.allowUpdatesToReleasePocket()): |
3459 | # XXX Robert Collins 2007-05-26: not an explicit CannotBuild |
3460 | # exception yet because the callers have not been audited |
3461 | assert build.distroseries.canUploadToPocket(build.pocket), ( |
3462 | @@ -306,7 +307,8 @@ |
3463 | def _dispatchBuildToSlave(self, build_queue_item, args, buildid, logger): |
3464 | """Start the build on the slave builder.""" |
3465 | # Send chroot. |
3466 | - chroot = build_queue_item.archseries.getChroot() |
3467 | + build = getUtility(IBuildSet).getByQueueEntry(build_queue_item) |
3468 | + chroot = build.distroarchseries.getChroot() |
3469 | self.cacheFileOnSlave(logger, chroot) |
3470 | |
3471 | # Build filemap structure with the files required in this build |
3472 | @@ -314,11 +316,11 @@ |
3473 | # If the build is private we tell the slave to get the files from the |
3474 | # archive instead of the librarian because the slaves cannot |
3475 | # access the restricted librarian. |
3476 | - private = build_queue_item.build.archive.private |
3477 | + private = build.archive.private |
3478 | if private: |
3479 | self.cachePrivateSourceOnSlave(logger, build_queue_item) |
3480 | filemap = {} |
3481 | - for source_file in build_queue_item.files: |
3482 | + for source_file in build.sourcepackagerelease.files: |
3483 | lfa = source_file.libraryfile |
3484 | filemap[lfa.filename] = lfa.content.sha1 |
3485 | if not private: |
3486 | @@ -349,9 +351,10 @@ |
3487 | |
3488 | def startBuild(self, build_queue_item, logger): |
3489 | """See IBuilder.""" |
3490 | + build = getUtility(IBuildSet).getByQueueEntry(build_queue_item) |
3491 | + spr = build.sourcepackagerelease |
3492 | logger.info("startBuild(%s, %s, %s, %s)", self.url, |
3493 | - build_queue_item.name, build_queue_item.version, |
3494 | - build_queue_item.build.pocket.title) |
3495 | + spr.name, spr.version, build.pocket.title) |
3496 | |
3497 | # Make sure the request is valid; an exception is raised if it's not. |
3498 | self._verifyBuildRequest(build_queue_item, logger) |
3499 | @@ -365,39 +368,39 @@ |
3500 | # turn 'arch_indep' ON only if build is archindep or if |
3501 | # the specific architecture is the nominatedarchindep for |
3502 | # this distroseries (in case it requires any archindep source) |
3503 | - args['arch_indep'] = build_queue_item.archseries.isNominatedArchIndep |
3504 | + build = getUtility(IBuildSet).getByQueueEntry(build_queue_item) |
3505 | + args['arch_indep'] = build.distroarchseries.isNominatedArchIndep |
3506 | |
3507 | - suite = build_queue_item.build.distroarchseries.distroseries.name |
3508 | - if build_queue_item.build.pocket != PackagePublishingPocket.RELEASE: |
3509 | - suite += "-%s" % (build_queue_item.build.pocket.name.lower()) |
3510 | + suite = build.distroarchseries.distroseries.name |
3511 | + if build.pocket != PackagePublishingPocket.RELEASE: |
3512 | + suite += "-%s" % (build.pocket.name.lower()) |
3513 | args['suite'] = suite |
3514 | |
3515 | - archive_purpose = build_queue_item.build.archive.purpose |
3516 | + archive_purpose = build.archive.purpose |
3517 | if (archive_purpose == ArchivePurpose.PPA and |
3518 | - not build_queue_item.build.archive.require_virtualized): |
3519 | + not build.archive.require_virtualized): |
3520 | # If we're building a non-virtual PPA, override the purpose |
3521 | # to PRIMARY and use the primary component override. |
3522 | # This ensures that the package mangling tools will run over |
3523 | # the built packages. |
3524 | args['archive_purpose'] = ArchivePurpose.PRIMARY.name |
3525 | args["ogrecomponent"] = ( |
3526 | - get_primary_current_component(build_queue_item.build)) |
3527 | + get_primary_current_component(build)) |
3528 | else: |
3529 | args['archive_purpose'] = archive_purpose.name |
3530 | args["ogrecomponent"] = ( |
3531 | - build_queue_item.build.current_component.name) |
3532 | + build.current_component.name) |
3533 | |
3534 | - args['archives'] = get_sources_list_for_building( |
3535 | - build_queue_item.build) |
3536 | + args['archives'] = get_sources_list_for_building(build) |
3537 | |
3538 | # Let the build slave know whether this is a build in a private |
3539 | # archive. |
3540 | - args['archive_private'] = build_queue_item.build.archive.private |
3541 | + args['archive_private'] = build.archive.private |
3542 | |
3543 | # Generate a string which can be used to cross-check when obtaining |
3544 | # results so we know we are referring to the right database object in |
3545 | # subsequent runs. |
3546 | - buildid = "%s-%s" % (build_queue_item.build.id, build_queue_item.id) |
3547 | + buildid = "%s-%s" % (build.id, build_queue_item.id) |
3548 | logger.debug("Initiating build %s on %s" % (buildid, self.url)) |
3549 | |
3550 | # Do it. |
3551 | @@ -421,8 +424,9 @@ |
3552 | if currentjob is None: |
3553 | return 'Idle' |
3554 | |
3555 | - msg = 'Building %s' % currentjob.build.title |
3556 | - archive = currentjob.build.archive |
3557 | + build = getUtility(IBuildSet).getByQueueEntry(currentjob) |
3558 | + msg = 'Building %s' % build.title |
3559 | + archive = build.archive |
3560 | if not archive.owner.private and (archive.is_ppa or archive.is_copy): |
3561 | return '%s [%s/%s]' % (msg, archive.owner.name, archive.name) |
3562 | else: |
3563 | @@ -553,7 +557,8 @@ |
3564 | SourcePackagePublishingHistory.status IN %s)) |
3565 | OR |
3566 | archive.private IS FALSE) AND |
3567 | - buildqueue.build = build.id AND |
3568 | + buildqueue.job = buildpackagejob.job AND |
3569 | + buildpackagejob.build = build.id AND |
3570 | build.distroarchseries = distroarchseries.id AND |
3571 | build.archive = archive.id AND |
3572 | archive.enabled = TRUE AND |
3573 | @@ -563,7 +568,8 @@ |
3574 | """ % sqlvalues( |
3575 | private_statuses, BuildStatus.NEEDSBUILD, self.processor.family)] |
3576 | |
3577 | - clauseTables = ['Build', 'DistroArchSeries', 'Archive'] |
3578 | + clauseTables = [ |
3579 | + 'Build', 'BuildPackageJob', 'DistroArchSeries', 'Archive'] |
3580 | |
3581 | clauses.append(""" |
3582 | archive.require_virtualized = %s |
3583 | @@ -605,7 +611,7 @@ |
3584 | |
3585 | query = " AND ".join(clauses) |
3586 | candidate = BuildQueue.selectFirst( |
3587 | - query, clauseTables=clauseTables, prejoins=['build'], |
3588 | + query, clauseTables=clauseTables, |
3589 | orderBy=['-buildqueue.lastscore', 'build.id']) |
3590 | |
3591 | return candidate |
3592 | @@ -629,26 +635,28 @@ |
3593 | # Builds in those situation should not be built because they will |
3594 | # be wasting build-time, the former case already has a newer source |
3595 | # and the latter could not be built in DAK. |
3596 | + build_set = getUtility(IBuildSet) |
3597 | while candidate is not None: |
3598 | - if candidate.build.pocket == PackagePublishingPocket.SECURITY: |
3599 | + build = build_set.getByQueueEntry(candidate) |
3600 | + if build.pocket == PackagePublishingPocket.SECURITY: |
3601 | # We never build anything in the security pocket. |
3602 | logger.debug( |
3603 | "Build %s FAILEDTOBUILD, queue item %s REMOVED" |
3604 | - % (candidate.build.id, candidate.id)) |
3605 | - candidate.build.buildstate = BuildStatus.FAILEDTOBUILD |
3606 | + % (build.id, candidate.id)) |
3607 | + build.buildstate = BuildStatus.FAILEDTOBUILD |
3608 | candidate.destroySelf() |
3609 | candidate = self._findBuildCandidate() |
3610 | continue |
3611 | |
3612 | - publication = candidate.build.current_source_publication |
3613 | + publication = build.current_source_publication |
3614 | |
3615 | if publication is None: |
3616 | # The build should be superseded if it no longer has a |
3617 | # current publishing record. |
3618 | logger.debug( |
3619 | "Build %s SUPERSEDED, queue item %s REMOVED" |
3620 | - % (candidate.build.id, candidate.id)) |
3621 | - candidate.build.buildstate = BuildStatus.SUPERSEDED |
3622 | + % (build.id, candidate.id)) |
3623 | + build.buildstate = BuildStatus.SUPERSEDED |
3624 | candidate.destroySelf() |
3625 | candidate = self._findBuildCandidate() |
3626 | continue |
3627 | @@ -751,13 +759,15 @@ |
3628 | origin = ( |
3629 | Archive, |
3630 | Build, |
3631 | + BuildPackageJob, |
3632 | BuildQueue, |
3633 | DistroArchSeries, |
3634 | Processor, |
3635 | ) |
3636 | queue = store.using(*origin).find( |
3637 | BuildQueue, |
3638 | - BuildQueue.build == Build.id, |
3639 | + BuildPackageJob.job == BuildQueue.jobID, |
3640 | + BuildPackageJob.build == Build.id, |
3641 | Build.distroarchseries == DistroArchSeries.id, |
3642 | Build.archive == Archive.id, |
3643 | DistroArchSeries.processorfamilyID == Processor.familyID, |
3644 | |
3645 | === added file 'lib/lp/soyuz/model/buildpackagejob.py' |
3646 | --- lib/lp/soyuz/model/buildpackagejob.py 1970-01-01 00:00:00 +0000 |
3647 | +++ lib/lp/soyuz/model/buildpackagejob.py 2009-11-16 23:27:14 +0000 |
3648 | @@ -0,0 +1,168 @@ |
3649 | +# Copyright 2009 Canonical Ltd. This software is licensed under the |
3650 | +# GNU Affero General Public License version 3 (see the file LICENSE). |
3651 | + |
3652 | +__metaclass__ = type |
3653 | +__all__ = ['BuildPackageJob'] |
3654 | + |
3655 | + |
3656 | +from datetime import datetime |
3657 | +import pytz |
3658 | + |
3659 | +from storm.locals import Int, Reference, Storm |
3660 | + |
3661 | +from zope.interface import implements |
3662 | + |
3663 | +from canonical.database.constants import UTC_NOW |
3664 | +from canonical.launchpad.interfaces import SourcePackageUrgency |
3665 | +from lp.buildmaster.interfaces.buildfarmjob import IBuildFarmJob |
3666 | +from lp.registry.interfaces.pocket import PackagePublishingPocket |
3667 | +from lp.soyuz.interfaces.archive import ArchivePurpose |
3668 | +from lp.soyuz.interfaces.build import BuildStatus |
3669 | +from lp.soyuz.interfaces.buildpackagejob import IBuildPackageJob |
3670 | + |
3671 | + |
3672 | +class BuildPackageJob(Storm): |
3673 | + """See `IBuildPackageJob`.""" |
3674 | + implements(IBuildFarmJob, IBuildPackageJob) |
3675 | + __storm_table__ = 'buildpackagejob' |
3676 | + id = Int(primary=True) |
3677 | + |
3678 | + job_id = Int(name='job', allow_none=False) |
3679 | + job = Reference(job_id, 'Job.id') |
3680 | + |
3681 | + build_id = Int(name='build', allow_none=False) |
3682 | + build = Reference(build_id, 'Build.id') |
3683 | + |
3684 | + def score(self): |
3685 | + """See `IBuildPackageJob`.""" |
3686 | + score_pocketname = { |
3687 | + PackagePublishingPocket.BACKPORTS: 0, |
3688 | + PackagePublishingPocket.RELEASE: 1500, |
3689 | + PackagePublishingPocket.PROPOSED: 3000, |
3690 | + PackagePublishingPocket.UPDATES: 3000, |
3691 | + PackagePublishingPocket.SECURITY: 4500, |
3692 | + } |
3693 | + |
3694 | + score_componentname = { |
3695 | + 'multiverse': 0, |
3696 | + 'universe': 250, |
3697 | + 'restricted': 750, |
3698 | + 'main': 1000, |
3699 | + 'partner' : 1250, |
3700 | + } |
3701 | + |
3702 | + score_urgency = { |
3703 | + SourcePackageUrgency.LOW: 5, |
3704 | + SourcePackageUrgency.MEDIUM: 10, |
3705 | + SourcePackageUrgency.HIGH: 15, |
3706 | + SourcePackageUrgency.EMERGENCY: 20, |
3707 | + } |
3708 | + |
3709 | + # Define a table we'll use to calculate the score based on the time |
3710 | + # in the build queue. The table is a sorted list of (upper time |
3711 | + # limit in seconds, score) tuples. |
3712 | + queue_time_scores = [ |
3713 | + (14400, 100), |
3714 | + (7200, 50), |
3715 | + (3600, 20), |
3716 | + (1800, 15), |
3717 | + (900, 10), |
3718 | + (300, 5), |
3719 | + ] |
3720 | + |
3721 | + private_archive_increment = 10000 |
3722 | + |
3723 | + # For build jobs in rebuild archives a score value of -1 |
3724 | + # was chosen because their priority is lower than build retries |
3725 | + # or language-packs. They should be built only when there is |
3726 | + # nothing else to build. |
3727 | + rebuild_archive_score = -10 |
3728 | + |
3729 | + score = 0 |
3730 | + |
3731 | + # Please note: the score for language packs is to be zero because |
3732 | + # they unduly delay the building of packages in the main component |
3733 | + # otherwise. |
3734 | + if self.build.sourcepackagerelease.section.name == 'translations': |
3735 | + pass |
3736 | + elif self.build.archive.purpose == ArchivePurpose.COPY: |
3737 | + score = rebuild_archive_score |
3738 | + else: |
3739 | + # Calculates the urgency-related part of the score. |
3740 | + urgency = score_urgency[self.build.sourcepackagerelease.urgency] |
3741 | + score += urgency |
3742 | + |
3743 | + # Calculates the pocket-related part of the score. |
3744 | + score_pocket = score_pocketname[self.build.pocket] |
3745 | + score += score_pocket |
3746 | + |
3747 | + # Calculates the component-related part of the score. |
3748 | + score_component = score_componentname[ |
3749 | + self.build.current_component.name] |
3750 | + score += score_component |
3751 | + |
3752 | + # Calculates the build queue time component of the score. |
3753 | + right_now = datetime.now(pytz.timezone('UTC')) |
3754 | + eta = right_now - self.job.date_created |
3755 | + for limit, dep_score in queue_time_scores: |
3756 | + if eta.seconds > limit: |
3757 | + score += dep_score |
3758 | + break |
3759 | + |
3760 | + # Private builds get uber score. |
3761 | + if self.build.archive.private: |
3762 | + score += private_archive_increment |
3763 | + |
3764 | + # Lastly, apply the archive score delta. This is to boost |
3765 | + # or retard build scores for any build in a particular |
3766 | + # archive. |
3767 | + score += self.build.archive.relative_build_score |
3768 | + |
3769 | + return score |
3770 | + |
3771 | + def getLogFileName(self): |
3772 | + """See `IBuildPackageJob`.""" |
3773 | + sourcename = self.build.sourcepackagerelease.name |
3774 | + version = self.build.sourcepackagerelease.version |
3775 | + # we rely on previous storage of current buildstate |
3776 | + # in the state handling methods. |
3777 | + state = self.build.buildstate.name |
3778 | + |
3779 | + dar = self.build.distroarchseries |
3780 | + distroname = dar.distroseries.distribution.name |
3781 | + distroseriesname = dar.distroseries.name |
3782 | + archname = dar.architecturetag |
3783 | + |
3784 | + # logfilename format: |
3785 | + # buildlog_<DISTRIBUTION>_<DISTROSeries>_<ARCHITECTURE>_\ |
3786 | + # <SOURCENAME>_<SOURCEVERSION>_<BUILDSTATE>.txt |
3787 | + # as: |
3788 | + # buildlog_ubuntu_dapper_i386_foo_1.0-ubuntu0_FULLYBUILT.txt |
3789 | + # it fix request from bug # 30617 |
3790 | + return ('buildlog_%s-%s-%s.%s_%s_%s.txt' % ( |
3791 | + distroname, distroseriesname, archname, sourcename, version, state |
3792 | + )) |
3793 | + |
3794 | + def getName(self): |
3795 | + """See `IBuildPackageJob`.""" |
3796 | + return self.build.sourcepackagerelease.name |
3797 | + |
3798 | + def jobStarted(self): |
3799 | + """See `IBuildPackageJob`.""" |
3800 | + self.build.buildstate = BuildStatus.BUILDING |
3801 | + # The build started, set the start time if not set already. |
3802 | + if self.build.date_first_dispatched is None: |
3803 | + self.build.date_first_dispatched = UTC_NOW |
3804 | + |
3805 | + def jobReset(self): |
3806 | + """See `IBuildPackageJob`.""" |
3807 | + self.build.buildstate = BuildStatus.NEEDSBUILD |
3808 | + |
3809 | + def jobAborted(self): |
3810 | + """See `IBuildPackageJob`.""" |
3811 | + # XXX, al-maisan, Thu, 12 Nov 2009 16:38:52 +0100 |
3812 | + # The setting below was "inherited" from the previous code. We |
3813 | + # need to investigate whether and why this is really needed and |
3814 | + # fix it. |
3815 | + self.build.buildstate = BuildStatus.BUILDING |
3816 | + |
3817 | |
3818 | === modified file 'lib/lp/soyuz/model/buildqueue.py' |
3819 | --- lib/lp/soyuz/model/buildqueue.py 2009-08-28 06:39:38 +0000 |
3820 | +++ lib/lp/soyuz/model/buildqueue.py 2009-11-16 23:27:14 +0000 |
3821 | @@ -10,27 +10,25 @@ |
3822 | 'BuildQueueSet' |
3823 | ] |
3824 | |
3825 | -from datetime import datetime |
3826 | import logging |
3827 | -import pytz |
3828 | |
3829 | from zope.component import getUtility |
3830 | from zope.interface import implements |
3831 | |
3832 | from sqlobject import ( |
3833 | StringCol, ForeignKey, BoolCol, IntCol, SQLObjectNotFound) |
3834 | -from storm.expr import In, LeftJoin |
3835 | +from storm.expr import In, Join, LeftJoin |
3836 | |
3837 | from canonical import encoding |
3838 | -from canonical.database.constants import UTC_NOW |
3839 | -from canonical.database.datetimecol import UtcDateTimeCol |
3840 | +from canonical.database.enumcol import EnumCol |
3841 | from canonical.database.sqlbase import SQLBase, sqlvalues |
3842 | from canonical.launchpad.webapp.interfaces import NotFoundError |
3843 | -from lp.registry.interfaces.sourcepackage import SourcePackageUrgency |
3844 | -from lp.soyuz.interfaces.archive import ArchivePurpose |
3845 | -from lp.soyuz.interfaces.build import BuildStatus |
3846 | +from lp.buildmaster.interfaces.buildfarmjob import BuildFarmJobType |
3847 | +from lp.services.job.interfaces.job import JobStatus |
3848 | +from lp.services.job.model.job import Job |
3849 | +from lp.soyuz.interfaces.build import BuildStatus, IBuildSet |
3850 | from lp.soyuz.interfaces.buildqueue import IBuildQueue, IBuildQueueSet |
3851 | -from lp.registry.interfaces.pocket import PackagePublishingPocket |
3852 | +from lp.soyuz.model.buildpackagejob import BuildPackageJob |
3853 | from canonical.launchpad.webapp.interfaces import ( |
3854 | IStoreSelector, MAIN_STORE, DEFAULT_FLAVOR) |
3855 | |
3856 | @@ -40,229 +38,85 @@ |
3857 | _table = "BuildQueue" |
3858 | _defaultOrder = "id" |
3859 | |
3860 | - build = ForeignKey(dbName='build', foreignKey='Build', notNull=True) |
3861 | + job = ForeignKey(dbName='job', foreignKey='Job', notNull=True) |
3862 | + job_type = EnumCol( |
3863 | + enum=BuildFarmJobType, notNull=True, |
3864 | + default=BuildFarmJobType.PACKAGEBUILD, dbName='job_type') |
3865 | builder = ForeignKey(dbName='builder', foreignKey='Builder', default=None) |
3866 | - created = UtcDateTimeCol(dbName='created', default=UTC_NOW) |
3867 | - buildstart = UtcDateTimeCol(dbName='buildstart', default= None) |
3868 | logtail = StringCol(dbName='logtail', default=None) |
3869 | lastscore = IntCol(dbName='lastscore', default=0) |
3870 | manual = BoolCol(dbName='manual', default=False) |
3871 | |
3872 | + @property |
3873 | + def specific_job(self): |
3874 | + """See `IBuildQueue`.""" |
3875 | + store = getUtility(IStoreSelector).get(MAIN_STORE, DEFAULT_FLAVOR) |
3876 | + result_set = store.find( |
3877 | + BuildPackageJob, BuildPackageJob.job == self.job) |
3878 | + return result_set.one() |
3879 | + |
3880 | + @property |
3881 | + def date_started(self): |
3882 | + """See `IBuildQueue`.""" |
3883 | + return self.job.date_started |
3884 | + |
3885 | def manualScore(self, value): |
3886 | """See `IBuildQueue`.""" |
3887 | self.lastscore = value |
3888 | self.manual = True |
3889 | |
3890 | - @property |
3891 | - def archseries(self): |
3892 | - """See `IBuildQueue`.""" |
3893 | - return self.build.distroarchseries |
3894 | - |
3895 | - @property |
3896 | - def urgency(self): |
3897 | - """See `IBuildQueue`.""" |
3898 | - return self.build.sourcepackagerelease.urgency |
3899 | - |
3900 | - @property |
3901 | - def archhintlist(self): |
3902 | - """See `IBuildQueue`.""" |
3903 | - return self.build.sourcepackagerelease.architecturehintlist |
3904 | - |
3905 | - @property |
3906 | - def name(self): |
3907 | - """See `IBuildQueue`.""" |
3908 | - return self.build.sourcepackagerelease.name |
3909 | - |
3910 | - @property |
3911 | - def version(self): |
3912 | - """See `IBuildQueue`.""" |
3913 | - return self.build.sourcepackagerelease.version |
3914 | - |
3915 | - @property |
3916 | - def files(self): |
3917 | - """See `IBuildQueue`.""" |
3918 | - return self.build.sourcepackagerelease.files |
3919 | - |
3920 | - @property |
3921 | - def builddependsindep(self): |
3922 | - """See `IBuildQueue`.""" |
3923 | - return self.build.sourcepackagerelease.builddependsindep |
3924 | - |
3925 | - @property |
3926 | - def buildduration(self): |
3927 | - """See `IBuildQueue`.""" |
3928 | - if self.buildstart: |
3929 | - UTC = pytz.timezone('UTC') |
3930 | - now = datetime.now(UTC) |
3931 | - return now - self.buildstart |
3932 | - return None |
3933 | - |
3934 | - @property |
3935 | - def is_virtualized(self): |
3936 | - """See `IBuildQueue`.""" |
3937 | - return self.build.is_virtualized |
3938 | - |
3939 | def score(self): |
3940 | """See `IBuildQueue`.""" |
3941 | # Grab any logger instance available. |
3942 | logger = logging.getLogger() |
3943 | + name = self.specific_job.getName() |
3944 | |
3945 | if self.manual: |
3946 | logger.debug( |
3947 | - "%s (%d) MANUALLY RESCORED" % (self.name, self.lastscore)) |
3948 | + "%s (%d) MANUALLY RESCORED" % (name, self.lastscore)) |
3949 | return |
3950 | |
3951 | - # XXX Al-maisan, 2008-05-14 (bug #230330): |
3952 | - # We keep touching the code here whenever a modification to the |
3953 | - # scoring parameters/weights is needed. Maybe the latter can be |
3954 | - # externalized? |
3955 | - |
3956 | - score_pocketname = { |
3957 | - PackagePublishingPocket.BACKPORTS: 0, |
3958 | - PackagePublishingPocket.RELEASE: 1500, |
3959 | - PackagePublishingPocket.PROPOSED: 3000, |
3960 | - PackagePublishingPocket.UPDATES: 3000, |
3961 | - PackagePublishingPocket.SECURITY: 4500, |
3962 | - } |
3963 | - |
3964 | - score_componentname = { |
3965 | - 'multiverse': 0, |
3966 | - 'universe': 250, |
3967 | - 'restricted': 750, |
3968 | - 'main': 1000, |
3969 | - 'partner' : 1250, |
3970 | - } |
3971 | - |
3972 | - score_urgency = { |
3973 | - SourcePackageUrgency.LOW: 5, |
3974 | - SourcePackageUrgency.MEDIUM: 10, |
3975 | - SourcePackageUrgency.HIGH: 15, |
3976 | - SourcePackageUrgency.EMERGENCY: 20, |
3977 | - } |
3978 | - |
3979 | - # Define a table we'll use to calculate the score based on the time |
3980 | - # in the build queue. The table is a sorted list of (upper time |
3981 | - # limit in seconds, score) tuples. |
3982 | - queue_time_scores = [ |
3983 | - (14400, 100), |
3984 | - (7200, 50), |
3985 | - (3600, 20), |
3986 | - (1800, 15), |
3987 | - (900, 10), |
3988 | - (300, 5), |
3989 | - ] |
3990 | - |
3991 | - private_archive_increment = 10000 |
3992 | - |
3993 | - # For build jobs in rebuild archives a score value of -1 |
3994 | - # was chosen because their priority is lower than build retries |
3995 | - # or language-packs. They should be built only when there is |
3996 | - # nothing else to build. |
3997 | - rebuild_archive_score = -10 |
3998 | - |
3999 | - score = 0 |
4000 | - msg = "%s (%d) -> " % (self.build.title, self.lastscore) |
4001 | - |
4002 | - # Please note: the score for language packs is to be zero because |
4003 | - # they unduly delay the building of packages in the main component |
4004 | - # otherwise. |
4005 | - if self.build.sourcepackagerelease.section.name == 'translations': |
4006 | - msg += "LPack => score zero" |
4007 | - elif self.build.archive.purpose == ArchivePurpose.COPY: |
4008 | - score = rebuild_archive_score |
4009 | - msg += "Rebuild archive => -10" |
4010 | - else: |
4011 | - # Calculates the urgency-related part of the score. |
4012 | - urgency = score_urgency[self.urgency] |
4013 | - score += urgency |
4014 | - msg += "U+%d " % urgency |
4015 | - |
4016 | - # Calculates the pocket-related part of the score. |
4017 | - score_pocket = score_pocketname[self.build.pocket] |
4018 | - score += score_pocket |
4019 | - msg += "P+%d " % score_pocket |
4020 | - |
4021 | - # Calculates the component-related part of the score. |
4022 | - score_component = score_componentname[ |
4023 | - self.build.current_component.name] |
4024 | - score += score_component |
4025 | - msg += "C+%d " % score_component |
4026 | - |
4027 | - # Calculates the build queue time component of the score. |
4028 | - right_now = datetime.now(pytz.timezone('UTC')) |
4029 | - eta = right_now - self.created |
4030 | - for limit, dep_score in queue_time_scores: |
4031 | - if eta.seconds > limit: |
4032 | - score += dep_score |
4033 | - msg += "T+%d " % dep_score |
4034 | - break |
4035 | - else: |
4036 | - msg += "T+0 " |
4037 | - |
4038 | - # Private builds get uber score. |
4039 | - if self.build.archive.private: |
4040 | - score += private_archive_increment |
4041 | - |
4042 | - # Lastly, apply the archive score delta. This is to boost |
4043 | - # or retard build scores for any build in a particular |
4044 | - # archive. |
4045 | - score += self.build.archive.relative_build_score |
4046 | - |
4047 | - # Store current score value. |
4048 | - self.lastscore = score |
4049 | - |
4050 | - logger.debug("%s= %d" % (msg, self.lastscore)) |
4051 | + # Allow the `IBuildFarmJob` instance with the data/logic specific to |
4052 | + # the job at hand to calculate the score as appropriate. |
4053 | + self.lastscore = self.specific_job.score() |
4054 | |
4055 | def getLogFileName(self): |
4056 | """See `IBuildQueue`.""" |
4057 | - sourcename = self.build.sourcepackagerelease.name |
4058 | - version = self.build.sourcepackagerelease.version |
4059 | - # we rely on previous storage of current buildstate |
4060 | - # in the state handling methods. |
4061 | - state = self.build.buildstate.name |
4062 | - |
4063 | - dar = self.build.distroarchseries |
4064 | - distroname = dar.distroseries.distribution.name |
4065 | - distroseriesname = dar.distroseries.name |
4066 | - archname = dar.architecturetag |
4067 | - |
4068 | - # logfilename format: |
4069 | - # buildlog_<DISTRIBUTION>_<DISTROSeries>_<ARCHITECTURE>_\ |
4070 | - # <SOURCENAME>_<SOURCEVERSION>_<BUILDSTATE>.txt |
4071 | - # as: |
4072 | - # buildlog_ubuntu_dapper_i386_foo_1.0-ubuntu0_FULLYBUILT.txt |
4073 | - # it fix request from bug # 30617 |
4074 | - return ('buildlog_%s-%s-%s.%s_%s_%s.txt' % ( |
4075 | - distroname, distroseriesname, archname, sourcename, version, state |
4076 | - )) |
4077 | + # Allow the `IBuildFarmJob` instance with the data/logic specific to |
4078 | + # the job at hand to calculate the log file name as appropriate. |
4079 | + return self.specific_job.getLogFileName() |
4080 | |
4081 | def markAsBuilding(self, builder): |
4082 | """See `IBuildQueue`.""" |
4083 | self.builder = builder |
4084 | - self.buildstart = UTC_NOW |
4085 | - self.build.buildstate = BuildStatus.BUILDING |
4086 | - # The build started, set the start time if not set already. |
4087 | - if self.build.date_first_dispatched is None: |
4088 | - self.build.date_first_dispatched = UTC_NOW |
4089 | + if self.job.status != JobStatus.RUNNING: |
4090 | + self.job.start() |
4091 | + self.specific_job.jobStarted() |
4092 | |
4093 | def reset(self): |
4094 | """See `IBuildQueue`.""" |
4095 | self.builder = None |
4096 | - self.buildstart = None |
4097 | + if self.job.status != JobStatus.WAITING: |
4098 | + self.job.queue() |
4099 | + self.job.date_started = None |
4100 | + self.job.date_finished = None |
4101 | self.logtail = None |
4102 | - self.build.buildstate = BuildStatus.NEEDSBUILD |
4103 | + self.specific_job.jobReset() |
4104 | |
4105 | def updateBuild_IDLE(self, build_id, build_status, logtail, |
4106 | filemap, dependencies, logger): |
4107 | """See `IBuildQueue`.""" |
4108 | + build = getUtility(IBuildSet).getByQueueEntry(self) |
4109 | logger.warn( |
4110 | "Builder %s forgot about build %s -- resetting buildqueue record" |
4111 | - % (self.builder.url, self.build.title)) |
4112 | + % (self.builder.url, build.title)) |
4113 | self.reset() |
4114 | |
4115 | def updateBuild_BUILDING(self, build_id, build_status, |
4116 | logtail, filemap, dependencies, logger): |
4117 | """See `IBuildQueue`.""" |
4118 | + if self.job.status != JobStatus.RUNNING: |
4119 | + self.job.start() |
4120 | self.logtail = encoding.guess(str(logtail)) |
4121 | |
4122 | def updateBuild_ABORTING(self, buildid, build_status, |
4123 | @@ -275,8 +129,15 @@ |
4124 | """See `IBuildQueue`.""" |
4125 | self.builder.cleanSlave() |
4126 | self.builder = None |
4127 | - self.buildstart = None |
4128 | - self.build.buildstate = BuildStatus.BUILDING |
4129 | + if self.job.status != JobStatus.FAILED: |
4130 | + self.job.fail() |
4131 | + self.job.date_started = None |
4132 | + self.job.date_finished = None |
4133 | + self.specific_job.jobAborted() |
4134 | + |
4135 | + def setDateStarted(self, timestamp): |
4136 | + """See `IBuildQueue`.""" |
4137 | + self.job.date_started = timestamp |
4138 | |
4139 | |
4140 | class BuildQueueSet(object): |
4141 | @@ -311,7 +172,12 @@ |
4142 | |
4143 | def getActiveBuildJobs(self): |
4144 | """See `IBuildQueueSet`.""" |
4145 | - return BuildQueue.select('buildstart is not null') |
4146 | + store = getUtility(IStoreSelector).get(MAIN_STORE, DEFAULT_FLAVOR) |
4147 | + result_set = store.find( |
4148 | + BuildQueue, |
4149 | + BuildQueue.job == Job.id, |
4150 | + Job.date_started != None) |
4151 | + return result_set |
4152 | |
4153 | def calculateCandidates(self, archseries): |
4154 | """See `IBuildQueueSet`.""" |
4155 | @@ -323,30 +189,37 @@ |
4156 | query = """ |
4157 | Build.distroarchseries IN %s AND |
4158 | Build.buildstate = %s AND |
4159 | - BuildQueue.build = build.id AND |
4160 | + BuildQueue.job_type = %s AND |
4161 | + BuildQueue.job = BuildPackageJob.job AND |
4162 | + BuildPackageJob.build = build.id AND |
4163 | BuildQueue.builder IS NULL |
4164 | - """ % sqlvalues(arch_ids, BuildStatus.NEEDSBUILD) |
4165 | + """ % sqlvalues( |
4166 | + arch_ids, BuildStatus.NEEDSBUILD, BuildFarmJobType.PACKAGEBUILD) |
4167 | |
4168 | candidates = BuildQueue.select( |
4169 | - query, clauseTables=['Build'], orderBy=['-BuildQueue.lastscore']) |
4170 | + query, clauseTables=['Build', 'BuildPackageJob'], |
4171 | + orderBy=['-BuildQueue.lastscore']) |
4172 | |
4173 | return candidates |
4174 | |
4175 | def getForBuilds(self, build_ids): |
4176 | """See `IBuildQueueSet`.""" |
4177 | # Avoid circular import problem. |
4178 | + from lp.soyuz.model.build import Build |
4179 | from lp.soyuz.model.builder import Builder |
4180 | |
4181 | store = getUtility(IStoreSelector).get(MAIN_STORE, DEFAULT_FLAVOR) |
4182 | |
4183 | origin = ( |
4184 | - BuildQueue, |
4185 | + BuildPackageJob, |
4186 | + Join(BuildQueue, BuildPackageJob.job == BuildQueue.jobID), |
4187 | + Join(Build, BuildPackageJob.build == Build.id), |
4188 | LeftJoin( |
4189 | Builder, |
4190 | BuildQueue.builderID == Builder.id), |
4191 | ) |
4192 | result_set = store.using(*origin).find( |
4193 | - (BuildQueue, Builder), |
4194 | - In(BuildQueue.buildID, build_ids)) |
4195 | + (BuildQueue, Builder, BuildPackageJob), |
4196 | + In(Build.id, build_ids)) |
4197 | |
4198 | return result_set |
4199 | |
4200 | === modified file 'lib/lp/soyuz/model/queue.py' |
4201 | --- lib/lp/soyuz/model/queue.py 2009-10-26 11:51:40 +0000 |
4202 | +++ lib/lp/soyuz/model/queue.py 2009-11-16 23:27:14 +0000 |
4203 | @@ -1468,7 +1468,8 @@ |
4204 | published_sha1 = published_file.content.sha1 |
4205 | |
4206 | # Multiple orig(s) with the same content are fine. |
4207 | - if source_file.filetype == SourcePackageFileType.ORIG: |
4208 | + if source_file.filetype == ( |
4209 | + SourcePackageFileType.ORIG_TARBALL): |
4210 | if proposed_sha1 == published_sha1: |
4211 | continue |
4212 | raise QueueInconsistentStateError( |
4213 | |
4214 | === added file 'lib/lp/soyuz/model/sourcepackageformat.py' |
4215 | --- lib/lp/soyuz/model/sourcepackageformat.py 1970-01-01 00:00:00 +0000 |
4216 | +++ lib/lp/soyuz/model/sourcepackageformat.py 2009-11-16 23:27:14 +0000 |
4217 | @@ -0,0 +1,56 @@ |
4218 | +# Copyright 2009 Canonical Ltd. This software is licensed under the |
4219 | +# GNU Affero General Public License version 3 (see the file LICENSE). |
4220 | + |
4221 | +__metaclass__ = type |
4222 | + |
4223 | +__all__ = [ |
4224 | + 'SourcePackageFormatSelection', |
4225 | + 'SourcePackageFormatSelectionSet', |
4226 | + ] |
4227 | + |
4228 | +from storm.locals import Storm, Int, Reference |
4229 | +from zope.component import getUtility |
4230 | +from zope.interface import implements |
4231 | + |
4232 | +from canonical.launchpad.webapp.interfaces import ( |
4233 | + IStoreSelector, MAIN_STORE, DEFAULT_FLAVOR, MASTER_FLAVOR) |
4234 | +from canonical.database.enumcol import DBEnum |
4235 | +from lp.soyuz.interfaces.sourcepackageformat import ( |
4236 | + ISourcePackageFormatSelection, ISourcePackageFormatSelectionSet, |
4237 | + SourcePackageFormat) |
4238 | + |
4239 | + |
4240 | +class SourcePackageFormatSelection(Storm): |
4241 | + """See ISourcePackageFormatSelection.""" |
4242 | + |
4243 | + implements(ISourcePackageFormatSelection) |
4244 | + |
4245 | + __storm_table__ = 'sourcepackageformatselection' |
4246 | + |
4247 | + id = Int(primary=True) |
4248 | + |
4249 | + distroseries_id = Int(name="distroseries") |
4250 | + distroseries = Reference(distroseries_id, 'DistroSeries.id') |
4251 | + |
4252 | + format = DBEnum(enum=SourcePackageFormat) |
4253 | + |
4254 | + |
4255 | +class SourcePackageFormatSelectionSet: |
4256 | + """See ISourcePackageFormatSelectionSet.""" |
4257 | + |
4258 | + implements(ISourcePackageFormatSelectionSet) |
4259 | + |
4260 | + def getBySeriesAndFormat(self, distroseries, format): |
4261 | + """See `ISourcePackageFormatSelection`.""" |
4262 | + return getUtility(IStoreSelector).get( |
4263 | + MAIN_STORE, DEFAULT_FLAVOR).find( |
4264 | + SourcePackageFormatSelection, distroseries=distroseries, |
4265 | + format=format).one() |
4266 | + |
4267 | + def add(self, distroseries, format): |
4268 | + """See `ISourcePackageFormatSelection`.""" |
4269 | + spfs = SourcePackageFormatSelection() |
4270 | + spfs.distroseries = distroseries |
4271 | + spfs.format = format |
4272 | + return getUtility(IStoreSelector).get(MAIN_STORE, MASTER_FLAVOR).add( |
4273 | + spfs) |
4274 | |
4275 | === modified file 'lib/lp/soyuz/model/sourcepackagerelease.py' |
4276 | --- lib/lp/soyuz/model/sourcepackagerelease.py 2009-11-07 09:50:37 +0000 |
4277 | +++ lib/lp/soyuz/model/sourcepackagerelease.py 2009-11-16 23:27:14 +0000 |
4278 | @@ -36,6 +36,7 @@ |
4279 | from lp.translations.interfaces.translationimportqueue import ( |
4280 | ITranslationImportQueue) |
4281 | from canonical.launchpad.webapp.interfaces import NotFoundError |
4282 | +from lp.archiveuploader.utils import determine_source_file_type |
4283 | from lp.soyuz.interfaces.archive import ( |
4284 | ArchivePurpose, IArchiveSet, MAIN_ARCHIVE_PURPOSES) |
4285 | from lp.soyuz.interfaces.build import BuildStatus |
4286 | @@ -52,7 +53,7 @@ |
4287 | from lp.soyuz.scripts.queue import QueueActionError |
4288 | from lp.registry.interfaces.person import validate_public_person |
4289 | from lp.registry.interfaces.sourcepackage import ( |
4290 | - SourcePackageFileType, SourcePackageType, SourcePackageUrgency) |
4291 | + SourcePackageType, SourcePackageUrgency) |
4292 | |
4293 | |
4294 | def _filter_ubuntu_translation_file(filename): |
4295 | @@ -271,19 +272,10 @@ |
4296 | |
4297 | def addFile(self, file): |
4298 | """See ISourcePackageRelease.""" |
4299 | - determined_filetype = None |
4300 | - if file.filename.endswith(".dsc"): |
4301 | - determined_filetype = SourcePackageFileType.DSC |
4302 | - elif file.filename.endswith(".orig.tar.gz"): |
4303 | - determined_filetype = SourcePackageFileType.ORIG |
4304 | - elif file.filename.endswith(".diff.gz"): |
4305 | - determined_filetype = SourcePackageFileType.DIFF |
4306 | - elif file.filename.endswith(".tar.gz"): |
4307 | - determined_filetype = SourcePackageFileType.TARBALL |
4308 | - |
4309 | - return SourcePackageReleaseFile(sourcepackagerelease=self, |
4310 | - filetype=determined_filetype, |
4311 | - libraryfile=file) |
4312 | + return SourcePackageReleaseFile( |
4313 | + sourcepackagerelease=self, |
4314 | + filetype=determine_source_file_type(file.filename), |
4315 | + libraryfile=file) |
4316 | |
4317 | def _getPackageSize(self): |
4318 | """Get the size total (in KB) of files comprising this package. |
4319 | |
4320 | === modified file 'lib/lp/soyuz/scripts/packagecopier.py' |
4321 | --- lib/lp/soyuz/scripts/packagecopier.py 2009-10-29 12:18:05 +0000 |
4322 | +++ lib/lp/soyuz/scripts/packagecopier.py 2009-11-16 23:27:14 +0000 |
4323 | @@ -36,6 +36,7 @@ |
4324 | ISourcePackagePublishingHistory, active_publishing_status) |
4325 | from lp.soyuz.interfaces.queue import ( |
4326 | IPackageUpload, IPackageUploadSet) |
4327 | +from lp.soyuz.interfaces.sourcepackageformat import SourcePackageFormat |
4328 | from lp.soyuz.scripts.ftpmasterbase import ( |
4329 | SoyuzScript, SoyuzScriptError) |
4330 | from lp.soyuz.scripts.processaccepted import ( |
4331 | @@ -356,6 +357,14 @@ |
4332 | "Cannot copy to an unsupported distribution: %s." % |
4333 | source.distroseries.distribution.name) |
4334 | |
4335 | + format = SourcePackageFormat.getTermByToken( |
4336 | + source.sourcepackagerelease.dsc_format).value |
4337 | + |
4338 | + if not series.isSourcePackageFormatPermitted(format): |
4339 | + raise CannotCopy( |
4340 | + "Source format '%s' not supported by target series %s." % |
4341 | + (source.sourcepackagerelease.dsc_format, series.name)) |
4342 | + |
4343 | if self.include_binaries: |
4344 | built_binaries = source.getBuiltBinaries() |
4345 | if len(built_binaries) == 0: |
4346 | |
4347 | === modified file 'lib/lp/soyuz/scripts/tests/test_copypackage.py' |
4348 | --- lib/lp/soyuz/scripts/tests/test_copypackage.py 2009-10-05 18:29:12 +0000 |
4349 | +++ lib/lp/soyuz/scripts/tests/test_copypackage.py 2009-11-16 23:27:14 +0000 |
4350 | @@ -37,6 +37,8 @@ |
4351 | PackagePublishingStatus, active_publishing_status) |
4352 | from lp.soyuz.interfaces.queue import ( |
4353 | PackageUploadCustomFormat, PackageUploadStatus) |
4354 | +from lp.soyuz.interfaces.sourcepackageformat import ( |
4355 | + ISourcePackageFormatSelectionSet, SourcePackageFormat) |
4356 | from lp.soyuz.model.publishing import ( |
4357 | SecureSourcePackagePublishingHistory, |
4358 | SecureBinaryPackagePublishingHistory) |
4359 | @@ -720,6 +722,32 @@ |
4360 | 'Cannot copy to an unsupported distribution: ubuntu.', |
4361 | copy_checker.checkCopy, source, series, pocket) |
4362 | |
4363 | + def test_checkCopy_respects_sourceformatselection(self): |
4364 | + # A source copy should be denied if the source's dsc_format is |
4365 | + # not permitted in the target series. |
4366 | + |
4367 | + # Get hoary, and configure it to accept 3.0 (quilt) uploads. |
4368 | + ubuntu = getUtility(IDistributionSet).getByName('ubuntu') |
4369 | + hoary = ubuntu.getSeries('hoary') |
4370 | + getUtility(ISourcePackageFormatSelectionSet).add( |
4371 | + hoary, SourcePackageFormat.FORMAT_3_0_QUILT) |
4372 | + |
4373 | + # Create a 3.0 (quilt) source. |
4374 | + source = self.test_publisher.getPubSource( |
4375 | + distroseries=hoary, dsc_format='3.0 (quilt)') |
4376 | + |
4377 | + archive = source.archive |
4378 | + series = ubuntu.getSeries('warty') |
4379 | + pocket = source.pocket |
4380 | + |
4381 | + # An attempt to copy the source to warty, which only supports |
4382 | + # 1.0 sources, is rejected. |
4383 | + copy_checker = CopyChecker(archive, include_binaries=True) |
4384 | + self.assertRaisesWithContent( |
4385 | + CannotCopy, |
4386 | + "Source format '3.0 (quilt)' not supported by target series " |
4387 | + "warty.", copy_checker.checkCopy, source, series, pocket) |
4388 | + |
4389 | def test_checkCopy_identifies_conflicting_copy_candidates(self): |
4390 | # checkCopy() is able to identify conflicting candidates within |
4391 | # the copy batch. |
4392 | |
4393 | === modified file 'lib/lp/soyuz/stories/soyuz/xx-build-record.txt' |
4394 | --- lib/lp/soyuz/stories/soyuz/xx-build-record.txt 2009-09-23 16:38:10 +0000 |
4395 | +++ lib/lp/soyuz/stories/soyuz/xx-build-record.txt 2009-11-16 23:27:14 +0000 |
4396 | @@ -23,8 +23,9 @@ |
4397 | >>> bob_builder.builderok = True |
4398 | |
4399 | # Set a known duration for the current job. |
4400 | - >>> in_progress_build = removeSecurityProxy( |
4401 | - ... bob_builder.currentjob.build) |
4402 | + >>> from lp.soyuz.interfaces.build import IBuildSet |
4403 | + >>> build2 = getUtility(IBuildSet).getByQueueEntry(bob_builder.currentjob) |
4404 | + >>> in_progress_build = removeSecurityProxy(build2) |
4405 | >>> one_minute = datetime.timedelta(seconds=60) |
4406 | >>> in_progress_build.estimated_build_duration = one_minute |
4407 | |
4408 | @@ -122,12 +123,8 @@ |
4409 | >>> login('foo.bar@canonical.com') |
4410 | >>> from canonical.database.constants import UTC_NOW |
4411 | >>> from lp.soyuz.interfaces.build import BuildStatus |
4412 | - >>> in_progress_build.buildstate = BuildStatus.NEEDSBUILD |
4413 | - >>> in_progress_build.buildqueue_record.buildstart = None |
4414 | - >>> in_progress_build.buildqueue_record.builder = None |
4415 | - >>> build.buildstate = BuildStatus.BUILDING |
4416 | - >>> build.buildqueue_record.buildstart = UTC_NOW |
4417 | - >>> build.buildqueue_record.builder = bob_builder |
4418 | + >>> in_progress_build.buildqueue_record.reset() |
4419 | + >>> build.buildqueue_record.markAsBuilding(bob_builder) |
4420 | >>> build.buildqueue_record.logtail = 'one line\nanother line' |
4421 | >>> logout() |
4422 | |
4423 | |
4424 | === modified file 'lib/lp/soyuz/templates/build-index.pt' |
4425 | --- lib/lp/soyuz/templates/build-index.pt 2009-09-17 12:08:45 +0000 |
4426 | +++ lib/lp/soyuz/templates/build-index.pt 2009-11-16 23:27:14 +0000 |
4427 | @@ -161,8 +161,8 @@ |
4428 | <tal:building condition="context/buildstate/enumvalue:BUILDING"> |
4429 | <li> |
4430 | Started |
4431 | - <span tal:attributes="title view/buildqueue/buildstart/fmt:datetime" |
4432 | - tal:content="view/buildqueue/buildstart/fmt:approximatedate" |
4433 | + <span tal:attributes="title view/buildqueue/job/date_started/fmt:datetime" |
4434 | + tal:content="view/buildqueue/job/date_started/fmt:approximatedate" |
4435 | >5 minutes ago</span> |
4436 | </li> |
4437 | </tal:building> |
4438 | |
4439 | === modified file 'lib/lp/soyuz/templates/builder-index.pt' |
4440 | --- lib/lp/soyuz/templates/builder-index.pt 2009-09-16 19:06:48 +0000 |
4441 | +++ lib/lp/soyuz/templates/builder-index.pt 2009-11-16 23:27:14 +0000 |
4442 | @@ -104,9 +104,14 @@ |
4443 | </tal:buildernok> |
4444 | </tal:no_job> |
4445 | |
4446 | + <tal:comment replace="nothing"> |
4447 | + In the very near future, 'job' will not just be a Build job. |
4448 | + The template needs to cope with that as and when new job types are |
4449 | + added. |
4450 | + </tal:comment> |
4451 | <tal:job condition="job"> |
4452 | <span class="sortkey" tal:content="job/id" /> |
4453 | - <tal:build define="build job/build"> |
4454 | + <tal:build define="build job/specific_job/build"> |
4455 | <tal:visible condition="build/required:launchpad.View"> |
4456 | <tal:icon replace="structure build/image:icon" /> |
4457 | Building |
4458 | @@ -153,10 +158,12 @@ |
4459 | |
4460 | <tal:job condition="job"> |
4461 | <p class="sprite">Started |
4462 | - <span tal:attributes="title job/buildstart/fmt:datetime" |
4463 | + <span tal:attributes="title job/job/date_started/fmt:datetime" |
4464 | tal:content="view/current_build_duration/fmt:exactduration" |
4465 | /> ago.</p> |
4466 | - <tal:visible condition="job/build/required:launchpad.View"> |
4467 | + <tal:visible |
4468 | + define="build job/specific_job/build" |
4469 | + condition="build/required:launchpad.View"> |
4470 | <tal:logtail condition="job/logtail"> |
4471 | <h3>Buildlog</h3> |
4472 | <div tal:content="structure job/logtail/fmt:text-to-html" |
4473 | |
4474 | === modified file 'lib/lp/soyuz/templates/builds-list.pt' |
4475 | --- lib/lp/soyuz/templates/builds-list.pt 2009-08-18 12:33:37 +0000 |
4476 | +++ lib/lp/soyuz/templates/builds-list.pt 2009-11-16 23:27:14 +0000 |
4477 | @@ -101,8 +101,8 @@ |
4478 | <tal:building condition="bq/builder"> |
4479 | Build started |
4480 | <span |
4481 | - tal:attributes="title bq/buildstart/fmt:datetime" |
4482 | - tal:content="bq/buildstart/fmt:displaydate" /> |
4483 | + tal:attributes="title bq/job/date_started/fmt:datetime" |
4484 | + tal:content="bq/job/date_started/fmt:displaydate" /> |
4485 | on |
4486 | <a tal:content="bq/builder/title" |
4487 | tal:attributes="href bq/builder/fmt:url"/> |
4488 | |
4489 | === modified file 'lib/lp/soyuz/tests/test_builder.py' |
4490 | --- lib/lp/soyuz/tests/test_builder.py 2009-11-11 10:43:07 +0000 |
4491 | +++ lib/lp/soyuz/tests/test_builder.py 2009-11-16 23:27:14 +0000 |
4492 | @@ -9,8 +9,8 @@ |
4493 | |
4494 | from canonical.testing import LaunchpadZopelessLayer |
4495 | from lp.soyuz.interfaces.archive import ArchivePurpose |
4496 | +from lp.soyuz.interfaces.build import BuildStatus, IBuildSet |
4497 | from lp.soyuz.interfaces.builder import IBuilderSet |
4498 | -from lp.soyuz.interfaces.build import BuildStatus |
4499 | from lp.soyuz.interfaces.publishing import PackagePublishingStatus |
4500 | from lp.soyuz.tests.test_publishing import SoyuzTestPublisher |
4501 | from lp.testing import TestCaseWithFactory |
4502 | @@ -77,14 +77,16 @@ |
4503 | |
4504 | # Asking frog to find a candidate should give us the joesppa build. |
4505 | next_job = self.frog_builder.findBuildCandidate() |
4506 | - self.assertEqual('joesppa', next_job.build.archive.name) |
4507 | + build = getUtility(IBuildSet).getByQueueEntry(next_job) |
4508 | + self.assertEqual('joesppa', build.archive.name) |
4509 | |
4510 | # If bob is in a failed state the joesppa build is still |
4511 | # returned. |
4512 | self.bob_builder.builderok = False |
4513 | self.bob_builder.manual = False |
4514 | next_job = self.frog_builder.findBuildCandidate() |
4515 | - self.assertEqual('joesppa', next_job.build.archive.name) |
4516 | + build = getUtility(IBuildSet).getByQueueEntry(next_job) |
4517 | + self.assertEqual('joesppa', build.archive.name) |
4518 | |
4519 | |
4520 | class TestFindBuildCandidatePPA(TestFindBuildCandidateBase): |
4521 | @@ -156,14 +158,16 @@ |
4522 | # A PPA cannot start a build if it would use 80% or more of the |
4523 | # builders. |
4524 | next_job = self.builder4.findBuildCandidate() |
4525 | - self.failIfEqual('joesppa', next_job.build.archive.name) |
4526 | + build = getUtility(IBuildSet).getByQueueEntry(next_job) |
4527 | + self.failIfEqual('joesppa', build.archive.name) |
4528 | |
4529 | def test_findBuildCandidate_first_build_finished(self): |
4530 | # When joe's first ppa build finishes, his fourth i386 build |
4531 | # will be the next build candidate. |
4532 | self.joe_builds[0].buildstate = BuildStatus.FAILEDTOBUILD |
4533 | next_job = self.builder4.findBuildCandidate() |
4534 | - self.failUnlessEqual('joesppa', next_job.build.archive.name) |
4535 | + build = getUtility(IBuildSet).getByQueueEntry(next_job) |
4536 | + self.failUnlessEqual('joesppa', build.archive.name) |
4537 | |
4538 | def test_findBuildCandidate_for_private_ppa(self): |
4539 | # If a ppa is private it will be able to have parallel builds |
4540 | @@ -171,7 +175,8 @@ |
4541 | self.ppa_joe.private = True |
4542 | self.ppa_joe.buildd_secret = 'sekrit' |
4543 | next_job = self.builder4.findBuildCandidate() |
4544 | - self.failUnlessEqual('joesppa', next_job.build.archive.name) |
4545 | + build = getUtility(IBuildSet).getByQueueEntry(next_job) |
4546 | + self.failUnlessEqual('joesppa', build.archive.name) |
4547 | |
4548 | |
4549 | class TestFindBuildCandidateDistroArchive(TestFindBuildCandidateBase): |
4550 | @@ -196,18 +201,18 @@ |
4551 | # arch. |
4552 | |
4553 | next_job = self.builder2.findBuildCandidate() |
4554 | - self.failUnlessEqual('primary', next_job.build.archive.name) |
4555 | - self.failUnlessEqual( |
4556 | - 'gedit', next_job.build.sourcepackagerelease.name) |
4557 | + build = getUtility(IBuildSet).getByQueueEntry(next_job) |
4558 | + self.failUnlessEqual('primary', build.archive.name) |
4559 | + self.failUnlessEqual('gedit', build.sourcepackagerelease.name) |
4560 | |
4561 | # Now even if we set the build building, we'll still get the |
4562 | # second non-ppa build for the same archive as the next candidate. |
4563 | - next_job.build.buildstate = BuildStatus.BUILDING |
4564 | - next_job.build.builder = self.builder2 |
4565 | + build.buildstate = BuildStatus.BUILDING |
4566 | + build.builder = self.builder2 |
4567 | next_job = self.builder2.findBuildCandidate() |
4568 | - self.failUnlessEqual('primary', next_job.build.archive.name) |
4569 | - self.failUnlessEqual( |
4570 | - 'firefox', next_job.build.sourcepackagerelease.name) |
4571 | + build = getUtility(IBuildSet).getByQueueEntry(next_job) |
4572 | + self.failUnlessEqual('primary', build.archive.name) |
4573 | + self.failUnlessEqual('firefox', build.sourcepackagerelease.name) |
4574 | |
4575 | def test_suite(): |
4576 | return unittest.TestLoader().loadTestsFromName(__name__) |
4577 | |
4578 | === modified file 'lib/lp/translations/tests/test_translations_to_review.py' |
4579 | --- lib/lp/translations/tests/test_translations_to_review.py 2009-08-19 16:48:17 +0000 |
4580 | +++ lib/lp/translations/tests/test_translations_to_review.py 2009-11-16 23:27:14 +0000 |
4581 | @@ -78,7 +78,7 @@ |
4582 | translator=self.person, translations=['bi'], |
4583 | date_updated=self.base_time) |
4584 | |
4585 | - later_time = self.base_time + timedelta(0, 0, 1) |
4586 | + later_time = self.base_time + timedelta(0, 3600) |
4587 | self.suggestion = removeSecurityProxy( |
4588 | self.factory.makeTranslationMessage( |
4589 | potmsgset=self.potmsgset, pofile=self.pofile, |
This is hopefully the second last of my Debian source format 3.0 branches. It has minimal 3.0-specific changes, as it mainly refactoring to support the new file types and multiple formats.
Note that the sample data changes are all from schema changes in previous branches, with the exception of the 'INSERT INTO sourcepackagefo rmatselection' lines which are new (they permit the 1.0 format in all series in the sample data).
The main changes are:
Making way for new files (new compression algorithms and completely new file types): leType. ORIG to ORIG_TARBALL, and SPFT.TARBALL to NATIVE_TARBALL, as there will be other kinds of orig and tarball soon. source_ file_type.
- Renamed SourcePackageFi
- Isolated extension matching on source files into determine_
Adding multiple source format support to archiveuploader: binaryful_ consistency, since we cannot make many assertions until we know the DSC format. binaryful_ consistency used have, but in a more extensible manner.
- Removed most of the checks from NU._check_
- Extended DSCFile.checkFiles to perform the checks that NU._check_
Forbidding uploads and copies of sources to series without appropriate format support: eFormatSelectio n, which uses a DB table added late in 3.1.10. .{permitSourceP ackageFormat, isSourceFormatP ermitted} to allow manipulation and verification of allowed formats. component_ and_section_ selections to also copy SourcePackageFo rmatSelections during intialiseFromPa rent. checkCopy, reject a copy if the target does not support the source format.
- Added the SourcePackageFormat enum with 1.0, 3.0 (quilt) and 3.0 (native) formats.
- Added (I)SourcePackag
- Added DB permissions for SPFS.
- Added (I)DistroSeries
- Extended DS._copy_
- In CopyChecker.
- Replace the self.format != "1.0" check in archiveuploader with a check that the format is supported by the target series.