Merge lp:~jelmer/launchpad/506256-remove-popen-2 into lp:launchpad
- 506256-remove-popen-2
- Merge into devel
Status: | Merged | ||||||||
---|---|---|---|---|---|---|---|---|---|
Approved by: | Jelmer Vernooij | ||||||||
Approved revision: | no longer in the source branch. | ||||||||
Merged at revision: | 11579 | ||||||||
Proposed branch: | lp:~jelmer/launchpad/506256-remove-popen-2 | ||||||||
Merge into: | lp:launchpad | ||||||||
Prerequisite: | lp:~jelmer/launchpad/506256-remove-popen | ||||||||
Diff against target: |
1585 lines (+374/-273) 27 files modified
database/schema/security.cfg (+2/-0) lib/lp/archiveuploader/dscfile.py (+0/-29) lib/lp/archiveuploader/nascentupload.py (+31/-16) lib/lp/archiveuploader/nascentuploadfile.py (+59/-35) lib/lp/archiveuploader/tests/__init__.py (+5/-7) lib/lp/archiveuploader/tests/nascentupload.txt (+4/-5) lib/lp/archiveuploader/tests/test_buildduploads.py (+7/-8) lib/lp/archiveuploader/tests/test_nascentuploadfile.py (+61/-0) lib/lp/archiveuploader/tests/test_ppauploadprocessor.py (+11/-12) lib/lp/archiveuploader/tests/test_recipeuploads.py (+8/-12) lib/lp/archiveuploader/tests/test_uploadprocessor.py (+99/-30) lib/lp/archiveuploader/tests/uploadpolicy.txt (+1/-8) lib/lp/archiveuploader/uploadpolicy.py (+12/-14) lib/lp/archiveuploader/uploadprocessor.py (+16/-19) lib/lp/buildmaster/interfaces/packagebuild.py (+8/-4) lib/lp/buildmaster/model/packagebuild.py (+11/-2) lib/lp/buildmaster/tests/test_packagebuild.py (+1/-1) lib/lp/code/configure.zcml (+1/-5) lib/lp/code/model/sourcepackagerecipebuild.py (+4/-29) lib/lp/code/model/tests/test_sourcepackagerecipebuild.py (+6/-0) lib/lp/soyuz/doc/build-failedtoupload-workflow.txt (+2/-3) lib/lp/soyuz/doc/buildd-slavescanner.txt (+0/-3) lib/lp/soyuz/doc/distroseriesqueue-translations.txt (+3/-5) lib/lp/soyuz/doc/soyuz-set-of-uploads.txt (+3/-20) lib/lp/soyuz/model/binarypackagebuild.py (+4/-0) lib/lp/soyuz/scripts/soyuz_process_upload.py (+6/-6) lib/lp/soyuz/tests/test_binarypackagebuild.py (+9/-0) |
||||||||
To merge this branch: | bzr merge lp:~jelmer/launchpad/506256-remove-popen-2 | ||||||||
Related bugs: |
|
Reviewer | Review Type | Date Requested | Status |
---|---|---|---|
Michael Nelson (community) | code | Approve | |
Review via email: mp+35412@code.launchpad.net |
This proposal supersedes a proposal from 2010-09-14.
Commit message
Fix handling of source recipe builds when processing build uploads asynchronously.
Description of the change
This MP actually has two prerequisites which both have been approved but not yet landed: lp:~jelmer/launchpad/506256-remove-popen and lp:~jelmer/launchpad/archiveuploader-build-handling. Since I can only set one as prerequisite I've set the first, since it's the biggest.
A bit of background: This branch is a followup to earlier work I did to make it possible for the builddmaster to no longer popen("
This branch fixes source package recipe build processing in the separate upload processor.
It does the following things:
* The separate upload policy for source package recipe builds has been merged into the overall buildd upload policy
* related, getUploader() is no longer on the upload policy but on the build class (it is different for binarypackagebuilds and recipe builds)
* Clean up the buildqueue record earlier so we don't keep the builder busy.
test:
./bin/test lp.buildmaster
./bin/test lp.archiveuploader
Jelmer Vernooij (jelmer) wrote : | # |
Michael Nelson (michael.nelson) wrote : | # |
Hi Jelmer,
r=me, assuming you check the following:
* to check: the db permissions - can we get rid of delete perms elsewhere),
* to fix: don't see why we need to use removeSecurityProxy in your test instead of a small update to the factory mothed),
* to check: regarding the actual diff - my local db-devel seems to already have some of these changes
Thanks!
> === modified file 'database/
> --- database/
> +++ database/
> @@ -1130,9 +1130,12 @@
> public.packagebuild = SELECT, INSERT, UPDATE
> public.
> public.
> -public.buildqueue = SELECT, INSERT, UPDATE
> -public.job = SELECT, INSERT, UPDATE
> -public.
> +public.
> +public.
> +public.buildqueue = SELECT, INSERT, UPDATE, DELETE
> +public.job = SELECT, INSERT, UPDATE, DELETE
> +public.
Right - so this is so the buildqueue record can be cleaned up earlier by the
uploader. Nice.
Should we be able to remove some DELETE perms elsewhere?
> +public.builder = SELECT
>
> # Thusly the librarian
> public.
>
> === modified file 'lib/lp/
> --- lib/lp/
> +++ lib/lp/
> @@ -498,10 +498,13 @@
> if self.binaryful:
> return
>
> - # Set up some convenient shortcut variables.
> -
> - uploader = self.policy.
> - archive = self.policy.archive
> + # The build can have an explicit uploader, which may be different
> + # from the changes file signer. (i.e in case of daily source package
> + # builds)
> + if build is not None:
> + uploader = build.getUpload
> + else:
> + uploader = self.changes.signer
This seems strange? do we not need archive any more? In db-devel it's used
straight afterwards, but I'm assuming you've changed that code in a previous
branch. And checking your full diff shows that is the case (you're using
policy.archive).
>
> # If we have no signer, there's no ACL we can apply.
> if uploader is None:
>
> === modified file 'lib/lp/
> --- lib/lp/
> +++ lib/lp/
> @@ -1884,7 +1884,7 @@
> self.assertLogC
> "Unable to find package build job with id 42. Skipping.")
>
> - def testNoFiles(self):
> + def testBinaryPacka
>...
Jelmer Vernooij (jelmer) wrote : | # |
=== modified file 'database/
--- database/
+++ database/
@@ -1130,6 +1130,8 @@
public.
public.
public.
+public.
+public.
public.buildqueue = SELECT, INSERT, UPDATE
public.job = SELECT, INSERT, UPDATE
public.
=== modified file 'lib/lp/
--- lib/lp/
+++ lib/lp/
@@ -498,10 +498,13 @@
if self.binaryful:
return
- # Set up some convenient shortcut variables.
-
- uploader = self.policy.
- archive = self.policy.archive
+ # The build can have an explicit uploader, which may be different
+ # from the changes file signer. (i.e in case of daily source package
+ # builds)
+ if build is not None:
+ uploader = build.getUpload
+ else:
+ uploader = self.changes.signer
# If we have no signer, there's no ACL we can apply.
if uploader is None:
=== modified file 'lib/lp/
--- lib/lp/
+++ lib/lp/
@@ -42,7 +42,7 @@
- self.options.
+ self.options.
=== modified file 'lib/lp/
--- lib/lp/
+++ lib/lp/
@@ -18,6 +18,7 @@
import tempfile
import traceback
+from storm.locals import Store
from zope.component import (
getGlobalS
getUtility,
@@ -1884,7 +1885,7 @@
- def testNoFiles(self):
+ def testBinaryPacka
# If the upload directory is empty, the upload
# will fail.
@@ -1908,6 +1909,8 @@
# Upload and accept a binary for the primary archive source.
+
+ # Commit so the build cookie has the right ids.
leaf_name = build.getUpload
@@ -1928,7 +1931,7 @@
in log_contents)
- def t...
Preview Diff
1 | === modified file 'database/schema/security.cfg' | |||
2 | --- database/schema/security.cfg 2010-09-10 02:46:28 +0000 | |||
3 | +++ database/schema/security.cfg 2010-09-17 06:08:57 +0000 | |||
4 | @@ -1130,6 +1130,8 @@ | |||
5 | 1130 | public.packagebuild = SELECT, INSERT, UPDATE | 1130 | public.packagebuild = SELECT, INSERT, UPDATE |
6 | 1131 | public.binarypackagebuild = SELECT, INSERT, UPDATE | 1131 | public.binarypackagebuild = SELECT, INSERT, UPDATE |
7 | 1132 | public.sourcepackagerecipebuild = SELECT, UPDATE | 1132 | public.sourcepackagerecipebuild = SELECT, UPDATE |
8 | 1133 | public.sourcepackagerecipebuildjob = SELECT, UPDATE | ||
9 | 1134 | public.sourcepackagerecipe = SELECT, UPDATE | ||
10 | 1133 | public.buildqueue = SELECT, INSERT, UPDATE | 1135 | public.buildqueue = SELECT, INSERT, UPDATE |
11 | 1134 | public.job = SELECT, INSERT, UPDATE | 1136 | public.job = SELECT, INSERT, UPDATE |
12 | 1135 | public.buildpackagejob = SELECT, INSERT, UPDATE | 1137 | public.buildpackagejob = SELECT, INSERT, UPDATE |
13 | 1136 | 1138 | ||
14 | === modified file 'lib/lp/archiveuploader/dscfile.py' | |||
15 | --- lib/lp/archiveuploader/dscfile.py 2010-09-09 17:02:33 +0000 | |||
16 | +++ lib/lp/archiveuploader/dscfile.py 2010-09-17 06:08:57 +0000 | |||
17 | @@ -630,35 +630,6 @@ | |||
18 | 630 | cleanup_unpacked_dir(unpacked_dir) | 630 | cleanup_unpacked_dir(unpacked_dir) |
19 | 631 | self.logger.debug("Done") | 631 | self.logger.debug("Done") |
20 | 632 | 632 | ||
21 | 633 | def findBuild(self): | ||
22 | 634 | """Find and return the SourcePackageRecipeBuild, if one is specified. | ||
23 | 635 | |||
24 | 636 | If by any chance an inconsistent build was found this method will | ||
25 | 637 | raise UploadError resulting in a upload rejection. | ||
26 | 638 | """ | ||
27 | 639 | build_id = getattr(self.policy.options, 'buildid', None) | ||
28 | 640 | if build_id is None: | ||
29 | 641 | return None | ||
30 | 642 | |||
31 | 643 | build = getUtility(ISourcePackageRecipeBuildSource).getById(build_id) | ||
32 | 644 | |||
33 | 645 | # The master verifies the status to confirm successful upload. | ||
34 | 646 | build.status = BuildStatus.FULLYBUILT | ||
35 | 647 | # If this upload is successful, any existing log is wrong and | ||
36 | 648 | # unuseful. | ||
37 | 649 | build.upload_log = None | ||
38 | 650 | |||
39 | 651 | # Sanity check; raise an error if the build we've been | ||
40 | 652 | # told to link to makes no sense. | ||
41 | 653 | if (build.pocket != self.policy.pocket or | ||
42 | 654 | build.distroseries != self.policy.distroseries or | ||
43 | 655 | build.archive != self.policy.archive): | ||
44 | 656 | raise UploadError( | ||
45 | 657 | "Attempt to upload source specifying " | ||
46 | 658 | "recipe build %s, where it doesn't fit." % build.id) | ||
47 | 659 | |||
48 | 660 | return build | ||
49 | 661 | |||
50 | 662 | def storeInDatabase(self, build): | 633 | def storeInDatabase(self, build): |
51 | 663 | """Store DSC information as a SourcePackageRelease record. | 634 | """Store DSC information as a SourcePackageRelease record. |
52 | 664 | 635 | ||
53 | 665 | 636 | ||
54 | === modified file 'lib/lp/archiveuploader/nascentupload.py' | |||
55 | --- lib/lp/archiveuploader/nascentupload.py 2010-08-27 14:27:22 +0000 | |||
56 | +++ lib/lp/archiveuploader/nascentupload.py 2010-09-17 06:08:57 +0000 | |||
57 | @@ -137,7 +137,7 @@ | |||
58 | 137 | raise FatalUploadError(str(e)) | 137 | raise FatalUploadError(str(e)) |
59 | 138 | return cls(changesfile, policy, logger) | 138 | return cls(changesfile, policy, logger) |
60 | 139 | 139 | ||
62 | 140 | def process(self): | 140 | def process(self, build=None): |
63 | 141 | """Process this upload, checking it against policy, loading it into | 141 | """Process this upload, checking it against policy, loading it into |
64 | 142 | the database if it seems okay. | 142 | the database if it seems okay. |
65 | 143 | 143 | ||
66 | @@ -200,7 +200,7 @@ | |||
67 | 200 | self.overrideArchive() | 200 | self.overrideArchive() |
68 | 201 | 201 | ||
69 | 202 | # Check upload rights for the signer of the upload. | 202 | # Check upload rights for the signer of the upload. |
71 | 203 | self.verify_acl() | 203 | self.verify_acl(build) |
72 | 204 | 204 | ||
73 | 205 | # Perform policy checks. | 205 | # Perform policy checks. |
74 | 206 | policy.checkUpload(self) | 206 | policy.checkUpload(self) |
75 | @@ -483,7 +483,7 @@ | |||
76 | 483 | # | 483 | # |
77 | 484 | # Signature and ACL stuff | 484 | # Signature and ACL stuff |
78 | 485 | # | 485 | # |
80 | 486 | def verify_acl(self): | 486 | def verify_acl(self, build=None): |
81 | 487 | """Check the signer's upload rights. | 487 | """Check the signer's upload rights. |
82 | 488 | 488 | ||
83 | 489 | The signer must have permission to upload to either the component | 489 | The signer must have permission to upload to either the component |
84 | @@ -498,10 +498,13 @@ | |||
85 | 498 | if self.binaryful: | 498 | if self.binaryful: |
86 | 499 | return | 499 | return |
87 | 500 | 500 | ||
92 | 501 | # Set up some convenient shortcut variables. | 501 | # The build can have an explicit uploader, which may be different |
93 | 502 | 502 | # from the changes file signer. (i.e in case of daily source package | |
94 | 503 | uploader = self.policy.getUploader(self.changes) | 503 | # builds) |
95 | 504 | archive = self.policy.archive | 504 | if build is not None: |
96 | 505 | uploader = build.getUploader(self.changes) | ||
97 | 506 | else: | ||
98 | 507 | uploader = self.changes.signer | ||
99 | 505 | 508 | ||
100 | 506 | # If we have no signer, there's no ACL we can apply. | 509 | # If we have no signer, there's no ACL we can apply. |
101 | 507 | if uploader is None: | 510 | if uploader is None: |
102 | @@ -511,7 +514,7 @@ | |||
103 | 511 | source_name = getUtility( | 514 | source_name = getUtility( |
104 | 512 | ISourcePackageNameSet).queryByName(self.changes.dsc.package) | 515 | ISourcePackageNameSet).queryByName(self.changes.dsc.package) |
105 | 513 | 516 | ||
107 | 514 | rejection_reason = archive.checkUpload( | 517 | rejection_reason = self.policy.archive.checkUpload( |
108 | 515 | uploader, self.policy.distroseries, source_name, | 518 | uploader, self.policy.distroseries, source_name, |
109 | 516 | self.changes.dsc.component, self.policy.pocket, not self.is_new) | 519 | self.changes.dsc.component, self.policy.pocket, not self.is_new) |
110 | 517 | 520 | ||
111 | @@ -824,7 +827,7 @@ | |||
112 | 824 | # | 827 | # |
113 | 825 | # Actually processing accepted or rejected uploads -- and mailing people | 828 | # Actually processing accepted or rejected uploads -- and mailing people |
114 | 826 | # | 829 | # |
116 | 827 | def do_accept(self, notify=True): | 830 | def do_accept(self, notify=True, build=None): |
117 | 828 | """Accept the upload into the queue. | 831 | """Accept the upload into the queue. |
118 | 829 | 832 | ||
119 | 830 | This *MAY* in extreme cases cause a database error and thus | 833 | This *MAY* in extreme cases cause a database error and thus |
120 | @@ -834,13 +837,14 @@ | |||
121 | 834 | constraint. | 837 | constraint. |
122 | 835 | 838 | ||
123 | 836 | :param notify: True to send an email, False to not send one. | 839 | :param notify: True to send an email, False to not send one. |
124 | 840 | :param build: The build associated with this upload. | ||
125 | 837 | """ | 841 | """ |
126 | 838 | if self.is_rejected: | 842 | if self.is_rejected: |
127 | 839 | self.reject("Alas, someone called do_accept when we're rejected") | 843 | self.reject("Alas, someone called do_accept when we're rejected") |
128 | 840 | self.do_reject(notify) | 844 | self.do_reject(notify) |
129 | 841 | return False | 845 | return False |
130 | 842 | try: | 846 | try: |
132 | 843 | self.storeObjectsInDatabase() | 847 | self.storeObjectsInDatabase(build=build) |
133 | 844 | 848 | ||
134 | 845 | # Send the email. | 849 | # Send the email. |
135 | 846 | # There is also a small corner case here where the DB transaction | 850 | # There is also a small corner case here where the DB transaction |
136 | @@ -923,7 +927,7 @@ | |||
137 | 923 | # | 927 | # |
138 | 924 | # Inserting stuff in the database | 928 | # Inserting stuff in the database |
139 | 925 | # | 929 | # |
141 | 926 | def storeObjectsInDatabase(self): | 930 | def storeObjectsInDatabase(self, build=None): |
142 | 927 | """Insert this nascent upload into the database.""" | 931 | """Insert this nascent upload into the database.""" |
143 | 928 | 932 | ||
144 | 929 | # Queue entries are created in the NEW state by default; at the | 933 | # Queue entries are created in the NEW state by default; at the |
145 | @@ -939,7 +943,8 @@ | |||
146 | 939 | sourcepackagerelease = None | 943 | sourcepackagerelease = None |
147 | 940 | if self.sourceful: | 944 | if self.sourceful: |
148 | 941 | assert self.changes.dsc, "Sourceful upload lacks DSC." | 945 | assert self.changes.dsc, "Sourceful upload lacks DSC." |
150 | 942 | build = self.changes.dsc.findBuild() | 946 | if build is not None: |
151 | 947 | self.changes.dsc.checkBuild(build) | ||
152 | 943 | sourcepackagerelease = self.changes.dsc.storeInDatabase(build) | 948 | sourcepackagerelease = self.changes.dsc.storeInDatabase(build) |
153 | 944 | package_upload_source = self.queue_root.addSource( | 949 | package_upload_source = self.queue_root.addSource( |
154 | 945 | sourcepackagerelease) | 950 | sourcepackagerelease) |
155 | @@ -980,11 +985,21 @@ | |||
156 | 980 | sourcepackagerelease = ( | 985 | sourcepackagerelease = ( |
157 | 981 | binary_package_file.findSourcePackageRelease()) | 986 | binary_package_file.findSourcePackageRelease()) |
158 | 982 | 987 | ||
161 | 983 | build = binary_package_file.findBuild(sourcepackagerelease) | 988 | # Find the build for this particular binary package file. |
162 | 984 | assert self.queue_root.pocket == build.pocket, ( | 989 | if build is None: |
163 | 990 | bpf_build = binary_package_file.findBuild( | ||
164 | 991 | sourcepackagerelease) | ||
165 | 992 | else: | ||
166 | 993 | bpf_build = build | ||
167 | 994 | if bpf_build.source_package_release != sourcepackagerelease: | ||
168 | 995 | raise AssertionError( | ||
169 | 996 | "Attempt to upload binaries specifying build %s, " | ||
170 | 997 | "where they don't fit." % bpf_build.id) | ||
171 | 998 | binary_package_file.checkBuild(bpf_build) | ||
172 | 999 | assert self.queue_root.pocket == bpf_build.pocket, ( | ||
173 | 985 | "Binary was not build for the claimed pocket.") | 1000 | "Binary was not build for the claimed pocket.") |
176 | 986 | binary_package_file.storeInDatabase(build) | 1001 | binary_package_file.storeInDatabase(bpf_build) |
177 | 987 | processed_builds.append(build) | 1002 | processed_builds.append(bpf_build) |
178 | 988 | 1003 | ||
179 | 989 | # Store the related builds after verifying they were built | 1004 | # Store the related builds after verifying they were built |
180 | 990 | # from the same source. | 1005 | # from the same source. |
181 | 991 | 1006 | ||
182 | === modified file 'lib/lp/archiveuploader/nascentuploadfile.py' | |||
183 | --- lib/lp/archiveuploader/nascentuploadfile.py 2010-09-02 16:28:50 +0000 | |||
184 | +++ lib/lp/archiveuploader/nascentuploadfile.py 2010-09-17 06:08:57 +0000 | |||
185 | @@ -33,6 +33,7 @@ | |||
186 | 33 | from canonical.encoding import guess as guess_encoding | 33 | from canonical.encoding import guess as guess_encoding |
187 | 34 | from canonical.launchpad.interfaces.librarian import ILibraryFileAliasSet | 34 | from canonical.launchpad.interfaces.librarian import ILibraryFileAliasSet |
188 | 35 | from canonical.librarian.utils import filechunks | 35 | from canonical.librarian.utils import filechunks |
189 | 36 | from lp.app.errors import NotFoundError | ||
190 | 36 | from lp.archiveuploader.utils import ( | 37 | from lp.archiveuploader.utils import ( |
191 | 37 | determine_source_file_type, | 38 | determine_source_file_type, |
192 | 38 | prefix_multi_line_string, | 39 | prefix_multi_line_string, |
193 | @@ -52,7 +53,6 @@ | |||
194 | 52 | PackageUploadCustomFormat, | 53 | PackageUploadCustomFormat, |
195 | 53 | PackageUploadStatus, | 54 | PackageUploadStatus, |
196 | 54 | ) | 55 | ) |
197 | 55 | from lp.soyuz.interfaces.binarypackagebuild import IBinaryPackageBuildSet | ||
198 | 56 | from lp.soyuz.interfaces.binarypackagename import IBinaryPackageNameSet | 56 | from lp.soyuz.interfaces.binarypackagename import IBinaryPackageNameSet |
199 | 57 | from lp.soyuz.interfaces.component import IComponentSet | 57 | from lp.soyuz.interfaces.component import IComponentSet |
200 | 58 | from lp.soyuz.interfaces.section import ISectionSet | 58 | from lp.soyuz.interfaces.section import ISectionSet |
201 | @@ -338,6 +338,13 @@ | |||
202 | 338 | """Return an ISection for self.section_name.""" | 338 | """Return an ISection for self.section_name.""" |
203 | 339 | return getUtility(ISectionSet)[self.section_name] | 339 | return getUtility(ISectionSet)[self.section_name] |
204 | 340 | 340 | ||
205 | 341 | def checkBuild(self, build): | ||
206 | 342 | """Check the status of the build this file is part of. | ||
207 | 343 | |||
208 | 344 | :param build: an `IPackageBuild` instance | ||
209 | 345 | """ | ||
210 | 346 | raise NotImplementedError(self.checkBuild) | ||
211 | 347 | |||
212 | 341 | def extractUserDefinedFields(self, control): | 348 | def extractUserDefinedFields(self, control): |
213 | 342 | """Extract the user defined fields out of a control file list. | 349 | """Extract the user defined fields out of a control file list. |
214 | 343 | """ | 350 | """ |
215 | @@ -381,6 +388,23 @@ | |||
216 | 381 | yield UploadError("%s: should be %s according to changes file." | 388 | yield UploadError("%s: should be %s according to changes file." |
217 | 382 | % (filename_version, version_chopped)) | 389 | % (filename_version, version_chopped)) |
218 | 383 | 390 | ||
219 | 391 | def checkBuild(self, build): | ||
220 | 392 | """See PackageUploadFile.""" | ||
221 | 393 | # The master verifies the status to confirm successful upload. | ||
222 | 394 | build.status = BuildStatus.FULLYBUILT | ||
223 | 395 | # If this upload is successful, any existing log is wrong and | ||
224 | 396 | # unuseful. | ||
225 | 397 | build.upload_log = None | ||
226 | 398 | |||
227 | 399 | # Sanity check; raise an error if the build we've been | ||
228 | 400 | # told to link to makes no sense. | ||
229 | 401 | if (build.pocket != self.policy.pocket or | ||
230 | 402 | build.distroseries != self.policy.distroseries or | ||
231 | 403 | build.archive != self.policy.archive): | ||
232 | 404 | raise UploadError( | ||
233 | 405 | "Attempt to upload source specifying " | ||
234 | 406 | "recipe build %s, where it doesn't fit." % build.id) | ||
235 | 407 | |||
236 | 384 | 408 | ||
237 | 385 | class BaseBinaryUploadFile(PackageUploadFile): | 409 | class BaseBinaryUploadFile(PackageUploadFile): |
238 | 386 | """Base methods for binary upload modeling.""" | 410 | """Base methods for binary upload modeling.""" |
239 | @@ -834,52 +858,52 @@ | |||
240 | 834 | in this case, change this build to be FULLYBUILT. | 858 | in this case, change this build to be FULLYBUILT. |
241 | 835 | - Create a new build in FULLYBUILT status. | 859 | - Create a new build in FULLYBUILT status. |
242 | 836 | 860 | ||
243 | 837 | If by any chance an inconsistent build was found this method will | ||
244 | 838 | raise UploadError resulting in a upload rejection. | ||
245 | 839 | """ | 861 | """ |
246 | 840 | build_id = getattr(self.policy.options, 'buildid', None) | ||
247 | 841 | dar = self.policy.distroseries[self.archtag] | 862 | dar = self.policy.distroseries[self.archtag] |
248 | 842 | 863 | ||
270 | 843 | if build_id is None: | 864 | # Check if there's a suitable existing build. |
271 | 844 | # Check if there's a suitable existing build. | 865 | build = sourcepackagerelease.getBuildByArch( |
272 | 845 | build = sourcepackagerelease.getBuildByArch( | 866 | dar, self.policy.archive) |
273 | 846 | dar, self.policy.archive) | 867 | if build is not None: |
253 | 847 | if build is not None: | ||
254 | 848 | build.status = BuildStatus.FULLYBUILT | ||
255 | 849 | self.logger.debug("Updating build for %s: %s" % ( | ||
256 | 850 | dar.architecturetag, build.id)) | ||
257 | 851 | else: | ||
258 | 852 | # No luck. Make one. | ||
259 | 853 | # Usually happen for security binary uploads. | ||
260 | 854 | build = sourcepackagerelease.createBuild( | ||
261 | 855 | dar, self.policy.pocket, self.policy.archive, | ||
262 | 856 | status=BuildStatus.FULLYBUILT) | ||
263 | 857 | self.logger.debug("Build %s created" % build.id) | ||
264 | 858 | else: | ||
265 | 859 | build = getUtility(IBinaryPackageBuildSet).getByBuildID(build_id) | ||
266 | 860 | self.logger.debug("Build %s found" % build.id) | ||
267 | 861 | # Ensure gathered binary is related to a FULLYBUILT build | ||
268 | 862 | # record. It will be check in slave-scanner procedure to | ||
269 | 863 | # certify that the build was processed correctly. | ||
274 | 864 | build.status = BuildStatus.FULLYBUILT | 868 | build.status = BuildStatus.FULLYBUILT |
279 | 865 | # Also purge any previous failed upload_log stored, so its | 869 | self.logger.debug("Updating build for %s: %s" % ( |
280 | 866 | # content can be garbage-collected since it's not useful | 870 | dar.architecturetag, build.id)) |
281 | 867 | # anymore. | 871 | else: |
282 | 868 | build.upload_log = None | 872 | # No luck. Make one. |
283 | 873 | # Usually happen for security binary uploads. | ||
284 | 874 | build = sourcepackagerelease.createBuild( | ||
285 | 875 | dar, self.policy.pocket, self.policy.archive, | ||
286 | 876 | status=BuildStatus.FULLYBUILT) | ||
287 | 877 | self.logger.debug("Build %s created" % build.id) | ||
288 | 878 | return build | ||
289 | 879 | |||
290 | 880 | def checkBuild(self, build): | ||
291 | 881 | """See PackageUploadFile.""" | ||
292 | 882 | try: | ||
293 | 883 | dar = self.policy.distroseries[self.archtag] | ||
294 | 884 | except NotFoundError: | ||
295 | 885 | raise UploadError( | ||
296 | 886 | "Upload to unknown architecture %s for distroseries %s" % | ||
297 | 887 | (self.archtag, self.policy.distroseries)) | ||
298 | 888 | |||
299 | 889 | # Ensure gathered binary is related to a FULLYBUILT build | ||
300 | 890 | # record. It will be check in slave-scanner procedure to | ||
301 | 891 | # certify that the build was processed correctly. | ||
302 | 892 | build.status = BuildStatus.FULLYBUILT | ||
303 | 893 | # Also purge any previous failed upload_log stored, so its | ||
304 | 894 | # content can be garbage-collected since it's not useful | ||
305 | 895 | # anymore. | ||
306 | 896 | build.upload_log = None | ||
307 | 869 | 897 | ||
308 | 870 | # Sanity check; raise an error if the build we've been | 898 | # Sanity check; raise an error if the build we've been |
313 | 871 | # told to link to makes no sense (ie. is not for the right | 899 | # told to link to makes no sense. |
314 | 872 | # source package). | 900 | if (build.pocket != self.policy.pocket or |
311 | 873 | if (build.source_package_release != sourcepackagerelease or | ||
312 | 874 | build.pocket != self.policy.pocket or | ||
315 | 875 | build.distro_arch_series != dar or | 901 | build.distro_arch_series != dar or |
316 | 876 | build.archive != self.policy.archive): | 902 | build.archive != self.policy.archive): |
317 | 877 | raise UploadError( | 903 | raise UploadError( |
318 | 878 | "Attempt to upload binaries specifying " | 904 | "Attempt to upload binaries specifying " |
319 | 879 | "build %s, where they don't fit." % build.id) | 905 | "build %s, where they don't fit." % build.id) |
320 | 880 | 906 | ||
321 | 881 | return build | ||
322 | 882 | |||
323 | 883 | def storeInDatabase(self, build): | 907 | def storeInDatabase(self, build): |
324 | 884 | """Insert this binary release and build into the database.""" | 908 | """Insert this binary release and build into the database.""" |
325 | 885 | # Reencode everything we are supplying, because old packages | 909 | # Reencode everything we are supplying, because old packages |
326 | 886 | 910 | ||
327 | === modified file 'lib/lp/archiveuploader/tests/__init__.py' | |||
328 | --- lib/lp/archiveuploader/tests/__init__.py 2010-08-26 20:08:43 +0000 | |||
329 | +++ lib/lp/archiveuploader/tests/__init__.py 2010-09-17 06:08:57 +0000 | |||
330 | @@ -64,17 +64,15 @@ | |||
331 | 64 | class MockUploadOptions: | 64 | class MockUploadOptions: |
332 | 65 | """Mock upload policy options helper""" | 65 | """Mock upload policy options helper""" |
333 | 66 | 66 | ||
335 | 67 | def __init__(self, distro='ubuntutest', distroseries=None, buildid=None): | 67 | def __init__(self, distro='ubuntutest', distroseries=None): |
336 | 68 | self.distro = distro | 68 | self.distro = distro |
337 | 69 | self.distroseries = distroseries | 69 | self.distroseries = distroseries |
343 | 70 | self.buildid = buildid | 70 | |
344 | 71 | 71 | ||
345 | 72 | 72 | def getPolicy(name='anything', distro='ubuntu', distroseries=None): | |
341 | 73 | def getPolicy(name='anything', distro='ubuntu', distroseries=None, | ||
342 | 74 | buildid=None): | ||
346 | 75 | """Build and return an Upload Policy for the given context.""" | 73 | """Build and return an Upload Policy for the given context.""" |
347 | 76 | policy = findPolicyByName(name) | 74 | policy = findPolicyByName(name) |
349 | 77 | options = MockUploadOptions(distro, distroseries, buildid) | 75 | options = MockUploadOptions(distro, distroseries) |
350 | 78 | policy.setOptions(options) | 76 | policy.setOptions(options) |
351 | 79 | return policy | 77 | return policy |
352 | 80 | 78 | ||
353 | 81 | 79 | ||
354 | === modified file 'lib/lp/archiveuploader/tests/nascentupload.txt' | |||
355 | --- lib/lp/archiveuploader/tests/nascentupload.txt 2010-08-26 15:28:34 +0000 | |||
356 | +++ lib/lp/archiveuploader/tests/nascentupload.txt 2010-09-17 06:08:57 +0000 | |||
357 | @@ -27,7 +27,7 @@ | |||
358 | 27 | ... datadir, getPolicy, mock_logger, mock_logger_quiet) | 27 | ... datadir, getPolicy, mock_logger, mock_logger_quiet) |
359 | 28 | 28 | ||
360 | 29 | >>> buildd_policy = getPolicy( | 29 | >>> buildd_policy = getPolicy( |
362 | 30 | ... name='buildd', distro='ubuntu', distroseries='hoary', buildid=1) | 30 | ... name='buildd', distro='ubuntu', distroseries='hoary') |
363 | 31 | 31 | ||
364 | 32 | >>> sync_policy = getPolicy( | 32 | >>> sync_policy = getPolicy( |
365 | 33 | ... name='sync', distro='ubuntu', distroseries='hoary') | 33 | ... name='sync', distro='ubuntu', distroseries='hoary') |
366 | @@ -216,7 +216,7 @@ | |||
367 | 216 | # Use the buildd policy as it accepts unsigned changes files and binary | 216 | # Use the buildd policy as it accepts unsigned changes files and binary |
368 | 217 | # uploads. | 217 | # uploads. |
369 | 218 | >>> modified_buildd_policy = getPolicy( | 218 | >>> modified_buildd_policy = getPolicy( |
371 | 219 | ... name='buildd', distro='ubuntu', distroseries='hoary', buildid=1) | 219 | ... name='buildd', distro='ubuntu', distroseries='hoary') |
372 | 220 | 220 | ||
373 | 221 | >>> ed_mismatched_upload = NascentUpload.from_changesfile_path( | 221 | >>> ed_mismatched_upload = NascentUpload.from_changesfile_path( |
374 | 222 | ... datadir("ed_0.2-20_i386.changes.mismatched-arch-unsigned"), | 222 | ... datadir("ed_0.2-20_i386.changes.mismatched-arch-unsigned"), |
375 | @@ -640,13 +640,12 @@ | |||
376 | 640 | the 'buildd' upload policy and the build record id. | 640 | the 'buildd' upload policy and the build record id. |
377 | 641 | 641 | ||
378 | 642 | >>> buildd_policy = getPolicy( | 642 | >>> buildd_policy = getPolicy( |
381 | 643 | ... name='buildd', distro='ubuntu', distroseries='hoary', | 643 | ... name='buildd', distro='ubuntu', distroseries='hoary') |
380 | 644 | ... buildid=multibar_build.id) | ||
382 | 645 | 644 | ||
383 | 646 | >>> multibar_bin_upload = NascentUpload.from_changesfile_path( | 645 | >>> multibar_bin_upload = NascentUpload.from_changesfile_path( |
384 | 647 | ... datadir('suite/multibar_1.0-1/multibar_1.0-1_i386.changes'), | 646 | ... datadir('suite/multibar_1.0-1/multibar_1.0-1_i386.changes'), |
385 | 648 | ... buildd_policy, mock_logger_quiet) | 647 | ... buildd_policy, mock_logger_quiet) |
387 | 649 | >>> multibar_bin_upload.process() | 648 | >>> multibar_bin_upload.process(build=multibar_build) |
388 | 650 | >>> success = multibar_bin_upload.do_accept() | 649 | >>> success = multibar_bin_upload.do_accept() |
389 | 651 | 650 | ||
390 | 652 | Now that we have successfully processed the binaries coming from a | 651 | Now that we have successfully processed the binaries coming from a |
391 | 653 | 652 | ||
392 | === modified file 'lib/lp/archiveuploader/tests/test_buildduploads.py' | |||
393 | --- lib/lp/archiveuploader/tests/test_buildduploads.py 2010-08-26 15:28:34 +0000 | |||
394 | +++ lib/lp/archiveuploader/tests/test_buildduploads.py 2010-09-17 06:08:57 +0000 | |||
395 | @@ -112,7 +112,7 @@ | |||
396 | 112 | # Store source queue item for future use. | 112 | # Store source queue item for future use. |
397 | 113 | self.source_queue = queue_item | 113 | self.source_queue = queue_item |
398 | 114 | 114 | ||
400 | 115 | def _uploadBinary(self, archtag): | 115 | def _uploadBinary(self, archtag, build): |
401 | 116 | """Upload the base binary. | 116 | """Upload the base binary. |
402 | 117 | 117 | ||
403 | 118 | Ensure it got processed and has a respective queue record. | 118 | Ensure it got processed and has a respective queue record. |
404 | @@ -121,7 +121,7 @@ | |||
405 | 121 | self._prepareUpload(self.binary_dir) | 121 | self._prepareUpload(self.binary_dir) |
406 | 122 | self.uploadprocessor.processChangesFile( | 122 | self.uploadprocessor.processChangesFile( |
407 | 123 | os.path.join(self.queue_folder, "incoming", self.binary_dir), | 123 | os.path.join(self.queue_folder, "incoming", self.binary_dir), |
409 | 124 | self.getBinaryChangesfileFor(archtag)) | 124 | self.getBinaryChangesfileFor(archtag), build=build) |
410 | 125 | queue_item = self.uploadprocessor.last_processed_upload.queue_root | 125 | queue_item = self.uploadprocessor.last_processed_upload.queue_root |
411 | 126 | self.assertTrue( | 126 | self.assertTrue( |
412 | 127 | queue_item is not None, | 127 | queue_item is not None, |
413 | @@ -205,10 +205,9 @@ | |||
414 | 205 | pubrec.datepublished = UTC_NOW | 205 | pubrec.datepublished = UTC_NOW |
415 | 206 | queue_item.setDone() | 206 | queue_item.setDone() |
416 | 207 | 207 | ||
418 | 208 | def _setupUploadProcessorForBuild(self, build_candidate): | 208 | def _setupUploadProcessorForBuild(self): |
419 | 209 | """Setup an UploadProcessor instance for a given buildd context.""" | 209 | """Setup an UploadProcessor instance for a given buildd context.""" |
420 | 210 | self.options.context = self.policy | 210 | self.options.context = self.policy |
421 | 211 | self.options.buildid = str(build_candidate.id) | ||
422 | 212 | self.uploadprocessor = self.getUploadProcessor( | 211 | self.uploadprocessor = self.getUploadProcessor( |
423 | 213 | self.layer.txn) | 212 | self.layer.txn) |
424 | 214 | 213 | ||
425 | @@ -223,8 +222,8 @@ | |||
426 | 223 | """ | 222 | """ |
427 | 224 | # Upload i386 binary. | 223 | # Upload i386 binary. |
428 | 225 | build_candidate = self._createBuild('i386') | 224 | build_candidate = self._createBuild('i386') |
431 | 226 | self._setupUploadProcessorForBuild(build_candidate) | 225 | self._setupUploadProcessorForBuild() |
432 | 227 | build_used = self._uploadBinary('i386') | 226 | build_used = self._uploadBinary('i386', build_candidate) |
433 | 228 | 227 | ||
434 | 229 | self.assertEqual(build_used.id, build_candidate.id) | 228 | self.assertEqual(build_used.id, build_candidate.id) |
435 | 230 | self.assertBuildsCreated(1) | 229 | self.assertBuildsCreated(1) |
436 | @@ -239,8 +238,8 @@ | |||
437 | 239 | 238 | ||
438 | 240 | # Upload powerpc binary | 239 | # Upload powerpc binary |
439 | 241 | build_candidate = self._createBuild('powerpc') | 240 | build_candidate = self._createBuild('powerpc') |
442 | 242 | self._setupUploadProcessorForBuild(build_candidate) | 241 | self._setupUploadProcessorForBuild() |
443 | 243 | build_used = self._uploadBinary('powerpc') | 242 | build_used = self._uploadBinary('powerpc', build_candidate) |
444 | 244 | 243 | ||
445 | 245 | self.assertEqual(build_used.id, build_candidate.id) | 244 | self.assertEqual(build_used.id, build_candidate.id) |
446 | 246 | self.assertBuildsCreated(2) | 245 | self.assertBuildsCreated(2) |
447 | 247 | 246 | ||
448 | === modified file 'lib/lp/archiveuploader/tests/test_nascentuploadfile.py' | |||
449 | --- lib/lp/archiveuploader/tests/test_nascentuploadfile.py 2010-09-03 06:06:40 +0000 | |||
450 | +++ lib/lp/archiveuploader/tests/test_nascentuploadfile.py 2010-09-17 06:08:57 +0000 | |||
451 | @@ -20,8 +20,11 @@ | |||
452 | 20 | from lp.archiveuploader.nascentuploadfile import ( | 20 | from lp.archiveuploader.nascentuploadfile import ( |
453 | 21 | CustomUploadFile, | 21 | CustomUploadFile, |
454 | 22 | DebBinaryUploadFile, | 22 | DebBinaryUploadFile, |
455 | 23 | UploadError, | ||
456 | 23 | ) | 24 | ) |
457 | 25 | from lp.registry.interfaces.pocket import PackagePublishingPocket | ||
458 | 24 | from lp.archiveuploader.tests import AbsolutelyAnythingGoesUploadPolicy | 26 | from lp.archiveuploader.tests import AbsolutelyAnythingGoesUploadPolicy |
459 | 27 | from lp.buildmaster.enums import BuildStatus | ||
460 | 25 | from lp.soyuz.enums import PackageUploadCustomFormat | 28 | from lp.soyuz.enums import PackageUploadCustomFormat |
461 | 26 | from lp.testing import TestCaseWithFactory | 29 | from lp.testing import TestCaseWithFactory |
462 | 27 | 30 | ||
463 | @@ -34,6 +37,7 @@ | |||
464 | 34 | self.logger = BufferLogger() | 37 | self.logger = BufferLogger() |
465 | 35 | self.policy = AbsolutelyAnythingGoesUploadPolicy() | 38 | self.policy = AbsolutelyAnythingGoesUploadPolicy() |
466 | 36 | self.distro = self.factory.makeDistribution() | 39 | self.distro = self.factory.makeDistribution() |
467 | 40 | self.policy.pocket = PackagePublishingPocket.RELEASE | ||
468 | 37 | self.policy.archive = self.factory.makeArchive( | 41 | self.policy.archive = self.factory.makeArchive( |
469 | 38 | distribution=self.distro) | 42 | distribution=self.distro) |
470 | 39 | 43 | ||
471 | @@ -217,6 +221,34 @@ | |||
472 | 217 | release = uploadfile.storeInDatabase(None) | 221 | release = uploadfile.storeInDatabase(None) |
473 | 218 | self.assertEquals(u"http://samba.org/~jelmer/bzr", release.homepage) | 222 | self.assertEquals(u"http://samba.org/~jelmer/bzr", release.homepage) |
474 | 219 | 223 | ||
475 | 224 | def test_checkBuild(self): | ||
476 | 225 | # checkBuild() verifies consistency with a build. | ||
477 | 226 | build = self.factory.makeSourcePackageRecipeBuild( | ||
478 | 227 | pocket=self.policy.pocket, distroseries=self.policy.distroseries, | ||
479 | 228 | archive=self.policy.archive) | ||
480 | 229 | dsc = self.getBaseDsc() | ||
481 | 230 | uploadfile = self.createDSCFile( | ||
482 | 231 | "foo.dsc", dsc, "main/net", "extra", "dulwich", "0.42", | ||
483 | 232 | self.createChangesFile("foo.changes", self.getBaseChanges())) | ||
484 | 233 | uploadfile.checkBuild(build) | ||
485 | 234 | # checkBuild() sets the build status to FULLYBUILT and | ||
486 | 235 | # removes the upload log. | ||
487 | 236 | self.assertEquals(BuildStatus.FULLYBUILT, build.status) | ||
488 | 237 | self.assertIs(None, build.upload_log) | ||
489 | 238 | |||
490 | 239 | def test_checkBuild_inconsistent(self): | ||
491 | 240 | # checkBuild() raises UploadError if inconsistencies between build | ||
492 | 241 | # and upload file are found. | ||
493 | 242 | build = self.factory.makeSourcePackageRecipeBuild( | ||
494 | 243 | pocket=self.policy.pocket, | ||
495 | 244 | distroseries=self.factory.makeDistroSeries(), | ||
496 | 245 | archive=self.policy.archive) | ||
497 | 246 | dsc = self.getBaseDsc() | ||
498 | 247 | uploadfile = self.createDSCFile( | ||
499 | 248 | "foo.dsc", dsc, "main/net", "extra", "dulwich", "0.42", | ||
500 | 249 | self.createChangesFile("foo.changes", self.getBaseChanges())) | ||
501 | 250 | self.assertRaises(UploadError, uploadfile.checkBuild, build) | ||
502 | 251 | |||
503 | 220 | 252 | ||
504 | 221 | class DebBinaryUploadFileTests(PackageUploadFileTestCase): | 253 | class DebBinaryUploadFileTests(PackageUploadFileTestCase): |
505 | 222 | """Tests for DebBinaryUploadFile.""" | 254 | """Tests for DebBinaryUploadFile.""" |
506 | @@ -326,3 +358,32 @@ | |||
507 | 326 | bpr = uploadfile.storeInDatabase(build) | 358 | bpr = uploadfile.storeInDatabase(build) |
508 | 327 | self.assertEquals( | 359 | self.assertEquals( |
509 | 328 | u"http://samba.org/~jelmer/dulwich", bpr.homepage) | 360 | u"http://samba.org/~jelmer/dulwich", bpr.homepage) |
510 | 361 | |||
511 | 362 | def test_checkBuild(self): | ||
512 | 363 | # checkBuild() verifies consistency with a build. | ||
513 | 364 | das = self.factory.makeDistroArchSeries( | ||
514 | 365 | distroseries=self.policy.distroseries, architecturetag="i386") | ||
515 | 366 | build = self.factory.makeBinaryPackageBuild( | ||
516 | 367 | distroarchseries=das, | ||
517 | 368 | archive=self.policy.archive) | ||
518 | 369 | uploadfile = self.createDebBinaryUploadFile( | ||
519 | 370 | "foo_0.42_i386.deb", "main/python", "unknown", "mypkg", "0.42", | ||
520 | 371 | None) | ||
521 | 372 | uploadfile.checkBuild(build) | ||
522 | 373 | # checkBuild() sets the build status to FULLYBUILT and | ||
523 | 374 | # removes the upload log. | ||
524 | 375 | self.assertEquals(BuildStatus.FULLYBUILT, build.status) | ||
525 | 376 | self.assertIs(None, build.upload_log) | ||
526 | 377 | |||
527 | 378 | def test_checkBuild_inconsistent(self): | ||
528 | 379 | # checkBuild() raises UploadError if inconsistencies between build | ||
529 | 380 | # and upload file are found. | ||
530 | 381 | das = self.factory.makeDistroArchSeries( | ||
531 | 382 | distroseries=self.policy.distroseries, architecturetag="amd64") | ||
532 | 383 | build = self.factory.makeBinaryPackageBuild( | ||
533 | 384 | distroarchseries=das, | ||
534 | 385 | archive=self.policy.archive) | ||
535 | 386 | uploadfile = self.createDebBinaryUploadFile( | ||
536 | 387 | "foo_0.42_i386.deb", "main/python", "unknown", "mypkg", "0.42", | ||
537 | 388 | None) | ||
538 | 389 | self.assertRaises(UploadError, uploadfile.checkBuild, build) | ||
539 | 329 | 390 | ||
540 | === modified file 'lib/lp/archiveuploader/tests/test_ppauploadprocessor.py' | |||
541 | --- lib/lp/archiveuploader/tests/test_ppauploadprocessor.py 2010-08-31 11:11:09 +0000 | |||
542 | +++ lib/lp/archiveuploader/tests/test_ppauploadprocessor.py 2010-09-17 06:08:57 +0000 | |||
543 | @@ -355,10 +355,10 @@ | |||
544 | 355 | builds = self.name16.archive.getBuildRecords(name="bar") | 355 | builds = self.name16.archive.getBuildRecords(name="bar") |
545 | 356 | [build] = builds | 356 | [build] = builds |
546 | 357 | self.options.context = 'buildd' | 357 | self.options.context = 'buildd' |
547 | 358 | self.options.buildid = build.id | ||
548 | 359 | upload_dir = self.queueUpload( | 358 | upload_dir = self.queueUpload( |
549 | 360 | "bar_1.0-1_binary_universe", "~name16/ubuntu") | 359 | "bar_1.0-1_binary_universe", "~name16/ubuntu") |
551 | 361 | self.processUpload(self.uploadprocessor, upload_dir) | 360 | self.processUpload( |
552 | 361 | self.uploadprocessor, upload_dir, build=build) | ||
553 | 362 | 362 | ||
554 | 363 | # No mails are sent for successful binary uploads. | 363 | # No mails are sent for successful binary uploads. |
555 | 364 | self.assertEqual(len(stub.test_emails), 0, | 364 | self.assertEqual(len(stub.test_emails), 0, |
556 | @@ -405,9 +405,9 @@ | |||
557 | 405 | 405 | ||
558 | 406 | # Binary upload to the just-created build record. | 406 | # Binary upload to the just-created build record. |
559 | 407 | self.options.context = 'buildd' | 407 | self.options.context = 'buildd' |
560 | 408 | self.options.buildid = build.id | ||
561 | 409 | upload_dir = self.queueUpload("bar_1.0-1_binary", "~name16/ubuntu") | 408 | upload_dir = self.queueUpload("bar_1.0-1_binary", "~name16/ubuntu") |
563 | 410 | self.processUpload(self.uploadprocessor, upload_dir) | 409 | self.processUpload( |
564 | 410 | self.uploadprocessor, upload_dir, build=build) | ||
565 | 411 | 411 | ||
566 | 412 | # The binary upload was accepted and it's waiting in the queue. | 412 | # The binary upload was accepted and it's waiting in the queue. |
567 | 413 | queue_items = self.breezy.getQueueItems( | 413 | queue_items = self.breezy.getQueueItems( |
568 | @@ -459,9 +459,9 @@ | |||
569 | 459 | 459 | ||
570 | 460 | # Binary upload to the just-created build record. | 460 | # Binary upload to the just-created build record. |
571 | 461 | self.options.context = 'buildd' | 461 | self.options.context = 'buildd' |
572 | 462 | self.options.buildid = build_bar_i386.id | ||
573 | 463 | upload_dir = self.queueUpload("bar_1.0-1_binary", "~cprov/ubuntu") | 462 | upload_dir = self.queueUpload("bar_1.0-1_binary", "~cprov/ubuntu") |
575 | 464 | self.processUpload(self.uploadprocessor, upload_dir) | 463 | self.processUpload( |
576 | 464 | self.uploadprocessor, upload_dir, build=build_bar_i386) | ||
577 | 465 | 465 | ||
578 | 466 | # The binary upload was accepted and it's waiting in the queue. | 466 | # The binary upload was accepted and it's waiting in the queue. |
579 | 467 | queue_items = self.breezy.getQueueItems( | 467 | queue_items = self.breezy.getQueueItems( |
580 | @@ -760,9 +760,9 @@ | |||
581 | 760 | builds = self.name16.archive.getBuildRecords(name='bar') | 760 | builds = self.name16.archive.getBuildRecords(name='bar') |
582 | 761 | [build] = builds | 761 | [build] = builds |
583 | 762 | self.options.context = 'buildd' | 762 | self.options.context = 'buildd' |
584 | 763 | self.options.buildid = build.id | ||
585 | 764 | upload_dir = self.queueUpload("bar_1.0-1_binary", "~name16/ubuntu") | 763 | upload_dir = self.queueUpload("bar_1.0-1_binary", "~name16/ubuntu") |
587 | 765 | self.processUpload(self.uploadprocessor, upload_dir) | 764 | self.processUpload( |
588 | 765 | self.uploadprocessor, upload_dir, build=build) | ||
589 | 766 | 766 | ||
590 | 767 | # The binary upload was accepted and it's waiting in the queue. | 767 | # The binary upload was accepted and it's waiting in the queue. |
591 | 768 | queue_items = self.breezy.getQueueItems( | 768 | queue_items = self.breezy.getQueueItems( |
592 | @@ -804,10 +804,9 @@ | |||
593 | 804 | # Binary uploads should exhibit the same behaviour: | 804 | # Binary uploads should exhibit the same behaviour: |
594 | 805 | [build] = self.name16.archive.getBuildRecords(name="bar") | 805 | [build] = self.name16.archive.getBuildRecords(name="bar") |
595 | 806 | self.options.context = 'buildd' | 806 | self.options.context = 'buildd' |
596 | 807 | self.options.buildid = build.id | ||
597 | 808 | upload_dir = self.queueUpload( | 807 | upload_dir = self.queueUpload( |
598 | 809 | "bar_1.0-1_contrib_binary", "~name16/ubuntu") | 808 | "bar_1.0-1_contrib_binary", "~name16/ubuntu") |
600 | 810 | self.processUpload(self.uploadprocessor, upload_dir) | 809 | self.processUpload(self.uploadprocessor, upload_dir, build=build) |
601 | 811 | queue_items = self.breezy.getQueueItems( | 810 | queue_items = self.breezy.getQueueItems( |
602 | 812 | status=PackageUploadStatus.ACCEPTED, name="bar", | 811 | status=PackageUploadStatus.ACCEPTED, name="bar", |
603 | 813 | version="1.0-1", exact_match=True, archive=self.name16.archive) | 812 | version="1.0-1", exact_match=True, archive=self.name16.archive) |
604 | @@ -1306,14 +1305,14 @@ | |||
605 | 1306 | builds = self.name16.archive.getBuildRecords(name='bar') | 1305 | builds = self.name16.archive.getBuildRecords(name='bar') |
606 | 1307 | [build] = builds | 1306 | [build] = builds |
607 | 1308 | self.options.context = 'buildd' | 1307 | self.options.context = 'buildd' |
608 | 1309 | self.options.buildid = build.id | ||
609 | 1310 | 1308 | ||
610 | 1311 | # Stuff 1024 MiB in name16 PPA, so anything will be above the | 1309 | # Stuff 1024 MiB in name16 PPA, so anything will be above the |
611 | 1312 | # default quota limit, 1024 MiB. | 1310 | # default quota limit, 1024 MiB. |
612 | 1313 | self._fillArchive(self.name16.archive, 1024 * (2 ** 20)) | 1311 | self._fillArchive(self.name16.archive, 1024 * (2 ** 20)) |
613 | 1314 | 1312 | ||
614 | 1315 | upload_dir = self.queueUpload("bar_1.0-1_binary", "~name16/ubuntu") | 1313 | upload_dir = self.queueUpload("bar_1.0-1_binary", "~name16/ubuntu") |
616 | 1316 | self.processUpload(self.uploadprocessor, upload_dir) | 1314 | self.processUpload( |
617 | 1315 | self.uploadprocessor, upload_dir, build=build) | ||
618 | 1317 | 1316 | ||
619 | 1318 | # The binary upload was accepted, and it's waiting in the queue. | 1317 | # The binary upload was accepted, and it's waiting in the queue. |
620 | 1319 | queue_items = self.breezy.getQueueItems( | 1318 | queue_items = self.breezy.getQueueItems( |
621 | 1320 | 1319 | ||
622 | === modified file 'lib/lp/archiveuploader/tests/test_recipeuploads.py' | |||
623 | --- lib/lp/archiveuploader/tests/test_recipeuploads.py 2010-08-27 11:19:54 +0000 | |||
624 | +++ lib/lp/archiveuploader/tests/test_recipeuploads.py 2010-09-17 06:08:57 +0000 | |||
625 | @@ -10,6 +10,9 @@ | |||
626 | 10 | from storm.store import Store | 10 | from storm.store import Store |
627 | 11 | from zope.component import getUtility | 11 | from zope.component import getUtility |
628 | 12 | 12 | ||
629 | 13 | from lp.archiveuploader.uploadprocessor import ( | ||
630 | 14 | UploadStatusEnum, | ||
631 | 15 | ) | ||
632 | 13 | from lp.archiveuploader.tests.test_uploadprocessor import ( | 16 | from lp.archiveuploader.tests.test_uploadprocessor import ( |
633 | 14 | TestUploadProcessorBase, | 17 | TestUploadProcessorBase, |
634 | 15 | ) | 18 | ) |
635 | @@ -17,7 +20,6 @@ | |||
636 | 17 | from lp.code.interfaces.sourcepackagerecipebuild import ( | 20 | from lp.code.interfaces.sourcepackagerecipebuild import ( |
637 | 18 | ISourcePackageRecipeBuildSource, | 21 | ISourcePackageRecipeBuildSource, |
638 | 19 | ) | 22 | ) |
639 | 20 | from lp.soyuz.enums import PackageUploadStatus | ||
640 | 21 | 23 | ||
641 | 22 | 24 | ||
642 | 23 | class TestSourcePackageRecipeBuildUploads(TestUploadProcessorBase): | 25 | class TestSourcePackageRecipeBuildUploads(TestUploadProcessorBase): |
643 | @@ -40,8 +42,7 @@ | |||
644 | 40 | requester=self.recipe.owner) | 42 | requester=self.recipe.owner) |
645 | 41 | 43 | ||
646 | 42 | Store.of(self.build).flush() | 44 | Store.of(self.build).flush() |
649 | 43 | self.options.context = 'recipe' | 45 | self.options.context = 'buildd' |
648 | 44 | self.options.buildid = self.build.id | ||
650 | 45 | 46 | ||
651 | 46 | self.uploadprocessor = self.getUploadProcessor( | 47 | self.uploadprocessor = self.getUploadProcessor( |
652 | 47 | self.layer.txn) | 48 | self.layer.txn) |
653 | @@ -54,19 +55,14 @@ | |||
654 | 54 | self.assertIs(None, self.build.source_package_release) | 55 | self.assertIs(None, self.build.source_package_release) |
655 | 55 | self.assertEqual(False, self.build.verifySuccessfulUpload()) | 56 | self.assertEqual(False, self.build.verifySuccessfulUpload()) |
656 | 56 | self.queueUpload('bar_1.0-1', '%d/ubuntu' % self.build.archive.id) | 57 | self.queueUpload('bar_1.0-1', '%d/ubuntu' % self.build.archive.id) |
658 | 57 | self.uploadprocessor.processChangesFile( | 58 | result = self.uploadprocessor.processChangesFile( |
659 | 58 | os.path.join(self.queue_folder, "incoming", 'bar_1.0-1'), | 59 | os.path.join(self.queue_folder, "incoming", 'bar_1.0-1'), |
661 | 59 | '%d/ubuntu/bar_1.0-1_source.changes' % self.build.archive.id) | 60 | '%d/ubuntu/bar_1.0-1_source.changes' % self.build.archive.id, |
662 | 61 | build=self.build) | ||
663 | 60 | self.layer.txn.commit() | 62 | self.layer.txn.commit() |
664 | 61 | 63 | ||
668 | 62 | queue_item = self.uploadprocessor.last_processed_upload.queue_root | 64 | self.assertEquals(UploadStatusEnum.ACCEPTED, result, |
666 | 63 | self.assertTrue( | ||
667 | 64 | queue_item is not None, | ||
669 | 65 | "Source upload failed\nGot: %s" % "\n".join(self.log.lines)) | 65 | "Source upload failed\nGot: %s" % "\n".join(self.log.lines)) |
670 | 66 | 66 | ||
671 | 67 | self.assertEqual(PackageUploadStatus.DONE, queue_item.status) | ||
672 | 68 | spr = queue_item.sources[0].sourcepackagerelease | ||
673 | 69 | self.assertEqual(self.build, spr.source_package_recipe_build) | ||
674 | 70 | self.assertEqual(spr, self.build.source_package_release) | ||
675 | 71 | self.assertEqual(BuildStatus.FULLYBUILT, self.build.status) | 67 | self.assertEqual(BuildStatus.FULLYBUILT, self.build.status) |
676 | 72 | self.assertEqual(True, self.build.verifySuccessfulUpload()) | 68 | self.assertEqual(True, self.build.verifySuccessfulUpload()) |
677 | 73 | 69 | ||
678 | === modified file 'lib/lp/archiveuploader/tests/test_uploadprocessor.py' | |||
679 | --- lib/lp/archiveuploader/tests/test_uploadprocessor.py 2010-09-17 06:08:54 +0000 | |||
680 | +++ lib/lp/archiveuploader/tests/test_uploadprocessor.py 2010-09-17 06:08:57 +0000 | |||
681 | @@ -18,6 +18,7 @@ | |||
682 | 18 | import tempfile | 18 | import tempfile |
683 | 19 | import traceback | 19 | import traceback |
684 | 20 | 20 | ||
685 | 21 | from storm.locals import Store | ||
686 | 21 | from zope.component import ( | 22 | from zope.component import ( |
687 | 22 | getGlobalSiteManager, | 23 | getGlobalSiteManager, |
688 | 23 | getUtility, | 24 | getUtility, |
689 | @@ -153,7 +154,7 @@ | |||
690 | 153 | 154 | ||
691 | 154 | self.options = MockOptions() | 155 | self.options = MockOptions() |
692 | 155 | self.options.base_fsroot = self.queue_folder | 156 | self.options.base_fsroot = self.queue_folder |
694 | 156 | self.options.builds = True | 157 | self.options.builds = False |
695 | 157 | self.options.leafname = None | 158 | self.options.leafname = None |
696 | 158 | self.options.distro = "ubuntu" | 159 | self.options.distro = "ubuntu" |
697 | 159 | self.options.distroseries = None | 160 | self.options.distroseries = None |
698 | @@ -172,9 +173,13 @@ | |||
699 | 172 | super(TestUploadProcessorBase, self).tearDown() | 173 | super(TestUploadProcessorBase, self).tearDown() |
700 | 173 | 174 | ||
701 | 174 | def getUploadProcessor(self, txn): | 175 | def getUploadProcessor(self, txn): |
703 | 175 | def getPolicy(distro): | 176 | def getPolicy(distro, build): |
704 | 176 | self.options.distro = distro.name | 177 | self.options.distro = distro.name |
705 | 177 | policy = findPolicyByName(self.options.context) | 178 | policy = findPolicyByName(self.options.context) |
706 | 179 | if self.options.builds: | ||
707 | 180 | policy.distroseries = build.distro_series | ||
708 | 181 | policy.pocket = build.pocket | ||
709 | 182 | policy.archive = build.archive | ||
710 | 178 | policy.setOptions(self.options) | 183 | policy.setOptions(self.options) |
711 | 179 | return policy | 184 | return policy |
712 | 180 | return UploadProcessor( | 185 | return UploadProcessor( |
713 | @@ -288,7 +293,7 @@ | |||
714 | 288 | shutil.copytree(upload_dir, target_path) | 293 | shutil.copytree(upload_dir, target_path) |
715 | 289 | return os.path.join(self.incoming_folder, queue_entry) | 294 | return os.path.join(self.incoming_folder, queue_entry) |
716 | 290 | 295 | ||
718 | 291 | def processUpload(self, processor, upload_dir): | 296 | def processUpload(self, processor, upload_dir, build=None): |
719 | 292 | """Process an upload queue entry directory. | 297 | """Process an upload queue entry directory. |
720 | 293 | 298 | ||
721 | 294 | There is some duplication here with logic in UploadProcessor, | 299 | There is some duplication here with logic in UploadProcessor, |
722 | @@ -298,7 +303,8 @@ | |||
723 | 298 | results = [] | 303 | results = [] |
724 | 299 | changes_files = processor.locateChangesFiles(upload_dir) | 304 | changes_files = processor.locateChangesFiles(upload_dir) |
725 | 300 | for changes_file in changes_files: | 305 | for changes_file in changes_files: |
727 | 301 | result = processor.processChangesFile(upload_dir, changes_file) | 306 | result = processor.processChangesFile( |
728 | 307 | upload_dir, changes_file, build=build) | ||
729 | 302 | results.append(result) | 308 | results.append(result) |
730 | 303 | return results | 309 | return results |
731 | 304 | 310 | ||
732 | @@ -693,10 +699,10 @@ | |||
733 | 693 | # Upload and accept a binary for the primary archive source. | 699 | # Upload and accept a binary for the primary archive source. |
734 | 694 | shutil.rmtree(upload_dir) | 700 | shutil.rmtree(upload_dir) |
735 | 695 | self.options.context = 'buildd' | 701 | self.options.context = 'buildd' |
736 | 696 | self.options.buildid = bar_original_build.id | ||
737 | 697 | self.layer.txn.commit() | 702 | self.layer.txn.commit() |
738 | 698 | upload_dir = self.queueUpload("bar_1.0-1_binary") | 703 | upload_dir = self.queueUpload("bar_1.0-1_binary") |
740 | 699 | self.processUpload(uploadprocessor, upload_dir) | 704 | self.processUpload(uploadprocessor, upload_dir, |
741 | 705 | build=bar_original_build) | ||
742 | 700 | self.assertEqual( | 706 | self.assertEqual( |
743 | 701 | uploadprocessor.last_processed_upload.is_rejected, False) | 707 | uploadprocessor.last_processed_upload.is_rejected, False) |
744 | 702 | bar_bin_pubs = self.publishPackage('bar', '1.0-1', source=False) | 708 | bar_bin_pubs = self.publishPackage('bar', '1.0-1', source=False) |
745 | @@ -724,10 +730,10 @@ | |||
746 | 724 | 730 | ||
747 | 725 | shutil.rmtree(upload_dir) | 731 | shutil.rmtree(upload_dir) |
748 | 726 | self.options.context = 'buildd' | 732 | self.options.context = 'buildd' |
749 | 727 | self.options.buildid = bar_copied_build.id | ||
750 | 728 | upload_dir = self.queueUpload( | 733 | upload_dir = self.queueUpload( |
751 | 729 | "bar_1.0-1_binary", "%s/ubuntu" % copy_archive.id) | 734 | "bar_1.0-1_binary", "%s/ubuntu" % copy_archive.id) |
753 | 730 | self.processUpload(uploadprocessor, upload_dir) | 735 | self.processUpload(uploadprocessor, upload_dir, |
754 | 736 | build=bar_copied_build) | ||
755 | 731 | 737 | ||
756 | 732 | # Make sure the upload succeeded. | 738 | # Make sure the upload succeeded. |
757 | 733 | self.assertEqual( | 739 | self.assertEqual( |
758 | @@ -796,9 +802,9 @@ | |||
759 | 796 | [bar_original_build] = bar_source_pub.createMissingBuilds() | 802 | [bar_original_build] = bar_source_pub.createMissingBuilds() |
760 | 797 | 803 | ||
761 | 798 | self.options.context = 'buildd' | 804 | self.options.context = 'buildd' |
762 | 799 | self.options.buildid = bar_original_build.id | ||
763 | 800 | upload_dir = self.queueUpload("bar_1.0-1_binary") | 805 | upload_dir = self.queueUpload("bar_1.0-1_binary") |
765 | 801 | self.processUpload(uploadprocessor, upload_dir) | 806 | self.processUpload( |
766 | 807 | uploadprocessor, upload_dir, build=bar_original_build) | ||
767 | 802 | [bar_binary_pub] = self.publishPackage("bar", "1.0-1", source=False) | 808 | [bar_binary_pub] = self.publishPackage("bar", "1.0-1", source=False) |
768 | 803 | 809 | ||
769 | 804 | # Prepare ubuntu/breezy-autotest to build sources in i386. | 810 | # Prepare ubuntu/breezy-autotest to build sources in i386. |
770 | @@ -818,10 +824,10 @@ | |||
771 | 818 | # Re-upload the same 'bar-1.0-1' binary as if it was rebuilt | 824 | # Re-upload the same 'bar-1.0-1' binary as if it was rebuilt |
772 | 819 | # in breezy-autotest context. | 825 | # in breezy-autotest context. |
773 | 820 | shutil.rmtree(upload_dir) | 826 | shutil.rmtree(upload_dir) |
774 | 821 | self.options.buildid = bar_copied_build.id | ||
775 | 822 | self.options.distroseries = breezy_autotest.name | 827 | self.options.distroseries = breezy_autotest.name |
776 | 823 | upload_dir = self.queueUpload("bar_1.0-1_binary") | 828 | upload_dir = self.queueUpload("bar_1.0-1_binary") |
778 | 824 | self.processUpload(uploadprocessor, upload_dir) | 829 | self.processUpload(uploadprocessor, upload_dir, |
779 | 830 | build=bar_copied_build) | ||
780 | 825 | [duplicated_binary_upload] = breezy_autotest.getQueueItems( | 831 | [duplicated_binary_upload] = breezy_autotest.getQueueItems( |
781 | 826 | status=PackageUploadStatus.NEW, name='bar', | 832 | status=PackageUploadStatus.NEW, name='bar', |
782 | 827 | version='1.0-1', exact_match=True) | 833 | version='1.0-1', exact_match=True) |
783 | @@ -859,9 +865,9 @@ | |||
784 | 859 | [bar_original_build] = bar_source_pub.getBuilds() | 865 | [bar_original_build] = bar_source_pub.getBuilds() |
785 | 860 | 866 | ||
786 | 861 | self.options.context = 'buildd' | 867 | self.options.context = 'buildd' |
787 | 862 | self.options.buildid = bar_original_build.id | ||
788 | 863 | upload_dir = self.queueUpload("bar_1.0-2_binary") | 868 | upload_dir = self.queueUpload("bar_1.0-2_binary") |
790 | 864 | self.processUpload(uploadprocessor, upload_dir) | 869 | self.processUpload(uploadprocessor, upload_dir, |
791 | 870 | build=bar_original_build) | ||
792 | 865 | [bar_binary_pub] = self.publishPackage("bar", "1.0-2", source=False) | 871 | [bar_binary_pub] = self.publishPackage("bar", "1.0-2", source=False) |
793 | 866 | 872 | ||
794 | 867 | # Create a COPY archive for building in non-virtual builds. | 873 | # Create a COPY archive for building in non-virtual builds. |
795 | @@ -878,10 +884,10 @@ | |||
796 | 878 | [bar_copied_build] = bar_copied_source.createMissingBuilds() | 884 | [bar_copied_build] = bar_copied_source.createMissingBuilds() |
797 | 879 | 885 | ||
798 | 880 | shutil.rmtree(upload_dir) | 886 | shutil.rmtree(upload_dir) |
799 | 881 | self.options.buildid = bar_copied_build.id | ||
800 | 882 | upload_dir = self.queueUpload( | 887 | upload_dir = self.queueUpload( |
801 | 883 | "bar_1.0-1_binary", "%s/ubuntu" % copy_archive.id) | 888 | "bar_1.0-1_binary", "%s/ubuntu" % copy_archive.id) |
803 | 884 | self.processUpload(uploadprocessor, upload_dir) | 889 | self.processUpload(uploadprocessor, upload_dir, |
804 | 890 | build=bar_copied_build) | ||
805 | 885 | 891 | ||
806 | 886 | # The binary just uploaded is accepted because it's destined for a | 892 | # The binary just uploaded is accepted because it's destined for a |
807 | 887 | # copy archive and the PRIMARY and the COPY archives are isolated | 893 | # copy archive and the PRIMARY and the COPY archives are isolated |
808 | @@ -1034,9 +1040,9 @@ | |||
809 | 1034 | self.breezy['i386'], PackagePublishingPocket.RELEASE, | 1040 | self.breezy['i386'], PackagePublishingPocket.RELEASE, |
810 | 1035 | self.ubuntu.main_archive) | 1041 | self.ubuntu.main_archive) |
811 | 1036 | self.layer.txn.commit() | 1042 | self.layer.txn.commit() |
812 | 1037 | self.options.buildid = foocomm_build.id | ||
813 | 1038 | upload_dir = self.queueUpload("foocomm_1.0-1_binary") | 1043 | upload_dir = self.queueUpload("foocomm_1.0-1_binary") |
815 | 1039 | self.processUpload(uploadprocessor, upload_dir) | 1044 | self.processUpload( |
816 | 1045 | uploadprocessor, upload_dir, build=foocomm_build) | ||
817 | 1040 | 1046 | ||
818 | 1041 | contents = [ | 1047 | contents = [ |
819 | 1042 | "Subject: foocomm_1.0-1_i386.changes rejected", | 1048 | "Subject: foocomm_1.0-1_i386.changes rejected", |
820 | @@ -1044,10 +1050,8 @@ | |||
821 | 1044 | "where they don't fit."] | 1050 | "where they don't fit."] |
822 | 1045 | self.assertEmail(contents) | 1051 | self.assertEmail(contents) |
823 | 1046 | 1052 | ||
826 | 1047 | # Reset upload queue directory for a new upload and the | 1053 | # Reset upload queue directory for a new upload. |
825 | 1048 | # uploadprocessor buildid option. | ||
827 | 1049 | shutil.rmtree(upload_dir) | 1054 | shutil.rmtree(upload_dir) |
828 | 1050 | self.options.buildid = None | ||
829 | 1051 | 1055 | ||
830 | 1052 | # Now upload a binary package of 'foocomm', letting a new build record | 1056 | # Now upload a binary package of 'foocomm', letting a new build record |
831 | 1053 | # with appropriate data be created by the uploadprocessor. | 1057 | # with appropriate data be created by the uploadprocessor. |
832 | @@ -1881,7 +1885,7 @@ | |||
833 | 1881 | self.assertLogContains( | 1885 | self.assertLogContains( |
834 | 1882 | "Unable to find package build job with id 42. Skipping.") | 1886 | "Unable to find package build job with id 42. Skipping.") |
835 | 1883 | 1887 | ||
837 | 1884 | def testNoFiles(self): | 1888 | def testBinaryPackageBuild_fail(self): |
838 | 1885 | # If the upload directory is empty, the upload | 1889 | # If the upload directory is empty, the upload |
839 | 1886 | # will fail. | 1890 | # will fail. |
840 | 1887 | 1891 | ||
841 | @@ -1905,6 +1909,8 @@ | |||
842 | 1905 | 1909 | ||
843 | 1906 | # Upload and accept a binary for the primary archive source. | 1910 | # Upload and accept a binary for the primary archive source. |
844 | 1907 | shutil.rmtree(upload_dir) | 1911 | shutil.rmtree(upload_dir) |
845 | 1912 | |||
846 | 1913 | # Commit so the build cookie has the right ids. | ||
847 | 1908 | self.layer.txn.commit() | 1914 | self.layer.txn.commit() |
848 | 1909 | leaf_name = build.getUploadDirLeaf(build.getBuildCookie()) | 1915 | leaf_name = build.getUploadDirLeaf(build.getBuildCookie()) |
849 | 1910 | os.mkdir(os.path.join(self.incoming_folder, leaf_name)) | 1916 | os.mkdir(os.path.join(self.incoming_folder, leaf_name)) |
850 | @@ -1925,7 +1931,7 @@ | |||
851 | 1925 | self.assertTrue('DEBUG: Moving upload directory ' | 1931 | self.assertTrue('DEBUG: Moving upload directory ' |
852 | 1926 | in log_contents) | 1932 | in log_contents) |
853 | 1927 | 1933 | ||
855 | 1928 | def testSuccess(self): | 1934 | def testBinaryPackageBuilds(self): |
856 | 1929 | # Properly uploaded binaries should result in the | 1935 | # Properly uploaded binaries should result in the |
857 | 1930 | # build status changing to FULLYBUILT. | 1936 | # build status changing to FULLYBUILT. |
858 | 1931 | # Upload a source package | 1937 | # Upload a source package |
859 | @@ -1946,6 +1952,8 @@ | |||
860 | 1946 | 1952 | ||
861 | 1947 | # Upload and accept a binary for the primary archive source. | 1953 | # Upload and accept a binary for the primary archive source. |
862 | 1948 | shutil.rmtree(upload_dir) | 1954 | shutil.rmtree(upload_dir) |
863 | 1955 | |||
864 | 1956 | # Commit so the build cookie has the right ids. | ||
865 | 1949 | self.layer.txn.commit() | 1957 | self.layer.txn.commit() |
866 | 1950 | leaf_name = build.getUploadDirLeaf(build.getBuildCookie()) | 1958 | leaf_name = build.getUploadDirLeaf(build.getBuildCookie()) |
867 | 1951 | upload_dir = self.queueUpload("bar_1.0-1_binary", | 1959 | upload_dir = self.queueUpload("bar_1.0-1_binary", |
868 | @@ -1959,13 +1967,74 @@ | |||
869 | 1959 | # No emails are sent on success | 1967 | # No emails are sent on success |
870 | 1960 | self.assertEquals(len(stub.test_emails), last_stub_mail_count) | 1968 | self.assertEquals(len(stub.test_emails), last_stub_mail_count) |
871 | 1961 | self.assertEquals(BuildStatus.FULLYBUILT, build.status) | 1969 | self.assertEquals(BuildStatus.FULLYBUILT, build.status) |
879 | 1962 | log_contents = build.upload_log.read() | 1970 | # Upon full build the upload log is unset. |
880 | 1963 | log_lines = log_contents.splitlines() | 1971 | self.assertIs(None, build.upload_log) |
881 | 1964 | self.assertTrue( | 1972 | |
882 | 1965 | 'INFO: Processing upload bar_1.0-1_i386.changes' in log_lines) | 1973 | def testSourcePackageRecipeBuild(self): |
883 | 1966 | self.assertTrue( | 1974 | # Properly uploaded source packages should result in the |
884 | 1967 | 'INFO: Committing the transaction and any mails associated with ' | 1975 | # build status changing to FULLYBUILT. |
885 | 1968 | 'this upload.' in log_lines) | 1976 | |
886 | 1977 | # Upload a source package | ||
887 | 1978 | archive = self.factory.makeArchive() | ||
888 | 1979 | archive.require_virtualized = False | ||
889 | 1980 | build = self.factory.makeSourcePackageRecipeBuild(sourcename=u"bar", | ||
890 | 1981 | distroseries=self.breezy, archive=archive, requester=archive.owner) | ||
891 | 1982 | self.assertEquals(archive.owner, build.requester) | ||
892 | 1983 | bq = self.factory.makeSourcePackageRecipeBuildJob(recipe_build=build) | ||
893 | 1984 | # Commit so the build cookie has the right ids. | ||
894 | 1985 | self.layer.txn.commit() | ||
895 | 1986 | leaf_name = build.getUploadDirLeaf(build.getBuildCookie()) | ||
896 | 1987 | relative_path = "~%s/%s/%s/%s" % ( | ||
897 | 1988 | archive.owner.name, archive.name, self.breezy.distribution.name, | ||
898 | 1989 | self.breezy.name) | ||
899 | 1990 | upload_dir = self.queueUpload( | ||
900 | 1991 | "bar_1.0-1", queue_entry=leaf_name, relative_path=relative_path) | ||
901 | 1992 | self.options.context = 'buildd' | ||
902 | 1993 | self.options.builds = True | ||
903 | 1994 | build.jobStarted() | ||
904 | 1995 | # Commit so date_started is recorded and doesn't cause constraint | ||
905 | 1996 | # violations later. | ||
906 | 1997 | build.status = BuildStatus.UPLOADING | ||
907 | 1998 | Store.of(build).flush() | ||
908 | 1999 | self.uploadprocessor.processBuildUpload( | ||
909 | 2000 | self.incoming_folder, leaf_name) | ||
910 | 2001 | self.layer.txn.commit() | ||
911 | 2002 | |||
912 | 2003 | self.assertEquals(BuildStatus.FULLYBUILT, build.status) | ||
913 | 2004 | self.assertEquals(None, build.builder) | ||
914 | 2005 | self.assertIsNot(None, build.date_finished) | ||
915 | 2006 | self.assertIsNot(None, build.duration) | ||
916 | 2007 | # Upon full build the upload log is unset. | ||
917 | 2008 | self.assertIs(None, build.upload_log) | ||
918 | 2009 | |||
919 | 2010 | def testSourcePackageRecipeBuild_fail(self): | ||
920 | 2011 | # A source package recipe build will fail if no files are present. | ||
921 | 2012 | |||
922 | 2013 | # Upload a source package | ||
923 | 2014 | archive = self.factory.makeArchive() | ||
924 | 2015 | archive.require_virtualized = False | ||
925 | 2016 | build = self.factory.makeSourcePackageRecipeBuild(sourcename=u"bar", | ||
926 | 2017 | distroseries=self.breezy, archive=archive) | ||
927 | 2018 | bq = self.factory.makeSourcePackageRecipeBuildJob(recipe_build=build) | ||
928 | 2019 | # Commit so the build cookie has the right ids. | ||
929 | 2020 | Store.of(build).flush() | ||
930 | 2021 | leaf_name = build.getUploadDirLeaf(build.getBuildCookie()) | ||
931 | 2022 | os.mkdir(os.path.join(self.incoming_folder, leaf_name)) | ||
932 | 2023 | self.options.context = 'buildd' | ||
933 | 2024 | self.options.builds = True | ||
934 | 2025 | build.jobStarted() | ||
935 | 2026 | # Commit so date_started is recorded and doesn't cause constraint | ||
936 | 2027 | # violations later. | ||
937 | 2028 | Store.of(build).flush() | ||
938 | 2029 | build.status = BuildStatus.UPLOADING | ||
939 | 2030 | self.uploadprocessor.processBuildUpload( | ||
940 | 2031 | self.incoming_folder, leaf_name) | ||
941 | 2032 | self.layer.txn.commit() | ||
942 | 2033 | self.assertEquals(BuildStatus.FAILEDTOUPLOAD, build.status) | ||
943 | 2034 | self.assertEquals(None, build.builder) | ||
944 | 2035 | self.assertIsNot(None, build.date_finished) | ||
945 | 2036 | self.assertIsNot(None, build.duration) | ||
946 | 2037 | self.assertIsNot(None, build.upload_log) | ||
947 | 1969 | 2038 | ||
948 | 1970 | 2039 | ||
949 | 1971 | class ParseBuildUploadLeafNameTests(TestCase): | 2040 | class ParseBuildUploadLeafNameTests(TestCase): |
950 | 1972 | 2041 | ||
951 | === modified file 'lib/lp/archiveuploader/tests/uploadpolicy.txt' | |||
952 | --- lib/lp/archiveuploader/tests/uploadpolicy.txt 2010-08-18 14:03:15 +0000 | |||
953 | +++ lib/lp/archiveuploader/tests/uploadpolicy.txt 2010-09-17 06:08:57 +0000 | |||
954 | @@ -53,23 +53,16 @@ | |||
955 | 53 | ... distro = 'ubuntu' | 53 | ... distro = 'ubuntu' |
956 | 54 | ... distroseries = None | 54 | ... distroseries = None |
957 | 55 | >>> class MockOptions(MockAbstractOptions): | 55 | >>> class MockOptions(MockAbstractOptions): |
959 | 56 | ... buildid = 1 | 56 | ... builds = True |
960 | 57 | 57 | ||
961 | 58 | >>> ab_opts = MockAbstractOptions() | 58 | >>> ab_opts = MockAbstractOptions() |
962 | 59 | >>> bd_opts = MockOptions() | 59 | >>> bd_opts = MockOptions() |
963 | 60 | 60 | ||
964 | 61 | >>> insecure_policy.setOptions(ab_opts) | 61 | >>> insecure_policy.setOptions(ab_opts) |
965 | 62 | >>> insecure_policy.options is ab_opts | ||
966 | 63 | True | ||
967 | 64 | >>> insecure_policy.distro.name | 62 | >>> insecure_policy.distro.name |
968 | 65 | u'ubuntu' | 63 | u'ubuntu' |
969 | 66 | >>> buildd_policy.setOptions(ab_opts) | 64 | >>> buildd_policy.setOptions(ab_opts) |
970 | 67 | Traceback (most recent call last): | ||
971 | 68 | ... | ||
972 | 69 | UploadPolicyError: BuildID required for buildd context | ||
973 | 70 | >>> buildd_policy.setOptions(bd_opts) | 65 | >>> buildd_policy.setOptions(bd_opts) |
974 | 71 | >>> buildd_policy.options is bd_opts | ||
975 | 72 | True | ||
976 | 73 | >>> buildd_policy.distro.name | 66 | >>> buildd_policy.distro.name |
977 | 74 | u'ubuntu' | 67 | u'ubuntu' |
978 | 75 | 68 | ||
979 | 76 | 69 | ||
980 | === modified file 'lib/lp/archiveuploader/uploadpolicy.py' | |||
981 | --- lib/lp/archiveuploader/uploadpolicy.py 2010-08-25 13:04:14 +0000 | |||
982 | +++ lib/lp/archiveuploader/uploadpolicy.py 2010-09-17 06:08:57 +0000 | |||
983 | @@ -11,7 +11,6 @@ | |||
984 | 11 | "BuildDaemonUploadPolicy", | 11 | "BuildDaemonUploadPolicy", |
985 | 12 | "findPolicyByName", | 12 | "findPolicyByName", |
986 | 13 | "IArchiveUploadPolicy", | 13 | "IArchiveUploadPolicy", |
987 | 14 | "SOURCE_PACKAGE_RECIPE_UPLOAD_POLICY_NAME", | ||
988 | 15 | "UploadPolicyError", | 14 | "UploadPolicyError", |
989 | 16 | ] | 15 | ] |
990 | 17 | 16 | ||
991 | @@ -34,8 +33,6 @@ | |||
992 | 34 | from lazr.enum import EnumeratedType, Item | 33 | from lazr.enum import EnumeratedType, Item |
993 | 35 | 34 | ||
994 | 36 | 35 | ||
995 | 37 | # Defined here so that uploadpolicy.py doesn't depend on lp.code. | ||
996 | 38 | SOURCE_PACKAGE_RECIPE_UPLOAD_POLICY_NAME = 'recipe' | ||
997 | 39 | # Number of seconds in an hour (used later) | 36 | # Number of seconds in an hour (used later) |
998 | 40 | HOURS = 3600 | 37 | HOURS = 3600 |
999 | 41 | 38 | ||
1000 | @@ -128,13 +125,8 @@ | |||
1001 | 128 | raise AssertionError( | 125 | raise AssertionError( |
1002 | 129 | "Upload is not sourceful, binaryful or mixed.") | 126 | "Upload is not sourceful, binaryful or mixed.") |
1003 | 130 | 127 | ||
1004 | 131 | def getUploader(self, changes): | ||
1005 | 132 | """Get the person who is doing the uploading.""" | ||
1006 | 133 | return changes.signer | ||
1007 | 134 | |||
1008 | 135 | def setOptions(self, options): | 128 | def setOptions(self, options): |
1009 | 136 | """Store the options for later.""" | 129 | """Store the options for later.""" |
1010 | 137 | self.options = options | ||
1011 | 138 | # Extract and locate the distribution though... | 130 | # Extract and locate the distribution though... |
1012 | 139 | self.distro = getUtility(IDistributionSet)[options.distro] | 131 | self.distro = getUtility(IDistributionSet)[options.distro] |
1013 | 140 | if options.distroseries is not None: | 132 | if options.distroseries is not None: |
1014 | @@ -324,7 +316,6 @@ | |||
1015 | 324 | """The build daemon upload policy is invoked by the slave scanner.""" | 316 | """The build daemon upload policy is invoked by the slave scanner.""" |
1016 | 325 | 317 | ||
1017 | 326 | name = 'buildd' | 318 | name = 'buildd' |
1018 | 327 | accepted_type = ArchiveUploadType.BINARY_ONLY | ||
1019 | 328 | 319 | ||
1020 | 329 | def __init__(self): | 320 | def __init__(self): |
1021 | 330 | super(BuildDaemonUploadPolicy, self).__init__() | 321 | super(BuildDaemonUploadPolicy, self).__init__() |
1022 | @@ -333,11 +324,9 @@ | |||
1023 | 333 | self.unsigned_dsc_ok = True | 324 | self.unsigned_dsc_ok = True |
1024 | 334 | 325 | ||
1025 | 335 | def setOptions(self, options): | 326 | def setOptions(self, options): |
1031 | 336 | AbstractUploadPolicy.setOptions(self, options) | 327 | """Store the options for later.""" |
1032 | 337 | # We require a buildid to be provided | 328 | super(BuildDaemonUploadPolicy, self).setOptions(options) |
1033 | 338 | if (getattr(options, 'buildid', None) is None and | 329 | options.builds = True |
1029 | 339 | not getattr(options, 'builds', False)): | ||
1030 | 340 | raise UploadPolicyError("BuildID required for buildd context") | ||
1034 | 341 | 330 | ||
1035 | 342 | def policySpecificChecks(self, upload): | 331 | def policySpecificChecks(self, upload): |
1036 | 343 | """The buildd policy should enforce that the buildid matches.""" | 332 | """The buildd policy should enforce that the buildid matches.""" |
1037 | @@ -349,6 +338,15 @@ | |||
1038 | 349 | """Buildd policy allows PPA upload.""" | 338 | """Buildd policy allows PPA upload.""" |
1039 | 350 | return False | 339 | return False |
1040 | 351 | 340 | ||
1041 | 341 | def validateUploadType(self, upload): | ||
1042 | 342 | if upload.sourceful and upload.binaryful: | ||
1043 | 343 | if self.accepted_type != ArchiveUploadType.MIXED_ONLY: | ||
1044 | 344 | upload.reject( | ||
1045 | 345 | "Source/binary (i.e. mixed) uploads are not allowed.") | ||
1046 | 346 | elif not upload.sourceful and not upload.binaryful: | ||
1047 | 347 | raise AssertionError( | ||
1048 | 348 | "Upload is not sourceful, binaryful or mixed.") | ||
1049 | 349 | |||
1050 | 352 | 350 | ||
1051 | 353 | class SyncUploadPolicy(AbstractUploadPolicy): | 351 | class SyncUploadPolicy(AbstractUploadPolicy): |
1052 | 354 | """This policy is invoked when processing sync uploads.""" | 352 | """This policy is invoked when processing sync uploads.""" |
1053 | 355 | 353 | ||
1054 | === modified file 'lib/lp/archiveuploader/uploadprocessor.py' | |||
1055 | --- lib/lp/archiveuploader/uploadprocessor.py 2010-09-17 06:08:54 +0000 | |||
1056 | +++ lib/lp/archiveuploader/uploadprocessor.py 2010-09-17 06:08:57 +0000 | |||
1057 | @@ -71,7 +71,6 @@ | |||
1058 | 71 | ) | 71 | ) |
1059 | 72 | from lp.archiveuploader.uploadpolicy import ( | 72 | from lp.archiveuploader.uploadpolicy import ( |
1060 | 73 | BuildDaemonUploadPolicy, | 73 | BuildDaemonUploadPolicy, |
1061 | 74 | SOURCE_PACKAGE_RECIPE_UPLOAD_POLICY_NAME, | ||
1062 | 75 | UploadPolicyError, | 74 | UploadPolicyError, |
1063 | 76 | ) | 75 | ) |
1064 | 77 | from lp.buildmaster.enums import ( | 76 | from lp.buildmaster.enums import ( |
1065 | @@ -207,6 +206,7 @@ | |||
1066 | 207 | The name of the leaf is the build id of the build. | 206 | The name of the leaf is the build id of the build. |
1067 | 208 | Build uploads always contain a single package per leaf. | 207 | Build uploads always contain a single package per leaf. |
1068 | 209 | """ | 208 | """ |
1069 | 209 | upload_path = os.path.join(fsroot, upload) | ||
1070 | 210 | try: | 210 | try: |
1071 | 211 | job_id = parse_build_upload_leaf_name(upload) | 211 | job_id = parse_build_upload_leaf_name(upload) |
1072 | 212 | except ValueError: | 212 | except ValueError: |
1073 | @@ -220,20 +220,20 @@ | |||
1074 | 220 | "Unable to find package build job with id %d. Skipping." % | 220 | "Unable to find package build job with id %d. Skipping." % |
1075 | 221 | job_id) | 221 | job_id) |
1076 | 222 | return | 222 | return |
1077 | 223 | logger = BufferLogger() | ||
1078 | 223 | build = buildfarm_job.getSpecificJob() | 224 | build = buildfarm_job.getSpecificJob() |
1079 | 224 | if build.status != BuildStatus.UPLOADING: | 225 | if build.status != BuildStatus.UPLOADING: |
1080 | 225 | self.log.warn( | 226 | self.log.warn( |
1083 | 226 | "Expected build status to be 'UPLOADING', was %s. Skipping.", | 227 | "Expected build status to be 'UPLOADING', was %s. " |
1084 | 227 | build.status.name) | 228 | "Moving to failed.", build.status.name) |
1085 | 229 | self.moveProcessedUpload(upload_path, "failed", logger) | ||
1086 | 228 | return | 230 | return |
1087 | 229 | self.log.debug("Build %s found" % build.id) | 231 | self.log.debug("Build %s found" % build.id) |
1088 | 230 | logger = BufferLogger() | ||
1089 | 231 | upload_path = os.path.join(fsroot, upload) | ||
1090 | 232 | try: | 232 | try: |
1091 | 233 | [changes_file] = self.locateChangesFiles(upload_path) | 233 | [changes_file] = self.locateChangesFiles(upload_path) |
1092 | 234 | logger.debug("Considering changefile %s" % changes_file) | 234 | logger.debug("Considering changefile %s" % changes_file) |
1093 | 235 | result = self.processChangesFile( | 235 | result = self.processChangesFile( |
1095 | 236 | upload_path, changes_file, logger) | 236 | upload_path, changes_file, logger, build) |
1096 | 237 | except (KeyboardInterrupt, SystemExit): | 237 | except (KeyboardInterrupt, SystemExit): |
1097 | 238 | raise | 238 | raise |
1098 | 239 | except: | 239 | except: |
1099 | @@ -251,16 +251,13 @@ | |||
1100 | 251 | UploadStatusEnum.REJECTED: "rejected", | 251 | UploadStatusEnum.REJECTED: "rejected", |
1101 | 252 | UploadStatusEnum.ACCEPTED: "accepted"}[result] | 252 | UploadStatusEnum.ACCEPTED: "accepted"}[result] |
1102 | 253 | self.moveProcessedUpload(upload_path, destination, logger) | 253 | self.moveProcessedUpload(upload_path, destination, logger) |
1103 | 254 | build.date_finished = datetime.datetime.now(pytz.UTC) | ||
1104 | 254 | if not (result == UploadStatusEnum.ACCEPTED and | 255 | if not (result == UploadStatusEnum.ACCEPTED and |
1105 | 255 | build.verifySuccessfulUpload() and | 256 | build.verifySuccessfulUpload() and |
1106 | 256 | build.status == BuildStatus.FULLYBUILT): | 257 | build.status == BuildStatus.FULLYBUILT): |
1107 | 257 | build.status = BuildStatus.FAILEDTOUPLOAD | 258 | build.status = BuildStatus.FAILEDTOUPLOAD |
1108 | 258 | build.date_finished = datetime.datetime.now(pytz.UTC) | ||
1109 | 259 | build.notify(extra_info="Uploading build %s failed." % upload) | 259 | build.notify(extra_info="Uploading build %s failed." % upload) |
1114 | 260 | build.storeUploadLog(logger.buffer.getvalue()) | 260 | build.storeUploadLog(logger.buffer.getvalue()) |
1111 | 261 | |||
1112 | 262 | # Remove BuildQueue record. | ||
1113 | 263 | build.buildqueue_record.destroySelf() | ||
1115 | 264 | 261 | ||
1116 | 265 | def processUpload(self, fsroot, upload): | 262 | def processUpload(self, fsroot, upload): |
1117 | 266 | """Process an upload's changes files, and move it to a new directory. | 263 | """Process an upload's changes files, and move it to a new directory. |
1118 | @@ -376,7 +373,8 @@ | |||
1119 | 376 | os.path.join(relative_path, filename)) | 373 | os.path.join(relative_path, filename)) |
1120 | 377 | return self.orderFilenames(changes_files) | 374 | return self.orderFilenames(changes_files) |
1121 | 378 | 375 | ||
1123 | 379 | def processChangesFile(self, upload_path, changes_file, logger=None): | 376 | def processChangesFile(self, upload_path, changes_file, logger=None, |
1124 | 377 | build=None): | ||
1125 | 380 | """Process a single changes file. | 378 | """Process a single changes file. |
1126 | 381 | 379 | ||
1127 | 382 | This is done by obtaining the appropriate upload policy (according | 380 | This is done by obtaining the appropriate upload policy (according |
1128 | @@ -432,7 +430,7 @@ | |||
1129 | 432 | "https://help.launchpad.net/Packaging/PPA#Uploading " | 430 | "https://help.launchpad.net/Packaging/PPA#Uploading " |
1130 | 433 | "and update your configuration."))) | 431 | "and update your configuration."))) |
1131 | 434 | logger.debug("Finding fresh policy") | 432 | logger.debug("Finding fresh policy") |
1133 | 435 | policy = self._getPolicyForDistro(distribution) | 433 | policy = self._getPolicyForDistro(distribution, build) |
1134 | 436 | policy.archive = archive | 434 | policy.archive = archive |
1135 | 437 | 435 | ||
1136 | 438 | # DistroSeries overriding respect the following precedence: | 436 | # DistroSeries overriding respect the following precedence: |
1137 | @@ -450,10 +448,8 @@ | |||
1138 | 450 | 448 | ||
1139 | 451 | # Reject source upload to buildd upload paths. | 449 | # Reject source upload to buildd upload paths. |
1140 | 452 | first_path = relative_path.split(os.path.sep)[0] | 450 | first_path = relative_path.split(os.path.sep)[0] |
1145 | 453 | is_not_buildd_nor_recipe_policy = policy.name not in [ | 451 | if (first_path.isdigit() and |
1146 | 454 | SOURCE_PACKAGE_RECIPE_UPLOAD_POLICY_NAME, | 452 | policy.name != BuildDaemonUploadPolicy.name): |
1143 | 455 | BuildDaemonUploadPolicy.name] | ||
1144 | 456 | if first_path.isdigit() and is_not_buildd_nor_recipe_policy: | ||
1147 | 457 | error_message = ( | 453 | error_message = ( |
1148 | 458 | "Invalid upload path (%s) for this policy (%s)" % | 454 | "Invalid upload path (%s) for this policy (%s)" % |
1149 | 459 | (relative_path, policy.name)) | 455 | (relative_path, policy.name)) |
1150 | @@ -472,7 +468,7 @@ | |||
1151 | 472 | result = UploadStatusEnum.ACCEPTED | 468 | result = UploadStatusEnum.ACCEPTED |
1152 | 473 | 469 | ||
1153 | 474 | try: | 470 | try: |
1155 | 475 | upload.process() | 471 | upload.process(build) |
1156 | 476 | except UploadPolicyError, e: | 472 | except UploadPolicyError, e: |
1157 | 477 | upload.reject("UploadPolicyError escaped upload.process: " | 473 | upload.reject("UploadPolicyError escaped upload.process: " |
1158 | 478 | "%s " % e) | 474 | "%s " % e) |
1159 | @@ -513,7 +509,8 @@ | |||
1160 | 513 | upload.do_reject(notify) | 509 | upload.do_reject(notify) |
1161 | 514 | self.ztm.abort() | 510 | self.ztm.abort() |
1162 | 515 | else: | 511 | else: |
1164 | 516 | successful = upload.do_accept(notify=notify) | 512 | successful = upload.do_accept( |
1165 | 513 | notify=notify, build=build) | ||
1166 | 517 | if not successful: | 514 | if not successful: |
1167 | 518 | result = UploadStatusEnum.REJECTED | 515 | result = UploadStatusEnum.REJECTED |
1168 | 519 | logger.info( | 516 | logger.info( |
1169 | 520 | 517 | ||
1170 | === modified file 'lib/lp/buildmaster/interfaces/packagebuild.py' | |||
1171 | --- lib/lp/buildmaster/interfaces/packagebuild.py 2010-09-17 06:08:54 +0000 | |||
1172 | +++ lib/lp/buildmaster/interfaces/packagebuild.py 2010-09-17 06:08:57 +0000 | |||
1173 | @@ -71,10 +71,6 @@ | |||
1174 | 71 | title=_('Build farm job'), schema=IBuildFarmJob, required=True, | 71 | title=_('Build farm job'), schema=IBuildFarmJob, required=True, |
1175 | 72 | readonly=True, description=_('The base build farm job.')) | 72 | readonly=True, description=_('The base build farm job.')) |
1176 | 73 | 73 | ||
1177 | 74 | policy_name = TextLine( | ||
1178 | 75 | title=_("Policy name"), required=True, | ||
1179 | 76 | description=_("The upload policy to use for handling these builds.")) | ||
1180 | 77 | |||
1181 | 78 | current_component = Attribute( | 74 | current_component = Attribute( |
1182 | 79 | 'Component where the source related to this build was last ' | 75 | 'Component where the source related to this build was last ' |
1183 | 80 | 'published.') | 76 | 'published.') |
1184 | @@ -149,6 +145,14 @@ | |||
1185 | 149 | created in a suspended state. | 145 | created in a suspended state. |
1186 | 150 | """ | 146 | """ |
1187 | 151 | 147 | ||
1188 | 148 | def getUploader(changes): | ||
1189 | 149 | """Return the person responsible for the upload. | ||
1190 | 150 | |||
1191 | 151 | This is used to when checking permissions. | ||
1192 | 152 | |||
1193 | 153 | :param changes: Changes file from the upload. | ||
1194 | 154 | """ | ||
1195 | 155 | |||
1196 | 152 | 156 | ||
1197 | 153 | class IPackageBuildSource(Interface): | 157 | class IPackageBuildSource(Interface): |
1198 | 154 | """A utility of this interface used to create _things_.""" | 158 | """A utility of this interface used to create _things_.""" |
1199 | 155 | 159 | ||
1200 | === modified file 'lib/lp/buildmaster/model/packagebuild.py' | |||
1201 | --- lib/lp/buildmaster/model/packagebuild.py 2010-09-17 06:08:54 +0000 | |||
1202 | +++ lib/lp/buildmaster/model/packagebuild.py 2010-09-17 06:08:57 +0000 | |||
1203 | @@ -94,8 +94,6 @@ | |||
1204 | 94 | build_farm_job_id = Int(name='build_farm_job', allow_none=False) | 94 | build_farm_job_id = Int(name='build_farm_job', allow_none=False) |
1205 | 95 | build_farm_job = Reference(build_farm_job_id, 'BuildFarmJob.id') | 95 | build_farm_job = Reference(build_farm_job_id, 'BuildFarmJob.id') |
1206 | 96 | 96 | ||
1207 | 97 | policy_name = 'buildd' | ||
1208 | 98 | |||
1209 | 99 | # The following two properties are part of the IPackageBuild | 97 | # The following two properties are part of the IPackageBuild |
1210 | 100 | # interface, but need to be provided by derived classes. | 98 | # interface, but need to be provided by derived classes. |
1211 | 101 | distribution = None | 99 | distribution = None |
1212 | @@ -239,6 +237,10 @@ | |||
1213 | 239 | """See `IPackageBuild`.""" | 237 | """See `IPackageBuild`.""" |
1214 | 240 | raise NotImplementedError | 238 | raise NotImplementedError |
1215 | 241 | 239 | ||
1216 | 240 | def getUploader(self, changes): | ||
1217 | 241 | """See `IPackageBuild`.""" | ||
1218 | 242 | raise NotImplementedError | ||
1219 | 243 | |||
1220 | 242 | 244 | ||
1221 | 243 | class PackageBuildDerived: | 245 | class PackageBuildDerived: |
1222 | 244 | """Setup the delegation for package build. | 246 | """Setup the delegation for package build. |
1223 | @@ -352,6 +354,10 @@ | |||
1224 | 352 | if not os.path.exists(target_dir): | 354 | if not os.path.exists(target_dir): |
1225 | 353 | os.mkdir(target_dir) | 355 | os.mkdir(target_dir) |
1226 | 354 | 356 | ||
1227 | 357 | # Flush so there are no race conditions with archiveuploader about | ||
1228 | 358 | # self.status. | ||
1229 | 359 | Store.of(self).flush() | ||
1230 | 360 | |||
1231 | 355 | # Move the directory used to grab the binaries into | 361 | # Move the directory used to grab the binaries into |
1232 | 356 | # the incoming directory so the upload processor never | 362 | # the incoming directory so the upload processor never |
1233 | 357 | # sees half-finished uploads. | 363 | # sees half-finished uploads. |
1234 | @@ -360,6 +366,9 @@ | |||
1235 | 360 | # Release the builder for another job. | 366 | # Release the builder for another job. |
1236 | 361 | self.buildqueue_record.builder.cleanSlave() | 367 | self.buildqueue_record.builder.cleanSlave() |
1237 | 362 | 368 | ||
1238 | 369 | # Remove BuildQueue record. | ||
1239 | 370 | self.buildqueue_record.destroySelf() | ||
1240 | 371 | |||
1241 | 363 | def _handleStatus_PACKAGEFAIL(self, librarian, slave_status, logger): | 372 | def _handleStatus_PACKAGEFAIL(self, librarian, slave_status, logger): |
1242 | 364 | """Handle a package that had failed to build. | 373 | """Handle a package that had failed to build. |
1243 | 365 | 374 | ||
1244 | 366 | 375 | ||
1245 | === modified file 'lib/lp/buildmaster/tests/test_packagebuild.py' | |||
1246 | --- lib/lp/buildmaster/tests/test_packagebuild.py 2010-09-17 06:08:54 +0000 | |||
1247 | +++ lib/lp/buildmaster/tests/test_packagebuild.py 2010-09-17 06:08:57 +0000 | |||
1248 | @@ -105,7 +105,6 @@ | |||
1249 | 105 | 105 | ||
1250 | 106 | def test_default_values(self): | 106 | def test_default_values(self): |
1251 | 107 | # PackageBuild has a number of default values. | 107 | # PackageBuild has a number of default values. |
1252 | 108 | self.failUnlessEqual('buildd', self.package_build.policy_name) | ||
1253 | 109 | self.failUnlessEqual( | 108 | self.failUnlessEqual( |
1254 | 110 | 'multiverse', self.package_build.current_component.name) | 109 | 'multiverse', self.package_build.current_component.name) |
1255 | 111 | self.failUnlessEqual(None, self.package_build.distribution) | 110 | self.failUnlessEqual(None, self.package_build.distribution) |
1256 | @@ -327,6 +326,7 @@ | |||
1257 | 327 | }) | 326 | }) |
1258 | 328 | self.assertEqual(BuildStatus.FAILEDTOUPLOAD, self.build.status) | 327 | self.assertEqual(BuildStatus.FAILEDTOUPLOAD, self.build.status) |
1259 | 329 | self.assertResultCount(0, "failed") | 328 | self.assertResultCount(0, "failed") |
1260 | 329 | self.assertIs(None, self.build.buildqueue_record) | ||
1261 | 330 | 330 | ||
1262 | 331 | def test_handleStatus_OK_relative_filepath(self): | 331 | def test_handleStatus_OK_relative_filepath(self): |
1263 | 332 | # A filemap that tries to write to files outside of | 332 | # A filemap that tries to write to files outside of |
1264 | 333 | 333 | ||
1265 | === modified file 'lib/lp/code/configure.zcml' | |||
1266 | --- lib/lp/code/configure.zcml 2010-09-13 04:56:29 +0000 | |||
1267 | +++ lib/lp/code/configure.zcml 2010-09-17 06:08:57 +0000 | |||
1268 | @@ -923,7 +923,7 @@ | |||
1269 | 923 | <require permission="launchpad.View" interface="lp.code.interfaces.sourcepackagerecipebuild.ISourcePackageRecipeBuild"/> | 923 | <require permission="launchpad.View" interface="lp.code.interfaces.sourcepackagerecipebuild.ISourcePackageRecipeBuild"/> |
1270 | 924 | <!-- This is needed for UploadProcessor to run. The permission isn't | 924 | <!-- This is needed for UploadProcessor to run. The permission isn't |
1271 | 925 | important; launchpad.Edit isn't actually held by anybody. --> | 925 | important; launchpad.Edit isn't actually held by anybody. --> |
1273 | 926 | <require permission="launchpad.Edit" set_attributes="status upload_log" /> | 926 | <require permission="launchpad.Edit" set_attributes="status upload_log date_finished requester" /> |
1274 | 927 | </class> | 927 | </class> |
1275 | 928 | 928 | ||
1276 | 929 | <securedutility | 929 | <securedutility |
1277 | @@ -988,10 +988,6 @@ | |||
1278 | 988 | name="RECIPEBRANCHBUILD" | 988 | name="RECIPEBRANCHBUILD" |
1279 | 989 | provides="lp.buildmaster.interfaces.buildfarmjob.IBuildFarmJob"/> | 989 | provides="lp.buildmaster.interfaces.buildfarmjob.IBuildFarmJob"/> |
1280 | 990 | 990 | ||
1281 | 991 | <call | ||
1282 | 992 | callable="lp.code.model.sourcepackagerecipebuild.register_archive_upload_policy_adapter" | ||
1283 | 993 | /> | ||
1284 | 994 | |||
1285 | 995 | <webservice:register module="lp.code.interfaces.webservice" /> | 991 | <webservice:register module="lp.code.interfaces.webservice" /> |
1286 | 996 | <adapter | 992 | <adapter |
1287 | 997 | provides="lp.buildmaster.interfaces.buildfarmjob.ISpecificBuildFarmJob" | 993 | provides="lp.buildmaster.interfaces.buildfarmjob.ISpecificBuildFarmJob" |
1288 | 998 | 994 | ||
1289 | === modified file 'lib/lp/code/model/sourcepackagerecipebuild.py' | |||
1290 | --- lib/lp/code/model/sourcepackagerecipebuild.py 2010-09-09 17:02:33 +0000 | |||
1291 | +++ lib/lp/code/model/sourcepackagerecipebuild.py 2010-09-17 06:08:57 +0000 | |||
1292 | @@ -22,7 +22,6 @@ | |||
1293 | 22 | ) | 22 | ) |
1294 | 23 | from storm.store import Store | 23 | from storm.store import Store |
1295 | 24 | from zope.component import ( | 24 | from zope.component import ( |
1296 | 25 | getGlobalSiteManager, | ||
1297 | 26 | getUtility, | 25 | getUtility, |
1298 | 27 | ) | 26 | ) |
1299 | 28 | from zope.interface import ( | 27 | from zope.interface import ( |
1300 | @@ -39,12 +38,6 @@ | |||
1301 | 39 | ) | 38 | ) |
1302 | 40 | from canonical.launchpad.webapp import errorlog | 39 | from canonical.launchpad.webapp import errorlog |
1303 | 41 | from lp.app.errors import NotFoundError | 40 | from lp.app.errors import NotFoundError |
1304 | 42 | from lp.archiveuploader.uploadpolicy import ( | ||
1305 | 43 | ArchiveUploadType, | ||
1306 | 44 | BuildDaemonUploadPolicy, | ||
1307 | 45 | IArchiveUploadPolicy, | ||
1308 | 46 | SOURCE_PACKAGE_RECIPE_UPLOAD_POLICY_NAME, | ||
1309 | 47 | ) | ||
1310 | 48 | from lp.buildmaster.enums import ( | 41 | from lp.buildmaster.enums import ( |
1311 | 49 | BuildFarmJobType, | 42 | BuildFarmJobType, |
1312 | 50 | BuildStatus, | 43 | BuildStatus, |
1313 | @@ -77,25 +70,10 @@ | |||
1314 | 77 | from lp.soyuz.model.sourcepackagerelease import SourcePackageRelease | 70 | from lp.soyuz.model.sourcepackagerelease import SourcePackageRelease |
1315 | 78 | 71 | ||
1316 | 79 | 72 | ||
1317 | 80 | class SourcePackageRecipeUploadPolicy(BuildDaemonUploadPolicy): | ||
1318 | 81 | """Policy for uploading the results of a source package recipe build.""" | ||
1319 | 82 | |||
1320 | 83 | name = SOURCE_PACKAGE_RECIPE_UPLOAD_POLICY_NAME | ||
1321 | 84 | accepted_type = ArchiveUploadType.SOURCE_ONLY | ||
1322 | 85 | |||
1323 | 86 | def getUploader(self, changes): | ||
1324 | 87 | """Return the person doing the upload.""" | ||
1325 | 88 | build_id = int(getattr(self.options, 'buildid')) | ||
1326 | 89 | sprb = getUtility(ISourcePackageRecipeBuildSource).getById(build_id) | ||
1327 | 90 | return sprb.requester | ||
1328 | 91 | |||
1329 | 92 | |||
1330 | 93 | class SourcePackageRecipeBuild(PackageBuildDerived, Storm): | 73 | class SourcePackageRecipeBuild(PackageBuildDerived, Storm): |
1331 | 94 | 74 | ||
1332 | 95 | __storm_table__ = 'SourcePackageRecipeBuild' | 75 | __storm_table__ = 'SourcePackageRecipeBuild' |
1333 | 96 | 76 | ||
1334 | 97 | policy_name = SourcePackageRecipeUploadPolicy.name | ||
1335 | 98 | |||
1336 | 99 | implements(ISourcePackageRecipeBuild) | 77 | implements(ISourcePackageRecipeBuild) |
1337 | 100 | classProvides(ISourcePackageRecipeBuildSource) | 78 | classProvides(ISourcePackageRecipeBuildSource) |
1338 | 101 | 79 | ||
1339 | @@ -333,6 +311,10 @@ | |||
1340 | 333 | if self.status == BuildStatus.FULLYBUILT: | 311 | if self.status == BuildStatus.FULLYBUILT: |
1341 | 334 | self.notify() | 312 | self.notify() |
1342 | 335 | 313 | ||
1343 | 314 | def getUploader(self, changes): | ||
1344 | 315 | """See `IPackageBuild`.""" | ||
1345 | 316 | return self.requester | ||
1346 | 317 | |||
1347 | 336 | 318 | ||
1348 | 337 | class SourcePackageRecipeBuildJob(BuildFarmJobOldDerived, Storm): | 319 | class SourcePackageRecipeBuildJob(BuildFarmJobOldDerived, Storm): |
1349 | 338 | classProvides(ISourcePackageRecipeBuildJobSource) | 320 | classProvides(ISourcePackageRecipeBuildJobSource) |
1350 | @@ -384,13 +366,6 @@ | |||
1351 | 384 | return 2505 + self.build.archive.relative_build_score | 366 | return 2505 + self.build.archive.relative_build_score |
1352 | 385 | 367 | ||
1353 | 386 | 368 | ||
1354 | 387 | def register_archive_upload_policy_adapter(): | ||
1355 | 388 | getGlobalSiteManager().registerUtility( | ||
1356 | 389 | component=SourcePackageRecipeUploadPolicy, | ||
1357 | 390 | provided=IArchiveUploadPolicy, | ||
1358 | 391 | name=SourcePackageRecipeUploadPolicy.name) | ||
1359 | 392 | |||
1360 | 393 | |||
1361 | 394 | def get_recipe_build_for_build_farm_job(build_farm_job): | 369 | def get_recipe_build_for_build_farm_job(build_farm_job): |
1362 | 395 | """Return the SourcePackageRecipeBuild associated with a BuildFarmJob.""" | 370 | """Return the SourcePackageRecipeBuild associated with a BuildFarmJob.""" |
1363 | 396 | store = Store.of(build_farm_job) | 371 | store = Store.of(build_farm_job) |
1364 | 397 | 372 | ||
1365 | === modified file 'lib/lp/code/model/tests/test_sourcepackagerecipebuild.py' | |||
1366 | --- lib/lp/code/model/tests/test_sourcepackagerecipebuild.py 2010-09-17 06:08:54 +0000 | |||
1367 | +++ lib/lp/code/model/tests/test_sourcepackagerecipebuild.py 2010-09-17 06:08:57 +0000 | |||
1368 | @@ -309,6 +309,12 @@ | |||
1369 | 309 | job = sprb.build_farm_job.getSpecificJob() | 309 | job = sprb.build_farm_job.getSpecificJob() |
1370 | 310 | self.assertEqual(sprb, job) | 310 | self.assertEqual(sprb, job) |
1371 | 311 | 311 | ||
1372 | 312 | def test_getUploader(self): | ||
1373 | 313 | # For ACL purposes the uploader is the build requester. | ||
1374 | 314 | build = self.makeSourcePackageRecipeBuild() | ||
1375 | 315 | self.assertEquals(build.requester, | ||
1376 | 316 | build.getUploader(None)) | ||
1377 | 317 | |||
1378 | 312 | 318 | ||
1379 | 313 | class TestAsBuildmaster(TestCaseWithFactory): | 319 | class TestAsBuildmaster(TestCaseWithFactory): |
1380 | 314 | 320 | ||
1381 | 315 | 321 | ||
1382 | === modified file 'lib/lp/soyuz/doc/build-failedtoupload-workflow.txt' | |||
1383 | --- lib/lp/soyuz/doc/build-failedtoupload-workflow.txt 2010-08-04 00:16:44 +0000 | |||
1384 | +++ lib/lp/soyuz/doc/build-failedtoupload-workflow.txt 2010-09-17 06:08:57 +0000 | |||
1385 | @@ -162,8 +162,7 @@ | |||
1386 | 162 | >>> buildd_policy = getPolicy( | 162 | >>> buildd_policy = getPolicy( |
1387 | 163 | ... name='buildd', | 163 | ... name='buildd', |
1388 | 164 | ... distro=failedtoupload_candidate.distribution.name, | 164 | ... distro=failedtoupload_candidate.distribution.name, |
1391 | 165 | ... distroseries=failedtoupload_candidate.distro_series.name, | 165 | ... distroseries=failedtoupload_candidate.distro_series.name) |
1390 | 166 | ... buildid=failedtoupload_candidate.id) | ||
1392 | 167 | 166 | ||
1393 | 168 | >>> cdrkit_bin_upload = NascentUpload.from_changesfile_path( | 167 | >>> cdrkit_bin_upload = NascentUpload.from_changesfile_path( |
1394 | 169 | ... datadir('suite/cdrkit_1.0/cdrkit_1.0_i386.changes'), | 168 | ... datadir('suite/cdrkit_1.0/cdrkit_1.0_i386.changes'), |
1395 | @@ -171,7 +170,7 @@ | |||
1396 | 171 | >>> cdrkit_bin_upload.process() | 170 | >>> cdrkit_bin_upload.process() |
1397 | 172 | >>> cdrkit_bin_upload.is_rejected | 171 | >>> cdrkit_bin_upload.is_rejected |
1398 | 173 | False | 172 | False |
1400 | 174 | >>> success = cdrkit_bin_upload.do_accept() | 173 | >>> success = cdrkit_bin_upload.do_accept(build=failedtoupload_candidate) |
1401 | 175 | >>> print cdrkit_bin_upload.queue_root.status.name | 174 | >>> print cdrkit_bin_upload.queue_root.status.name |
1402 | 176 | NEW | 175 | NEW |
1403 | 177 | 176 | ||
1404 | 178 | 177 | ||
1405 | === modified file 'lib/lp/soyuz/doc/buildd-slavescanner.txt' | |||
1406 | --- lib/lp/soyuz/doc/buildd-slavescanner.txt 2010-09-17 06:08:54 +0000 | |||
1407 | +++ lib/lp/soyuz/doc/buildd-slavescanner.txt 2010-09-17 06:08:57 +0000 | |||
1408 | @@ -339,8 +339,6 @@ | |||
1409 | 339 | >>> build.status.title | 339 | >>> build.status.title |
1410 | 340 | 'Uploading build' | 340 | 'Uploading build' |
1411 | 341 | 341 | ||
1412 | 342 | >>> bqItem10.destroySelf() | ||
1413 | 343 | |||
1414 | 344 | === Successfully collected and uploaded (FULLYBUILT) === | 342 | === Successfully collected and uploaded (FULLYBUILT) === |
1415 | 345 | 343 | ||
1416 | 346 | Build item 6 has binary packages available in the sample data, letting us test | 344 | Build item 6 has binary packages available in the sample data, letting us test |
1417 | @@ -1062,7 +1060,6 @@ | |||
1418 | 1062 | True | 1060 | True |
1419 | 1063 | >>> print lfa.filename | 1061 | >>> print lfa.filename |
1420 | 1064 | buildlog_ubuntu-hoary-i386.mozilla-firefox_0.9_BUILDING.txt.gz | 1062 | buildlog_ubuntu-hoary-i386.mozilla-firefox_0.9_BUILDING.txt.gz |
1421 | 1065 | >>> candidate.destroySelf() | ||
1422 | 1066 | 1063 | ||
1423 | 1067 | The attempt to fetch the buildlog from the common librarian will fail | 1064 | The attempt to fetch the buildlog from the common librarian will fail |
1424 | 1068 | since this is a build in a private archive and the buildlog was thus | 1065 | since this is a build in a private archive and the buildlog was thus |
1425 | 1069 | 1066 | ||
1426 | === modified file 'lib/lp/soyuz/doc/distroseriesqueue-translations.txt' | |||
1427 | --- lib/lp/soyuz/doc/distroseriesqueue-translations.txt 2010-08-24 15:29:01 +0000 | |||
1428 | +++ lib/lp/soyuz/doc/distroseriesqueue-translations.txt 2010-09-17 06:08:57 +0000 | |||
1429 | @@ -74,15 +74,14 @@ | |||
1430 | 74 | ... dapper_amd64, PackagePublishingPocket.RELEASE, dapper.main_archive) | 74 | ... dapper_amd64, PackagePublishingPocket.RELEASE, dapper.main_archive) |
1431 | 75 | 75 | ||
1432 | 76 | >>> buildd_policy = getPolicy( | 76 | >>> buildd_policy = getPolicy( |
1435 | 77 | ... name='buildd', distro='ubuntu', distroseries='dapper', | 77 | ... name='buildd', distro='ubuntu', distroseries='dapper') |
1434 | 78 | ... buildid=build.id) | ||
1436 | 79 | 78 | ||
1437 | 80 | >>> pmount_upload = NascentUpload.from_changesfile_path( | 79 | >>> pmount_upload = NascentUpload.from_changesfile_path( |
1438 | 81 | ... datadir('pmount_0.9.7-2ubuntu2_amd64.changes'), | 80 | ... datadir('pmount_0.9.7-2ubuntu2_amd64.changes'), |
1439 | 82 | ... buildd_policy, mock_logger) | 81 | ... buildd_policy, mock_logger) |
1440 | 83 | DEBUG: Changes file can be unsigned. | 82 | DEBUG: Changes file can be unsigned. |
1441 | 84 | 83 | ||
1443 | 85 | >>> pmount_upload.process() | 84 | >>> pmount_upload.process(build=build) |
1444 | 86 | DEBUG: Beginning processing. | 85 | DEBUG: Beginning processing. |
1445 | 87 | DEBUG: Verifying the changes file. | 86 | DEBUG: Verifying the changes file. |
1446 | 88 | DEBUG: Verifying files in upload. | 87 | DEBUG: Verifying files in upload. |
1447 | @@ -105,9 +104,8 @@ | |||
1448 | 105 | >>> print len(dapper_pmount.getLatestTranslationsUploads()) | 104 | >>> print len(dapper_pmount.getLatestTranslationsUploads()) |
1449 | 106 | 0 | 105 | 0 |
1450 | 107 | 106 | ||
1452 | 108 | >>> success = pmount_upload.do_accept() | 107 | >>> success = pmount_upload.do_accept(build=build) |
1453 | 109 | DEBUG: Creating queue entry | 108 | DEBUG: Creating queue entry |
1454 | 110 | DEBUG: Build ... found | ||
1455 | 111 | ... | 109 | ... |
1456 | 112 | 110 | ||
1457 | 113 | # And all things worked. | 111 | # And all things worked. |
1458 | 114 | 112 | ||
1459 | === modified file 'lib/lp/soyuz/doc/soyuz-set-of-uploads.txt' | |||
1460 | --- lib/lp/soyuz/doc/soyuz-set-of-uploads.txt 2010-08-30 02:07:38 +0000 | |||
1461 | +++ lib/lp/soyuz/doc/soyuz-set-of-uploads.txt 2010-09-17 06:08:57 +0000 | |||
1462 | @@ -119,21 +119,17 @@ | |||
1463 | 119 | >>> from lp.soyuz.scripts.soyuz_process_upload import ( | 119 | >>> from lp.soyuz.scripts.soyuz_process_upload import ( |
1464 | 120 | ... ProcessUpload) | 120 | ... ProcessUpload) |
1465 | 121 | >>> from canonical.testing import LaunchpadZopelessLayer | 121 | >>> from canonical.testing import LaunchpadZopelessLayer |
1467 | 122 | >>> def process_uploads(upload_policy, build_id, series, loglevel): | 122 | >>> def process_uploads(upload_policy, series, loglevel): |
1468 | 123 | ... """Simulate process-upload.py script run. | 123 | ... """Simulate process-upload.py script run. |
1469 | 124 | ... | 124 | ... |
1470 | 125 | ... :param upload_policy: context in which to consider the upload | 125 | ... :param upload_policy: context in which to consider the upload |
1471 | 126 | ... (equivalent to script's --context option). | 126 | ... (equivalent to script's --context option). |
1472 | 127 | ... :param build_id: build to which to attach this upload. | ||
1473 | 128 | ... (equivalent to script's --buildid option). | ||
1474 | 129 | ... :param series: distro series to give back from. | 127 | ... :param series: distro series to give back from. |
1475 | 130 | ... (equivalent to script's --series option). | 128 | ... (equivalent to script's --series option). |
1476 | 131 | ... :param loglevel: logging level (as defined in logging module). Any | 129 | ... :param loglevel: logging level (as defined in logging module). Any |
1477 | 132 | ... log messages below this level will be suppressed. | 130 | ... log messages below this level will be suppressed. |
1478 | 133 | ... """ | 131 | ... """ |
1479 | 134 | ... args = [temp_dir, "-C", upload_policy] | 132 | ... args = [temp_dir, "-C", upload_policy] |
1480 | 135 | ... if build_id is not None: | ||
1481 | 136 | ... args.extend(["-b", build_id]) | ||
1482 | 137 | ... if series is not None: | 133 | ... if series is not None: |
1483 | 138 | ... args.extend(["-s", series]) | 134 | ... args.extend(["-s", series]) |
1484 | 139 | ... # Run script under 'uploader' DB user. The dbuser argument to the | 135 | ... # Run script under 'uploader' DB user. The dbuser argument to the |
1485 | @@ -230,11 +226,11 @@ | |||
1486 | 230 | >>> from lp.services.mail import stub | 226 | >>> from lp.services.mail import stub |
1487 | 231 | 227 | ||
1488 | 232 | >>> def simulate_upload( | 228 | >>> def simulate_upload( |
1490 | 233 | ... leafname, is_new=False, upload_policy='anything', build_id=None, | 229 | ... leafname, is_new=False, upload_policy='anything', |
1491 | 234 | ... series=None, distro="ubuntutest", loglevel=logging.WARN): | 230 | ... series=None, distro="ubuntutest", loglevel=logging.WARN): |
1492 | 235 | ... """Process upload(s). Options are as for process_uploads().""" | 231 | ... """Process upload(s). Options are as for process_uploads().""" |
1493 | 236 | ... punt_upload_into_queue(leafname, distro=distro) | 232 | ... punt_upload_into_queue(leafname, distro=distro) |
1495 | 237 | ... process_uploads(upload_policy, build_id, series, loglevel) | 233 | ... process_uploads(upload_policy, series, loglevel) |
1496 | 238 | ... # We seem to be leaving a lock file behind here for some reason. | 234 | ... # We seem to be leaving a lock file behind here for some reason. |
1497 | 239 | ... # Naturally it doesn't count as an unprocessed incoming file, which | 235 | ... # Naturally it doesn't count as an unprocessed incoming file, which |
1498 | 240 | ... # is what we're really looking for. | 236 | ... # is what we're really looking for. |
1499 | @@ -289,19 +285,6 @@ | |||
1500 | 289 | 285 | ||
1501 | 290 | >>> simulate_upload('bar_1.0-2') | 286 | >>> simulate_upload('bar_1.0-2') |
1502 | 291 | 287 | ||
1503 | 292 | Check the rejection of bar_1.0-2_binary when uploaded to the wrong build id. | ||
1504 | 293 | |||
1505 | 294 | >>> simulate_upload( | ||
1506 | 295 | ... 'bar_1.0-2_binary', upload_policy="buildd", build_id="2", | ||
1507 | 296 | ... loglevel=logging.ERROR) | ||
1508 | 297 | log> Exception while accepting: | ||
1509 | 298 | Attempt to upload binaries specifying build 2, where they don't fit. | ||
1510 | 299 | ... | ||
1511 | 300 | Rejected uploads: ['bar_1.0-2_binary'] | ||
1512 | 301 | |||
1513 | 302 | Try it again without the bogus build id. This succeeds without | ||
1514 | 303 | complaints. | ||
1515 | 304 | |||
1516 | 305 | >>> simulate_upload('bar_1.0-2_binary') | 288 | >>> simulate_upload('bar_1.0-2_binary') |
1517 | 306 | 289 | ||
1518 | 307 | Check the rejection of a malicious version of bar package which refers | 290 | Check the rejection of a malicious version of bar package which refers |
1519 | 308 | 291 | ||
1520 | === modified file 'lib/lp/soyuz/model/binarypackagebuild.py' | |||
1521 | --- lib/lp/soyuz/model/binarypackagebuild.py 2010-09-17 06:08:54 +0000 | |||
1522 | +++ lib/lp/soyuz/model/binarypackagebuild.py 2010-09-17 06:08:57 +0000 | |||
1523 | @@ -760,6 +760,10 @@ | |||
1524 | 760 | # package build, then don't hit the db. | 760 | # package build, then don't hit the db. |
1525 | 761 | return self | 761 | return self |
1526 | 762 | 762 | ||
1527 | 763 | def getUploader(self, changes): | ||
1528 | 764 | """See `IBinaryPackageBuild`.""" | ||
1529 | 765 | return changes.signer | ||
1530 | 766 | |||
1531 | 763 | 767 | ||
1532 | 764 | class BinaryPackageBuildSet: | 768 | class BinaryPackageBuildSet: |
1533 | 765 | implements(IBinaryPackageBuildSet) | 769 | implements(IBinaryPackageBuildSet) |
1534 | 766 | 770 | ||
1535 | === modified file 'lib/lp/soyuz/scripts/soyuz_process_upload.py' | |||
1536 | --- lib/lp/soyuz/scripts/soyuz_process_upload.py 2010-08-20 20:31:18 +0000 | |||
1537 | +++ lib/lp/soyuz/scripts/soyuz_process_upload.py 2010-09-17 06:08:57 +0000 | |||
1538 | @@ -61,11 +61,6 @@ | |||
1539 | 61 | help="Distro series to give back from.") | 61 | help="Distro series to give back from.") |
1540 | 62 | 62 | ||
1541 | 63 | self.parser.add_option( | 63 | self.parser.add_option( |
1542 | 64 | "-b", "--buildid", action="store", type="int", dest="buildid", | ||
1543 | 65 | metavar="BUILD", | ||
1544 | 66 | help="The build ID to which to attach this upload.") | ||
1545 | 67 | |||
1546 | 68 | self.parser.add_option( | ||
1547 | 69 | "-a", "--announce", action="store", dest="announcelist", | 64 | "-a", "--announce", action="store", dest="announcelist", |
1548 | 70 | metavar="ANNOUNCELIST", help="Override the announcement list") | 65 | metavar="ANNOUNCELIST", help="Override the announcement list") |
1549 | 71 | 66 | ||
1550 | @@ -82,10 +77,15 @@ | |||
1551 | 82 | "%s is not a directory" % self.options.base_fsroot) | 77 | "%s is not a directory" % self.options.base_fsroot) |
1552 | 83 | 78 | ||
1553 | 84 | self.logger.debug("Initialising connection.") | 79 | self.logger.debug("Initialising connection.") |
1555 | 85 | def getPolicy(distro): | 80 | def getPolicy(distro, build): |
1556 | 86 | self.options.distro = distro.name | 81 | self.options.distro = distro.name |
1557 | 87 | policy = findPolicyByName(self.options.context) | 82 | policy = findPolicyByName(self.options.context) |
1558 | 88 | policy.setOptions(self.options) | 83 | policy.setOptions(self.options) |
1559 | 84 | if self.options.builds: | ||
1560 | 85 | assert build, "--builds specified but no build" | ||
1561 | 86 | policy.distroseries = build.distro_series | ||
1562 | 87 | policy.pocket = build.pocket | ||
1563 | 88 | policy.archive = build.archive | ||
1564 | 89 | return policy | 89 | return policy |
1565 | 90 | processor = UploadProcessor(self.options.base_fsroot, | 90 | processor = UploadProcessor(self.options.base_fsroot, |
1566 | 91 | self.options.dryrun, self.options.nomails, self.options.builds, | 91 | self.options.dryrun, self.options.nomails, self.options.builds, |
1567 | 92 | 92 | ||
1568 | === modified file 'lib/lp/soyuz/tests/test_binarypackagebuild.py' | |||
1569 | --- lib/lp/soyuz/tests/test_binarypackagebuild.py 2010-09-09 17:02:33 +0000 | |||
1570 | +++ lib/lp/soyuz/tests/test_binarypackagebuild.py 2010-09-17 06:08:57 +0000 | |||
1571 | @@ -150,6 +150,15 @@ | |||
1572 | 150 | self.assertStatementCount( | 150 | self.assertStatementCount( |
1573 | 151 | 0, self.build.getSpecificJob) | 151 | 0, self.build.getSpecificJob) |
1574 | 152 | 152 | ||
1575 | 153 | def test_getUploader(self): | ||
1576 | 154 | # For ACL purposes the uploader is the changes file signer. | ||
1577 | 155 | |||
1578 | 156 | class MockChanges: | ||
1579 | 157 | signer = "Somebody <somebody@ubuntu.com>" | ||
1580 | 158 | |||
1581 | 159 | self.assertEquals("Somebody <somebody@ubuntu.com>", | ||
1582 | 160 | self.build.getUploader(MockChanges())) | ||
1583 | 161 | |||
1584 | 153 | 162 | ||
1585 | 154 | class TestBuildUpdateDependencies(TestCaseWithFactory): | 163 | class TestBuildUpdateDependencies(TestCaseWithFactory): |
1586 | 155 | 164 |
=== modified file 'database/ schema/ security. cfg' schema/ security. cfg 2010-09-16 00:33:37 +0000 schema/ security. cfg 2010-09-16 09:04:13 +0000 packagebuild = SELECT, INSERT, UPDATE binarypackagebu ild = SELECT, INSERT, UPDATE sourcepackagere cipebuild = SELECT, UPDATE buildpackagejob = SELECT, INSERT, UPDATE sourcepackagere cipebuildjob = SELECT, UPDATE sourcepackagere cipe = SELECT, UPDATE buildpackagejob = SELECT, INSERT, UPDATE, DELETE
--- database/
+++ database/
@@ -1130,9 +1130,12 @@
public.
public.
public.
-public.buildqueue = SELECT, INSERT, UPDATE
-public.job = SELECT, INSERT, UPDATE
-public.
+public.
+public.
+public.buildqueue = SELECT, INSERT, UPDATE, DELETE
+public.job = SELECT, INSERT, UPDATE, DELETE
+public.
+public.builder = SELECT
# Thusly the librarian libraryfilecont ent = SELECT, INSERT
public.
=== modified file 'lib/lp/ archiveuploader /nascentupload. py' archiveuploader /nascentupload. py 2010-09-15 19:38:48 +0000 archiveuploader /nascentupload. py 2010-09-16 09:05:43 +0000
--- lib/lp/
+++ lib/lp/
@@ -498,10 +498,13 @@
if self.binaryful:
return
- # Set up some convenient shortcut variables. getUploader( self.changes, build) er(self. changes)
-
- uploader = self.policy.
- archive = self.policy.archive
+ # The build can have an explicit uploader, which may be different
+ # from the changes file signer. (i.e in case of daily source package
+ # builds)
+ if build is not None:
+ uploader = build.getUpload
+ else:
+ uploader = self.changes.signer
# If we have no signer, there's no ACL we can apply.
if uploader is None:
=== modified file 'lib/lp/ archiveuploader /tests/ test_recipeuplo ads.py' archiveuploader /tests/ test_recipeuplo ads.py 2010-09-16 09:02:57 +0000 archiveuploader /tests/ test_recipeuplo ads.py 2010-09-16 09:05:48 +0000
requester =self.recipe. owner)
--- lib/lp/
+++ lib/lp/
@@ -42,7 +42,7 @@
- self.options.
+ self.options.
=== modified file 'lib/lp/ archiveuploader /tests/ test_uploadproc essor.py' archiveuploader /tests/ test_uploadproc essor.py 2010-09-16 09:02:57 +0000 archiveuploader /tests/ test_uploadproc essor.py 2010-09-16 09:05:56 +0000
self. assertLogContai ns(
" Unable to find package build job with id 42. Skipping.")
--- lib/lp/
+++ lib/lp/
@@ -1884,7 +1884,7 @@
- def testNoFiles(self): geBuild_ fail(self) :
+ def testBinaryPacka
# If the upload directory is empty, the upload
# will fail.
@@ -1908,6 +1908,8 @@
# Upload and accept a binary for the primary archive source.
shutil. rmtree( upload_ dir)
self. layer.txn. commit( ) DirLeaf( build.getBuildC o...
+
+ # Commit so the build cookie has the right ids.
leaf_name = build.getUpload