Merge lp:~michael.nelson/launchpad/443353-api-builds-from-private into lp:launchpad/db-devel
- 443353-api-builds-from-private
- Merge into db-devel
Status: | Merged |
---|---|
Approved by: | Jonathan Lange |
Approved revision: | not available |
Merged at revision: | not available |
Proposed branch: | lp:~michael.nelson/launchpad/443353-api-builds-from-private |
Merge into: | lp:launchpad/db-devel |
Diff against target: |
1274 lines (+453/-125) 43 files modified
lib/lp/archiveuploader/nascentuploadfile.py (+1/-1) lib/lp/archiveuploader/tests/nascentuploadfile.txt (+71/-0) lib/lp/bugs/templates/bug-portlet-subscribers.pt (+1/-1) lib/lp/bugs/templates/bugtarget-filebug-submit-bug.pt (+1/-1) lib/lp/bugs/templates/bugtarget-portlet-bugfilters.pt (+1/-1) lib/lp/bugs/templates/bugtarget-portlet-bugtags.pt (+1/-1) lib/lp/bugs/templates/bugtask-index.pt (+3/-3) lib/lp/bugs/templates/bugtask-tasks-and-nominations-table-row.pt (+1/-1) lib/lp/bugs/templates/bugtasks-and-nominations-table.pt (+1/-1) lib/lp/bugs/templates/official-bug-target-manage-tags.pt (+1/-1) lib/lp/code/templates/branch-import-details.pt (+1/-1) lib/lp/code/templates/branch-index.pt (+1/-1) lib/lp/code/templates/branch-listing.pt (+1/-1) lib/lp/code/templates/branch-portlet-subscribers.pt (+1/-1) lib/lp/code/templates/branch-related-bugs-specs.pt (+1/-1) lib/lp/code/templates/branchmergeproposal-generic-listing.pt (+1/-1) lib/lp/registry/templates/object-timeline-graph.pt (+1/-1) lib/lp/registry/templates/person-macros.pt (+1/-1) lib/lp/registry/templates/product-new.pt (+1/-1) lib/lp/registry/templates/productrelease-add-from-series.pt (+1/-1) lib/lp/registry/templates/teammembership-index.pt (+1/-1) lib/lp/registry/templates/timeline-macros.pt (+1/-1) lib/lp/soyuz/doc/publishing.txt (+50/-11) lib/lp/soyuz/model/publishing.py (+58/-22) lib/lp/soyuz/scripts/tests/test_copypackage.py (+16/-1) lib/lp/soyuz/stories/ppa/xx-copy-packages.txt (+1/-1) lib/lp/soyuz/stories/ppa/xx-ppa-packages.txt (+32/-10) lib/lp/soyuz/stories/webservice/xx-source-package-publishing.txt (+5/-3) lib/lp/soyuz/templates/archive-edit-dependencies.pt (+1/-1) lib/lp/soyuz/templates/archive-macros.pt (+1/-1) lib/lp/soyuz/templates/archive-packages.pt (+1/-1) lib/lp/soyuz/templates/archive-subscribers.pt (+1/-1) lib/lp/translations/browser/language.py (+8/-0) lib/lp/translations/stories/distroseries/xx-distroseries-templates.txt (+56/-13) lib/lp/translations/stories/productseries/xx-productseries-templates.txt (+27/-18) lib/lp/translations/stories/standalone/xx-language.txt (+68/-7) lib/lp/translations/templates/language-index.pt (+9/-4) lib/lp/translations/templates/object-templates.pt (+20/-3) lib/lp/translations/templates/pofile-export.pt (+1/-1) lib/lp/translations/templates/pofile-translate.pt (+1/-1) lib/lp/translations/templates/translation-import-queue-macros.pt (+1/-1) lib/lp/translations/templates/translationimportqueueentry-index.pt (+1/-1) lib/lp/translations/templates/translationmessage-translate.pt (+1/-1) |
To merge this branch: | bzr merge lp:~michael.nelson/launchpad/443353-api-builds-from-private |
Related bugs: |
Reviewer | Review Type | Date Requested | Status |
---|---|---|---|
Aaron Bentley (community) | Approve | ||
Jonathan Lange (community) | Approve | ||
Review via email: mp+14896@code.launchpad.net |
Commit message
Ensures that when source+binaries are copied from archive A to B, the corresponding builds will be returned when querying the copied source for its builds (even though they were built in the context of archive A). Fixes bug 443353.
Description of the change
Michael Nelson (michael.nelson) wrote : | # |
Jonathan Lange (jml) wrote : | # |
On Sun, Nov 15, 2009 at 2:35 PM, Michael Nelson
<email address hidden> wrote:
> Michael Nelson has proposed merging lp:~michael.nelson/launchpad/443353-api-builds-from-private into lp:launchpad.
>
> Requested reviews:
> Canonical Launchpad Engineering (launchpad)
> Related bugs:
> #443353 API does not include Builds for sources that were sync'd from private PPAs
> https:/
>
>
> Overview
> ========
>
> This branch fixes bug 443353 by ensuring that getBuildsForSou
>
> Issues
> ======
>
> There are two main issues with this branch IMO.
>
> 1. Ensuring that the result of the union was ordered correctly is hackish, and dependent on a storm implementation detail (that columns in SQL queries for a certain table are ordered alphabetically).
>
> 2. Testing the correct ordering in the doctest isn't so readable. It could be better to print out the results instead, but on the other hand, I didn't want the test to be dependent on database ids. Previously the test simply ensured the last item was the one expected - I've updated this to instead assure that the complete result is sorted as expected.
>
> Any suggestions welcome!
>
Hi Michael,
Thanks for fixing this -- on a Sunday even!
I'm generally OK with this branch, but would like another opportunity
to have a look at it and maybe to talk with you face-to-face.
jml
> Testing/QA
> ==========
>
> To test, run:
> bin/test -vvt doc/publishing.txt
>
> To QA:
>
> Visit:
> https:/
>
> and expand the intrepid 6b12-0ubuntu6.6 security release. Currently this only displays two builds, where as it should display all the builds listed at:
>
> https:/
>
>
>
>
> --
> https:/
> Your team Launchpad code reviewers from Canonical is subscribed to branch lp:launchpad.
>
> === modified file 'lib/lp/
> --- lib/lp/
> +++ lib/lp/
> @@ -1055,22 +1055,53 @@
> each build found.
>
> >>> cprov_builds.
> - 7
> + 8
>
> The `ResultSet` is ordered by ascending
> `SourcePackageP
> `DistroArchseri
>
> - >>> source_pub, build, arch = cprov_builds.last()
> -
> - >>> print source_
> - foo 666 in breezy-autotest
> -
> - >>> print build.title
> - i386 build of foo 666 in ubuntutest breezy-autotest RELEASE
> -
> - >>> print arch.displayname
> - ubuntutest Breezy Badger Autotest i386
> + # The easiest thing we can do here (without printing ids)
> + # is to show that sorting a list of the resulting ids+tags does not
> + # modify the list.
> + >>> from copy import copy
> + >>> ids_and_tags = [(pub.id, arch.architectu
> + ... for pub, build, arch in cprov_builds]
> + >>> ids...
Michael Nelson (michael.nelson) wrote : | # |
On Sun, Nov 15, 2009 at 3:24 PM, Jonathan Lange <email address hidden> wrote:
> On Sun, Nov 15, 2009 at 2:35 PM, Michael Nelson
> <email address hidden> wrote:
> > Michael Nelson has proposed merging
> lp:~michael.nelson/launchpad/443353-api-builds-from-private into
> lp:launchpad.
> >
> > Requested reviews:
> > Canonical Launchpad Engineering (launchpad)
> > Related bugs:
> > #443353 API does not include Builds for sources that were sync'd from
> private PPAs
> > https:/
> >
> >
> > Overview
> > ========
> >
> > This branch fixes bug 443353 by ensuring that getBuildsForSou
> include builds that were originally built in a different archive context,
> but have since had binaries copied into the source archive context.
> >
> > Issues
> > ======
> >
> > There are two main issues with this branch IMO.
> >
> > 1. Ensuring that the result of the union was ordered correctly is
> hackish, and dependent on a storm implementation detail (that columns in SQL
> queries for a certain table are ordered alphabetically).
> >
> > 2. Testing the correct ordering in the doctest isn't so readable. It
> could be better to print out the results instead, but on the other hand, I
> didn't want the test to be dependent on database ids. Previously the test
> simply ensured the last item was the one expected - I've updated this to
> instead assure that the complete result is sorted as expected.
> >
> > Any suggestions welcome!
> >
>
> Hi Michael,
>
> Thanks for fixing this -- on a Sunday even!
>
>
And thanks for reviewing even on a Sunday :)
> I'm generally OK with this branch, but would like another opportunity
> to have a look at it and maybe to talk with you face-to-face.
>
>
Yes, I was keen for pointers as I wasn't happy with the hack... after
talking with both Jamu and Gustavo, I think we've got a much better solution
(although it still feels it should not be necessary).
> jml
>
> > Testing/QA
> > ==========
> >
> > To test, run:
> > bin/test -vvt doc/publishing.txt
> >
> > To QA:
> >
> > Visit:
> > https:/
> >
> > and expand the intrepid 6b12-0ubuntu6.6 security release. Currently this
> only displays two builds, where as it should display all the builds listed
> at:
> >
> > https:/
> >
> >
> >
> >
> > --
> >
> https:/
> > Your team Launchpad code reviewers from Canonical is subscribed to branch
> lp:launchpad.
> >
> > === modified file 'lib/lp/
> > --- lib/lp/
> > +++ lib/lp/
> > @@ -1055,22 +1055,53 @@
> > each build found.
> >
> > >>> cprov_builds.
> > - 7
> > + 8
> >
> > The `ResultSet` is ordered by ascending
> > `SourcePackageP
> > `DistroArchseri
> >
> > - >>> source_pub, build, arch = c...
1 | === modified file 'lib/lp/soyuz/doc/publishing.txt' | |||
2 | --- lib/lp/soyuz/doc/publishing.txt 2009-11-15 20:13:09 +0000 | |||
3 | +++ lib/lp/soyuz/doc/publishing.txt 2009-11-16 03:58:10 +0000 | |||
4 | @@ -1064,24 +1064,31 @@ | |||
5 | 1064 | # The easiest thing we can do here (without printing ids) | 1064 | # The easiest thing we can do here (without printing ids) |
6 | 1065 | # is to show that sorting a list of the resulting ids+tags does not | 1065 | # is to show that sorting a list of the resulting ids+tags does not |
7 | 1066 | # modify the list. | 1066 | # modify the list. |
8 | 1067 | >>> from copy import copy | ||
9 | 1068 | >>> ids_and_tags = [(pub.id, arch.architecturetag) | 1067 | >>> ids_and_tags = [(pub.id, arch.architecturetag) |
10 | 1069 | ... for pub, build, arch in cprov_builds] | 1068 | ... for pub, build, arch in cprov_builds] |
14 | 1070 | >>> ids_and_tags_sorted = copy(ids_and_tags) | 1069 | >>> ids_and_tags == sorted(ids_and_tags) |
12 | 1071 | >>> ids_and_tags_sorted.sort() | ||
13 | 1072 | >>> ids_and_tags == ids_and_tags_sorted | ||
15 | 1073 | True | 1070 | True |
16 | 1074 | 1071 | ||
17 | 1075 | If a source package is copied from another archive (including the | 1072 | If a source package is copied from another archive (including the |
20 | 1076 | binaries), then these builds will also be included in the result (even | 1073 | binaries), then the related builds for that source package will |
21 | 1077 | though they were build in a different archive context). | 1074 | also be retrievable via the copied source publication. |
22 | 1075 | For example, if a package is built in a private security PPA, and then | ||
23 | 1076 | later copied out into the primary archive, the builds will then | ||
24 | 1077 | be available when looking at the copied source package in the primary | ||
25 | 1078 | archive. | ||
26 | 1078 | 1079 | ||
27 | 1079 | # Create a new PPA and publish a source with some builds | 1080 | # Create a new PPA and publish a source with some builds |
28 | 1080 | # and binaries. | 1081 | # and binaries. |
30 | 1081 | >>> other_ppa = factory.makeArchive() | 1082 | >>> other_ppa = factory.makeArchive(name="otherppa") |
31 | 1082 | >>> binaries = test_publisher.getPubBinaries(archive=other_ppa) | 1083 | >>> binaries = test_publisher.getPubBinaries(archive=other_ppa) |
32 | 1084 | |||
33 | 1085 | The associated builds and binaries will be created in the context of the | ||
34 | 1086 | other PPA. | ||
35 | 1087 | |||
36 | 1083 | >>> build = binaries[0].binarypackagerelease.build | 1088 | >>> build = binaries[0].binarypackagerelease.build |
37 | 1084 | >>> source_pub = build.sourcepackagerelease.publishings[0] | 1089 | >>> source_pub = build.sourcepackagerelease.publishings[0] |
38 | 1090 | >>> print build.archive.name | ||
39 | 1091 | otherppa | ||
40 | 1085 | 1092 | ||
41 | 1086 | # Copy the source into Celso's PPA, ensuring that the binaries | 1093 | # Copy the source into Celso's PPA, ensuring that the binaries |
42 | 1087 | # are alse published there. | 1094 | # are alse published there. |
43 | @@ -1092,7 +1099,8 @@ | |||
44 | 1092 | ... binaries[0].binarypackagerelease, cprov.archive) | 1099 | ... binaries[0].binarypackagerelease, cprov.archive) |
45 | 1093 | 1100 | ||
46 | 1094 | Now we will see an extra source in Celso's PPA as well as an extra | 1101 | Now we will see an extra source in Celso's PPA as well as an extra |
48 | 1095 | build - even though the build's context is not Celso's PPA. | 1102 | build - even though the build's context is not Celso's PPA. Previously |
49 | 1103 | there were 8 sources and builds. | ||
50 | 1096 | 1104 | ||
51 | 1097 | >>> cprov_sources_new = cprov.archive.getPublishedSources() | 1105 | >>> cprov_sources_new = cprov.archive.getPublishedSources() |
52 | 1098 | >>> cprov_sources_new.count() | 1106 | >>> cprov_sources_new.count() |
53 | 1099 | 1107 | ||
54 | === modified file 'lib/lp/soyuz/model/publishing.py' | |||
55 | --- lib/lp/soyuz/model/publishing.py 2009-11-15 20:41:46 +0000 | |||
56 | +++ lib/lp/soyuz/model/publishing.py 2009-11-16 05:45:03 +0000 | |||
57 | @@ -1234,8 +1234,6 @@ | |||
58 | 1234 | Build.buildstate.is_in(build_states)) | 1234 | Build.buildstate.is_in(build_states)) |
59 | 1235 | 1235 | ||
60 | 1236 | store = getUtility(IStoreSelector).get(MAIN_STORE, DEFAULT_FLAVOR) | 1236 | store = getUtility(IStoreSelector).get(MAIN_STORE, DEFAULT_FLAVOR) |
61 | 1237 | find_spec = ( | ||
62 | 1238 | SourcePackagePublishingHistory, Build, DistroArchSeries) | ||
63 | 1239 | 1237 | ||
64 | 1240 | # We'll be looking for builds in the same distroseries as the | 1238 | # We'll be looking for builds in the same distroseries as the |
65 | 1241 | # SPPH for the same release. | 1239 | # SPPH for the same release. |
66 | @@ -1251,7 +1249,7 @@ | |||
67 | 1251 | # First, we'll find the builds that were built in the same | 1249 | # First, we'll find the builds that were built in the same |
68 | 1252 | # archive context as the published sources. | 1250 | # archive context as the published sources. |
69 | 1253 | builds_in_same_archive = store.find( | 1251 | builds_in_same_archive = store.find( |
71 | 1254 | find_spec, | 1252 | Build, |
72 | 1255 | builds_for_distroseries_expr, | 1253 | builds_for_distroseries_expr, |
73 | 1256 | SourcePackagePublishingHistory.archiveID == Build.archiveID, | 1254 | SourcePackagePublishingHistory.archiveID == Build.archiveID, |
74 | 1257 | *extra_exprs) | 1255 | *extra_exprs) |
75 | @@ -1260,7 +1258,7 @@ | |||
76 | 1260 | # same archive... even though the build was not built in | 1258 | # same archive... even though the build was not built in |
77 | 1261 | # the same context archive. | 1259 | # the same context archive. |
78 | 1262 | builds_copied_into_archive = store.find( | 1260 | builds_copied_into_archive = store.find( |
80 | 1263 | find_spec, | 1261 | Build, |
81 | 1264 | builds_for_distroseries_expr, | 1262 | builds_for_distroseries_expr, |
82 | 1265 | SourcePackagePublishingHistory.archiveID != Build.archiveID, | 1263 | SourcePackagePublishingHistory.archiveID != Build.archiveID, |
83 | 1266 | BinaryPackagePublishingHistory.archive == Build.archiveID, | 1264 | BinaryPackagePublishingHistory.archive == Build.archiveID, |
84 | @@ -1269,26 +1267,29 @@ | |||
85 | 1269 | BinaryPackageRelease.build == Build.id, | 1267 | BinaryPackageRelease.build == Build.id, |
86 | 1270 | *extra_exprs) | 1268 | *extra_exprs) |
87 | 1271 | 1269 | ||
89 | 1272 | result_set = builds_copied_into_archive.union( | 1270 | builds_union = builds_copied_into_archive.union( |
90 | 1273 | builds_in_same_archive).config(distinct=True) | 1271 | builds_in_same_archive).config(distinct=True) |
91 | 1274 | 1272 | ||
109 | 1275 | # XXX 2009-11-15 Michael Nelson bug=366043. It is not possible | 1273 | # Now that we have a result_set of all the builds, we'll use it |
110 | 1276 | # to sort by the name `SourcePackagePublishingHistory.id` as after | 1274 | # as a subquery to get the required publishing and arch to do |
111 | 1277 | # the union there are no tables. Nor can we sort by ambiguous `id` | 1275 | # the ordering. We do this in this round-about way because we |
112 | 1278 | # as in this case there are 3 id columns in the result. Specifying | 1276 | # can't sort on SourcePackagePublishingHistory.id after the |
113 | 1279 | # the column index is the only option that I can find. So we're | 1277 | # union. See bug 443353 for details. |
114 | 1280 | # relying on an implementation detail of Storm that it lists | 1278 | find_spec = ( |
115 | 1281 | # columns in alphabetical order in sql queries. | 1279 | SourcePackagePublishingHistory, Build, DistroArchSeries) |
116 | 1282 | sql_columns = SourcePackagePublishingHistory._storm_columns.values() | 1280 | |
117 | 1283 | sql_column_names = [column.name for column in sql_columns] | 1281 | # Storm doesn't let us do builds_union.values('id') - |
118 | 1284 | sql_column_names.sort() | 1282 | # ('Union' object has no attribute 'columns'). So instead |
119 | 1285 | 1283 | # we have to instantiate the objects just to get the id. | |
120 | 1286 | # SQL order by uses 1-based column numbers. | 1284 | build_ids = [build.id for build in builds_union] |
121 | 1287 | source_pub_id_col_number = sql_column_names.index('id') + 1 | 1285 | |
122 | 1288 | result_set.order_by( | 1286 | result_set = store.find( |
123 | 1289 | source_pub_id_col_number, DistroArchSeries.architecturetag) | 1287 | find_spec, builds_for_distroseries_expr, |
124 | 1290 | 1288 | Build.id.is_in(build_ids)) | |
125 | 1291 | return result_set | 1289 | |
126 | 1290 | return result_set.order_by( | ||
127 | 1291 | SourcePackagePublishingHistory.id, | ||
128 | 1292 | DistroArchSeries.architecturetag) | ||
129 | 1292 | 1293 | ||
130 | 1293 | def getByIdAndArchive(self, id, archive, source=True): | 1294 | def getByIdAndArchive(self, id, archive, source=True): |
131 | 1294 | """See `IPublishingSet`.""" | 1295 | """See `IPublishingSet`.""" |
Jonathan Lange (jml) wrote : | # |
Thanks for fixing this, Michael. Land away.
Michael Nelson (michael.nelson) wrote : | # |
I've just gotten back to this now - when I originally tried to land this while at UDS there were a bunch of ec2 errors - one of which pointed out an error in the query itself.
There are two parts the failures fixed by this incremental.
* First, there was an actual error in the query (details below),
* second, a number of tests were affected by the change as we now return
builds from other archive contexts if the build has binaries published in
the current archive context (ie. by a copy)
* third, we have some bogus sample data (effectively two i386 builds for
ice-weasel 1.0 in cprov's ppa - one built in that context but without any
corresponding bpr's or bpph (so pending, buildid=25), the other copied and
published there with a new bpph (buildid=23). As this branch exposed this
data, I modified two tests to use a different package instead (yes, I
would like to have re-written the test to use STP etc. etc., but).
So, annotated diff below, raw diff at: http://
=== modified file 'lib/lp/
--- lib/lp/
+++ lib/lp/
@@ -594,7 +594,7 @@
# not blow up because of bad data.
return None
source, packageupload, spr, changesfile, lfc = result
-
+
# Return a webapp-proxied LibraryFileAlias so that restricted
# librarian files are accessible. Non-restricted files will get
# a 302 so that webapp threads are not tied up.
@@ -1290,7 +1290,8 @@
Build,
- BinaryPackagePu
+ BinaryPackagePu
+ SourcePackagePu
### So this was the error in the original MP - we're looking for builds that
originated in other contexts that have binaries published in the SPPH
archive context - not the Build's archive context :/
=== modified file 'lib/lp/
--- lib/lp/
+++ lib/lp/
@@ -1340,6 +1340,21 @@
+ # The second copy will fail explicitly because the new BPPH
+ # records are not yet published.
+ nothing_copied = copy_helper.
+ self.assertEqua
+ self.assertEqual(
+ copy_helper.
+ 'ERROR: foo 666 in hoary (same version has unpublished binaries '
+ 'in the destination archive for Hoary, please wait for them to '
+ 'be published before copying)')
+
+ # If we ensure that the copied b...
Aaron Bentley (abentley) wrote : | # |
I approve the incremental change. As discussed on IRC, I think it may be possible to remove the first find and use the second query to find cases where SourcePackagePu
Preview Diff
1 | === modified file 'lib/lp/archiveuploader/nascentuploadfile.py' | |||
2 | --- lib/lp/archiveuploader/nascentuploadfile.py 2009-11-10 13:09:26 +0000 | |||
3 | +++ lib/lp/archiveuploader/nascentuploadfile.py 2009-12-07 15:21:15 +0000 | |||
4 | @@ -690,7 +690,7 @@ | |||
5 | 690 | tar_checker.ancient_files[first_file]) | 690 | tar_checker.ancient_files[first_file]) |
6 | 691 | yield UploadError( | 691 | yield UploadError( |
7 | 692 | "%s: has %s file(s) with a time stamp too " | 692 | "%s: has %s file(s) with a time stamp too " |
9 | 693 | "far into the future (e.g. %s [%s])." | 693 | "far in the past (e.g. %s [%s])." |
10 | 694 | % (self.filename, len(ancient_files), first_file, | 694 | % (self.filename, len(ancient_files), first_file, |
11 | 695 | timestamp)) | 695 | timestamp)) |
12 | 696 | return | 696 | return |
13 | 697 | 697 | ||
14 | === modified file 'lib/lp/archiveuploader/tests/nascentuploadfile.txt' | |||
15 | --- lib/lp/archiveuploader/tests/nascentuploadfile.txt 2009-07-08 08:38:05 +0000 | |||
16 | +++ lib/lp/archiveuploader/tests/nascentuploadfile.txt 2009-12-07 15:21:15 +0000 | |||
17 | @@ -539,6 +539,77 @@ | |||
18 | 539 | 539 | ||
19 | 540 | == DebBinaryUploadFile == | 540 | == DebBinaryUploadFile == |
20 | 541 | 541 | ||
21 | 542 | DebBinaryUploadFile models a binary .deb file. | ||
22 | 543 | |||
23 | 544 | >>> from lp.archiveuploader.nascentuploadfile import ( | ||
24 | 545 | ... DebBinaryUploadFile) | ||
25 | 546 | >>> ed_deb_path = datadir('ed_0.2-20_i386.deb') | ||
26 | 547 | >>> ed_binary_deb = DebBinaryUploadFile(ed_deb_path, | ||
27 | 548 | ... 'e31eeb0b6b3b87e1ea79378df864ffff', | ||
28 | 549 | ... 15, 'main/editors', 'important', 'foo', '1.2', | ||
29 | 550 | ... ed_mixed_changes, modified_insecure_policy, | ||
30 | 551 | ... mock_logger_quiet) | ||
31 | 552 | |||
32 | 553 | Like the other files it can be verified: | ||
33 | 554 | |||
34 | 555 | >>> list(ed_binary_deb.verify()) | ||
35 | 556 | [] | ||
36 | 557 | |||
37 | 558 | Verification checks that the specified section matches the section in the | ||
38 | 559 | changes file: | ||
39 | 560 | |||
40 | 561 | >>> ed_binary_deb = DebBinaryUploadFile(ed_deb_path, | ||
41 | 562 | ... 'e31eeb0b6b3b87e1ea79378df864ffff', | ||
42 | 563 | ... 15, 'main/net', 'important', 'foo', '1.2', | ||
43 | 564 | ... ed_mixed_changes, modified_insecure_policy, | ||
44 | 565 | ... mock_logger_quiet) | ||
45 | 566 | >>> list(ed_binary_deb.verify()) | ||
46 | 567 | [UploadError('ed_0.2-20_i386.deb control file lists section as | ||
47 | 568 | main/editors but changes file has main/net.',)] | ||
48 | 569 | |||
49 | 570 | It also checks the priority against the changes file: | ||
50 | 571 | |||
51 | 572 | >>> ed_binary_deb = DebBinaryUploadFile(ed_deb_path, | ||
52 | 573 | ... 'e31eeb0b6b3b87e1ea79378df864ffff', | ||
53 | 574 | ... 15, 'main/editors', 'extra', 'foo', '1.2', | ||
54 | 575 | ... ed_mixed_changes, modified_insecure_policy, | ||
55 | 576 | ... mock_logger_quiet) | ||
56 | 577 | >>> list(ed_binary_deb.verify()) | ||
57 | 578 | [UploadError('ed_0.2-20_i386.deb control file lists priority as important | ||
58 | 579 | but changes file has extra.',)] | ||
59 | 580 | |||
60 | 581 | The timestamp of the files in the .deb are tested against the policy for being | ||
61 | 582 | too new: | ||
62 | 583 | |||
63 | 584 | >>> old_only_policy = getPolicy( | ||
64 | 585 | ... name='insecure', distro='ubuntu', distroseries='hoary') | ||
65 | 586 | >>> old_only_policy.can_upload_binaries = True | ||
66 | 587 | >>> old_only_policy.future_time_grace = -5 * 365 * 24 * 60 * 60 | ||
67 | 588 | |||
68 | 589 | >>> ed_binary_deb = DebBinaryUploadFile(ed_deb_path, | ||
69 | 590 | ... 'e31eeb0b6b3b87e1ea79378df864ffff', | ||
70 | 591 | ... 15, 'main/editors', 'important', 'foo', '1.2', | ||
71 | 592 | ... ed_mixed_changes, old_only_policy, | ||
72 | 593 | ... mock_logger_quiet) | ||
73 | 594 | >>> list(ed_binary_deb.verifyDebTimestamp()) | ||
74 | 595 | [UploadError('ed_0.2-20_i386.deb: has 26 file(s) with a time stamp too | ||
75 | 596 | far into the future (e.g. control [Thu Jan 3 19:29:01 2008]).',)] | ||
76 | 597 | |||
77 | 598 | ... as well as for being too old: | ||
78 | 599 | |||
79 | 600 | >>> new_only_policy = getPolicy( | ||
80 | 601 | ... name='insecure', distro='ubuntu', distroseries='hoary') | ||
81 | 602 | >>> new_only_policy.can_upload_binaries = True | ||
82 | 603 | >>> new_only_policy.earliest_year = 2010 | ||
83 | 604 | >>> ed_binary_deb = DebBinaryUploadFile(ed_deb_path, | ||
84 | 605 | ... 'e31eeb0b6b3b87e1ea79378df864ffff', | ||
85 | 606 | ... 15, 'main/editors', 'important', 'foo', '1.2', | ||
86 | 607 | ... ed_mixed_changes, new_only_policy, | ||
87 | 608 | ... mock_logger_quiet) | ||
88 | 609 | >>> list(ed_binary_deb.verify()) | ||
89 | 610 | [UploadError('ed_0.2-20_i386.deb: has 26 file(s) with a time stamp too | ||
90 | 611 | far in the past (e.g. control [Thu Jan 3 19:29:01 2008]).',)] | ||
91 | 612 | |||
92 | 542 | 613 | ||
93 | 543 | == UDebBinaryUploadFile == | 614 | == UDebBinaryUploadFile == |
94 | 544 | 615 | ||
95 | 545 | 616 | ||
96 | === modified file 'lib/lp/bugs/templates/bug-portlet-subscribers.pt' | |||
97 | --- lib/lp/bugs/templates/bug-portlet-subscribers.pt 2009-11-26 03:13:32 +0000 | |||
98 | +++ lib/lp/bugs/templates/bug-portlet-subscribers.pt 2009-12-07 15:21:15 +0000 | |||
99 | @@ -25,7 +25,7 @@ | |||
100 | 25 | <img src="/@@/spinner" /> | 25 | <img src="/@@/spinner" /> |
101 | 26 | </div> | 26 | </div> |
102 | 27 | <script type="text/javascript"> | 27 | <script type="text/javascript"> |
104 | 28 | YUI().use('io-base', 'node', 'bugs.bugtask_index', function(Y) { | 28 | LPS.use('io-base', 'node', 'bugs.bugtask_index', function(Y) { |
105 | 29 | // Must be done inline here to ensure the load event fires. | 29 | // Must be done inline here to ensure the load event fires. |
106 | 30 | // This is a work around for a YUI3 issue with event handling. | 30 | // This is a work around for a YUI3 issue with event handling. |
107 | 31 | var subscription_link = Y.one('.menu-link-subscription'); | 31 | var subscription_link = Y.one('.menu-link-subscription'); |
108 | 32 | 32 | ||
109 | === modified file 'lib/lp/bugs/templates/bugtarget-filebug-submit-bug.pt' | |||
110 | --- lib/lp/bugs/templates/bugtarget-filebug-submit-bug.pt 2009-10-01 12:09:37 +0000 | |||
111 | +++ lib/lp/bugs/templates/bugtarget-filebug-submit-bug.pt 2009-12-07 15:21:15 +0000 | |||
112 | @@ -14,7 +14,7 @@ | |||
113 | 14 | tal:define="lp_js string:${icingroot}/build" | 14 | tal:define="lp_js string:${icingroot}/build" |
114 | 15 | tal:attributes="src string:${lp_js}/bugs/filebug-dupefinder.js"></script> | 15 | tal:attributes="src string:${lp_js}/bugs/filebug-dupefinder.js"></script> |
115 | 16 | <script type="text/javascript"> | 16 | <script type="text/javascript"> |
117 | 17 | YUI().use('base', 'node', 'oop', 'event', 'bugs.dupe_finder', function(Y) { | 17 | LPS.use('base', 'node', 'oop', 'event', 'bugs.dupe_finder', function(Y) { |
118 | 18 | Y.bugs.setup_dupe_finder(); | 18 | Y.bugs.setup_dupe_finder(); |
119 | 19 | }); | 19 | }); |
120 | 20 | </script> | 20 | </script> |
121 | 21 | 21 | ||
122 | === modified file 'lib/lp/bugs/templates/bugtarget-portlet-bugfilters.pt' | |||
123 | --- lib/lp/bugs/templates/bugtarget-portlet-bugfilters.pt 2009-11-04 13:56:17 +0000 | |||
124 | +++ lib/lp/bugs/templates/bugtarget-portlet-bugfilters.pt 2009-12-07 15:21:15 +0000 | |||
125 | @@ -12,7 +12,7 @@ | |||
126 | 12 | <img src="/@@/spinner" /> | 12 | <img src="/@@/spinner" /> |
127 | 13 | </div> | 13 | </div> |
128 | 14 | <script type="text/javascript"> | 14 | <script type="text/javascript"> |
130 | 15 | YUI().use('io-base', 'node', function(Y) { | 15 | LPS.use('io-base', 'node', function(Y) { |
131 | 16 | Y.on('domready', function() { | 16 | Y.on('domready', function() { |
132 | 17 | var portlet = Y.one('#portlet-bugfilters'); | 17 | var portlet = Y.one('#portlet-bugfilters'); |
133 | 18 | Y.one('#bugfilters-portlet-spinner').setStyle('display', 'block'); | 18 | Y.one('#bugfilters-portlet-spinner').setStyle('display', 'block'); |
134 | 19 | 19 | ||
135 | === modified file 'lib/lp/bugs/templates/bugtarget-portlet-bugtags.pt' | |||
136 | --- lib/lp/bugs/templates/bugtarget-portlet-bugtags.pt 2009-11-04 13:56:17 +0000 | |||
137 | +++ lib/lp/bugs/templates/bugtarget-portlet-bugtags.pt 2009-12-07 15:21:15 +0000 | |||
138 | @@ -9,7 +9,7 @@ | |||
139 | 9 | <a id="tags-content-link" | 9 | <a id="tags-content-link" |
140 | 10 | tal:attributes="href context/fmt:url/+bugtarget-portlet-tags-content"></a> | 10 | tal:attributes="href context/fmt:url/+bugtarget-portlet-tags-content"></a> |
141 | 11 | <script type="text/javascript"> | 11 | <script type="text/javascript"> |
143 | 12 | YUI().use('io-base', 'node', function(Y) { | 12 | LPS.use('io-base', 'node', function(Y) { |
144 | 13 | Y.on('domready', function() { | 13 | Y.on('domready', function() { |
145 | 14 | Y.one('#tags-portlet-spinner').setStyle('display', 'block'); | 14 | Y.one('#tags-portlet-spinner').setStyle('display', 'block'); |
146 | 15 | 15 | ||
147 | 16 | 16 | ||
148 | === modified file 'lib/lp/bugs/templates/bugtask-index.pt' | |||
149 | --- lib/lp/bugs/templates/bugtask-index.pt 2009-11-30 17:57:15 +0000 | |||
150 | +++ lib/lp/bugs/templates/bugtask-index.pt 2009-12-07 15:21:15 +0000 | |||
151 | @@ -37,7 +37,7 @@ | |||
152 | 37 | </script> | 37 | </script> |
153 | 38 | </tal:devmode> | 38 | </tal:devmode> |
154 | 39 | <script type="text/javascript"> | 39 | <script type="text/javascript"> |
156 | 40 | YUI().use('base', 'node', 'oop', 'event', 'bugs.bugtask_index', | 40 | LPS.use('base', 'node', 'oop', 'event', 'bugs.bugtask_index', |
157 | 41 | 'code.branchmergeproposal.popupdiff', function(Y) { | 41 | 'code.branchmergeproposal.popupdiff', function(Y) { |
158 | 42 | Y.bugs.setup_bugtask_index(); | 42 | Y.bugs.setup_bugtask_index(); |
159 | 43 | Y.on('load', function(e) { | 43 | Y.on('load', function(e) { |
160 | @@ -155,7 +155,7 @@ | |||
161 | 155 | <img src="/@@/spinner" id="tags-edit-spinner" style="display: none" /> | 155 | <img src="/@@/spinner" id="tags-edit-spinner" style="display: none" /> |
162 | 156 | <a href="+edit" title="Edit tags" id="edit-tags-trigger" class="sprite edit"></a> | 156 | <a href="+edit" title="Edit tags" id="edit-tags-trigger" class="sprite edit"></a> |
163 | 157 | <script type="text/javascript"> | 157 | <script type="text/javascript"> |
165 | 158 | YUI().use('event', 'node', 'bugs.bug_tags_entry', function(Y) { | 158 | LPS.use('event', 'node', 'bugs.bug_tags_entry', function(Y) { |
166 | 159 | // XXX intellectronica 2009-04-16 bug #362309: | 159 | // XXX intellectronica 2009-04-16 bug #362309: |
167 | 160 | // The load event fires very late on bug pages that take a | 160 | // The load event fires very late on bug pages that take a |
168 | 161 | // long time to render, but we prefer to use it since the | 161 | // long time to render, but we prefer to use it since the |
169 | @@ -295,7 +295,7 @@ | |||
170 | 295 | button.style.display = 'none'; | 295 | button.style.display = 'none'; |
171 | 296 | </script> | 296 | </script> |
172 | 297 | <script type="text/javascript"> | 297 | <script type="text/javascript"> |
174 | 298 | YUI().use('lp.comment', function(Y) { | 298 | LPS.use('lp.comment', function(Y) { |
175 | 299 | var comment = new Y.lp.Comment(); | 299 | var comment = new Y.lp.Comment(); |
176 | 300 | comment.render(); | 300 | comment.render(); |
177 | 301 | }); | 301 | }); |
178 | 302 | 302 | ||
179 | === modified file 'lib/lp/bugs/templates/bugtask-tasks-and-nominations-table-row.pt' | |||
180 | --- lib/lp/bugs/templates/bugtask-tasks-and-nominations-table-row.pt 2009-11-03 15:32:31 +0000 | |||
181 | +++ lib/lp/bugs/templates/bugtask-tasks-and-nominations-table-row.pt 2009-12-07 15:21:15 +0000 | |||
182 | @@ -185,7 +185,7 @@ | |||
183 | 185 | class="bugtasks-table-row-init-script" | 185 | class="bugtasks-table-row-init-script" |
184 | 186 | tal:condition="not:view/many_bugtasks" | 186 | tal:condition="not:view/many_bugtasks" |
185 | 187 | tal:content="string: | 187 | tal:content="string: |
187 | 188 | YUI().use('event', 'bugs.bugtask_index', function(Y) { | 188 | LPS.use('event', 'bugs.bugtask_index', function(Y) { |
188 | 189 | Y.on('load', | 189 | Y.on('load', |
189 | 190 | function(e) { | 190 | function(e) { |
190 | 191 | Y.bugs.setup_bugtask_row(${view/js_config}); | 191 | Y.bugs.setup_bugtask_row(${view/js_config}); |
191 | 192 | 192 | ||
192 | === modified file 'lib/lp/bugs/templates/bugtasks-and-nominations-table.pt' | |||
193 | --- lib/lp/bugs/templates/bugtasks-and-nominations-table.pt 2009-09-02 22:13:06 +0000 | |||
194 | +++ lib/lp/bugs/templates/bugtasks-and-nominations-table.pt 2009-12-07 15:21:15 +0000 | |||
195 | @@ -88,7 +88,7 @@ | |||
196 | 88 | </span> | 88 | </span> |
197 | 89 | 89 | ||
198 | 90 | <script type="text/javascript" tal:content="string: | 90 | <script type="text/javascript" tal:content="string: |
200 | 91 | YUI().use('event', 'bugs.bugtask_index', function(Y) { | 91 | LPS.use('event', 'bugs.bugtask_index', function(Y) { |
201 | 92 | Y.on('load', function(e) { | 92 | Y.on('load', function(e) { |
202 | 93 | Y.bugs.setup_me_too(${view/current_user_affected_js_status}); | 93 | Y.bugs.setup_me_too(${view/current_user_affected_js_status}); |
203 | 94 | }, window); | 94 | }, window); |
204 | 95 | 95 | ||
205 | === modified file 'lib/lp/bugs/templates/official-bug-target-manage-tags.pt' | |||
206 | --- lib/lp/bugs/templates/official-bug-target-manage-tags.pt 2009-09-04 17:03:00 +0000 | |||
207 | +++ lib/lp/bugs/templates/official-bug-target-manage-tags.pt 2009-12-07 15:21:15 +0000 | |||
208 | @@ -31,7 +31,7 @@ | |||
209 | 31 | </script> | 31 | </script> |
210 | 32 | <script tal:replace="structure view/tags_js_data" /> | 32 | <script tal:replace="structure view/tags_js_data" /> |
211 | 33 | <script type="text/javascript"> | 33 | <script type="text/javascript"> |
213 | 34 | YUI().use('event', 'bugs.official_bug_tag_management', function(Y) { | 34 | LPS.use('event', 'bugs.official_bug_tag_management', function(Y) { |
214 | 35 | Y.on('domready', function(e) { | 35 | Y.on('domready', function(e) { |
215 | 36 | Y.bugs.setup_official_bug_tag_management(); | 36 | Y.bugs.setup_official_bug_tag_management(); |
216 | 37 | }); | 37 | }); |
217 | 38 | 38 | ||
218 | === modified file 'lib/lp/code/templates/branch-import-details.pt' | |||
219 | --- lib/lp/code/templates/branch-import-details.pt 2009-11-04 13:56:17 +0000 | |||
220 | +++ lib/lp/code/templates/branch-import-details.pt 2009-12-07 15:21:15 +0000 | |||
221 | @@ -32,7 +32,7 @@ | |||
222 | 32 | Try again | 32 | Try again |
223 | 33 | </a> | 33 | </a> |
224 | 34 | <script type="text/javascript"> | 34 | <script type="text/javascript"> |
226 | 35 | YUI().use('event', 'node', function(Y) { | 35 | LPS.use('event', 'node', function(Y) { |
227 | 36 | Y.on("domready", function () { Y.one('#tryagainlink').setStyle('display', 'inline') }); | 36 | Y.on("domready", function () { Y.one('#tryagainlink').setStyle('display', 'inline') }); |
228 | 37 | }); | 37 | }); |
229 | 38 | </script> | 38 | </script> |
230 | 39 | 39 | ||
231 | === modified file 'lib/lp/code/templates/branch-index.pt' | |||
232 | --- lib/lp/code/templates/branch-index.pt 2009-11-17 05:07:41 +0000 | |||
233 | +++ lib/lp/code/templates/branch-index.pt 2009-12-07 15:21:15 +0000 | |||
234 | @@ -47,7 +47,7 @@ | |||
235 | 47 | </tal:devmode> | 47 | </tal:devmode> |
236 | 48 | <script type="text/javascript" | 48 | <script type="text/javascript" |
237 | 49 | tal:content="string: | 49 | tal:content="string: |
239 | 50 | YUI().use('node', 'event', 'widget', 'plugin', 'overlay', | 50 | LPS.use('node', 'event', 'widget', 'plugin', 'overlay', |
240 | 51 | 'lazr.choiceedit', 'code.branchstatus', | 51 | 'lazr.choiceedit', 'code.branchstatus', |
241 | 52 | 'code.branchmergeproposal.popupdiff', | 52 | 'code.branchmergeproposal.popupdiff', |
242 | 53 | function(Y) { | 53 | function(Y) { |
243 | 54 | 54 | ||
244 | === modified file 'lib/lp/code/templates/branch-listing.pt' | |||
245 | --- lib/lp/code/templates/branch-listing.pt 2009-11-04 13:56:17 +0000 | |||
246 | +++ lib/lp/code/templates/branch-listing.pt 2009-12-07 15:21:15 +0000 | |||
247 | @@ -41,7 +41,7 @@ | |||
248 | 41 | } | 41 | } |
249 | 42 | registerLaunchpadFunction(hookUpFilterSubmission); | 42 | registerLaunchpadFunction(hookUpFilterSubmission); |
250 | 43 | 43 | ||
252 | 44 | YUI().use('io-base', 'node', 'json-parse', function(Y) { | 44 | LPS.use('io-base', 'node', 'json-parse', function(Y) { |
253 | 45 | 45 | ||
254 | 46 | function doUpdate(transaction_id, response, args) { | 46 | function doUpdate(transaction_id, response, args) { |
255 | 47 | json_values = Y.JSON.parse(response.responseText); | 47 | json_values = Y.JSON.parse(response.responseText); |
256 | 48 | 48 | ||
257 | === modified file 'lib/lp/code/templates/branch-portlet-subscribers.pt' | |||
258 | --- lib/lp/code/templates/branch-portlet-subscribers.pt 2009-11-04 13:56:17 +0000 | |||
259 | +++ lib/lp/code/templates/branch-portlet-subscribers.pt 2009-12-07 15:21:15 +0000 | |||
260 | @@ -41,7 +41,7 @@ | |||
261 | 41 | string:<script id='milestone-script' type='text/javascript'>" /> | 41 | string:<script id='milestone-script' type='text/javascript'>" /> |
262 | 42 | <!-- | 42 | <!-- |
263 | 43 | 43 | ||
265 | 44 | YUI().use('io-base', 'node', 'code.branchsubscription', function(Y) { | 44 | LPS.use('io-base', 'node', 'code.branchsubscription', function(Y) { |
266 | 45 | 45 | ||
267 | 46 | if(Y.UA.ie) { | 46 | if(Y.UA.ie) { |
268 | 47 | Y.one('#subscriber-list').set('innerHTML', | 47 | Y.one('#subscriber-list').set('innerHTML', |
269 | 48 | 48 | ||
270 | === modified file 'lib/lp/code/templates/branch-related-bugs-specs.pt' | |||
271 | --- lib/lp/code/templates/branch-related-bugs-specs.pt 2009-09-08 21:42:45 +0000 | |||
272 | +++ lib/lp/code/templates/branch-related-bugs-specs.pt 2009-12-07 15:21:15 +0000 | |||
273 | @@ -42,7 +42,7 @@ | |||
274 | 42 | string:<script id='branchlink-script' type='text/javascript'>" /> | 42 | string:<script id='branchlink-script' type='text/javascript'>" /> |
275 | 43 | <!-- | 43 | <!-- |
276 | 44 | 44 | ||
278 | 45 | YUI().use('io-base', 'code.branchlinks', function(Y) { | 45 | LPS.use('io-base', 'code.branchlinks', function(Y) { |
279 | 46 | 46 | ||
280 | 47 | if(Y.UA.ie) { | 47 | if(Y.UA.ie) { |
281 | 48 | return; | 48 | return; |
282 | 49 | 49 | ||
283 | === modified file 'lib/lp/code/templates/branchmergeproposal-generic-listing.pt' | |||
284 | --- lib/lp/code/templates/branchmergeproposal-generic-listing.pt 2009-11-04 13:56:17 +0000 | |||
285 | +++ lib/lp/code/templates/branchmergeproposal-generic-listing.pt 2009-12-07 15:21:15 +0000 | |||
286 | @@ -24,7 +24,7 @@ | |||
287 | 24 | </form> | 24 | </form> |
288 | 25 | <script type="text/javascript"> | 25 | <script type="text/javascript"> |
289 | 26 | 26 | ||
291 | 27 | YUI().use('node', function(Y) { | 27 | LPS.use('node', function(Y) { |
292 | 28 | 28 | ||
293 | 29 | function submit_filter() { | 29 | function submit_filter() { |
294 | 30 | Y.one('#filter_form').submit(); | 30 | Y.one('#filter_form').submit(); |
295 | 31 | 31 | ||
296 | === modified file 'lib/lp/registry/templates/object-timeline-graph.pt' | |||
297 | --- lib/lp/registry/templates/object-timeline-graph.pt 2009-11-24 09:30:01 +0000 | |||
298 | +++ lib/lp/registry/templates/object-timeline-graph.pt 2009-12-07 15:21:15 +0000 | |||
299 | @@ -32,7 +32,7 @@ | |||
300 | 32 | include_inactive = false; | 32 | include_inactive = false; |
301 | 33 | } | 33 | } |
302 | 34 | 34 | ||
304 | 35 | YUI().use('registry.timeline', 'node', function(Y) { | 35 | LPS.use('registry.timeline', 'node', function(Y) { |
305 | 36 | Y.on('domready', function(e) { | 36 | Y.on('domready', function(e) { |
306 | 37 | if (Y.UA.ie) { | 37 | if (Y.UA.ie) { |
307 | 38 | return; | 38 | return; |
308 | 39 | 39 | ||
309 | === modified file 'lib/lp/registry/templates/person-macros.pt' | |||
310 | --- lib/lp/registry/templates/person-macros.pt 2009-11-04 13:56:17 +0000 | |||
311 | +++ lib/lp/registry/templates/person-macros.pt 2009-12-07 15:21:15 +0000 | |||
312 | @@ -190,7 +190,7 @@ | |||
313 | 190 | condition="private_prefix"> | 190 | condition="private_prefix"> |
314 | 191 | <script type="text/javascript" | 191 | <script type="text/javascript" |
315 | 192 | tal:content="string: | 192 | tal:content="string: |
317 | 193 | YUI().use('node', 'event', function(Y) { | 193 | LPS.use('node', 'event', function(Y) { |
318 | 194 | // Prepend/remove 'private-' from team name based on visibility | 194 | // Prepend/remove 'private-' from team name based on visibility |
319 | 195 | // setting. User can choose to edit it back out, if they wish. | 195 | // setting. User can choose to edit it back out, if they wish. |
320 | 196 | function visibility_on_change(e) { | 196 | function visibility_on_change(e) { |
321 | 197 | 197 | ||
322 | === modified file 'lib/lp/registry/templates/product-new.pt' | |||
323 | --- lib/lp/registry/templates/product-new.pt 2009-11-04 13:56:17 +0000 | |||
324 | +++ lib/lp/registry/templates/product-new.pt 2009-12-07 15:21:15 +0000 | |||
325 | @@ -14,7 +14,7 @@ | |||
326 | 14 | * details widgets until the user states that the project they are | 14 | * details widgets until the user states that the project they are |
327 | 15 | * registering is not a duplicate. | 15 | * registering is not a duplicate. |
328 | 16 | */ | 16 | */ |
330 | 17 | YUI().use('node', 'lazr.effects', function(Y) { | 17 | LPS.use('node', 'lazr.effects', function(Y) { |
331 | 18 | Y.on('domready', function() { | 18 | Y.on('domready', function() { |
332 | 19 | /* These two regexps serve slightly different purposes. The first | 19 | /* These two regexps serve slightly different purposes. The first |
333 | 20 | * finds the leftmost run of valid url characters for the autofill | 20 | * finds the leftmost run of valid url characters for the autofill |
334 | 21 | 21 | ||
335 | === modified file 'lib/lp/registry/templates/productrelease-add-from-series.pt' | |||
336 | --- lib/lp/registry/templates/productrelease-add-from-series.pt 2009-11-04 13:56:17 +0000 | |||
337 | +++ lib/lp/registry/templates/productrelease-add-from-series.pt 2009-12-07 15:21:15 +0000 | |||
338 | @@ -14,7 +14,7 @@ | |||
339 | 14 | <tal:script | 14 | <tal:script |
340 | 15 | replace="structure | 15 | replace="structure |
341 | 16 | string:<script id='milestone-script' type='text/javascript'>" /> | 16 | string:<script id='milestone-script' type='text/javascript'>" /> |
343 | 17 | YUI().use('node', 'lp.milestoneoverlay', function (Y) { | 17 | LPS.use('node', 'lp.milestoneoverlay', function (Y) { |
344 | 18 | 18 | ||
345 | 19 | // This is a value for the SELECT OPTION which is passed with | 19 | // This is a value for the SELECT OPTION which is passed with |
346 | 20 | // the SELECT's "change" event. It includes some symbols that are not | 20 | // the SELECT's "change" event. It includes some symbols that are not |
347 | 21 | 21 | ||
348 | === modified file 'lib/lp/registry/templates/teammembership-index.pt' | |||
349 | --- lib/lp/registry/templates/teammembership-index.pt 2009-11-04 13:56:17 +0000 | |||
350 | +++ lib/lp/registry/templates/teammembership-index.pt 2009-12-07 15:21:15 +0000 | |||
351 | @@ -20,7 +20,7 @@ | |||
352 | 20 | use-macro="context/@@launchpad_widget_macros/yui2calendar-dependencies" /> | 20 | use-macro="context/@@launchpad_widget_macros/yui2calendar-dependencies" /> |
353 | 21 | 21 | ||
354 | 22 | <script type="text/javascript"> | 22 | <script type="text/javascript"> |
356 | 23 | YUI().use('node', 'lp.calendar', function(Y) { | 23 | LPS.use('node', 'lp.calendar', function(Y) { |
357 | 24 | // Ensure that when the picker is used the radio button switches | 24 | // Ensure that when the picker is used the radio button switches |
358 | 25 | // from 'Never' to 'On' and the expiry field is enabled. | 25 | // from 'Never' to 'On' and the expiry field is enabled. |
359 | 26 | Y.on("available", function(e) { | 26 | Y.on("available", function(e) { |
360 | 27 | 27 | ||
361 | === modified file 'lib/lp/registry/templates/timeline-macros.pt' | |||
362 | --- lib/lp/registry/templates/timeline-macros.pt 2009-11-04 13:56:17 +0000 | |||
363 | +++ lib/lp/registry/templates/timeline-macros.pt 2009-12-07 15:21:15 +0000 | |||
364 | @@ -35,7 +35,7 @@ | |||
365 | 35 | if (auto_resize == 'true') { | 35 | if (auto_resize == 'true') { |
366 | 36 | timeline_url += 'resize_frame=timeline-iframe&'; | 36 | timeline_url += 'resize_frame=timeline-iframe&'; |
367 | 37 | } | 37 | } |
369 | 38 | YUI().use('node', function(Y) { | 38 | LPS.use('node', function(Y) { |
370 | 39 | if (Y.UA.ie) { | 39 | if (Y.UA.ie) { |
371 | 40 | return; | 40 | return; |
372 | 41 | } | 41 | } |
373 | 42 | 42 | ||
374 | === modified file 'lib/lp/soyuz/doc/publishing.txt' | |||
375 | --- lib/lp/soyuz/doc/publishing.txt 2009-11-18 23:56:26 +0000 | |||
376 | +++ lib/lp/soyuz/doc/publishing.txt 2009-12-07 15:21:15 +0000 | |||
377 | @@ -1055,22 +1055,61 @@ | |||
378 | 1055 | each build found. | 1055 | each build found. |
379 | 1056 | 1056 | ||
380 | 1057 | >>> cprov_builds.count() | 1057 | >>> cprov_builds.count() |
382 | 1058 | 7 | 1058 | 8 |
383 | 1059 | 1059 | ||
384 | 1060 | The `ResultSet` is ordered by ascending | 1060 | The `ResultSet` is ordered by ascending |
385 | 1061 | `SourcePackagePublishingHistory.id` and ascending | 1061 | `SourcePackagePublishingHistory.id` and ascending |
386 | 1062 | `DistroArchseries.architecturetag` in this order. | 1062 | `DistroArchseries.architecturetag` in this order. |
387 | 1063 | 1063 | ||
398 | 1064 | >>> source_pub, build, arch = cprov_builds.last() | 1064 | # The easiest thing we can do here (without printing ids) |
399 | 1065 | 1065 | # is to show that sorting a list of the resulting ids+tags does not | |
400 | 1066 | >>> print source_pub.displayname | 1066 | # modify the list. |
401 | 1067 | foo 666 in breezy-autotest | 1067 | >>> ids_and_tags = [(pub.id, arch.architecturetag) |
402 | 1068 | 1068 | ... for pub, build, arch in cprov_builds] | |
403 | 1069 | >>> print build.title | 1069 | >>> ids_and_tags == sorted(ids_and_tags) |
404 | 1070 | i386 build of foo 666 in ubuntutest breezy-autotest RELEASE | 1070 | True |
405 | 1071 | 1071 | ||
406 | 1072 | >>> print arch.displayname | 1072 | If a source package is copied from another archive (including the |
407 | 1073 | ubuntutest Breezy Badger Autotest i386 | 1073 | binaries), then the related builds for that source package will |
408 | 1074 | also be retrievable via the copied source publication. | ||
409 | 1075 | For example, if a package is built in a private security PPA, and then | ||
410 | 1076 | later copied out into the primary archive, the builds will then | ||
411 | 1077 | be available when looking at the copied source package in the primary | ||
412 | 1078 | archive. | ||
413 | 1079 | |||
414 | 1080 | # Create a new PPA and publish a source with some builds | ||
415 | 1081 | # and binaries. | ||
416 | 1082 | >>> other_ppa = factory.makeArchive(name="otherppa") | ||
417 | 1083 | >>> binaries = test_publisher.getPubBinaries(archive=other_ppa) | ||
418 | 1084 | |||
419 | 1085 | The associated builds and binaries will be created in the context of the | ||
420 | 1086 | other PPA. | ||
421 | 1087 | |||
422 | 1088 | >>> build = binaries[0].binarypackagerelease.build | ||
423 | 1089 | >>> source_pub = build.sourcepackagerelease.publishings[0] | ||
424 | 1090 | >>> print build.archive.name | ||
425 | 1091 | otherppa | ||
426 | 1092 | |||
427 | 1093 | # Copy the source into Celso's PPA, ensuring that the binaries | ||
428 | 1094 | # are alse published there. | ||
429 | 1095 | >>> source_pub_cprov = source_pub.copyTo( | ||
430 | 1096 | ... source_pub.distroseries, source_pub.pocket, | ||
431 | 1097 | ... cprov.archive) | ||
432 | 1098 | >>> binaries_cprov = test_publisher.publishBinaryInArchive( | ||
433 | 1099 | ... binaries[0].binarypackagerelease, cprov.archive) | ||
434 | 1100 | |||
435 | 1101 | Now we will see an extra source in Celso's PPA as well as an extra | ||
436 | 1102 | build - even though the build's context is not Celso's PPA. Previously | ||
437 | 1103 | there were 8 sources and builds. | ||
438 | 1104 | |||
439 | 1105 | >>> cprov_sources_new = cprov.archive.getPublishedSources() | ||
440 | 1106 | >>> cprov_sources_new.count() | ||
441 | 1107 | 9 | ||
442 | 1108 | |||
443 | 1109 | >>> cprov_builds_new = publishing_set.getBuildsForSources( | ||
444 | 1110 | ... cprov_sources_new) | ||
445 | 1111 | >>> cprov_builds_new.count() | ||
446 | 1112 | 9 | ||
447 | 1074 | 1113 | ||
448 | 1075 | Next we'll create two sources with two builds each (the SoyuzTestPublisher | 1114 | Next we'll create two sources with two builds each (the SoyuzTestPublisher |
449 | 1076 | default) and show that the number of unpublished builds for these sources | 1115 | default) and show that the number of unpublished builds for these sources |
450 | 1077 | 1116 | ||
451 | === modified file 'lib/lp/soyuz/model/publishing.py' | |||
452 | --- lib/lp/soyuz/model/publishing.py 2009-11-19 00:26:13 +0000 | |||
453 | +++ lib/lp/soyuz/model/publishing.py 2009-12-07 15:21:15 +0000 | |||
454 | @@ -40,6 +40,7 @@ | |||
455 | 40 | from canonical.database.enumcol import EnumCol | 40 | from canonical.database.enumcol import EnumCol |
456 | 41 | from lp.registry.interfaces.pocket import PackagePublishingPocket | 41 | from lp.registry.interfaces.pocket import PackagePublishingPocket |
457 | 42 | from lp.soyuz.model.binarypackagename import BinaryPackageName | 42 | from lp.soyuz.model.binarypackagename import BinaryPackageName |
458 | 43 | from lp.soyuz.model.binarypackagerelease import BinaryPackageRelease | ||
459 | 43 | from lp.soyuz.model.files import ( | 44 | from lp.soyuz.model.files import ( |
460 | 44 | BinaryPackageFile, SourcePackageReleaseFile) | 45 | BinaryPackageFile, SourcePackageReleaseFile) |
461 | 45 | from canonical.launchpad.database.librarian import ( | 46 | from canonical.launchpad.database.librarian import ( |
462 | @@ -593,7 +594,7 @@ | |||
463 | 593 | # not blow up because of bad data. | 594 | # not blow up because of bad data. |
464 | 594 | return None | 595 | return None |
465 | 595 | source, packageupload, spr, changesfile, lfc = result | 596 | source, packageupload, spr, changesfile, lfc = result |
467 | 596 | 597 | ||
468 | 597 | # Return a webapp-proxied LibraryFileAlias so that restricted | 598 | # Return a webapp-proxied LibraryFileAlias so that restricted |
469 | 598 | # librarian files are accessible. Non-restricted files will get | 599 | # librarian files are accessible. Non-restricted files will get |
470 | 599 | # a 302 so that webapp threads are not tied up. | 600 | # a 302 so that webapp threads are not tied up. |
471 | @@ -1262,23 +1263,64 @@ | |||
472 | 1262 | Build.buildstate.is_in(build_states)) | 1263 | Build.buildstate.is_in(build_states)) |
473 | 1263 | 1264 | ||
474 | 1264 | store = getUtility(IStoreSelector).get(MAIN_STORE, DEFAULT_FLAVOR) | 1265 | store = getUtility(IStoreSelector).get(MAIN_STORE, DEFAULT_FLAVOR) |
477 | 1265 | result_set = store.find( | 1266 | |
478 | 1266 | (SourcePackagePublishingHistory, Build, DistroArchSeries), | 1267 | # We'll be looking for builds in the same distroseries as the |
479 | 1268 | # SPPH for the same release. | ||
480 | 1269 | builds_for_distroseries_expr = ( | ||
481 | 1267 | Build.distroarchseriesID == DistroArchSeries.id, | 1270 | Build.distroarchseriesID == DistroArchSeries.id, |
482 | 1271 | SourcePackagePublishingHistory.distroseriesID == | ||
483 | 1272 | DistroArchSeries.distroseriesID, | ||
484 | 1273 | SourcePackagePublishingHistory.sourcepackagereleaseID == | ||
485 | 1274 | Build.sourcepackagereleaseID, | ||
486 | 1275 | In(SourcePackagePublishingHistory.id, source_publication_ids) | ||
487 | 1276 | ) | ||
488 | 1277 | |||
489 | 1278 | # First, we'll find the builds that were built in the same | ||
490 | 1279 | # archive context as the published sources. | ||
491 | 1280 | builds_in_same_archive = store.find( | ||
492 | 1281 | Build, | ||
493 | 1282 | builds_for_distroseries_expr, | ||
494 | 1268 | SourcePackagePublishingHistory.archiveID == Build.archiveID, | 1283 | SourcePackagePublishingHistory.archiveID == Build.archiveID, |
503 | 1269 | SourcePackagePublishingHistory.distroseriesID == | 1284 | *extra_exprs) |
504 | 1270 | DistroArchSeries.distroseriesID, | 1285 | |
505 | 1271 | SourcePackagePublishingHistory.sourcepackagereleaseID == | 1286 | # Next get all the builds that have a binary published in the |
506 | 1272 | Build.sourcepackagereleaseID, | 1287 | # same archive... even though the build was not built in |
507 | 1273 | In(SourcePackagePublishingHistory.id, source_publication_ids), | 1288 | # the same context archive. |
508 | 1274 | *extra_exprs) | 1289 | builds_copied_into_archive = store.find( |
509 | 1275 | 1290 | Build, | |
510 | 1276 | result_set.order_by( | 1291 | builds_for_distroseries_expr, |
511 | 1292 | SourcePackagePublishingHistory.archiveID != Build.archiveID, | ||
512 | 1293 | BinaryPackagePublishingHistory.archive == | ||
513 | 1294 | SourcePackagePublishingHistory.archiveID, | ||
514 | 1295 | BinaryPackagePublishingHistory.binarypackagerelease == | ||
515 | 1296 | BinaryPackageRelease.id, | ||
516 | 1297 | BinaryPackageRelease.build == Build.id, | ||
517 | 1298 | *extra_exprs) | ||
518 | 1299 | |||
519 | 1300 | builds_union = builds_copied_into_archive.union( | ||
520 | 1301 | builds_in_same_archive).config(distinct=True) | ||
521 | 1302 | |||
522 | 1303 | # Now that we have a result_set of all the builds, we'll use it | ||
523 | 1304 | # as a subquery to get the required publishing and arch to do | ||
524 | 1305 | # the ordering. We do this in this round-about way because we | ||
525 | 1306 | # can't sort on SourcePackagePublishingHistory.id after the | ||
526 | 1307 | # union. See bug 443353 for details. | ||
527 | 1308 | find_spec = ( | ||
528 | 1309 | SourcePackagePublishingHistory, Build, DistroArchSeries) | ||
529 | 1310 | |||
530 | 1311 | # Storm doesn't let us do builds_union.values('id') - | ||
531 | 1312 | # ('Union' object has no attribute 'columns'). So instead | ||
532 | 1313 | # we have to instantiate the objects just to get the id. | ||
533 | 1314 | build_ids = [build.id for build in builds_union] | ||
534 | 1315 | |||
535 | 1316 | result_set = store.find( | ||
536 | 1317 | find_spec, builds_for_distroseries_expr, | ||
537 | 1318 | Build.id.is_in(build_ids)) | ||
538 | 1319 | |||
539 | 1320 | return result_set.order_by( | ||
540 | 1277 | SourcePackagePublishingHistory.id, | 1321 | SourcePackagePublishingHistory.id, |
541 | 1278 | DistroArchSeries.architecturetag) | 1322 | DistroArchSeries.architecturetag) |
542 | 1279 | 1323 | ||
543 | 1280 | return result_set | ||
544 | 1281 | |||
545 | 1282 | def getByIdAndArchive(self, id, archive, source=True): | 1324 | def getByIdAndArchive(self, id, archive, source=True): |
546 | 1283 | """See `IPublishingSet`.""" | 1325 | """See `IPublishingSet`.""" |
547 | 1284 | if source: | 1326 | if source: |
548 | @@ -1317,12 +1359,10 @@ | |||
549 | 1317 | def _getSourceBinaryJoinForSources(self, source_publication_ids, | 1359 | def _getSourceBinaryJoinForSources(self, source_publication_ids, |
550 | 1318 | active_binaries_only=True): | 1360 | active_binaries_only=True): |
551 | 1319 | """Return the join linking sources with binaries.""" | 1361 | """Return the join linking sources with binaries.""" |
553 | 1320 | # Import Build, BinaryPackageRelease and DistroArchSeries locally | 1362 | # Import Build and DistroArchSeries locally |
554 | 1321 | # to avoid circular imports, since Build uses | 1363 | # to avoid circular imports, since Build uses |
555 | 1322 | # SourcePackagePublishingHistory, BinaryPackageRelease uses Build | 1364 | # SourcePackagePublishingHistory, BinaryPackageRelease uses Build |
556 | 1323 | # and DistroArchSeries uses BinaryPackagePublishingHistory. | 1365 | # and DistroArchSeries uses BinaryPackagePublishingHistory. |
557 | 1324 | from lp.soyuz.model.binarypackagerelease import ( | ||
558 | 1325 | BinaryPackageRelease) | ||
559 | 1326 | from lp.soyuz.model.build import Build | 1366 | from lp.soyuz.model.build import Build |
560 | 1327 | from lp.soyuz.model.distroarchseries import ( | 1367 | from lp.soyuz.model.distroarchseries import ( |
561 | 1328 | DistroArchSeries) | 1368 | DistroArchSeries) |
562 | @@ -1397,12 +1437,8 @@ | |||
563 | 1397 | 1437 | ||
564 | 1398 | def getBinaryFilesForSources(self, one_or_more_source_publications): | 1438 | def getBinaryFilesForSources(self, one_or_more_source_publications): |
565 | 1399 | """See `IPublishingSet`.""" | 1439 | """See `IPublishingSet`.""" |
572 | 1400 | # Import Build and BinaryPackageRelease locally to avoid circular | 1440 | # Import Build locally to avoid circular imports, since that |
573 | 1401 | # imports, since that Build already imports | 1441 | # Build already imports SourcePackagePublishingHistory. |
568 | 1402 | # SourcePackagePublishingHistory and BinaryPackageRelease imports | ||
569 | 1403 | # Build. | ||
570 | 1404 | from lp.soyuz.model.binarypackagerelease import ( | ||
571 | 1405 | BinaryPackageRelease) | ||
574 | 1406 | from lp.soyuz.model.build import Build | 1442 | from lp.soyuz.model.build import Build |
575 | 1407 | 1443 | ||
576 | 1408 | source_publication_ids = self._extractIDs( | 1444 | source_publication_ids = self._extractIDs( |
577 | 1409 | 1445 | ||
578 | === modified file 'lib/lp/soyuz/scripts/tests/test_copypackage.py' | |||
579 | --- lib/lp/soyuz/scripts/tests/test_copypackage.py 2009-11-17 21:38:28 +0000 | |||
580 | +++ lib/lp/soyuz/scripts/tests/test_copypackage.py 2009-12-07 15:21:15 +0000 | |||
581 | @@ -1368,6 +1368,21 @@ | |||
582 | 1368 | target_archive = copy_helper.destination.archive | 1368 | target_archive = copy_helper.destination.archive |
583 | 1369 | self.checkCopies(copied, target_archive, 3) | 1369 | self.checkCopies(copied, target_archive, 3) |
584 | 1370 | 1370 | ||
585 | 1371 | # The second copy will fail explicitly because the new BPPH | ||
586 | 1372 | # records are not yet published. | ||
587 | 1373 | nothing_copied = copy_helper.mainTask() | ||
588 | 1374 | self.assertEqual(len(nothing_copied), 0) | ||
589 | 1375 | self.assertEqual( | ||
590 | 1376 | copy_helper.logger.buffer.getvalue().splitlines()[-1], | ||
591 | 1377 | 'ERROR: foo 666 in hoary (same version has unpublished binaries ' | ||
592 | 1378 | 'in the destination archive for Hoary, please wait for them to ' | ||
593 | 1379 | 'be published before copying)') | ||
594 | 1380 | |||
595 | 1381 | # If we ensure that the copied binaries are published, the | ||
596 | 1382 | # copy won't fail but will simply not copy anything. | ||
597 | 1383 | for bin_pub in copied[1:3]: | ||
598 | 1384 | bin_pub.secure_record.setPublished() | ||
599 | 1385 | |||
600 | 1371 | nothing_copied = copy_helper.mainTask() | 1386 | nothing_copied = copy_helper.mainTask() |
601 | 1372 | self.assertEqual(len(nothing_copied), 0) | 1387 | self.assertEqual(len(nothing_copied), 0) |
602 | 1373 | self.assertEqual( | 1388 | self.assertEqual( |
603 | @@ -1504,7 +1519,7 @@ | |||
604 | 1504 | name='boing') | 1519 | name='boing') |
605 | 1505 | self.assertEqual(copied_source.displayname, 'boing 1.0 in hoary') | 1520 | self.assertEqual(copied_source.displayname, 'boing 1.0 in hoary') |
606 | 1506 | self.assertEqual(len(copied_source.getPublishedBinaries()), 2) | 1521 | self.assertEqual(len(copied_source.getPublishedBinaries()), 2) |
608 | 1507 | self.assertEqual(len(copied_source.getBuilds()), 0) | 1522 | self.assertEqual(len(copied_source.getBuilds()), 1) |
609 | 1508 | 1523 | ||
610 | 1509 | def _setupArchitectureGrowingScenario(self, architecturehintlist="all"): | 1524 | def _setupArchitectureGrowingScenario(self, architecturehintlist="all"): |
611 | 1510 | """Prepare distroseries with different sets of architectures. | 1525 | """Prepare distroseries with different sets of architectures. |
612 | 1511 | 1526 | ||
613 | === modified file 'lib/lp/soyuz/stories/ppa/xx-copy-packages.txt' | |||
614 | --- lib/lp/soyuz/stories/ppa/xx-copy-packages.txt 2009-10-13 10:05:58 +0000 | |||
615 | +++ lib/lp/soyuz/stories/ppa/xx-copy-packages.txt 2009-12-07 15:21:15 +0000 | |||
616 | @@ -1062,7 +1062,7 @@ | |||
617 | 1062 | >>> print_ppa_packages(jblack_browser.contents) | 1062 | >>> print_ppa_packages(jblack_browser.contents) |
618 | 1063 | Source Published Status Series Section Build | 1063 | Source Published Status Series Section Build |
619 | 1064 | Status | 1064 | Status |
621 | 1065 | foo - 2.0 (changesfile) Pending Hoary Base | 1065 | foo - 2.0 (changesfile) Pending Hoary Base i386 |
622 | 1066 | foo - 1.1 (changesfile) Pending Warty Base | 1066 | foo - 1.1 (changesfile) Pending Warty Base |
623 | 1067 | pmount - 0.1-1 Pending Hoary Editors | 1067 | pmount - 0.1-1 Pending Hoary Editors |
624 | 1068 | pmount - 0.1-1 Pending Warty Editors | 1068 | pmount - 0.1-1 Pending Warty Editors |
625 | 1069 | 1069 | ||
626 | === modified file 'lib/lp/soyuz/stories/ppa/xx-ppa-packages.txt' | |||
627 | --- lib/lp/soyuz/stories/ppa/xx-ppa-packages.txt 2009-11-05 10:51:36 +0000 | |||
628 | +++ lib/lp/soyuz/stories/ppa/xx-ppa-packages.txt 2009-12-07 15:21:15 +0000 | |||
629 | @@ -129,29 +129,51 @@ | |||
630 | 129 | If a the binaries for a package are fully built, but have not yet been | 129 | If a the binaries for a package are fully built, but have not yet been |
631 | 130 | published, this will be indicated to the viewer: | 130 | published, this will be indicated to the viewer: |
632 | 131 | 131 | ||
636 | 132 | >>> anon_browser.open( | 132 | # First, we'll update the binary publishing history for the i386 |
637 | 133 | ... "http://launchpad.dev/~cprov/+archive/ppa/+packages") | 133 | # record so that it is pending publication. |
638 | 134 | >>> expander_url = anon_browser.getLink(id='pub28-expander').url | 134 | >>> login('foo.bar@canonical.com') |
639 | 135 | >>> from zope.component import getUtility | ||
640 | 136 | >>> from lp.registry.interfaces.person import IPersonSet | ||
641 | 137 | >>> cprov_ppa = getUtility(IPersonSet).getByName('cprov').archive | ||
642 | 138 | >>> pmount_i386_pub = cprov_ppa.getAllPublishedBinaries( | ||
643 | 139 | ... name='pmount', version='0.1-1')[1] | ||
644 | 140 | >>> print pmount_i386_pub.displayname | ||
645 | 141 | pmount 0.1-1 in warty i386 | ||
646 | 142 | >>> from lp.soyuz.interfaces.publishing import PackagePublishingStatus | ||
647 | 143 | >>> pmount_i386_pub.secure_record.status = PackagePublishingStatus.PENDING | ||
648 | 144 | >>> pmount_i386_pub.secure_record.datepublished = None | ||
649 | 145 | >>> transaction.commit() | ||
650 | 146 | >>> logout() | ||
651 | 147 | |||
652 | 148 | # Now, to re-display the pmount expanded section: | ||
653 | 135 | >>> anon_browser.open(expander_url) | 149 | >>> anon_browser.open(expander_url) |
654 | 136 | >>> print extract_text(anon_browser.contents) | 150 | >>> print extract_text(anon_browser.contents) |
655 | 137 | Note: Some binary packages for this source are not yet published in the | 151 | Note: Some binary packages for this source are not yet published in the |
656 | 138 | repository. | 152 | repository. |
657 | 139 | Publishing details | 153 | Publishing details |
658 | 140 | Published on 2007-07-09 | 154 | Published on 2007-07-09 |
660 | 141 | Copied from ubuntu warty in PPA for Mark Shuttleworth | 155 | Copied from ubuntu hoary in Primary Archive for Ubuntu Linux |
661 | 142 | Changelog | 156 | Changelog |
662 | 157 | pmount (0.1-1) hoary; urgency=low | ||
663 | 158 | * Fix description (Malone #1) | ||
664 | 159 | * Fix debian (Debian #2000) | ||
665 | 160 | * Fix warty (Warty Ubuntu #1) | ||
666 | 161 | -- Sample Person... | ||
667 | 143 | Builds | 162 | Builds |
668 | 144 | i386 - Pending publication | 163 | i386 - Pending publication |
669 | 145 | Built packages | 164 | Built packages |
671 | 146 | mozilla-firefox ff from iceweasel | 165 | pmount |
672 | 166 | pmount shortdesc | ||
673 | 147 | Package files | 167 | Package files |
677 | 148 | firefox_0.9.2.orig.tar.gz (9.5 MiB) | 168 | No files published for this package. |
675 | 149 | iceweasel-1.0.dsc (123 bytes) | ||
676 | 150 | mozilla-firefox_0.9_i386.deb (3 bytes) | ||
678 | 151 | 169 | ||
680 | 152 | The package was copied from a PPA. The archive title will hence link | 170 | When the package is copied from a PPA, the archive title will link |
681 | 153 | back to the source PPA. | 171 | back to the source PPA. |
682 | 154 | 172 | ||
683 | 173 | >>> anon_browser.open( | ||
684 | 174 | ... "http://launchpad.dev/~cprov/+archive/ppa/+packages") | ||
685 | 175 | >>> expander_url = anon_browser.getLink(id='pub28-expander').url | ||
686 | 176 | >>> anon_browser.open(expander_url) | ||
687 | 155 | >>> anon_browser.getLink("PPA for Mark Shuttleworth").url | 177 | >>> anon_browser.getLink("PPA for Mark Shuttleworth").url |
688 | 156 | 'http://launchpad.dev/~mark/+archive/ppa' | 178 | 'http://launchpad.dev/~mark/+archive/ppa' |
689 | 157 | 179 | ||
690 | @@ -164,7 +186,7 @@ | |||
691 | 164 | >>> admin_browser.getControl(name="field.buildd_secret").value = "secret" | 186 | >>> admin_browser.getControl(name="field.buildd_secret").value = "secret" |
692 | 165 | >>> admin_browser.getControl("Save").click() | 187 | >>> admin_browser.getControl("Save").click() |
693 | 166 | 188 | ||
695 | 167 | >>> anon_browser.open("http://launchpad.dev/~cprov/+archive/ppa") | 189 | >>> anon_browser.open(expander_url) |
696 | 168 | >>> anon_browser.getLink("PPA for Mark Shuttleworth") | 190 | >>> anon_browser.getLink("PPA for Mark Shuttleworth") |
697 | 169 | Traceback (most recent call last): | 191 | Traceback (most recent call last): |
698 | 170 | ... | 192 | ... |
699 | 171 | 193 | ||
700 | === modified file 'lib/lp/soyuz/stories/webservice/xx-source-package-publishing.txt' | |||
701 | --- lib/lp/soyuz/stories/webservice/xx-source-package-publishing.txt 2009-11-18 23:56:26 +0000 | |||
702 | +++ lib/lp/soyuz/stories/webservice/xx-source-package-publishing.txt 2009-12-07 15:21:15 +0000 | |||
703 | @@ -207,18 +207,20 @@ | |||
704 | 207 | ====================== | 207 | ====================== |
705 | 208 | 208 | ||
706 | 209 | The source publication object has a custom operation called 'getBuilds' and | 209 | The source publication object has a custom operation called 'getBuilds' and |
708 | 210 | it returns the build records in the context of that publication. | 210 | it returns the build records for builds that were built in the same context |
709 | 211 | archive as the publication, or builds from other archives but where the | ||
710 | 212 | binaries have been copied and published in the same context archive. | ||
711 | 211 | 213 | ||
712 | 212 | >>> pubs = webservice.named_get( | 214 | >>> pubs = webservice.named_get( |
713 | 213 | ... cprov_archive['self_link'], 'getPublishedSources', | 215 | ... cprov_archive['self_link'], 'getPublishedSources', |
715 | 214 | ... source_name="iceweasel", version="1.0", | 216 | ... source_name="pmount", version="0.1-1", |
716 | 215 | ... exact_match=True).jsonBody() | 217 | ... exact_match=True).jsonBody() |
717 | 216 | >>> source_pub = pubs['entries'][0] | 218 | >>> source_pub = pubs['entries'][0] |
718 | 217 | >>> builds = webservice.named_get( | 219 | >>> builds = webservice.named_get( |
719 | 218 | ... source_pub['self_link'], 'getBuilds').jsonBody() | 220 | ... source_pub['self_link'], 'getBuilds').jsonBody() |
720 | 219 | >>> for entry in sorted(builds['entries']): | 221 | >>> for entry in sorted(builds['entries']): |
721 | 220 | ... print entry['title'] | 222 | ... print entry['title'] |
723 | 221 | i386 build of iceweasel 1.0 in ubuntu warty RELEASE | 223 | i386 build of pmount 0.1-1 in ubuntu warty RELEASE |
724 | 222 | 224 | ||
725 | 223 | 225 | ||
726 | 224 | Finding related Binary publications | 226 | Finding related Binary publications |
727 | 225 | 227 | ||
728 | === modified file 'lib/lp/soyuz/templates/archive-edit-dependencies.pt' | |||
729 | --- lib/lp/soyuz/templates/archive-edit-dependencies.pt 2009-11-12 17:26:17 +0000 | |||
730 | +++ lib/lp/soyuz/templates/archive-edit-dependencies.pt 2009-12-07 15:21:15 +0000 | |||
731 | @@ -62,7 +62,7 @@ | |||
732 | 62 | </div> <!-- launchpad_form --> | 62 | </div> <!-- launchpad_form --> |
733 | 63 | 63 | ||
734 | 64 | <script type="text/javascript"> | 64 | <script type="text/javascript"> |
736 | 65 | YUI().use("node", function(Y) { | 65 | LPS.use("node", function(Y) { |
737 | 66 | 66 | ||
738 | 67 | // Highlight (setting bold font-weight) the label for the | 67 | // Highlight (setting bold font-weight) the label for the |
739 | 68 | // selected option in a given NodesList. Assumes the input is | 68 | // selected option in a given NodesList. Assumes the input is |
740 | 69 | 69 | ||
741 | === modified file 'lib/lp/soyuz/templates/archive-macros.pt' | |||
742 | --- lib/lp/soyuz/templates/archive-macros.pt 2009-11-04 19:59:16 +0000 | |||
743 | +++ lib/lp/soyuz/templates/archive-macros.pt 2009-12-07 15:21:15 +0000 | |||
744 | @@ -10,7 +10,7 @@ | |||
745 | 10 | </tal:comment> | 10 | </tal:comment> |
746 | 11 | 11 | ||
747 | 12 | <script type="text/javascript"> | 12 | <script type="text/javascript"> |
749 | 13 | YUI().use('node', 'io-base', 'lazr.anim', 'soyuz-base', function(Y) { | 13 | LPS.use('node', 'io-base', 'lazr.anim', 'soyuz-base', function(Y) { |
750 | 14 | 14 | ||
751 | 15 | 15 | ||
752 | 16 | /* | 16 | /* |
753 | 17 | 17 | ||
754 | === modified file 'lib/lp/soyuz/templates/archive-packages.pt' | |||
755 | --- lib/lp/soyuz/templates/archive-packages.pt 2009-11-04 13:56:17 +0000 | |||
756 | +++ lib/lp/soyuz/templates/archive-packages.pt 2009-12-07 15:21:15 +0000 | |||
757 | @@ -23,7 +23,7 @@ | |||
758 | 23 | </tal:devmode> | 23 | </tal:devmode> |
759 | 24 | <script type="text/javascript" id="repository-size-update" | 24 | <script type="text/javascript" id="repository-size-update" |
760 | 25 | tal:condition="view/archive_url"> | 25 | tal:condition="view/archive_url"> |
762 | 26 | YUI().use('io-base', 'lazr.anim', 'node', 'soyuz-base', | 26 | LPS.use('io-base', 'lazr.anim', 'node', 'soyuz-base', |
763 | 27 | 'soyuz.update_archive_build_statuses', function(Y) { | 27 | 'soyuz.update_archive_build_statuses', function(Y) { |
764 | 28 | 28 | ||
765 | 29 | 29 | ||
766 | 30 | 30 | ||
767 | === modified file 'lib/lp/soyuz/templates/archive-subscribers.pt' | |||
768 | --- lib/lp/soyuz/templates/archive-subscribers.pt 2009-09-29 07:21:40 +0000 | |||
769 | +++ lib/lp/soyuz/templates/archive-subscribers.pt 2009-12-07 15:21:15 +0000 | |||
770 | @@ -98,7 +98,7 @@ | |||
771 | 98 | </form> | 98 | </form> |
772 | 99 | </div><!-- class="portlet" --> | 99 | </div><!-- class="portlet" --> |
773 | 100 | <script type="text/javascript" id="setup-archivesubscribers-index"> | 100 | <script type="text/javascript" id="setup-archivesubscribers-index"> |
775 | 101 | YUI().use('soyuz.archivesubscribers_index', function(Y) { | 101 | LPS.use('soyuz.archivesubscribers_index', function(Y) { |
776 | 102 | Y.soyuz.setup_archivesubscribers_index(); | 102 | Y.soyuz.setup_archivesubscribers_index(); |
777 | 103 | }); | 103 | }); |
778 | 104 | </script> | 104 | </script> |
779 | 105 | 105 | ||
780 | === modified file 'lib/lp/translations/browser/language.py' | |||
781 | --- lib/lp/translations/browser/language.py 2009-10-31 11:06:44 +0000 | |||
782 | +++ lib/lp/translations/browser/language.py 2009-12-07 15:21:15 +0000 | |||
783 | @@ -29,6 +29,7 @@ | |||
784 | 29 | enabled_with_permission, GetitemNavigation, LaunchpadEditFormView, | 29 | enabled_with_permission, GetitemNavigation, LaunchpadEditFormView, |
785 | 30 | LaunchpadFormView, LaunchpadView, Link, NavigationMenu) | 30 | LaunchpadFormView, LaunchpadView, Link, NavigationMenu) |
786 | 31 | from lp.translations.utilities.pluralforms import make_friendly_plural_forms | 31 | from lp.translations.utilities.pluralforms import make_friendly_plural_forms |
787 | 32 | from canonical.launchpad.interfaces.launchpad import ILaunchpadCelebrities | ||
788 | 32 | 33 | ||
789 | 33 | from canonical.widgets import LabeledMultiCheckBoxWidget | 34 | from canonical.widgets import LabeledMultiCheckBoxWidget |
790 | 34 | 35 | ||
791 | @@ -202,6 +203,13 @@ | |||
792 | 202 | 203 | ||
793 | 203 | return pluralforms_list | 204 | return pluralforms_list |
794 | 204 | 205 | ||
795 | 206 | @property | ||
796 | 207 | def add_question_url(self): | ||
797 | 208 | rosetta = getUtility(ILaunchpadCelebrities).lp_translations | ||
798 | 209 | return canonical_url( | ||
799 | 210 | rosetta, | ||
800 | 211 | view_name='+addquestion', | ||
801 | 212 | rootsite='answers') | ||
802 | 205 | 213 | ||
803 | 206 | class LanguageAdminView(LaunchpadEditFormView): | 214 | class LanguageAdminView(LaunchpadEditFormView): |
804 | 207 | """Handle an admin form submission.""" | 215 | """Handle an admin form submission.""" |
805 | 208 | 216 | ||
806 | === modified file 'lib/lp/translations/stories/distroseries/xx-distroseries-templates.txt' | |||
807 | --- lib/lp/translations/stories/distroseries/xx-distroseries-templates.txt 2009-10-30 10:09:17 +0000 | |||
808 | +++ lib/lp/translations/stories/distroseries/xx-distroseries-templates.txt 2009-12-07 15:21:15 +0000 | |||
809 | @@ -1,11 +1,15 @@ | |||
811 | 1 | = Templates view for DistroSeries = | 1 | |
812 | 2 | |||
813 | 3 | Templates view for DistroSeries | ||
814 | 4 | =============================== | ||
815 | 2 | 5 | ||
816 | 3 | The +templates view for DistroSeries gives an overview of the translation | 6 | The +templates view for DistroSeries gives an overview of the translation |
817 | 4 | templates in this series and provides easy access to the various subpages of | 7 | templates in this series and provides easy access to the various subpages of |
818 | 5 | each template. | 8 | each template. |
819 | 6 | 9 | ||
820 | 7 | 10 | ||
822 | 8 | == Getting there == | 11 | Getting there |
823 | 12 | ------------- | ||
824 | 9 | 13 | ||
825 | 10 | To get to the listing of all templates, one needs to use the link | 14 | To get to the listing of all templates, one needs to use the link |
826 | 11 | from the distribution series translations page. | 15 | from the distribution series translations page. |
827 | @@ -16,7 +20,45 @@ | |||
828 | 16 | >>> print user_browser.url | 20 | >>> print user_browser.url |
829 | 17 | http://translations.launchpad.dev/ubuntu/hoary/+templates | 21 | http://translations.launchpad.dev/ubuntu/hoary/+templates |
830 | 18 | 22 | ||
832 | 19 | == The templates table == | 23 | The templates table |
833 | 24 | ------------------- | ||
834 | 25 | |||
835 | 26 | Full template listing for a distribution series is reached by following | ||
836 | 27 | a link from the distribution series translations page. | ||
837 | 28 | |||
838 | 29 | >>> anon_browser.open( | ||
839 | 30 | ... 'http://translations.launchpad.dev/ubuntu/hoary') | ||
840 | 31 | >>> anon_browser.getLink('full list of templates').click() | ||
841 | 32 | |||
842 | 33 | Full listing of templates shows source package name, template name and | ||
843 | 34 | the date of last update for this distribution series. | ||
844 | 35 | |||
845 | 36 | >>> table = find_tag_by_id(anon_browser.contents, 'templates_table') | ||
846 | 37 | >>> print extract_text(table) | ||
847 | 38 | Source package Template name Last update | ||
848 | 39 | evolution disabled-template 2007-01-05 | ||
849 | 40 | evolution evolution-2.2 2005-05-06 | ||
850 | 41 | evolution man 2006-08-14 | ||
851 | 42 | mozilla pkgconf-mozilla 2005-05-06 | ||
852 | 43 | pmount man 2006-08-14 | ||
853 | 44 | pmount pmount 2005-05-06 | ||
854 | 45 | |||
855 | 46 | |||
856 | 47 | Logged-in users will see a link from distro series | ||
857 | 48 | >>> user_browser.open( | ||
858 | 49 | ... 'http://translations.launchpad.dev/ubuntu/hoary') | ||
859 | 50 | >>> user_browser.getLink('full list of templates').click() | ||
860 | 51 | |||
861 | 52 | Logged-in users can also choose to download all translations for each | ||
862 | 53 | of the templates. | ||
863 | 54 | |||
864 | 55 | >>> table = find_tag_by_id(user_browser.contents, 'templates_table') | ||
865 | 56 | >>> print extract_text(table) | ||
866 | 57 | Source package Template name Last update Actions | ||
867 | 58 | evolution disabled-template 2007-01-05 Download | ||
868 | 59 | ... | ||
869 | 60 | mozilla pkgconf-mozilla 2005-05-06 Download | ||
870 | 61 | ... | ||
871 | 20 | 62 | ||
872 | 21 | Administrator can see all editing options. | 63 | Administrator can see all editing options. |
873 | 22 | 64 | ||
874 | @@ -28,16 +70,17 @@ | |||
875 | 28 | 70 | ||
876 | 29 | >>> table = find_tag_by_id(admin_browser.contents, 'templates_table') | 71 | >>> table = find_tag_by_id(admin_browser.contents, 'templates_table') |
877 | 30 | >>> print extract_text(table) | 72 | >>> print extract_text(table) |
888 | 31 | Source package Template name Actions | 73 | Source package Template name Last update Actions |
889 | 32 | evolution disabled-template Edit Upload Download Administer | 74 | evolution disabled-template 2007-01-05 Edit Upload Download Administer |
890 | 33 | evolution evolution-2.2 Edit Upload Download Administer | 75 | evolution evolution-2.2 2005-05-06 Edit Upload Download Administer |
891 | 34 | evolution man Edit Upload Download Administer | 76 | evolution man 2006-08-14 Edit Upload Download Administer |
892 | 35 | mozilla pkgconf-mozilla Edit Upload Download Administer | 77 | mozilla pkgconf-mozilla 2005-05-06 Edit Upload Download Administer |
893 | 36 | pmount man Edit Upload Download Administer | 78 | pmount man 2006-08-14 Edit Upload Download Administer |
894 | 37 | pmount pmount Edit Upload Download Administer | 79 | pmount pmount 2005-05-06 Edit Upload Download Administer |
895 | 38 | 80 | ||
896 | 39 | 81 | ||
897 | 40 | == Links to the templates == | 82 | Links to the templates |
898 | 83 | ---------------------- | ||
899 | 41 | 84 | ||
900 | 42 | Clicking on a template name will take the user to that template's overview | 85 | Clicking on a template name will take the user to that template's overview |
901 | 43 | page. | 86 | page. |
902 | 44 | 87 | ||
903 | === modified file 'lib/lp/translations/stories/productseries/xx-productseries-templates.txt' | |||
904 | --- lib/lp/translations/stories/productseries/xx-productseries-templates.txt 2009-10-30 10:09:17 +0000 | |||
905 | +++ lib/lp/translations/stories/productseries/xx-productseries-templates.txt 2009-12-07 15:21:15 +0000 | |||
906 | @@ -1,13 +1,18 @@ | |||
908 | 1 | = Templates view for ProductSeries = | 1 | |
909 | 2 | |||
910 | 3 | Templates view for ProductSeries | ||
911 | 4 | ================================ | ||
912 | 2 | 5 | ||
913 | 3 | The +templates view for ProductSeries gives an overview of the translation | 6 | The +templates view for ProductSeries gives an overview of the translation |
914 | 4 | templates in this series and provides easy access to the various subpages of | 7 | templates in this series and provides easy access to the various subpages of |
915 | 5 | each template. | 8 | each template. |
916 | 6 | 9 | ||
921 | 7 | == Preparation == | 10 | |
922 | 8 | 11 | Preparation | |
923 | 9 | To test the ordering of templates in the listing, we need another template | 12 | ----------- |
924 | 10 | that is new but must appear at the top of the list. | 13 | |
925 | 14 | To test the ordering of templates in the listing, we need another | ||
926 | 15 | template that is new but must appear at the top of the list. | ||
927 | 11 | 16 | ||
928 | 12 | >>> login('foo.bar@canonical.com') | 17 | >>> login('foo.bar@canonical.com') |
929 | 13 | >>> from zope.component import getUtility | 18 | >>> from zope.component import getUtility |
930 | @@ -18,7 +23,9 @@ | |||
931 | 18 | ... name='at-the-top') | 23 | ... name='at-the-top') |
932 | 19 | >>> logout() | 24 | >>> logout() |
933 | 20 | 25 | ||
935 | 21 | == Getting there == | 26 | |
936 | 27 | Getting there | ||
937 | 28 | ------------- | ||
938 | 22 | 29 | ||
939 | 23 | To get to the listing of all templates, one needs to use the link | 30 | To get to the listing of all templates, one needs to use the link |
940 | 24 | from the product series translations page. | 31 | from the product series translations page. |
941 | @@ -30,16 +37,17 @@ | |||
942 | 30 | http://translations.launchpad.dev/evolution/trunk/+templates | 37 | http://translations.launchpad.dev/evolution/trunk/+templates |
943 | 31 | 38 | ||
944 | 32 | 39 | ||
946 | 33 | == The templates table == | 40 | The templates table |
947 | 41 | ------------------- | ||
948 | 34 | 42 | ||
949 | 35 | The page shows a table of all templates and links to their subpages. | 43 | The page shows a table of all templates and links to their subpages. |
950 | 36 | 44 | ||
951 | 37 | >>> table = find_tag_by_id(user_browser.contents, 'templates_table') | 45 | >>> table = find_tag_by_id(user_browser.contents, 'templates_table') |
952 | 38 | >>> print extract_text(table) | 46 | >>> print extract_text(table) |
957 | 39 | Template name Actions | 47 | Template name Last update Actions |
958 | 40 | at-the-top Download | 48 | at-the-top ... Download |
959 | 41 | evolution-2.2 Download | 49 | evolution-2.2 2005-08-25 Download |
960 | 42 | evolution-2.2-test Download | 50 | evolution-2.2-test 2006-12-13 Download |
961 | 43 | 51 | ||
962 | 44 | If an administrator views this page, links to the templates admin page are | 52 | If an administrator views this page, links to the templates admin page are |
963 | 45 | shown, too. | 53 | shown, too. |
964 | @@ -48,13 +56,14 @@ | |||
965 | 48 | ... 'http://translations.launchpad.dev/evolution/trunk/+templates') | 56 | ... 'http://translations.launchpad.dev/evolution/trunk/+templates') |
966 | 49 | >>> table = find_tag_by_id(admin_browser.contents, 'templates_table') | 57 | >>> table = find_tag_by_id(admin_browser.contents, 'templates_table') |
967 | 50 | >>> print extract_text(table) | 58 | >>> print extract_text(table) |
975 | 51 | Template name Actions | 59 | Template name Last update Actions |
976 | 52 | at-the-top Edit Upload Download Administer | 60 | at-the-top ... Edit Upload Download Administer |
977 | 53 | evolution-2.2 Edit Upload Download Administer | 61 | evolution-2.2 2005-08-25 Edit Upload Download Administer |
978 | 54 | evolution-2.2-test Edit Upload Download Administer | 62 | evolution-2.2-test 2006-12-13 Edit Upload Download Administer |
979 | 55 | 63 | ||
980 | 56 | 64 | ||
981 | 57 | == Links to the templates == | 65 | Links to the templates |
982 | 66 | ---------------------- | ||
983 | 58 | 67 | ||
984 | 59 | Clicking on a template name will take the user to that template's overview | 68 | Clicking on a template name will take the user to that template's overview |
985 | 60 | page. | 69 | page. |
986 | 61 | 70 | ||
987 | === modified file 'lib/lp/translations/stories/standalone/xx-language.txt' | |||
988 | --- lib/lp/translations/stories/standalone/xx-language.txt 2009-10-31 11:06:44 +0000 | |||
989 | +++ lib/lp/translations/stories/standalone/xx-language.txt 2009-12-07 15:21:15 +0000 | |||
990 | @@ -1,6 +1,15 @@ | |||
991 | 1 | |||
992 | 2 | |||
993 | 3 | Languages view | ||
994 | 4 | ============== | ||
995 | 5 | |||
996 | 1 | Here is the tale of languages. We will see how to create, find and edit | 6 | Here is the tale of languages. We will see how to create, find and edit |
997 | 2 | them. | 7 | them. |
998 | 3 | 8 | ||
999 | 9 | |||
1000 | 10 | Getting there | ||
1001 | 11 | ------------- | ||
1002 | 12 | |||
1003 | 4 | Launchpad Translations has a main page. | 13 | Launchpad Translations has a main page. |
1004 | 5 | 14 | ||
1005 | 6 | >>> admin_browser.open('http://translations.launchpad.dev/') | 15 | >>> admin_browser.open('http://translations.launchpad.dev/') |
1006 | @@ -11,7 +20,12 @@ | |||
1007 | 11 | >>> print admin_browser.url | 20 | >>> print admin_browser.url |
1008 | 12 | http://translations.launchpad.dev/+languages | 21 | http://translations.launchpad.dev/+languages |
1009 | 13 | 22 | ||
1011 | 14 | Following the link, there is a form to add new languages. | 23 | |
1012 | 24 | Adding new languages | ||
1013 | 25 | -------------------- | ||
1014 | 26 | |||
1015 | 27 | Following the link from the translations main page, there is a form to | ||
1016 | 28 | add new languages. | ||
1017 | 15 | 29 | ||
1018 | 16 | >>> admin_browser.getLink('Add new language').click() | 30 | >>> admin_browser.getLink('Add new language').click() |
1019 | 17 | >>> print admin_browser.url | 31 | >>> print admin_browser.url |
1020 | @@ -65,11 +79,16 @@ | |||
1021 | 65 | ... | 79 | ... |
1022 | 66 | LinkNotFoundError | 80 | LinkNotFoundError |
1023 | 67 | 81 | ||
1025 | 68 | >>> user_browser.open('http://translations.launchpad.dev/+languages/+add') | 82 | >>> user_browser.open( |
1026 | 83 | ... 'http://translations.launchpad.dev/+languages/+add') | ||
1027 | 69 | Traceback (most recent call last): | 84 | Traceback (most recent call last): |
1028 | 70 | ... | 85 | ... |
1029 | 71 | Unauthorized:... | 86 | Unauthorized:... |
1030 | 72 | 87 | ||
1031 | 88 | |||
1032 | 89 | Searching for a language | ||
1033 | 90 | ------------------------ | ||
1034 | 91 | |||
1035 | 73 | From the top languages page, anyone can find languages. | 92 | From the top languages page, anyone can find languages. |
1036 | 74 | 93 | ||
1037 | 75 | >>> browser.open('http://translations.launchpad.dev/+languages') | 94 | >>> browser.open('http://translations.launchpad.dev/+languages') |
1038 | @@ -82,7 +101,11 @@ | |||
1039 | 82 | >>> print browser.url | 101 | >>> print browser.url |
1040 | 83 | http://translations.launchpad.dev/+languages/+index?find=Spanish | 102 | http://translations.launchpad.dev/+languages/+index?find=Spanish |
1041 | 84 | 103 | ||
1043 | 85 | And following one of the found languages, we can see a brief information | 104 | |
1044 | 105 | Read language information | ||
1045 | 106 | ------------------------- | ||
1046 | 107 | |||
1047 | 108 | Following one of the found languages, we can see a brief information | ||
1048 | 86 | about the selected language. | 109 | about the selected language. |
1049 | 87 | 110 | ||
1050 | 88 | >>> browser.getLink('Spanish').click() | 111 | >>> browser.getLink('Spanish').click() |
1051 | @@ -128,14 +151,50 @@ | |||
1052 | 128 | ...Uruguay... | 151 | ...Uruguay... |
1053 | 129 | ...Venezuela... | 152 | ...Venezuela... |
1054 | 130 | 153 | ||
1056 | 131 | >>> topcontributors_portlet = find_portlet(browser.contents, 'Top contributors') | 154 | >>> topcontributors_portlet = find_portlet( |
1057 | 155 | ... browser.contents, 'Top contributors') | ||
1058 | 132 | >>> print topcontributors_portlet | 156 | >>> print topcontributors_portlet |
1059 | 133 | <... | 157 | <... |
1060 | 134 | ...Carlos Perelló Marín... | 158 | ...Carlos Perelló Marín... |
1061 | 135 | 159 | ||
1062 | 160 | Our test sample data does not know about plural forms of | ||
1063 | 161 | Abkhazian and about countries where this language is spoken. | ||
1064 | 162 | |||
1065 | 163 | We will see a note about missing plural forms and a link to Rosetta | ||
1066 | 164 | add question page for informing Rosetta admin about the right plural | ||
1067 | 165 | form. | ||
1068 | 166 | |||
1069 | 167 | >>> browser.open('http://translations.launchpad.dev/+languages/ab') | ||
1070 | 168 | >>> print extract_text(find_portlet(browser.contents, 'Plural forms' | ||
1071 | 169 | ... ).renderContents()) | ||
1072 | 170 | Plural forms | ||
1073 | 171 | Unfortunately, Launchpad doesn't know the plural | ||
1074 | 172 | form information for this language... | ||
1075 | 173 | |||
1076 | 174 | >>> print browser.getLink(id='plural_question').url | ||
1077 | 175 | http://answers.launchpad.dev/rosetta/+addquestion | ||
1078 | 176 | |||
1079 | 177 | We will see a note that Launchpad does not know in which countries | ||
1080 | 178 | this language is spoken and a link to add question page for informing | ||
1081 | 179 | Rosetta admin about the countries where this page is officially spoken. | ||
1082 | 180 | |||
1083 | 181 | >>> countries_portlet = find_portlet(browser.contents, 'Countries') | ||
1084 | 182 | >>> print countries_portlet | ||
1085 | 183 | <... | ||
1086 | 184 | Abkhazian is not registered as being spoken in any | ||
1087 | 185 | country... | ||
1088 | 186 | |||
1089 | 187 | >>> print browser.getLink(id='country_question').url | ||
1090 | 188 | http://answers.launchpad.dev/rosetta/+addquestion | ||
1091 | 189 | |||
1092 | 190 | |||
1093 | 191 | Edit language information | ||
1094 | 192 | ------------------------- | ||
1095 | 193 | |||
1096 | 136 | Finally, there is the edit form to change language basic information. | 194 | Finally, there is the edit form to change language basic information. |
1097 | 137 | 195 | ||
1099 | 138 | >>> user_browser.open('http://translations.launchpad.dev/+languages/es') | 196 | >>> user_browser.open( |
1100 | 197 | ... 'http://translations.launchpad.dev/+languages/es') | ||
1101 | 139 | >>> print user_browser.url | 198 | >>> print user_browser.url |
1102 | 140 | http://translations.launchpad.dev/+languages/es | 199 | http://translations.launchpad.dev/+languages/es |
1103 | 141 | 200 | ||
1104 | @@ -146,7 +205,8 @@ | |||
1105 | 146 | ... | 205 | ... |
1106 | 147 | LinkNotFoundError | 206 | LinkNotFoundError |
1107 | 148 | 207 | ||
1109 | 149 | >>> user_browser.open('http://translations.launchpad.dev/+languages/es/+admin') | 208 | >>> user_browser.open( |
1110 | 209 | ... 'http://translations.launchpad.dev/+languages/es/+admin') | ||
1111 | 150 | Traceback (most recent call last): | 210 | Traceback (most recent call last): |
1112 | 151 | ... | 211 | ... |
1113 | 152 | Unauthorized:... | 212 | Unauthorized:... |
1114 | @@ -155,7 +215,8 @@ | |||
1115 | 155 | 215 | ||
1116 | 156 | >>> from canonical.launchpad.testing.pages import strip_label | 216 | >>> from canonical.launchpad.testing.pages import strip_label |
1117 | 157 | 217 | ||
1119 | 158 | >>> admin_browser.open('http://translations.launchpad.dev/+languages/es') | 218 | >>> admin_browser.open( |
1120 | 219 | ... 'http://translations.launchpad.dev/+languages/es') | ||
1121 | 159 | >>> print admin_browser.url | 220 | >>> print admin_browser.url |
1122 | 160 | http://translations.launchpad.dev/+languages/es | 221 | http://translations.launchpad.dev/+languages/es |
1123 | 161 | 222 | ||
1124 | 162 | 223 | ||
1125 | === modified file 'lib/lp/translations/templates/language-index.pt' | |||
1126 | --- lib/lp/translations/templates/language-index.pt 2009-09-17 14:45:59 +0000 | |||
1127 | +++ lib/lp/translations/templates/language-index.pt 2009-12-07 15:21:15 +0000 | |||
1128 | @@ -43,8 +43,10 @@ | |||
1129 | 43 | <p class="helpwanted"> | 43 | <p class="helpwanted"> |
1130 | 44 | Unfortunately, Launchpad doesn't know the plural form | 44 | Unfortunately, Launchpad doesn't know the plural form |
1131 | 45 | information for this language. If you know it, please open a | 45 | information for this language. If you know it, please open a |
1134 | 46 | <a href="/rosetta/+addticket">ticket</a> with that information, | 46 | <a id='plural_question' |
1135 | 47 | so we can add it to Launchpad. | 47 | tal:attributes="href view/add_question_url" |
1136 | 48 | >question</a> | ||
1137 | 49 | with that information, so we can add it to Launchpad. | ||
1138 | 48 | </p> | 50 | </p> |
1139 | 49 | </tal:has_not_pluralforms> | 51 | </tal:has_not_pluralforms> |
1140 | 50 | </div> | 52 | </div> |
1141 | @@ -124,8 +126,11 @@ | |||
1142 | 124 | </tal:language> | 126 | </tal:language> |
1143 | 125 | is not registered as being spoken in any country. If you know | 127 | is not registered as being spoken in any country. If you know |
1144 | 126 | about a country that officially speaks this language, please | 128 | about a country that officially speaks this language, please |
1147 | 127 | open a <a href="/rosetta/+addticket">ticket</a> with that | 129 | open a |
1148 | 128 | information, so we can add it to Launchpad. | 130 | <a id='country_question' |
1149 | 131 | tal:attributes="href view/add_question_url" | ||
1150 | 132 | >question</a> | ||
1151 | 133 | with that information, so we can add it to Launchpad. | ||
1152 | 129 | </p> | 134 | </p> |
1153 | 130 | </tal:has_not_countries> | 135 | </tal:has_not_countries> |
1154 | 131 | </div> | 136 | </div> |
1155 | 132 | 137 | ||
1156 | === modified file 'lib/lp/translations/templates/object-templates.pt' | |||
1157 | --- lib/lp/translations/templates/object-templates.pt 2009-11-24 19:23:52 +0000 | |||
1158 | +++ lib/lp/translations/templates/object-templates.pt 2009-12-07 15:21:15 +0000 | |||
1159 | @@ -26,16 +26,16 @@ | |||
1160 | 26 | </style> | 26 | </style> |
1161 | 27 | <style tal:condition="view/is_distroseries" type="text/css"> | 27 | <style tal:condition="view/is_distroseries" type="text/css"> |
1162 | 28 | #templates_table { | 28 | #templates_table { |
1164 | 29 | width: 72em; | 29 | width: 79em; |
1165 | 30 | } | 30 | } |
1166 | 31 | </style> | 31 | </style> |
1167 | 32 | <style tal:condition="not:view/is_distroseries" type="text/css"> | 32 | <style tal:condition="not:view/is_distroseries" type="text/css"> |
1168 | 33 | #templates_table { | 33 | #templates_table { |
1170 | 34 | width: 50em; | 34 | width: 58em; |
1171 | 35 | } | 35 | } |
1172 | 36 | </style> | 36 | </style> |
1173 | 37 | <script language="JavaScript" type="text/javascript"> | 37 | <script language="JavaScript" type="text/javascript"> |
1175 | 38 | YUI().use('node-base', 'event-delegate', function(Y) { | 38 | LPS.use('node-base', 'event-delegate', function(Y) { |
1176 | 39 | Y.on('domready', function(e) { | 39 | Y.on('domready', function(e) { |
1177 | 40 | Y.all('#templates_table .template_links').addClass( | 40 | Y.all('#templates_table .template_links').addClass( |
1178 | 41 | 'inactive_links'); | 41 | 'inactive_links'); |
1179 | @@ -75,6 +75,7 @@ | |||
1180 | 75 | <th tal:condition="view/is_distroseries" | 75 | <th tal:condition="view/is_distroseries" |
1181 | 76 | class="sourcepackage_column">Source package</th> | 76 | class="sourcepackage_column">Source package</th> |
1182 | 77 | <th class="template_column">Template name</th> | 77 | <th class="template_column">Template name</th> |
1183 | 78 | <th class="lastupdate_column">Last update</th> | ||
1184 | 78 | <th class="actions_column" | 79 | <th class="actions_column" |
1185 | 79 | tal:condition="context/required:launchpad.AnyPerson"> | 80 | tal:condition="context/required:launchpad.AnyPerson"> |
1186 | 80 | Actions</th> | 81 | Actions</th> |
1187 | @@ -88,6 +89,22 @@ | |||
1188 | 88 | </td> | 89 | </td> |
1189 | 89 | <td class="template_column"><a tal:attributes="href template/fmt:url" | 90 | <td class="template_column"><a tal:attributes="href template/fmt:url" |
1190 | 90 | tal:content="template/name">Template name</a></td> | 91 | tal:content="template/name">Template name</a></td> |
1191 | 92 | <td class="lastupdate_column"> | ||
1192 | 93 | <span class="sortkey" | ||
1193 | 94 | tal:condition="template/date_last_updated" | ||
1194 | 95 | tal:content="template/date_last_updated/fmt:datetime"> | ||
1195 | 96 | time sort key | ||
1196 | 97 | </span> | ||
1197 | 98 | <span class="lastupdate_column" | ||
1198 | 99 | tal:condition="template/date_last_updated" | ||
1199 | 100 | tal:attributes=" | ||
1200 | 101 | title template/date_last_updated/fmt:datetime" | ||
1201 | 102 | tal:content=" | ||
1202 | 103 | template/date_last_updated/fmt:approximatedate" | ||
1203 | 104 | > | ||
1204 | 105 | 2009-09-23 | ||
1205 | 106 | </span> | ||
1206 | 107 | </td> | ||
1207 | 91 | <td class="actions_column" | 108 | <td class="actions_column" |
1208 | 92 | tal:condition="context/required:launchpad.AnyPerson"> | 109 | tal:condition="context/required:launchpad.AnyPerson"> |
1209 | 93 | <div class="template_links"> | 110 | <div class="template_links"> |
1210 | 94 | 111 | ||
1211 | === modified file 'lib/lp/translations/templates/pofile-export.pt' | |||
1212 | --- lib/lp/translations/templates/pofile-export.pt 2009-11-10 21:04:19 +0000 | |||
1213 | +++ lib/lp/translations/templates/pofile-export.pt 2009-12-07 15:21:15 +0000 | |||
1214 | @@ -13,7 +13,7 @@ | |||
1215 | 13 | } | 13 | } |
1216 | 14 | </style> | 14 | </style> |
1217 | 15 | <script type="text/javascript"> | 15 | <script type="text/javascript"> |
1219 | 16 | YUI().use('node', 'event', function(Y){ | 16 | LPS.use('node', 'event', function(Y){ |
1220 | 17 | Y.on('domready', function(){ | 17 | Y.on('domready', function(){ |
1221 | 18 | // The pochanged option is only available for the PO format. | 18 | // The pochanged option is only available for the PO format. |
1222 | 19 | var formatlist = Y.one('#div_format select'); | 19 | var formatlist = Y.one('#div_format select'); |
1223 | 20 | 20 | ||
1224 | === modified file 'lib/lp/translations/templates/pofile-translate.pt' | |||
1225 | --- lib/lp/translations/templates/pofile-translate.pt 2009-11-04 19:59:16 +0000 | |||
1226 | +++ lib/lp/translations/templates/pofile-translate.pt 2009-12-07 15:21:15 +0000 | |||
1227 | @@ -20,7 +20,7 @@ | |||
1228 | 20 | <script type="text/javascript"> | 20 | <script type="text/javascript"> |
1229 | 21 | registerLaunchpadFunction(insertAllExpansionButtons); | 21 | registerLaunchpadFunction(insertAllExpansionButtons); |
1230 | 22 | 22 | ||
1232 | 23 | YUI().use('node', 'cookie', 'anim', 'lp.pofile', function(Y) { | 23 | LPS.use('node', 'cookie', 'anim', 'lp.pofile', function(Y) { |
1233 | 24 | 24 | ||
1234 | 25 | var hide_notification = function(node) { | 25 | var hide_notification = function(node) { |
1235 | 26 | var hide_anim = new Y.Anim({ | 26 | var hide_anim = new Y.Anim({ |
1236 | 27 | 27 | ||
1237 | === modified file 'lib/lp/translations/templates/translation-import-queue-macros.pt' | |||
1238 | --- lib/lp/translations/templates/translation-import-queue-macros.pt 2009-11-20 14:15:34 +0000 | |||
1239 | +++ lib/lp/translations/templates/translation-import-queue-macros.pt 2009-12-07 15:21:15 +0000 | |||
1240 | @@ -18,7 +18,7 @@ | |||
1241 | 18 | </script> | 18 | </script> |
1242 | 19 | 19 | ||
1243 | 20 | <script type="text/javascript"> | 20 | <script type="text/javascript"> |
1245 | 21 | YUI().use( 'translations', 'event', function(Y) { | 21 | LPS.use( 'translations', 'event', function(Y) { |
1246 | 22 | Y.on('domready', function(e) { | 22 | Y.on('domready', function(e) { |
1247 | 23 | Y.translations.initialize_import_queue_page(Y); | 23 | Y.translations.initialize_import_queue_page(Y); |
1248 | 24 | }); | 24 | }); |
1249 | 25 | 25 | ||
1250 | === modified file 'lib/lp/translations/templates/translationimportqueueentry-index.pt' | |||
1251 | --- lib/lp/translations/templates/translationimportqueueentry-index.pt 2009-11-04 13:56:17 +0000 | |||
1252 | +++ lib/lp/translations/templates/translationimportqueueentry-index.pt 2009-12-07 15:21:15 +0000 | |||
1253 | @@ -14,7 +14,7 @@ | |||
1254 | 14 | } | 14 | } |
1255 | 15 | </style> | 15 | </style> |
1256 | 16 | <script type="text/javascript"> | 16 | <script type="text/javascript"> |
1258 | 17 | YUI().use('node', 'lazr.anim', function(Y) { | 17 | LPS.use('node', 'lazr.anim', function(Y) { |
1259 | 18 | var fields = {'POT': | 18 | var fields = {'POT': |
1260 | 19 | ['field.name', 'field.translation_domain', | 19 | ['field.name', 'field.translation_domain', |
1261 | 20 | 'field.languagepack'], | 20 | 'field.languagepack'], |
1262 | 21 | 21 | ||
1263 | === modified file 'lib/lp/translations/templates/translationmessage-translate.pt' | |||
1264 | --- lib/lp/translations/templates/translationmessage-translate.pt 2009-09-17 07:28:30 +0000 | |||
1265 | +++ lib/lp/translations/templates/translationmessage-translate.pt 2009-12-07 15:21:15 +0000 | |||
1266 | @@ -18,7 +18,7 @@ | |||
1267 | 18 | tal:define="lp_js string:${icingroot}/build" | 18 | tal:define="lp_js string:${icingroot}/build" |
1268 | 19 | tal:attributes="src string:${lp_js}/translations/pofile.js"></script> | 19 | tal:attributes="src string:${lp_js}/translations/pofile.js"></script> |
1269 | 20 | <script type="text/javascript"> | 20 | <script type="text/javascript"> |
1271 | 21 | YUI().use('node', 'lp.pofile', function(Y) { | 21 | LPS.use('node', 'lp.pofile', function(Y) { |
1272 | 22 | Y.on('domready', Y.lp.pofile.setupSuggestionDismissal); | 22 | Y.on('domready', Y.lp.pofile.setupSuggestionDismissal); |
1273 | 23 | }); | 23 | }); |
1274 | 24 | </script> | 24 | </script> |
Overview
========
This branch fixes bug 443353 by ensuring that getBuildsForSou rces() will include builds that were originally built in a different archive context, but have since had binaries copied into the source archive context.
Issues
======
There are two main issues with this branch IMO.
1. Ensuring that the result of the union was ordered correctly is hackish, and dependent on a storm implementation detail (that columns in SQL queries for a certain table are ordered alphabetically).
2. Testing the correct ordering in the doctest isn't so readable. It could be better to print out the results instead, but on the other hand, I didn't want the test to be dependent on database ids. Previously the test simply ensured the last item was the one expected - I've updated this to instead assure that the complete result is sorted as expected.
Any suggestions welcome!
Testing/QA
==========
To test, run:
bin/test -vvt doc/publishing.txt
To QA:
Visit: /edge.launchpad .net/ubuntu/ +source/ openjdk- 6
https:/
and expand the intrepid 6b12-0ubuntu6.6 security release. Currently this only displays two builds, where as it should display all the builds listed at:
https:/ /edge.launchpad .net/ubuntu/ +source/ openjdk- 6/6b12- 0ubuntu6. 6