Merge lp:~michael.nelson/launchpad/443353-api-builds-from-private into lp:launchpad/db-devel
- 443353-api-builds-from-private
- Merge into db-devel
Status: | Merged |
---|---|
Approved by: | Jonathan Lange |
Approved revision: | not available |
Merged at revision: | not available |
Proposed branch: | lp:~michael.nelson/launchpad/443353-api-builds-from-private |
Merge into: | lp:launchpad/db-devel |
Diff against target: |
1274 lines (+453/-125) 43 files modified
lib/lp/archiveuploader/nascentuploadfile.py (+1/-1) lib/lp/archiveuploader/tests/nascentuploadfile.txt (+71/-0) lib/lp/bugs/templates/bug-portlet-subscribers.pt (+1/-1) lib/lp/bugs/templates/bugtarget-filebug-submit-bug.pt (+1/-1) lib/lp/bugs/templates/bugtarget-portlet-bugfilters.pt (+1/-1) lib/lp/bugs/templates/bugtarget-portlet-bugtags.pt (+1/-1) lib/lp/bugs/templates/bugtask-index.pt (+3/-3) lib/lp/bugs/templates/bugtask-tasks-and-nominations-table-row.pt (+1/-1) lib/lp/bugs/templates/bugtasks-and-nominations-table.pt (+1/-1) lib/lp/bugs/templates/official-bug-target-manage-tags.pt (+1/-1) lib/lp/code/templates/branch-import-details.pt (+1/-1) lib/lp/code/templates/branch-index.pt (+1/-1) lib/lp/code/templates/branch-listing.pt (+1/-1) lib/lp/code/templates/branch-portlet-subscribers.pt (+1/-1) lib/lp/code/templates/branch-related-bugs-specs.pt (+1/-1) lib/lp/code/templates/branchmergeproposal-generic-listing.pt (+1/-1) lib/lp/registry/templates/object-timeline-graph.pt (+1/-1) lib/lp/registry/templates/person-macros.pt (+1/-1) lib/lp/registry/templates/product-new.pt (+1/-1) lib/lp/registry/templates/productrelease-add-from-series.pt (+1/-1) lib/lp/registry/templates/teammembership-index.pt (+1/-1) lib/lp/registry/templates/timeline-macros.pt (+1/-1) lib/lp/soyuz/doc/publishing.txt (+50/-11) lib/lp/soyuz/model/publishing.py (+58/-22) lib/lp/soyuz/scripts/tests/test_copypackage.py (+16/-1) lib/lp/soyuz/stories/ppa/xx-copy-packages.txt (+1/-1) lib/lp/soyuz/stories/ppa/xx-ppa-packages.txt (+32/-10) lib/lp/soyuz/stories/webservice/xx-source-package-publishing.txt (+5/-3) lib/lp/soyuz/templates/archive-edit-dependencies.pt (+1/-1) lib/lp/soyuz/templates/archive-macros.pt (+1/-1) lib/lp/soyuz/templates/archive-packages.pt (+1/-1) lib/lp/soyuz/templates/archive-subscribers.pt (+1/-1) lib/lp/translations/browser/language.py (+8/-0) lib/lp/translations/stories/distroseries/xx-distroseries-templates.txt (+56/-13) lib/lp/translations/stories/productseries/xx-productseries-templates.txt (+27/-18) lib/lp/translations/stories/standalone/xx-language.txt (+68/-7) lib/lp/translations/templates/language-index.pt (+9/-4) lib/lp/translations/templates/object-templates.pt (+20/-3) lib/lp/translations/templates/pofile-export.pt (+1/-1) lib/lp/translations/templates/pofile-translate.pt (+1/-1) lib/lp/translations/templates/translation-import-queue-macros.pt (+1/-1) lib/lp/translations/templates/translationimportqueueentry-index.pt (+1/-1) lib/lp/translations/templates/translationmessage-translate.pt (+1/-1) |
To merge this branch: | bzr merge lp:~michael.nelson/launchpad/443353-api-builds-from-private |
Related bugs: |
Reviewer | Review Type | Date Requested | Status |
---|---|---|---|
Aaron Bentley (community) | Approve | ||
Jonathan Lange (community) | Approve | ||
Review via email: mp+14896@code.launchpad.net |
Commit message
Ensures that when source+binaries are copied from archive A to B, the corresponding builds will be returned when querying the copied source for its builds (even though they were built in the context of archive A). Fixes bug 443353.
Description of the change
Michael Nelson (michael.nelson) wrote : | # |
Jonathan Lange (jml) wrote : | # |
On Sun, Nov 15, 2009 at 2:35 PM, Michael Nelson
<email address hidden> wrote:
> Michael Nelson has proposed merging lp:~michael.nelson/launchpad/443353-api-builds-from-private into lp:launchpad.
>
> Requested reviews:
> Canonical Launchpad Engineering (launchpad)
> Related bugs:
> #443353 API does not include Builds for sources that were sync'd from private PPAs
> https:/
>
>
> Overview
> ========
>
> This branch fixes bug 443353 by ensuring that getBuildsForSou
>
> Issues
> ======
>
> There are two main issues with this branch IMO.
>
> 1. Ensuring that the result of the union was ordered correctly is hackish, and dependent on a storm implementation detail (that columns in SQL queries for a certain table are ordered alphabetically).
>
> 2. Testing the correct ordering in the doctest isn't so readable. It could be better to print out the results instead, but on the other hand, I didn't want the test to be dependent on database ids. Previously the test simply ensured the last item was the one expected - I've updated this to instead assure that the complete result is sorted as expected.
>
> Any suggestions welcome!
>
Hi Michael,
Thanks for fixing this -- on a Sunday even!
I'm generally OK with this branch, but would like another opportunity
to have a look at it and maybe to talk with you face-to-face.
jml
> Testing/QA
> ==========
>
> To test, run:
> bin/test -vvt doc/publishing.txt
>
> To QA:
>
> Visit:
> https:/
>
> and expand the intrepid 6b12-0ubuntu6.6 security release. Currently this only displays two builds, where as it should display all the builds listed at:
>
> https:/
>
>
>
>
> --
> https:/
> Your team Launchpad code reviewers from Canonical is subscribed to branch lp:launchpad.
>
> === modified file 'lib/lp/
> --- lib/lp/
> +++ lib/lp/
> @@ -1055,22 +1055,53 @@
> each build found.
>
> >>> cprov_builds.
> - 7
> + 8
>
> The `ResultSet` is ordered by ascending
> `SourcePackageP
> `DistroArchseri
>
> - >>> source_pub, build, arch = cprov_builds.last()
> -
> - >>> print source_
> - foo 666 in breezy-autotest
> -
> - >>> print build.title
> - i386 build of foo 666 in ubuntutest breezy-autotest RELEASE
> -
> - >>> print arch.displayname
> - ubuntutest Breezy Badger Autotest i386
> + # The easiest thing we can do here (without printing ids)
> + # is to show that sorting a list of the resulting ids+tags does not
> + # modify the list.
> + >>> from copy import copy
> + >>> ids_and_tags = [(pub.id, arch.architectu
> + ... for pub, build, arch in cprov_builds]
> + >>> ids...
Michael Nelson (michael.nelson) wrote : | # |
On Sun, Nov 15, 2009 at 3:24 PM, Jonathan Lange <email address hidden> wrote:
> On Sun, Nov 15, 2009 at 2:35 PM, Michael Nelson
> <email address hidden> wrote:
> > Michael Nelson has proposed merging
> lp:~michael.nelson/launchpad/443353-api-builds-from-private into
> lp:launchpad.
> >
> > Requested reviews:
> > Canonical Launchpad Engineering (launchpad)
> > Related bugs:
> > #443353 API does not include Builds for sources that were sync'd from
> private PPAs
> > https:/
> >
> >
> > Overview
> > ========
> >
> > This branch fixes bug 443353 by ensuring that getBuildsForSou
> include builds that were originally built in a different archive context,
> but have since had binaries copied into the source archive context.
> >
> > Issues
> > ======
> >
> > There are two main issues with this branch IMO.
> >
> > 1. Ensuring that the result of the union was ordered correctly is
> hackish, and dependent on a storm implementation detail (that columns in SQL
> queries for a certain table are ordered alphabetically).
> >
> > 2. Testing the correct ordering in the doctest isn't so readable. It
> could be better to print out the results instead, but on the other hand, I
> didn't want the test to be dependent on database ids. Previously the test
> simply ensured the last item was the one expected - I've updated this to
> instead assure that the complete result is sorted as expected.
> >
> > Any suggestions welcome!
> >
>
> Hi Michael,
>
> Thanks for fixing this -- on a Sunday even!
>
>
And thanks for reviewing even on a Sunday :)
> I'm generally OK with this branch, but would like another opportunity
> to have a look at it and maybe to talk with you face-to-face.
>
>
Yes, I was keen for pointers as I wasn't happy with the hack... after
talking with both Jamu and Gustavo, I think we've got a much better solution
(although it still feels it should not be necessary).
> jml
>
> > Testing/QA
> > ==========
> >
> > To test, run:
> > bin/test -vvt doc/publishing.txt
> >
> > To QA:
> >
> > Visit:
> > https:/
> >
> > and expand the intrepid 6b12-0ubuntu6.6 security release. Currently this
> only displays two builds, where as it should display all the builds listed
> at:
> >
> > https:/
> >
> >
> >
> >
> > --
> >
> https:/
> > Your team Launchpad code reviewers from Canonical is subscribed to branch
> lp:launchpad.
> >
> > === modified file 'lib/lp/
> > --- lib/lp/
> > +++ lib/lp/
> > @@ -1055,22 +1055,53 @@
> > each build found.
> >
> > >>> cprov_builds.
> > - 7
> > + 8
> >
> > The `ResultSet` is ordered by ascending
> > `SourcePackageP
> > `DistroArchseri
> >
> > - >>> source_pub, build, arch = c...
1 | === modified file 'lib/lp/soyuz/doc/publishing.txt' |
2 | --- lib/lp/soyuz/doc/publishing.txt 2009-11-15 20:13:09 +0000 |
3 | +++ lib/lp/soyuz/doc/publishing.txt 2009-11-16 03:58:10 +0000 |
4 | @@ -1064,24 +1064,31 @@ |
5 | # The easiest thing we can do here (without printing ids) |
6 | # is to show that sorting a list of the resulting ids+tags does not |
7 | # modify the list. |
8 | - >>> from copy import copy |
9 | >>> ids_and_tags = [(pub.id, arch.architecturetag) |
10 | ... for pub, build, arch in cprov_builds] |
11 | - >>> ids_and_tags_sorted = copy(ids_and_tags) |
12 | - >>> ids_and_tags_sorted.sort() |
13 | - >>> ids_and_tags == ids_and_tags_sorted |
14 | + >>> ids_and_tags == sorted(ids_and_tags) |
15 | True |
16 | |
17 | If a source package is copied from another archive (including the |
18 | -binaries), then these builds will also be included in the result (even |
19 | -though they were build in a different archive context). |
20 | +binaries), then the related builds for that source package will |
21 | +also be retrievable via the copied source publication. |
22 | +For example, if a package is built in a private security PPA, and then |
23 | +later copied out into the primary archive, the builds will then |
24 | +be available when looking at the copied source package in the primary |
25 | +archive. |
26 | |
27 | # Create a new PPA and publish a source with some builds |
28 | # and binaries. |
29 | - >>> other_ppa = factory.makeArchive() |
30 | + >>> other_ppa = factory.makeArchive(name="otherppa") |
31 | >>> binaries = test_publisher.getPubBinaries(archive=other_ppa) |
32 | + |
33 | +The associated builds and binaries will be created in the context of the |
34 | +other PPA. |
35 | + |
36 | >>> build = binaries[0].binarypackagerelease.build |
37 | >>> source_pub = build.sourcepackagerelease.publishings[0] |
38 | + >>> print build.archive.name |
39 | + otherppa |
40 | |
41 | # Copy the source into Celso's PPA, ensuring that the binaries |
42 | # are alse published there. |
43 | @@ -1092,7 +1099,8 @@ |
44 | ... binaries[0].binarypackagerelease, cprov.archive) |
45 | |
46 | Now we will see an extra source in Celso's PPA as well as an extra |
47 | -build - even though the build's context is not Celso's PPA. |
48 | +build - even though the build's context is not Celso's PPA. Previously |
49 | +there were 8 sources and builds. |
50 | |
51 | >>> cprov_sources_new = cprov.archive.getPublishedSources() |
52 | >>> cprov_sources_new.count() |
53 | |
54 | === modified file 'lib/lp/soyuz/model/publishing.py' |
55 | --- lib/lp/soyuz/model/publishing.py 2009-11-15 20:41:46 +0000 |
56 | +++ lib/lp/soyuz/model/publishing.py 2009-11-16 05:45:03 +0000 |
57 | @@ -1234,8 +1234,6 @@ |
58 | Build.buildstate.is_in(build_states)) |
59 | |
60 | store = getUtility(IStoreSelector).get(MAIN_STORE, DEFAULT_FLAVOR) |
61 | - find_spec = ( |
62 | - SourcePackagePublishingHistory, Build, DistroArchSeries) |
63 | |
64 | # We'll be looking for builds in the same distroseries as the |
65 | # SPPH for the same release. |
66 | @@ -1251,7 +1249,7 @@ |
67 | # First, we'll find the builds that were built in the same |
68 | # archive context as the published sources. |
69 | builds_in_same_archive = store.find( |
70 | - find_spec, |
71 | + Build, |
72 | builds_for_distroseries_expr, |
73 | SourcePackagePublishingHistory.archiveID == Build.archiveID, |
74 | *extra_exprs) |
75 | @@ -1260,7 +1258,7 @@ |
76 | # same archive... even though the build was not built in |
77 | # the same context archive. |
78 | builds_copied_into_archive = store.find( |
79 | - find_spec, |
80 | + Build, |
81 | builds_for_distroseries_expr, |
82 | SourcePackagePublishingHistory.archiveID != Build.archiveID, |
83 | BinaryPackagePublishingHistory.archive == Build.archiveID, |
84 | @@ -1269,26 +1267,29 @@ |
85 | BinaryPackageRelease.build == Build.id, |
86 | *extra_exprs) |
87 | |
88 | - result_set = builds_copied_into_archive.union( |
89 | + builds_union = builds_copied_into_archive.union( |
90 | builds_in_same_archive).config(distinct=True) |
91 | |
92 | - # XXX 2009-11-15 Michael Nelson bug=366043. It is not possible |
93 | - # to sort by the name `SourcePackagePublishingHistory.id` as after |
94 | - # the union there are no tables. Nor can we sort by ambiguous `id` |
95 | - # as in this case there are 3 id columns in the result. Specifying |
96 | - # the column index is the only option that I can find. So we're |
97 | - # relying on an implementation detail of Storm that it lists |
98 | - # columns in alphabetical order in sql queries. |
99 | - sql_columns = SourcePackagePublishingHistory._storm_columns.values() |
100 | - sql_column_names = [column.name for column in sql_columns] |
101 | - sql_column_names.sort() |
102 | - |
103 | - # SQL order by uses 1-based column numbers. |
104 | - source_pub_id_col_number = sql_column_names.index('id') + 1 |
105 | - result_set.order_by( |
106 | - source_pub_id_col_number, DistroArchSeries.architecturetag) |
107 | - |
108 | - return result_set |
109 | + # Now that we have a result_set of all the builds, we'll use it |
110 | + # as a subquery to get the required publishing and arch to do |
111 | + # the ordering. We do this in this round-about way because we |
112 | + # can't sort on SourcePackagePublishingHistory.id after the |
113 | + # union. See bug 443353 for details. |
114 | + find_spec = ( |
115 | + SourcePackagePublishingHistory, Build, DistroArchSeries) |
116 | + |
117 | + # Storm doesn't let us do builds_union.values('id') - |
118 | + # ('Union' object has no attribute 'columns'). So instead |
119 | + # we have to instantiate the objects just to get the id. |
120 | + build_ids = [build.id for build in builds_union] |
121 | + |
122 | + result_set = store.find( |
123 | + find_spec, builds_for_distroseries_expr, |
124 | + Build.id.is_in(build_ids)) |
125 | + |
126 | + return result_set.order_by( |
127 | + SourcePackagePublishingHistory.id, |
128 | + DistroArchSeries.architecturetag) |
129 | |
130 | def getByIdAndArchive(self, id, archive, source=True): |
131 | """See `IPublishingSet`.""" |
Jonathan Lange (jml) wrote : | # |
Thanks for fixing this, Michael. Land away.
Michael Nelson (michael.nelson) wrote : | # |
I've just gotten back to this now - when I originally tried to land this while at UDS there were a bunch of ec2 errors - one of which pointed out an error in the query itself.
There are two parts the failures fixed by this incremental.
* First, there was an actual error in the query (details below),
* second, a number of tests were affected by the change as we now return
builds from other archive contexts if the build has binaries published in
the current archive context (ie. by a copy)
* third, we have some bogus sample data (effectively two i386 builds for
ice-weasel 1.0 in cprov's ppa - one built in that context but without any
corresponding bpr's or bpph (so pending, buildid=25), the other copied and
published there with a new bpph (buildid=23). As this branch exposed this
data, I modified two tests to use a different package instead (yes, I
would like to have re-written the test to use STP etc. etc., but).
So, annotated diff below, raw diff at: http://
=== modified file 'lib/lp/
--- lib/lp/
+++ lib/lp/
@@ -594,7 +594,7 @@
# not blow up because of bad data.
return None
source, packageupload, spr, changesfile, lfc = result
-
+
# Return a webapp-proxied LibraryFileAlias so that restricted
# librarian files are accessible. Non-restricted files will get
# a 302 so that webapp threads are not tied up.
@@ -1290,7 +1290,8 @@
Build,
- BinaryPackagePu
+ BinaryPackagePu
+ SourcePackagePu
### So this was the error in the original MP - we're looking for builds that
originated in other contexts that have binaries published in the SPPH
archive context - not the Build's archive context :/
=== modified file 'lib/lp/
--- lib/lp/
+++ lib/lp/
@@ -1340,6 +1340,21 @@
+ # The second copy will fail explicitly because the new BPPH
+ # records are not yet published.
+ nothing_copied = copy_helper.
+ self.assertEqua
+ self.assertEqual(
+ copy_helper.
+ 'ERROR: foo 666 in hoary (same version has unpublished binaries '
+ 'in the destination archive for Hoary, please wait for them to '
+ 'be published before copying)')
+
+ # If we ensure that the copied b...
Aaron Bentley (abentley) wrote : | # |
I approve the incremental change. As discussed on IRC, I think it may be possible to remove the first find and use the second query to find cases where SourcePackagePu
Preview Diff
1 | === modified file 'lib/lp/archiveuploader/nascentuploadfile.py' |
2 | --- lib/lp/archiveuploader/nascentuploadfile.py 2009-11-10 13:09:26 +0000 |
3 | +++ lib/lp/archiveuploader/nascentuploadfile.py 2009-12-07 15:21:15 +0000 |
4 | @@ -690,7 +690,7 @@ |
5 | tar_checker.ancient_files[first_file]) |
6 | yield UploadError( |
7 | "%s: has %s file(s) with a time stamp too " |
8 | - "far into the future (e.g. %s [%s])." |
9 | + "far in the past (e.g. %s [%s])." |
10 | % (self.filename, len(ancient_files), first_file, |
11 | timestamp)) |
12 | return |
13 | |
14 | === modified file 'lib/lp/archiveuploader/tests/nascentuploadfile.txt' |
15 | --- lib/lp/archiveuploader/tests/nascentuploadfile.txt 2009-07-08 08:38:05 +0000 |
16 | +++ lib/lp/archiveuploader/tests/nascentuploadfile.txt 2009-12-07 15:21:15 +0000 |
17 | @@ -539,6 +539,77 @@ |
18 | |
19 | == DebBinaryUploadFile == |
20 | |
21 | +DebBinaryUploadFile models a binary .deb file. |
22 | + |
23 | + >>> from lp.archiveuploader.nascentuploadfile import ( |
24 | + ... DebBinaryUploadFile) |
25 | + >>> ed_deb_path = datadir('ed_0.2-20_i386.deb') |
26 | + >>> ed_binary_deb = DebBinaryUploadFile(ed_deb_path, |
27 | + ... 'e31eeb0b6b3b87e1ea79378df864ffff', |
28 | + ... 15, 'main/editors', 'important', 'foo', '1.2', |
29 | + ... ed_mixed_changes, modified_insecure_policy, |
30 | + ... mock_logger_quiet) |
31 | + |
32 | +Like the other files it can be verified: |
33 | + |
34 | + >>> list(ed_binary_deb.verify()) |
35 | + [] |
36 | + |
37 | +Verification checks that the specified section matches the section in the |
38 | +changes file: |
39 | + |
40 | + >>> ed_binary_deb = DebBinaryUploadFile(ed_deb_path, |
41 | + ... 'e31eeb0b6b3b87e1ea79378df864ffff', |
42 | + ... 15, 'main/net', 'important', 'foo', '1.2', |
43 | + ... ed_mixed_changes, modified_insecure_policy, |
44 | + ... mock_logger_quiet) |
45 | + >>> list(ed_binary_deb.verify()) |
46 | + [UploadError('ed_0.2-20_i386.deb control file lists section as |
47 | + main/editors but changes file has main/net.',)] |
48 | + |
49 | +It also checks the priority against the changes file: |
50 | + |
51 | + >>> ed_binary_deb = DebBinaryUploadFile(ed_deb_path, |
52 | + ... 'e31eeb0b6b3b87e1ea79378df864ffff', |
53 | + ... 15, 'main/editors', 'extra', 'foo', '1.2', |
54 | + ... ed_mixed_changes, modified_insecure_policy, |
55 | + ... mock_logger_quiet) |
56 | + >>> list(ed_binary_deb.verify()) |
57 | + [UploadError('ed_0.2-20_i386.deb control file lists priority as important |
58 | + but changes file has extra.',)] |
59 | + |
60 | +The timestamp of the files in the .deb are tested against the policy for being |
61 | +too new: |
62 | + |
63 | + >>> old_only_policy = getPolicy( |
64 | + ... name='insecure', distro='ubuntu', distroseries='hoary') |
65 | + >>> old_only_policy.can_upload_binaries = True |
66 | + >>> old_only_policy.future_time_grace = -5 * 365 * 24 * 60 * 60 |
67 | + |
68 | + >>> ed_binary_deb = DebBinaryUploadFile(ed_deb_path, |
69 | + ... 'e31eeb0b6b3b87e1ea79378df864ffff', |
70 | + ... 15, 'main/editors', 'important', 'foo', '1.2', |
71 | + ... ed_mixed_changes, old_only_policy, |
72 | + ... mock_logger_quiet) |
73 | + >>> list(ed_binary_deb.verifyDebTimestamp()) |
74 | + [UploadError('ed_0.2-20_i386.deb: has 26 file(s) with a time stamp too |
75 | + far into the future (e.g. control [Thu Jan 3 19:29:01 2008]).',)] |
76 | + |
77 | +... as well as for being too old: |
78 | + |
79 | + >>> new_only_policy = getPolicy( |
80 | + ... name='insecure', distro='ubuntu', distroseries='hoary') |
81 | + >>> new_only_policy.can_upload_binaries = True |
82 | + >>> new_only_policy.earliest_year = 2010 |
83 | + >>> ed_binary_deb = DebBinaryUploadFile(ed_deb_path, |
84 | + ... 'e31eeb0b6b3b87e1ea79378df864ffff', |
85 | + ... 15, 'main/editors', 'important', 'foo', '1.2', |
86 | + ... ed_mixed_changes, new_only_policy, |
87 | + ... mock_logger_quiet) |
88 | + >>> list(ed_binary_deb.verify()) |
89 | + [UploadError('ed_0.2-20_i386.deb: has 26 file(s) with a time stamp too |
90 | + far in the past (e.g. control [Thu Jan 3 19:29:01 2008]).',)] |
91 | + |
92 | |
93 | == UDebBinaryUploadFile == |
94 | |
95 | |
96 | === modified file 'lib/lp/bugs/templates/bug-portlet-subscribers.pt' |
97 | --- lib/lp/bugs/templates/bug-portlet-subscribers.pt 2009-11-26 03:13:32 +0000 |
98 | +++ lib/lp/bugs/templates/bug-portlet-subscribers.pt 2009-12-07 15:21:15 +0000 |
99 | @@ -25,7 +25,7 @@ |
100 | <img src="/@@/spinner" /> |
101 | </div> |
102 | <script type="text/javascript"> |
103 | - YUI().use('io-base', 'node', 'bugs.bugtask_index', function(Y) { |
104 | + LPS.use('io-base', 'node', 'bugs.bugtask_index', function(Y) { |
105 | // Must be done inline here to ensure the load event fires. |
106 | // This is a work around for a YUI3 issue with event handling. |
107 | var subscription_link = Y.one('.menu-link-subscription'); |
108 | |
109 | === modified file 'lib/lp/bugs/templates/bugtarget-filebug-submit-bug.pt' |
110 | --- lib/lp/bugs/templates/bugtarget-filebug-submit-bug.pt 2009-10-01 12:09:37 +0000 |
111 | +++ lib/lp/bugs/templates/bugtarget-filebug-submit-bug.pt 2009-12-07 15:21:15 +0000 |
112 | @@ -14,7 +14,7 @@ |
113 | tal:define="lp_js string:${icingroot}/build" |
114 | tal:attributes="src string:${lp_js}/bugs/filebug-dupefinder.js"></script> |
115 | <script type="text/javascript"> |
116 | - YUI().use('base', 'node', 'oop', 'event', 'bugs.dupe_finder', function(Y) { |
117 | + LPS.use('base', 'node', 'oop', 'event', 'bugs.dupe_finder', function(Y) { |
118 | Y.bugs.setup_dupe_finder(); |
119 | }); |
120 | </script> |
121 | |
122 | === modified file 'lib/lp/bugs/templates/bugtarget-portlet-bugfilters.pt' |
123 | --- lib/lp/bugs/templates/bugtarget-portlet-bugfilters.pt 2009-11-04 13:56:17 +0000 |
124 | +++ lib/lp/bugs/templates/bugtarget-portlet-bugfilters.pt 2009-12-07 15:21:15 +0000 |
125 | @@ -12,7 +12,7 @@ |
126 | <img src="/@@/spinner" /> |
127 | </div> |
128 | <script type="text/javascript"> |
129 | - YUI().use('io-base', 'node', function(Y) { |
130 | + LPS.use('io-base', 'node', function(Y) { |
131 | Y.on('domready', function() { |
132 | var portlet = Y.one('#portlet-bugfilters'); |
133 | Y.one('#bugfilters-portlet-spinner').setStyle('display', 'block'); |
134 | |
135 | === modified file 'lib/lp/bugs/templates/bugtarget-portlet-bugtags.pt' |
136 | --- lib/lp/bugs/templates/bugtarget-portlet-bugtags.pt 2009-11-04 13:56:17 +0000 |
137 | +++ lib/lp/bugs/templates/bugtarget-portlet-bugtags.pt 2009-12-07 15:21:15 +0000 |
138 | @@ -9,7 +9,7 @@ |
139 | <a id="tags-content-link" |
140 | tal:attributes="href context/fmt:url/+bugtarget-portlet-tags-content"></a> |
141 | <script type="text/javascript"> |
142 | - YUI().use('io-base', 'node', function(Y) { |
143 | + LPS.use('io-base', 'node', function(Y) { |
144 | Y.on('domready', function() { |
145 | Y.one('#tags-portlet-spinner').setStyle('display', 'block'); |
146 | |
147 | |
148 | === modified file 'lib/lp/bugs/templates/bugtask-index.pt' |
149 | --- lib/lp/bugs/templates/bugtask-index.pt 2009-11-30 17:57:15 +0000 |
150 | +++ lib/lp/bugs/templates/bugtask-index.pt 2009-12-07 15:21:15 +0000 |
151 | @@ -37,7 +37,7 @@ |
152 | </script> |
153 | </tal:devmode> |
154 | <script type="text/javascript"> |
155 | - YUI().use('base', 'node', 'oop', 'event', 'bugs.bugtask_index', |
156 | + LPS.use('base', 'node', 'oop', 'event', 'bugs.bugtask_index', |
157 | 'code.branchmergeproposal.popupdiff', function(Y) { |
158 | Y.bugs.setup_bugtask_index(); |
159 | Y.on('load', function(e) { |
160 | @@ -155,7 +155,7 @@ |
161 | <img src="/@@/spinner" id="tags-edit-spinner" style="display: none" /> |
162 | <a href="+edit" title="Edit tags" id="edit-tags-trigger" class="sprite edit"></a> |
163 | <script type="text/javascript"> |
164 | - YUI().use('event', 'node', 'bugs.bug_tags_entry', function(Y) { |
165 | + LPS.use('event', 'node', 'bugs.bug_tags_entry', function(Y) { |
166 | // XXX intellectronica 2009-04-16 bug #362309: |
167 | // The load event fires very late on bug pages that take a |
168 | // long time to render, but we prefer to use it since the |
169 | @@ -295,7 +295,7 @@ |
170 | button.style.display = 'none'; |
171 | </script> |
172 | <script type="text/javascript"> |
173 | - YUI().use('lp.comment', function(Y) { |
174 | + LPS.use('lp.comment', function(Y) { |
175 | var comment = new Y.lp.Comment(); |
176 | comment.render(); |
177 | }); |
178 | |
179 | === modified file 'lib/lp/bugs/templates/bugtask-tasks-and-nominations-table-row.pt' |
180 | --- lib/lp/bugs/templates/bugtask-tasks-and-nominations-table-row.pt 2009-11-03 15:32:31 +0000 |
181 | +++ lib/lp/bugs/templates/bugtask-tasks-and-nominations-table-row.pt 2009-12-07 15:21:15 +0000 |
182 | @@ -185,7 +185,7 @@ |
183 | class="bugtasks-table-row-init-script" |
184 | tal:condition="not:view/many_bugtasks" |
185 | tal:content="string: |
186 | - YUI().use('event', 'bugs.bugtask_index', function(Y) { |
187 | + LPS.use('event', 'bugs.bugtask_index', function(Y) { |
188 | Y.on('load', |
189 | function(e) { |
190 | Y.bugs.setup_bugtask_row(${view/js_config}); |
191 | |
192 | === modified file 'lib/lp/bugs/templates/bugtasks-and-nominations-table.pt' |
193 | --- lib/lp/bugs/templates/bugtasks-and-nominations-table.pt 2009-09-02 22:13:06 +0000 |
194 | +++ lib/lp/bugs/templates/bugtasks-and-nominations-table.pt 2009-12-07 15:21:15 +0000 |
195 | @@ -88,7 +88,7 @@ |
196 | </span> |
197 | |
198 | <script type="text/javascript" tal:content="string: |
199 | - YUI().use('event', 'bugs.bugtask_index', function(Y) { |
200 | + LPS.use('event', 'bugs.bugtask_index', function(Y) { |
201 | Y.on('load', function(e) { |
202 | Y.bugs.setup_me_too(${view/current_user_affected_js_status}); |
203 | }, window); |
204 | |
205 | === modified file 'lib/lp/bugs/templates/official-bug-target-manage-tags.pt' |
206 | --- lib/lp/bugs/templates/official-bug-target-manage-tags.pt 2009-09-04 17:03:00 +0000 |
207 | +++ lib/lp/bugs/templates/official-bug-target-manage-tags.pt 2009-12-07 15:21:15 +0000 |
208 | @@ -31,7 +31,7 @@ |
209 | </script> |
210 | <script tal:replace="structure view/tags_js_data" /> |
211 | <script type="text/javascript"> |
212 | - YUI().use('event', 'bugs.official_bug_tag_management', function(Y) { |
213 | + LPS.use('event', 'bugs.official_bug_tag_management', function(Y) { |
214 | Y.on('domready', function(e) { |
215 | Y.bugs.setup_official_bug_tag_management(); |
216 | }); |
217 | |
218 | === modified file 'lib/lp/code/templates/branch-import-details.pt' |
219 | --- lib/lp/code/templates/branch-import-details.pt 2009-11-04 13:56:17 +0000 |
220 | +++ lib/lp/code/templates/branch-import-details.pt 2009-12-07 15:21:15 +0000 |
221 | @@ -32,7 +32,7 @@ |
222 | Try again |
223 | </a> |
224 | <script type="text/javascript"> |
225 | - YUI().use('event', 'node', function(Y) { |
226 | + LPS.use('event', 'node', function(Y) { |
227 | Y.on("domready", function () { Y.one('#tryagainlink').setStyle('display', 'inline') }); |
228 | }); |
229 | </script> |
230 | |
231 | === modified file 'lib/lp/code/templates/branch-index.pt' |
232 | --- lib/lp/code/templates/branch-index.pt 2009-11-17 05:07:41 +0000 |
233 | +++ lib/lp/code/templates/branch-index.pt 2009-12-07 15:21:15 +0000 |
234 | @@ -47,7 +47,7 @@ |
235 | </tal:devmode> |
236 | <script type="text/javascript" |
237 | tal:content="string: |
238 | - YUI().use('node', 'event', 'widget', 'plugin', 'overlay', |
239 | + LPS.use('node', 'event', 'widget', 'plugin', 'overlay', |
240 | 'lazr.choiceedit', 'code.branchstatus', |
241 | 'code.branchmergeproposal.popupdiff', |
242 | function(Y) { |
243 | |
244 | === modified file 'lib/lp/code/templates/branch-listing.pt' |
245 | --- lib/lp/code/templates/branch-listing.pt 2009-11-04 13:56:17 +0000 |
246 | +++ lib/lp/code/templates/branch-listing.pt 2009-12-07 15:21:15 +0000 |
247 | @@ -41,7 +41,7 @@ |
248 | } |
249 | registerLaunchpadFunction(hookUpFilterSubmission); |
250 | |
251 | -YUI().use('io-base', 'node', 'json-parse', function(Y) { |
252 | +LPS.use('io-base', 'node', 'json-parse', function(Y) { |
253 | |
254 | function doUpdate(transaction_id, response, args) { |
255 | json_values = Y.JSON.parse(response.responseText); |
256 | |
257 | === modified file 'lib/lp/code/templates/branch-portlet-subscribers.pt' |
258 | --- lib/lp/code/templates/branch-portlet-subscribers.pt 2009-11-04 13:56:17 +0000 |
259 | +++ lib/lp/code/templates/branch-portlet-subscribers.pt 2009-12-07 15:21:15 +0000 |
260 | @@ -41,7 +41,7 @@ |
261 | string:<script id='milestone-script' type='text/javascript'>" /> |
262 | <!-- |
263 | |
264 | - YUI().use('io-base', 'node', 'code.branchsubscription', function(Y) { |
265 | + LPS.use('io-base', 'node', 'code.branchsubscription', function(Y) { |
266 | |
267 | if(Y.UA.ie) { |
268 | Y.one('#subscriber-list').set('innerHTML', |
269 | |
270 | === modified file 'lib/lp/code/templates/branch-related-bugs-specs.pt' |
271 | --- lib/lp/code/templates/branch-related-bugs-specs.pt 2009-09-08 21:42:45 +0000 |
272 | +++ lib/lp/code/templates/branch-related-bugs-specs.pt 2009-12-07 15:21:15 +0000 |
273 | @@ -42,7 +42,7 @@ |
274 | string:<script id='branchlink-script' type='text/javascript'>" /> |
275 | <!-- |
276 | |
277 | - YUI().use('io-base', 'code.branchlinks', function(Y) { |
278 | + LPS.use('io-base', 'code.branchlinks', function(Y) { |
279 | |
280 | if(Y.UA.ie) { |
281 | return; |
282 | |
283 | === modified file 'lib/lp/code/templates/branchmergeproposal-generic-listing.pt' |
284 | --- lib/lp/code/templates/branchmergeproposal-generic-listing.pt 2009-11-04 13:56:17 +0000 |
285 | +++ lib/lp/code/templates/branchmergeproposal-generic-listing.pt 2009-12-07 15:21:15 +0000 |
286 | @@ -24,7 +24,7 @@ |
287 | </form> |
288 | <script type="text/javascript"> |
289 | |
290 | -YUI().use('node', function(Y) { |
291 | +LPS.use('node', function(Y) { |
292 | |
293 | function submit_filter() { |
294 | Y.one('#filter_form').submit(); |
295 | |
296 | === modified file 'lib/lp/registry/templates/object-timeline-graph.pt' |
297 | --- lib/lp/registry/templates/object-timeline-graph.pt 2009-11-24 09:30:01 +0000 |
298 | +++ lib/lp/registry/templates/object-timeline-graph.pt 2009-12-07 15:21:15 +0000 |
299 | @@ -32,7 +32,7 @@ |
300 | include_inactive = false; |
301 | } |
302 | |
303 | - YUI().use('registry.timeline', 'node', function(Y) { |
304 | + LPS.use('registry.timeline', 'node', function(Y) { |
305 | Y.on('domready', function(e) { |
306 | if (Y.UA.ie) { |
307 | return; |
308 | |
309 | === modified file 'lib/lp/registry/templates/person-macros.pt' |
310 | --- lib/lp/registry/templates/person-macros.pt 2009-11-04 13:56:17 +0000 |
311 | +++ lib/lp/registry/templates/person-macros.pt 2009-12-07 15:21:15 +0000 |
312 | @@ -190,7 +190,7 @@ |
313 | condition="private_prefix"> |
314 | <script type="text/javascript" |
315 | tal:content="string: |
316 | - YUI().use('node', 'event', function(Y) { |
317 | + LPS.use('node', 'event', function(Y) { |
318 | // Prepend/remove 'private-' from team name based on visibility |
319 | // setting. User can choose to edit it back out, if they wish. |
320 | function visibility_on_change(e) { |
321 | |
322 | === modified file 'lib/lp/registry/templates/product-new.pt' |
323 | --- lib/lp/registry/templates/product-new.pt 2009-11-04 13:56:17 +0000 |
324 | +++ lib/lp/registry/templates/product-new.pt 2009-12-07 15:21:15 +0000 |
325 | @@ -14,7 +14,7 @@ |
326 | * details widgets until the user states that the project they are |
327 | * registering is not a duplicate. |
328 | */ |
329 | -YUI().use('node', 'lazr.effects', function(Y) { |
330 | +LPS.use('node', 'lazr.effects', function(Y) { |
331 | Y.on('domready', function() { |
332 | /* These two regexps serve slightly different purposes. The first |
333 | * finds the leftmost run of valid url characters for the autofill |
334 | |
335 | === modified file 'lib/lp/registry/templates/productrelease-add-from-series.pt' |
336 | --- lib/lp/registry/templates/productrelease-add-from-series.pt 2009-11-04 13:56:17 +0000 |
337 | +++ lib/lp/registry/templates/productrelease-add-from-series.pt 2009-12-07 15:21:15 +0000 |
338 | @@ -14,7 +14,7 @@ |
339 | <tal:script |
340 | replace="structure |
341 | string:<script id='milestone-script' type='text/javascript'>" /> |
342 | - YUI().use('node', 'lp.milestoneoverlay', function (Y) { |
343 | + LPS.use('node', 'lp.milestoneoverlay', function (Y) { |
344 | |
345 | // This is a value for the SELECT OPTION which is passed with |
346 | // the SELECT's "change" event. It includes some symbols that are not |
347 | |
348 | === modified file 'lib/lp/registry/templates/teammembership-index.pt' |
349 | --- lib/lp/registry/templates/teammembership-index.pt 2009-11-04 13:56:17 +0000 |
350 | +++ lib/lp/registry/templates/teammembership-index.pt 2009-12-07 15:21:15 +0000 |
351 | @@ -20,7 +20,7 @@ |
352 | use-macro="context/@@launchpad_widget_macros/yui2calendar-dependencies" /> |
353 | |
354 | <script type="text/javascript"> |
355 | - YUI().use('node', 'lp.calendar', function(Y) { |
356 | + LPS.use('node', 'lp.calendar', function(Y) { |
357 | // Ensure that when the picker is used the radio button switches |
358 | // from 'Never' to 'On' and the expiry field is enabled. |
359 | Y.on("available", function(e) { |
360 | |
361 | === modified file 'lib/lp/registry/templates/timeline-macros.pt' |
362 | --- lib/lp/registry/templates/timeline-macros.pt 2009-11-04 13:56:17 +0000 |
363 | +++ lib/lp/registry/templates/timeline-macros.pt 2009-12-07 15:21:15 +0000 |
364 | @@ -35,7 +35,7 @@ |
365 | if (auto_resize == 'true') { |
366 | timeline_url += 'resize_frame=timeline-iframe&'; |
367 | } |
368 | - YUI().use('node', function(Y) { |
369 | + LPS.use('node', function(Y) { |
370 | if (Y.UA.ie) { |
371 | return; |
372 | } |
373 | |
374 | === modified file 'lib/lp/soyuz/doc/publishing.txt' |
375 | --- lib/lp/soyuz/doc/publishing.txt 2009-11-18 23:56:26 +0000 |
376 | +++ lib/lp/soyuz/doc/publishing.txt 2009-12-07 15:21:15 +0000 |
377 | @@ -1055,22 +1055,61 @@ |
378 | each build found. |
379 | |
380 | >>> cprov_builds.count() |
381 | - 7 |
382 | + 8 |
383 | |
384 | The `ResultSet` is ordered by ascending |
385 | `SourcePackagePublishingHistory.id` and ascending |
386 | `DistroArchseries.architecturetag` in this order. |
387 | |
388 | - >>> source_pub, build, arch = cprov_builds.last() |
389 | - |
390 | - >>> print source_pub.displayname |
391 | - foo 666 in breezy-autotest |
392 | - |
393 | - >>> print build.title |
394 | - i386 build of foo 666 in ubuntutest breezy-autotest RELEASE |
395 | - |
396 | - >>> print arch.displayname |
397 | - ubuntutest Breezy Badger Autotest i386 |
398 | + # The easiest thing we can do here (without printing ids) |
399 | + # is to show that sorting a list of the resulting ids+tags does not |
400 | + # modify the list. |
401 | + >>> ids_and_tags = [(pub.id, arch.architecturetag) |
402 | + ... for pub, build, arch in cprov_builds] |
403 | + >>> ids_and_tags == sorted(ids_and_tags) |
404 | + True |
405 | + |
406 | +If a source package is copied from another archive (including the |
407 | +binaries), then the related builds for that source package will |
408 | +also be retrievable via the copied source publication. |
409 | +For example, if a package is built in a private security PPA, and then |
410 | +later copied out into the primary archive, the builds will then |
411 | +be available when looking at the copied source package in the primary |
412 | +archive. |
413 | + |
414 | + # Create a new PPA and publish a source with some builds |
415 | + # and binaries. |
416 | + >>> other_ppa = factory.makeArchive(name="otherppa") |
417 | + >>> binaries = test_publisher.getPubBinaries(archive=other_ppa) |
418 | + |
419 | +The associated builds and binaries will be created in the context of the |
420 | +other PPA. |
421 | + |
422 | + >>> build = binaries[0].binarypackagerelease.build |
423 | + >>> source_pub = build.sourcepackagerelease.publishings[0] |
424 | + >>> print build.archive.name |
425 | + otherppa |
426 | + |
427 | + # Copy the source into Celso's PPA, ensuring that the binaries |
428 | + # are alse published there. |
429 | + >>> source_pub_cprov = source_pub.copyTo( |
430 | + ... source_pub.distroseries, source_pub.pocket, |
431 | + ... cprov.archive) |
432 | + >>> binaries_cprov = test_publisher.publishBinaryInArchive( |
433 | + ... binaries[0].binarypackagerelease, cprov.archive) |
434 | + |
435 | +Now we will see an extra source in Celso's PPA as well as an extra |
436 | +build - even though the build's context is not Celso's PPA. Previously |
437 | +there were 8 sources and builds. |
438 | + |
439 | + >>> cprov_sources_new = cprov.archive.getPublishedSources() |
440 | + >>> cprov_sources_new.count() |
441 | + 9 |
442 | + |
443 | + >>> cprov_builds_new = publishing_set.getBuildsForSources( |
444 | + ... cprov_sources_new) |
445 | + >>> cprov_builds_new.count() |
446 | + 9 |
447 | |
448 | Next we'll create two sources with two builds each (the SoyuzTestPublisher |
449 | default) and show that the number of unpublished builds for these sources |
450 | |
451 | === modified file 'lib/lp/soyuz/model/publishing.py' |
452 | --- lib/lp/soyuz/model/publishing.py 2009-11-19 00:26:13 +0000 |
453 | +++ lib/lp/soyuz/model/publishing.py 2009-12-07 15:21:15 +0000 |
454 | @@ -40,6 +40,7 @@ |
455 | from canonical.database.enumcol import EnumCol |
456 | from lp.registry.interfaces.pocket import PackagePublishingPocket |
457 | from lp.soyuz.model.binarypackagename import BinaryPackageName |
458 | +from lp.soyuz.model.binarypackagerelease import BinaryPackageRelease |
459 | from lp.soyuz.model.files import ( |
460 | BinaryPackageFile, SourcePackageReleaseFile) |
461 | from canonical.launchpad.database.librarian import ( |
462 | @@ -593,7 +594,7 @@ |
463 | # not blow up because of bad data. |
464 | return None |
465 | source, packageupload, spr, changesfile, lfc = result |
466 | - |
467 | + |
468 | # Return a webapp-proxied LibraryFileAlias so that restricted |
469 | # librarian files are accessible. Non-restricted files will get |
470 | # a 302 so that webapp threads are not tied up. |
471 | @@ -1262,23 +1263,64 @@ |
472 | Build.buildstate.is_in(build_states)) |
473 | |
474 | store = getUtility(IStoreSelector).get(MAIN_STORE, DEFAULT_FLAVOR) |
475 | - result_set = store.find( |
476 | - (SourcePackagePublishingHistory, Build, DistroArchSeries), |
477 | + |
478 | + # We'll be looking for builds in the same distroseries as the |
479 | + # SPPH for the same release. |
480 | + builds_for_distroseries_expr = ( |
481 | Build.distroarchseriesID == DistroArchSeries.id, |
482 | + SourcePackagePublishingHistory.distroseriesID == |
483 | + DistroArchSeries.distroseriesID, |
484 | + SourcePackagePublishingHistory.sourcepackagereleaseID == |
485 | + Build.sourcepackagereleaseID, |
486 | + In(SourcePackagePublishingHistory.id, source_publication_ids) |
487 | + ) |
488 | + |
489 | + # First, we'll find the builds that were built in the same |
490 | + # archive context as the published sources. |
491 | + builds_in_same_archive = store.find( |
492 | + Build, |
493 | + builds_for_distroseries_expr, |
494 | SourcePackagePublishingHistory.archiveID == Build.archiveID, |
495 | - SourcePackagePublishingHistory.distroseriesID == |
496 | - DistroArchSeries.distroseriesID, |
497 | - SourcePackagePublishingHistory.sourcepackagereleaseID == |
498 | - Build.sourcepackagereleaseID, |
499 | - In(SourcePackagePublishingHistory.id, source_publication_ids), |
500 | - *extra_exprs) |
501 | - |
502 | - result_set.order_by( |
503 | + *extra_exprs) |
504 | + |
505 | + # Next get all the builds that have a binary published in the |
506 | + # same archive... even though the build was not built in |
507 | + # the same context archive. |
508 | + builds_copied_into_archive = store.find( |
509 | + Build, |
510 | + builds_for_distroseries_expr, |
511 | + SourcePackagePublishingHistory.archiveID != Build.archiveID, |
512 | + BinaryPackagePublishingHistory.archive == |
513 | + SourcePackagePublishingHistory.archiveID, |
514 | + BinaryPackagePublishingHistory.binarypackagerelease == |
515 | + BinaryPackageRelease.id, |
516 | + BinaryPackageRelease.build == Build.id, |
517 | + *extra_exprs) |
518 | + |
519 | + builds_union = builds_copied_into_archive.union( |
520 | + builds_in_same_archive).config(distinct=True) |
521 | + |
522 | + # Now that we have a result_set of all the builds, we'll use it |
523 | + # as a subquery to get the required publishing and arch to do |
524 | + # the ordering. We do this in this round-about way because we |
525 | + # can't sort on SourcePackagePublishingHistory.id after the |
526 | + # union. See bug 443353 for details. |
527 | + find_spec = ( |
528 | + SourcePackagePublishingHistory, Build, DistroArchSeries) |
529 | + |
530 | + # Storm doesn't let us do builds_union.values('id') - |
531 | + # ('Union' object has no attribute 'columns'). So instead |
532 | + # we have to instantiate the objects just to get the id. |
533 | + build_ids = [build.id for build in builds_union] |
534 | + |
535 | + result_set = store.find( |
536 | + find_spec, builds_for_distroseries_expr, |
537 | + Build.id.is_in(build_ids)) |
538 | + |
539 | + return result_set.order_by( |
540 | SourcePackagePublishingHistory.id, |
541 | DistroArchSeries.architecturetag) |
542 | |
543 | - return result_set |
544 | - |
545 | def getByIdAndArchive(self, id, archive, source=True): |
546 | """See `IPublishingSet`.""" |
547 | if source: |
548 | @@ -1317,12 +1359,10 @@ |
549 | def _getSourceBinaryJoinForSources(self, source_publication_ids, |
550 | active_binaries_only=True): |
551 | """Return the join linking sources with binaries.""" |
552 | - # Import Build, BinaryPackageRelease and DistroArchSeries locally |
553 | + # Import Build and DistroArchSeries locally |
554 | # to avoid circular imports, since Build uses |
555 | # SourcePackagePublishingHistory, BinaryPackageRelease uses Build |
556 | # and DistroArchSeries uses BinaryPackagePublishingHistory. |
557 | - from lp.soyuz.model.binarypackagerelease import ( |
558 | - BinaryPackageRelease) |
559 | from lp.soyuz.model.build import Build |
560 | from lp.soyuz.model.distroarchseries import ( |
561 | DistroArchSeries) |
562 | @@ -1397,12 +1437,8 @@ |
563 | |
564 | def getBinaryFilesForSources(self, one_or_more_source_publications): |
565 | """See `IPublishingSet`.""" |
566 | - # Import Build and BinaryPackageRelease locally to avoid circular |
567 | - # imports, since that Build already imports |
568 | - # SourcePackagePublishingHistory and BinaryPackageRelease imports |
569 | - # Build. |
570 | - from lp.soyuz.model.binarypackagerelease import ( |
571 | - BinaryPackageRelease) |
572 | + # Import Build locally to avoid circular imports, since that |
573 | + # Build already imports SourcePackagePublishingHistory. |
574 | from lp.soyuz.model.build import Build |
575 | |
576 | source_publication_ids = self._extractIDs( |
577 | |
578 | === modified file 'lib/lp/soyuz/scripts/tests/test_copypackage.py' |
579 | --- lib/lp/soyuz/scripts/tests/test_copypackage.py 2009-11-17 21:38:28 +0000 |
580 | +++ lib/lp/soyuz/scripts/tests/test_copypackage.py 2009-12-07 15:21:15 +0000 |
581 | @@ -1368,6 +1368,21 @@ |
582 | target_archive = copy_helper.destination.archive |
583 | self.checkCopies(copied, target_archive, 3) |
584 | |
585 | + # The second copy will fail explicitly because the new BPPH |
586 | + # records are not yet published. |
587 | + nothing_copied = copy_helper.mainTask() |
588 | + self.assertEqual(len(nothing_copied), 0) |
589 | + self.assertEqual( |
590 | + copy_helper.logger.buffer.getvalue().splitlines()[-1], |
591 | + 'ERROR: foo 666 in hoary (same version has unpublished binaries ' |
592 | + 'in the destination archive for Hoary, please wait for them to ' |
593 | + 'be published before copying)') |
594 | + |
595 | + # If we ensure that the copied binaries are published, the |
596 | + # copy won't fail but will simply not copy anything. |
597 | + for bin_pub in copied[1:3]: |
598 | + bin_pub.secure_record.setPublished() |
599 | + |
600 | nothing_copied = copy_helper.mainTask() |
601 | self.assertEqual(len(nothing_copied), 0) |
602 | self.assertEqual( |
603 | @@ -1504,7 +1519,7 @@ |
604 | name='boing') |
605 | self.assertEqual(copied_source.displayname, 'boing 1.0 in hoary') |
606 | self.assertEqual(len(copied_source.getPublishedBinaries()), 2) |
607 | - self.assertEqual(len(copied_source.getBuilds()), 0) |
608 | + self.assertEqual(len(copied_source.getBuilds()), 1) |
609 | |
610 | def _setupArchitectureGrowingScenario(self, architecturehintlist="all"): |
611 | """Prepare distroseries with different sets of architectures. |
612 | |
613 | === modified file 'lib/lp/soyuz/stories/ppa/xx-copy-packages.txt' |
614 | --- lib/lp/soyuz/stories/ppa/xx-copy-packages.txt 2009-10-13 10:05:58 +0000 |
615 | +++ lib/lp/soyuz/stories/ppa/xx-copy-packages.txt 2009-12-07 15:21:15 +0000 |
616 | @@ -1062,7 +1062,7 @@ |
617 | >>> print_ppa_packages(jblack_browser.contents) |
618 | Source Published Status Series Section Build |
619 | Status |
620 | - foo - 2.0 (changesfile) Pending Hoary Base |
621 | + foo - 2.0 (changesfile) Pending Hoary Base i386 |
622 | foo - 1.1 (changesfile) Pending Warty Base |
623 | pmount - 0.1-1 Pending Hoary Editors |
624 | pmount - 0.1-1 Pending Warty Editors |
625 | |
626 | === modified file 'lib/lp/soyuz/stories/ppa/xx-ppa-packages.txt' |
627 | --- lib/lp/soyuz/stories/ppa/xx-ppa-packages.txt 2009-11-05 10:51:36 +0000 |
628 | +++ lib/lp/soyuz/stories/ppa/xx-ppa-packages.txt 2009-12-07 15:21:15 +0000 |
629 | @@ -129,29 +129,51 @@ |
630 | If a the binaries for a package are fully built, but have not yet been |
631 | published, this will be indicated to the viewer: |
632 | |
633 | - >>> anon_browser.open( |
634 | - ... "http://launchpad.dev/~cprov/+archive/ppa/+packages") |
635 | - >>> expander_url = anon_browser.getLink(id='pub28-expander').url |
636 | + # First, we'll update the binary publishing history for the i386 |
637 | + # record so that it is pending publication. |
638 | + >>> login('foo.bar@canonical.com') |
639 | + >>> from zope.component import getUtility |
640 | + >>> from lp.registry.interfaces.person import IPersonSet |
641 | + >>> cprov_ppa = getUtility(IPersonSet).getByName('cprov').archive |
642 | + >>> pmount_i386_pub = cprov_ppa.getAllPublishedBinaries( |
643 | + ... name='pmount', version='0.1-1')[1] |
644 | + >>> print pmount_i386_pub.displayname |
645 | + pmount 0.1-1 in warty i386 |
646 | + >>> from lp.soyuz.interfaces.publishing import PackagePublishingStatus |
647 | + >>> pmount_i386_pub.secure_record.status = PackagePublishingStatus.PENDING |
648 | + >>> pmount_i386_pub.secure_record.datepublished = None |
649 | + >>> transaction.commit() |
650 | + >>> logout() |
651 | + |
652 | + # Now, to re-display the pmount expanded section: |
653 | >>> anon_browser.open(expander_url) |
654 | >>> print extract_text(anon_browser.contents) |
655 | Note: Some binary packages for this source are not yet published in the |
656 | repository. |
657 | Publishing details |
658 | Published on 2007-07-09 |
659 | - Copied from ubuntu warty in PPA for Mark Shuttleworth |
660 | + Copied from ubuntu hoary in Primary Archive for Ubuntu Linux |
661 | Changelog |
662 | + pmount (0.1-1) hoary; urgency=low |
663 | + * Fix description (Malone #1) |
664 | + * Fix debian (Debian #2000) |
665 | + * Fix warty (Warty Ubuntu #1) |
666 | + -- Sample Person... |
667 | Builds |
668 | i386 - Pending publication |
669 | Built packages |
670 | - mozilla-firefox ff from iceweasel |
671 | + pmount |
672 | + pmount shortdesc |
673 | Package files |
674 | - firefox_0.9.2.orig.tar.gz (9.5 MiB) |
675 | - iceweasel-1.0.dsc (123 bytes) |
676 | - mozilla-firefox_0.9_i386.deb (3 bytes) |
677 | + No files published for this package. |
678 | |
679 | -The package was copied from a PPA. The archive title will hence link |
680 | +When the package is copied from a PPA, the archive title will link |
681 | back to the source PPA. |
682 | |
683 | + >>> anon_browser.open( |
684 | + ... "http://launchpad.dev/~cprov/+archive/ppa/+packages") |
685 | + >>> expander_url = anon_browser.getLink(id='pub28-expander').url |
686 | + >>> anon_browser.open(expander_url) |
687 | >>> anon_browser.getLink("PPA for Mark Shuttleworth").url |
688 | 'http://launchpad.dev/~mark/+archive/ppa' |
689 | |
690 | @@ -164,7 +186,7 @@ |
691 | >>> admin_browser.getControl(name="field.buildd_secret").value = "secret" |
692 | >>> admin_browser.getControl("Save").click() |
693 | |
694 | - >>> anon_browser.open("http://launchpad.dev/~cprov/+archive/ppa") |
695 | + >>> anon_browser.open(expander_url) |
696 | >>> anon_browser.getLink("PPA for Mark Shuttleworth") |
697 | Traceback (most recent call last): |
698 | ... |
699 | |
700 | === modified file 'lib/lp/soyuz/stories/webservice/xx-source-package-publishing.txt' |
701 | --- lib/lp/soyuz/stories/webservice/xx-source-package-publishing.txt 2009-11-18 23:56:26 +0000 |
702 | +++ lib/lp/soyuz/stories/webservice/xx-source-package-publishing.txt 2009-12-07 15:21:15 +0000 |
703 | @@ -207,18 +207,20 @@ |
704 | ====================== |
705 | |
706 | The source publication object has a custom operation called 'getBuilds' and |
707 | -it returns the build records in the context of that publication. |
708 | +it returns the build records for builds that were built in the same context |
709 | +archive as the publication, or builds from other archives but where the |
710 | +binaries have been copied and published in the same context archive. |
711 | |
712 | >>> pubs = webservice.named_get( |
713 | ... cprov_archive['self_link'], 'getPublishedSources', |
714 | - ... source_name="iceweasel", version="1.0", |
715 | + ... source_name="pmount", version="0.1-1", |
716 | ... exact_match=True).jsonBody() |
717 | >>> source_pub = pubs['entries'][0] |
718 | >>> builds = webservice.named_get( |
719 | ... source_pub['self_link'], 'getBuilds').jsonBody() |
720 | >>> for entry in sorted(builds['entries']): |
721 | ... print entry['title'] |
722 | - i386 build of iceweasel 1.0 in ubuntu warty RELEASE |
723 | + i386 build of pmount 0.1-1 in ubuntu warty RELEASE |
724 | |
725 | |
726 | Finding related Binary publications |
727 | |
728 | === modified file 'lib/lp/soyuz/templates/archive-edit-dependencies.pt' |
729 | --- lib/lp/soyuz/templates/archive-edit-dependencies.pt 2009-11-12 17:26:17 +0000 |
730 | +++ lib/lp/soyuz/templates/archive-edit-dependencies.pt 2009-12-07 15:21:15 +0000 |
731 | @@ -62,7 +62,7 @@ |
732 | </div> <!-- launchpad_form --> |
733 | |
734 | <script type="text/javascript"> |
735 | - YUI().use("node", function(Y) { |
736 | + LPS.use("node", function(Y) { |
737 | |
738 | // Highlight (setting bold font-weight) the label for the |
739 | // selected option in a given NodesList. Assumes the input is |
740 | |
741 | === modified file 'lib/lp/soyuz/templates/archive-macros.pt' |
742 | --- lib/lp/soyuz/templates/archive-macros.pt 2009-11-04 19:59:16 +0000 |
743 | +++ lib/lp/soyuz/templates/archive-macros.pt 2009-12-07 15:21:15 +0000 |
744 | @@ -10,7 +10,7 @@ |
745 | </tal:comment> |
746 | |
747 | <script type="text/javascript"> |
748 | -YUI().use('node', 'io-base', 'lazr.anim', 'soyuz-base', function(Y) { |
749 | +LPS.use('node', 'io-base', 'lazr.anim', 'soyuz-base', function(Y) { |
750 | |
751 | |
752 | /* |
753 | |
754 | === modified file 'lib/lp/soyuz/templates/archive-packages.pt' |
755 | --- lib/lp/soyuz/templates/archive-packages.pt 2009-11-04 13:56:17 +0000 |
756 | +++ lib/lp/soyuz/templates/archive-packages.pt 2009-12-07 15:21:15 +0000 |
757 | @@ -23,7 +23,7 @@ |
758 | </tal:devmode> |
759 | <script type="text/javascript" id="repository-size-update" |
760 | tal:condition="view/archive_url"> |
761 | -YUI().use('io-base', 'lazr.anim', 'node', 'soyuz-base', |
762 | +LPS.use('io-base', 'lazr.anim', 'node', 'soyuz-base', |
763 | 'soyuz.update_archive_build_statuses', function(Y) { |
764 | |
765 | |
766 | |
767 | === modified file 'lib/lp/soyuz/templates/archive-subscribers.pt' |
768 | --- lib/lp/soyuz/templates/archive-subscribers.pt 2009-09-29 07:21:40 +0000 |
769 | +++ lib/lp/soyuz/templates/archive-subscribers.pt 2009-12-07 15:21:15 +0000 |
770 | @@ -98,7 +98,7 @@ |
771 | </form> |
772 | </div><!-- class="portlet" --> |
773 | <script type="text/javascript" id="setup-archivesubscribers-index"> |
774 | - YUI().use('soyuz.archivesubscribers_index', function(Y) { |
775 | + LPS.use('soyuz.archivesubscribers_index', function(Y) { |
776 | Y.soyuz.setup_archivesubscribers_index(); |
777 | }); |
778 | </script> |
779 | |
780 | === modified file 'lib/lp/translations/browser/language.py' |
781 | --- lib/lp/translations/browser/language.py 2009-10-31 11:06:44 +0000 |
782 | +++ lib/lp/translations/browser/language.py 2009-12-07 15:21:15 +0000 |
783 | @@ -29,6 +29,7 @@ |
784 | enabled_with_permission, GetitemNavigation, LaunchpadEditFormView, |
785 | LaunchpadFormView, LaunchpadView, Link, NavigationMenu) |
786 | from lp.translations.utilities.pluralforms import make_friendly_plural_forms |
787 | +from canonical.launchpad.interfaces.launchpad import ILaunchpadCelebrities |
788 | |
789 | from canonical.widgets import LabeledMultiCheckBoxWidget |
790 | |
791 | @@ -202,6 +203,13 @@ |
792 | |
793 | return pluralforms_list |
794 | |
795 | + @property |
796 | + def add_question_url(self): |
797 | + rosetta = getUtility(ILaunchpadCelebrities).lp_translations |
798 | + return canonical_url( |
799 | + rosetta, |
800 | + view_name='+addquestion', |
801 | + rootsite='answers') |
802 | |
803 | class LanguageAdminView(LaunchpadEditFormView): |
804 | """Handle an admin form submission.""" |
805 | |
806 | === modified file 'lib/lp/translations/stories/distroseries/xx-distroseries-templates.txt' |
807 | --- lib/lp/translations/stories/distroseries/xx-distroseries-templates.txt 2009-10-30 10:09:17 +0000 |
808 | +++ lib/lp/translations/stories/distroseries/xx-distroseries-templates.txt 2009-12-07 15:21:15 +0000 |
809 | @@ -1,11 +1,15 @@ |
810 | -= Templates view for DistroSeries = |
811 | + |
812 | + |
813 | +Templates view for DistroSeries |
814 | +=============================== |
815 | |
816 | The +templates view for DistroSeries gives an overview of the translation |
817 | templates in this series and provides easy access to the various subpages of |
818 | each template. |
819 | |
820 | |
821 | -== Getting there == |
822 | +Getting there |
823 | +------------- |
824 | |
825 | To get to the listing of all templates, one needs to use the link |
826 | from the distribution series translations page. |
827 | @@ -16,7 +20,45 @@ |
828 | >>> print user_browser.url |
829 | http://translations.launchpad.dev/ubuntu/hoary/+templates |
830 | |
831 | -== The templates table == |
832 | +The templates table |
833 | +------------------- |
834 | + |
835 | +Full template listing for a distribution series is reached by following |
836 | +a link from the distribution series translations page. |
837 | + |
838 | + >>> anon_browser.open( |
839 | + ... 'http://translations.launchpad.dev/ubuntu/hoary') |
840 | + >>> anon_browser.getLink('full list of templates').click() |
841 | + |
842 | +Full listing of templates shows source package name, template name and |
843 | +the date of last update for this distribution series. |
844 | + |
845 | + >>> table = find_tag_by_id(anon_browser.contents, 'templates_table') |
846 | + >>> print extract_text(table) |
847 | + Source package Template name Last update |
848 | + evolution disabled-template 2007-01-05 |
849 | + evolution evolution-2.2 2005-05-06 |
850 | + evolution man 2006-08-14 |
851 | + mozilla pkgconf-mozilla 2005-05-06 |
852 | + pmount man 2006-08-14 |
853 | + pmount pmount 2005-05-06 |
854 | + |
855 | + |
856 | +Logged-in users will see a link from distro series |
857 | + >>> user_browser.open( |
858 | + ... 'http://translations.launchpad.dev/ubuntu/hoary') |
859 | + >>> user_browser.getLink('full list of templates').click() |
860 | + |
861 | +Logged-in users can also choose to download all translations for each |
862 | +of the templates. |
863 | + |
864 | + >>> table = find_tag_by_id(user_browser.contents, 'templates_table') |
865 | + >>> print extract_text(table) |
866 | + Source package Template name Last update Actions |
867 | + evolution disabled-template 2007-01-05 Download |
868 | + ... |
869 | + mozilla pkgconf-mozilla 2005-05-06 Download |
870 | + ... |
871 | |
872 | Administrator can see all editing options. |
873 | |
874 | @@ -28,16 +70,17 @@ |
875 | |
876 | >>> table = find_tag_by_id(admin_browser.contents, 'templates_table') |
877 | >>> print extract_text(table) |
878 | - Source package Template name Actions |
879 | - evolution disabled-template Edit Upload Download Administer |
880 | - evolution evolution-2.2 Edit Upload Download Administer |
881 | - evolution man Edit Upload Download Administer |
882 | - mozilla pkgconf-mozilla Edit Upload Download Administer |
883 | - pmount man Edit Upload Download Administer |
884 | - pmount pmount Edit Upload Download Administer |
885 | - |
886 | - |
887 | -== Links to the templates == |
888 | + Source package Template name Last update Actions |
889 | + evolution disabled-template 2007-01-05 Edit Upload Download Administer |
890 | + evolution evolution-2.2 2005-05-06 Edit Upload Download Administer |
891 | + evolution man 2006-08-14 Edit Upload Download Administer |
892 | + mozilla pkgconf-mozilla 2005-05-06 Edit Upload Download Administer |
893 | + pmount man 2006-08-14 Edit Upload Download Administer |
894 | + pmount pmount 2005-05-06 Edit Upload Download Administer |
895 | + |
896 | + |
897 | +Links to the templates |
898 | +---------------------- |
899 | |
900 | Clicking on a template name will take the user to that template's overview |
901 | page. |
902 | |
903 | === modified file 'lib/lp/translations/stories/productseries/xx-productseries-templates.txt' |
904 | --- lib/lp/translations/stories/productseries/xx-productseries-templates.txt 2009-10-30 10:09:17 +0000 |
905 | +++ lib/lp/translations/stories/productseries/xx-productseries-templates.txt 2009-12-07 15:21:15 +0000 |
906 | @@ -1,13 +1,18 @@ |
907 | -= Templates view for ProductSeries = |
908 | + |
909 | + |
910 | +Templates view for ProductSeries |
911 | +================================ |
912 | |
913 | The +templates view for ProductSeries gives an overview of the translation |
914 | templates in this series and provides easy access to the various subpages of |
915 | each template. |
916 | |
917 | -== Preparation == |
918 | - |
919 | -To test the ordering of templates in the listing, we need another template |
920 | -that is new but must appear at the top of the list. |
921 | + |
922 | +Preparation |
923 | +----------- |
924 | + |
925 | +To test the ordering of templates in the listing, we need another |
926 | +template that is new but must appear at the top of the list. |
927 | |
928 | >>> login('foo.bar@canonical.com') |
929 | >>> from zope.component import getUtility |
930 | @@ -18,7 +23,9 @@ |
931 | ... name='at-the-top') |
932 | >>> logout() |
933 | |
934 | -== Getting there == |
935 | + |
936 | +Getting there |
937 | +------------- |
938 | |
939 | To get to the listing of all templates, one needs to use the link |
940 | from the product series translations page. |
941 | @@ -30,16 +37,17 @@ |
942 | http://translations.launchpad.dev/evolution/trunk/+templates |
943 | |
944 | |
945 | -== The templates table == |
946 | +The templates table |
947 | +------------------- |
948 | |
949 | The page shows a table of all templates and links to their subpages. |
950 | |
951 | >>> table = find_tag_by_id(user_browser.contents, 'templates_table') |
952 | >>> print extract_text(table) |
953 | - Template name Actions |
954 | - at-the-top Download |
955 | - evolution-2.2 Download |
956 | - evolution-2.2-test Download |
957 | + Template name Last update Actions |
958 | + at-the-top ... Download |
959 | + evolution-2.2 2005-08-25 Download |
960 | + evolution-2.2-test 2006-12-13 Download |
961 | |
962 | If an administrator views this page, links to the templates admin page are |
963 | shown, too. |
964 | @@ -48,13 +56,14 @@ |
965 | ... 'http://translations.launchpad.dev/evolution/trunk/+templates') |
966 | >>> table = find_tag_by_id(admin_browser.contents, 'templates_table') |
967 | >>> print extract_text(table) |
968 | - Template name Actions |
969 | - at-the-top Edit Upload Download Administer |
970 | - evolution-2.2 Edit Upload Download Administer |
971 | - evolution-2.2-test Edit Upload Download Administer |
972 | - |
973 | - |
974 | -== Links to the templates == |
975 | + Template name Last update Actions |
976 | + at-the-top ... Edit Upload Download Administer |
977 | + evolution-2.2 2005-08-25 Edit Upload Download Administer |
978 | + evolution-2.2-test 2006-12-13 Edit Upload Download Administer |
979 | + |
980 | + |
981 | +Links to the templates |
982 | +---------------------- |
983 | |
984 | Clicking on a template name will take the user to that template's overview |
985 | page. |
986 | |
987 | === modified file 'lib/lp/translations/stories/standalone/xx-language.txt' |
988 | --- lib/lp/translations/stories/standalone/xx-language.txt 2009-10-31 11:06:44 +0000 |
989 | +++ lib/lp/translations/stories/standalone/xx-language.txt 2009-12-07 15:21:15 +0000 |
990 | @@ -1,6 +1,15 @@ |
991 | + |
992 | + |
993 | +Languages view |
994 | +============== |
995 | + |
996 | Here is the tale of languages. We will see how to create, find and edit |
997 | them. |
998 | |
999 | + |
1000 | +Getting there |
1001 | +------------- |
1002 | + |
1003 | Launchpad Translations has a main page. |
1004 | |
1005 | >>> admin_browser.open('http://translations.launchpad.dev/') |
1006 | @@ -11,7 +20,12 @@ |
1007 | >>> print admin_browser.url |
1008 | http://translations.launchpad.dev/+languages |
1009 | |
1010 | -Following the link, there is a form to add new languages. |
1011 | + |
1012 | +Adding new languages |
1013 | +-------------------- |
1014 | + |
1015 | +Following the link from the translations main page, there is a form to |
1016 | +add new languages. |
1017 | |
1018 | >>> admin_browser.getLink('Add new language').click() |
1019 | >>> print admin_browser.url |
1020 | @@ -65,11 +79,16 @@ |
1021 | ... |
1022 | LinkNotFoundError |
1023 | |
1024 | - >>> user_browser.open('http://translations.launchpad.dev/+languages/+add') |
1025 | + >>> user_browser.open( |
1026 | + ... 'http://translations.launchpad.dev/+languages/+add') |
1027 | Traceback (most recent call last): |
1028 | ... |
1029 | Unauthorized:... |
1030 | |
1031 | + |
1032 | +Searching for a language |
1033 | +------------------------ |
1034 | + |
1035 | From the top languages page, anyone can find languages. |
1036 | |
1037 | >>> browser.open('http://translations.launchpad.dev/+languages') |
1038 | @@ -82,7 +101,11 @@ |
1039 | >>> print browser.url |
1040 | http://translations.launchpad.dev/+languages/+index?find=Spanish |
1041 | |
1042 | -And following one of the found languages, we can see a brief information |
1043 | + |
1044 | +Read language information |
1045 | +------------------------- |
1046 | + |
1047 | +Following one of the found languages, we can see a brief information |
1048 | about the selected language. |
1049 | |
1050 | >>> browser.getLink('Spanish').click() |
1051 | @@ -128,14 +151,50 @@ |
1052 | ...Uruguay... |
1053 | ...Venezuela... |
1054 | |
1055 | - >>> topcontributors_portlet = find_portlet(browser.contents, 'Top contributors') |
1056 | + >>> topcontributors_portlet = find_portlet( |
1057 | + ... browser.contents, 'Top contributors') |
1058 | >>> print topcontributors_portlet |
1059 | <... |
1060 | ...Carlos Perelló Marín... |
1061 | |
1062 | +Our test sample data does not know about plural forms of |
1063 | +Abkhazian and about countries where this language is spoken. |
1064 | + |
1065 | +We will see a note about missing plural forms and a link to Rosetta |
1066 | +add question page for informing Rosetta admin about the right plural |
1067 | +form. |
1068 | + |
1069 | + >>> browser.open('http://translations.launchpad.dev/+languages/ab') |
1070 | + >>> print extract_text(find_portlet(browser.contents, 'Plural forms' |
1071 | + ... ).renderContents()) |
1072 | + Plural forms |
1073 | + Unfortunately, Launchpad doesn't know the plural |
1074 | + form information for this language... |
1075 | + |
1076 | + >>> print browser.getLink(id='plural_question').url |
1077 | + http://answers.launchpad.dev/rosetta/+addquestion |
1078 | + |
1079 | +We will see a note that Launchpad does not know in which countries |
1080 | +this language is spoken and a link to add question page for informing |
1081 | +Rosetta admin about the countries where this page is officially spoken. |
1082 | + |
1083 | + >>> countries_portlet = find_portlet(browser.contents, 'Countries') |
1084 | + >>> print countries_portlet |
1085 | + <... |
1086 | + Abkhazian is not registered as being spoken in any |
1087 | + country... |
1088 | + |
1089 | + >>> print browser.getLink(id='country_question').url |
1090 | + http://answers.launchpad.dev/rosetta/+addquestion |
1091 | + |
1092 | + |
1093 | +Edit language information |
1094 | +------------------------- |
1095 | + |
1096 | Finally, there is the edit form to change language basic information. |
1097 | |
1098 | - >>> user_browser.open('http://translations.launchpad.dev/+languages/es') |
1099 | + >>> user_browser.open( |
1100 | + ... 'http://translations.launchpad.dev/+languages/es') |
1101 | >>> print user_browser.url |
1102 | http://translations.launchpad.dev/+languages/es |
1103 | |
1104 | @@ -146,7 +205,8 @@ |
1105 | ... |
1106 | LinkNotFoundError |
1107 | |
1108 | - >>> user_browser.open('http://translations.launchpad.dev/+languages/es/+admin') |
1109 | + >>> user_browser.open( |
1110 | + ... 'http://translations.launchpad.dev/+languages/es/+admin') |
1111 | Traceback (most recent call last): |
1112 | ... |
1113 | Unauthorized:... |
1114 | @@ -155,7 +215,8 @@ |
1115 | |
1116 | >>> from canonical.launchpad.testing.pages import strip_label |
1117 | |
1118 | - >>> admin_browser.open('http://translations.launchpad.dev/+languages/es') |
1119 | + >>> admin_browser.open( |
1120 | + ... 'http://translations.launchpad.dev/+languages/es') |
1121 | >>> print admin_browser.url |
1122 | http://translations.launchpad.dev/+languages/es |
1123 | |
1124 | |
1125 | === modified file 'lib/lp/translations/templates/language-index.pt' |
1126 | --- lib/lp/translations/templates/language-index.pt 2009-09-17 14:45:59 +0000 |
1127 | +++ lib/lp/translations/templates/language-index.pt 2009-12-07 15:21:15 +0000 |
1128 | @@ -43,8 +43,10 @@ |
1129 | <p class="helpwanted"> |
1130 | Unfortunately, Launchpad doesn't know the plural form |
1131 | information for this language. If you know it, please open a |
1132 | - <a href="/rosetta/+addticket">ticket</a> with that information, |
1133 | - so we can add it to Launchpad. |
1134 | + <a id='plural_question' |
1135 | + tal:attributes="href view/add_question_url" |
1136 | + >question</a> |
1137 | + with that information, so we can add it to Launchpad. |
1138 | </p> |
1139 | </tal:has_not_pluralforms> |
1140 | </div> |
1141 | @@ -124,8 +126,11 @@ |
1142 | </tal:language> |
1143 | is not registered as being spoken in any country. If you know |
1144 | about a country that officially speaks this language, please |
1145 | - open a <a href="/rosetta/+addticket">ticket</a> with that |
1146 | - information, so we can add it to Launchpad. |
1147 | + open a |
1148 | + <a id='country_question' |
1149 | + tal:attributes="href view/add_question_url" |
1150 | + >question</a> |
1151 | + with that information, so we can add it to Launchpad. |
1152 | </p> |
1153 | </tal:has_not_countries> |
1154 | </div> |
1155 | |
1156 | === modified file 'lib/lp/translations/templates/object-templates.pt' |
1157 | --- lib/lp/translations/templates/object-templates.pt 2009-11-24 19:23:52 +0000 |
1158 | +++ lib/lp/translations/templates/object-templates.pt 2009-12-07 15:21:15 +0000 |
1159 | @@ -26,16 +26,16 @@ |
1160 | </style> |
1161 | <style tal:condition="view/is_distroseries" type="text/css"> |
1162 | #templates_table { |
1163 | - width: 72em; |
1164 | + width: 79em; |
1165 | } |
1166 | </style> |
1167 | <style tal:condition="not:view/is_distroseries" type="text/css"> |
1168 | #templates_table { |
1169 | - width: 50em; |
1170 | + width: 58em; |
1171 | } |
1172 | </style> |
1173 | <script language="JavaScript" type="text/javascript"> |
1174 | - YUI().use('node-base', 'event-delegate', function(Y) { |
1175 | + LPS.use('node-base', 'event-delegate', function(Y) { |
1176 | Y.on('domready', function(e) { |
1177 | Y.all('#templates_table .template_links').addClass( |
1178 | 'inactive_links'); |
1179 | @@ -75,6 +75,7 @@ |
1180 | <th tal:condition="view/is_distroseries" |
1181 | class="sourcepackage_column">Source package</th> |
1182 | <th class="template_column">Template name</th> |
1183 | + <th class="lastupdate_column">Last update</th> |
1184 | <th class="actions_column" |
1185 | tal:condition="context/required:launchpad.AnyPerson"> |
1186 | Actions</th> |
1187 | @@ -88,6 +89,22 @@ |
1188 | </td> |
1189 | <td class="template_column"><a tal:attributes="href template/fmt:url" |
1190 | tal:content="template/name">Template name</a></td> |
1191 | + <td class="lastupdate_column"> |
1192 | + <span class="sortkey" |
1193 | + tal:condition="template/date_last_updated" |
1194 | + tal:content="template/date_last_updated/fmt:datetime"> |
1195 | + time sort key |
1196 | + </span> |
1197 | + <span class="lastupdate_column" |
1198 | + tal:condition="template/date_last_updated" |
1199 | + tal:attributes=" |
1200 | + title template/date_last_updated/fmt:datetime" |
1201 | + tal:content=" |
1202 | + template/date_last_updated/fmt:approximatedate" |
1203 | + > |
1204 | + 2009-09-23 |
1205 | + </span> |
1206 | + </td> |
1207 | <td class="actions_column" |
1208 | tal:condition="context/required:launchpad.AnyPerson"> |
1209 | <div class="template_links"> |
1210 | |
1211 | === modified file 'lib/lp/translations/templates/pofile-export.pt' |
1212 | --- lib/lp/translations/templates/pofile-export.pt 2009-11-10 21:04:19 +0000 |
1213 | +++ lib/lp/translations/templates/pofile-export.pt 2009-12-07 15:21:15 +0000 |
1214 | @@ -13,7 +13,7 @@ |
1215 | } |
1216 | </style> |
1217 | <script type="text/javascript"> |
1218 | - YUI().use('node', 'event', function(Y){ |
1219 | + LPS.use('node', 'event', function(Y){ |
1220 | Y.on('domready', function(){ |
1221 | // The pochanged option is only available for the PO format. |
1222 | var formatlist = Y.one('#div_format select'); |
1223 | |
1224 | === modified file 'lib/lp/translations/templates/pofile-translate.pt' |
1225 | --- lib/lp/translations/templates/pofile-translate.pt 2009-11-04 19:59:16 +0000 |
1226 | +++ lib/lp/translations/templates/pofile-translate.pt 2009-12-07 15:21:15 +0000 |
1227 | @@ -20,7 +20,7 @@ |
1228 | <script type="text/javascript"> |
1229 | registerLaunchpadFunction(insertAllExpansionButtons); |
1230 | |
1231 | - YUI().use('node', 'cookie', 'anim', 'lp.pofile', function(Y) { |
1232 | + LPS.use('node', 'cookie', 'anim', 'lp.pofile', function(Y) { |
1233 | |
1234 | var hide_notification = function(node) { |
1235 | var hide_anim = new Y.Anim({ |
1236 | |
1237 | === modified file 'lib/lp/translations/templates/translation-import-queue-macros.pt' |
1238 | --- lib/lp/translations/templates/translation-import-queue-macros.pt 2009-11-20 14:15:34 +0000 |
1239 | +++ lib/lp/translations/templates/translation-import-queue-macros.pt 2009-12-07 15:21:15 +0000 |
1240 | @@ -18,7 +18,7 @@ |
1241 | </script> |
1242 | |
1243 | <script type="text/javascript"> |
1244 | - YUI().use( 'translations', 'event', function(Y) { |
1245 | + LPS.use( 'translations', 'event', function(Y) { |
1246 | Y.on('domready', function(e) { |
1247 | Y.translations.initialize_import_queue_page(Y); |
1248 | }); |
1249 | |
1250 | === modified file 'lib/lp/translations/templates/translationimportqueueentry-index.pt' |
1251 | --- lib/lp/translations/templates/translationimportqueueentry-index.pt 2009-11-04 13:56:17 +0000 |
1252 | +++ lib/lp/translations/templates/translationimportqueueentry-index.pt 2009-12-07 15:21:15 +0000 |
1253 | @@ -14,7 +14,7 @@ |
1254 | } |
1255 | </style> |
1256 | <script type="text/javascript"> |
1257 | - YUI().use('node', 'lazr.anim', function(Y) { |
1258 | + LPS.use('node', 'lazr.anim', function(Y) { |
1259 | var fields = {'POT': |
1260 | ['field.name', 'field.translation_domain', |
1261 | 'field.languagepack'], |
1262 | |
1263 | === modified file 'lib/lp/translations/templates/translationmessage-translate.pt' |
1264 | --- lib/lp/translations/templates/translationmessage-translate.pt 2009-09-17 07:28:30 +0000 |
1265 | +++ lib/lp/translations/templates/translationmessage-translate.pt 2009-12-07 15:21:15 +0000 |
1266 | @@ -18,7 +18,7 @@ |
1267 | tal:define="lp_js string:${icingroot}/build" |
1268 | tal:attributes="src string:${lp_js}/translations/pofile.js"></script> |
1269 | <script type="text/javascript"> |
1270 | - YUI().use('node', 'lp.pofile', function(Y) { |
1271 | + LPS.use('node', 'lp.pofile', function(Y) { |
1272 | Y.on('domready', Y.lp.pofile.setupSuggestionDismissal); |
1273 | }); |
1274 | </script> |
Overview
========
This branch fixes bug 443353 by ensuring that getBuildsForSou rces() will include builds that were originally built in a different archive context, but have since had binaries copied into the source archive context.
Issues
======
There are two main issues with this branch IMO.
1. Ensuring that the result of the union was ordered correctly is hackish, and dependent on a storm implementation detail (that columns in SQL queries for a certain table are ordered alphabetically).
2. Testing the correct ordering in the doctest isn't so readable. It could be better to print out the results instead, but on the other hand, I didn't want the test to be dependent on database ids. Previously the test simply ensured the last item was the one expected - I've updated this to instead assure that the complete result is sorted as expected.
Any suggestions welcome!
Testing/QA
==========
To test, run:
bin/test -vvt doc/publishing.txt
To QA:
Visit: /edge.launchpad .net/ubuntu/ +source/ openjdk- 6
https:/
and expand the intrepid 6b12-0ubuntu6.6 security release. Currently this only displays two builds, where as it should display all the builds listed at:
https:/ /edge.launchpad .net/ubuntu/ +source/ openjdk- 6/6b12- 0ubuntu6. 6