Merge lp:~barry/gwibber/bug-990145 into lp:gwibber
- bug-990145
- Merge into trunk
Status: | Merged | ||||
---|---|---|---|---|---|
Merged at revision: | 1354 | ||||
Proposed branch: | lp:~barry/gwibber/bug-990145 | ||||
Merge into: | lp:gwibber | ||||
Diff against target: |
697 lines (+286/-63) 16 files modified
README (+50/-0) gwibber/microblog/dispatcher.py (+2/-2) gwibber/microblog/plugins/facebook/__init__.py (+27/-8) gwibber/microblog/plugins/flickr/__init__.py (+3/-6) gwibber/microblog/plugins/friendfeed/__init__.py (+3/-2) gwibber/microblog/plugins/identica/__init__.py (+7/-5) gwibber/microblog/plugins/qaiku/__init__.py (+2/-1) gwibber/microblog/plugins/statusnet/__init__.py (+7/-6) gwibber/microblog/plugins/twitter/__init__.py (+7/-5) gwibber/microblog/util/__init__.py (+3/-9) gwibber/microblog/util/resources.py (+2/-2) gwibber/time.py (+96/-0) gwibber/util.py (+3/-3) tests/plugins/test/__init__.py (+3/-5) tests/python/unittests/test_time.py (+53/-0) tests/python/utils/__init__.py (+18/-9) |
||||
To merge this branch: | bzr merge lp:~barry/gwibber/bug-990145 | ||||
Related bugs: |
|
Reviewer | Review Type | Date Requested | Status |
---|---|---|---|
Ken VanDine | Approve | ||
Review via email: mp+103956@code.launchpad.net |
Commit message
Description of the change
It's a release goal for Quantal to ship only Python 3 on the desktop CD, so we
need to port Gwibber to Python 3. This branch doesn't do that. Instead, it
removes a dependency which we know will not be ported upstream any time soon.
I've heard from the mx project that they have no plans to port to Python 3.
This branch changes the uses of mx.DateTime to the built-in datetime module.
A 'make check' seems to pass locally, but it's entirely possible that I'm not
running the full suite correctly. I'm happy to make any additional changes
that might be necessary.
Barry Warsaw (barry) wrote : | # |
Ken VanDine (ken-vandine) wrote : | # |
Thanks for the branch, review looks good and the unit tests pass. However it does fail with twitter, unfortunately we don't have tests that use real data from the services. It looks like it needs special handling like you did in facebook. Here is the traceback:
Dispatcher Thread-1 : ERROR <twitter:receive> Operation failed
Dispatcher Thread-1 : DEBUG Traceback:
Traceback (most recent call last):
File "/home/
message_data = PROTOCOLS[
File "/home/
return getattr(self, opname)(**args)
File "/home/
return self._get(
File "/home/
if parse: return [getattr(self, "_%s" % parse)(m) for m in data]
File "/home/
n["time"] = util.parsetime(
File "/home/
dt = datetime.
File "/usr/lib/
(data_string, format))
ValueError: time data 'Thu May 10 17:52:38 +0000 2012' does not match format '%Y-%m-%dT%H:%M:%S'
Ken VanDine (ken-vandine) wrote : | # |
Actually my last test didn't download any updates for facebook, so I purged all my facebook messages from the database. Now facebook refresh gives me this traceback:
Dispatcher Thread-6 : ERROR <facebook:receive> Operation failed
Dispatcher Thread-6 : DEBUG Traceback:
Traceback (most recent call last):
File "/home/
message_data = PROTOCOLS[
File "/home/
return getattr(self, opname)(**args)
File "/home/
return [self._
File "/home/
m["time"] = convert_time(data)
File "/home/
as_datetime = datetime.
File "/usr/lib/
data_
ValueError: unconverted data remains: +0000
- 1345. By Barry Warsaw
-
- Add the most minimal of unittests, with instructions for running them in
the README file.- Add unittests for parsetime().
- Support more formats in parsetime(), e.g. timezone aware strings (but only
for UTC, i.e. +0000), and also for alternative ISO 8601 (space instead of
'T').
Barry Warsaw (barry) wrote : | # |
I've pushed an update which should fix the parsing in the last comment. I also added the most minimal of unittests, based on our out-of-band discussion. See the README for details. It's ugly and not tied into `make check` but at least it's a start. ;)
- 1346. By Barry Warsaw
-
trunk merge
Barry Warsaw (barry) wrote : | # |
I think the last update may not completely fix this problem. I'm working on an update.
- 1347. By Barry Warsaw
-
- Remove some unused imports, and other import cleanups.
- Avoid circular imports by moving parsetime() to new module gwibber.time
- Refactor convert_time() to use parsetime() in order to fix the bugs
everywhere.
Barry Warsaw (barry) wrote : | # |
Latest update pushed, which should work (seems to locally anyway ;). You'll see the diff got bigger because I had to fiddle with the imports to prevent some circular import problems.
- 1348. By Barry Warsaw
-
- Handle Twitter and Facebook nonconformity to standards.
- Refactor the code to be more readable.
- Improve the comments.
Ken VanDine (ken-vandine) wrote : | # |
twitter and facebook confirmed to work, however I tested with identica and status.net and they aren't using a UTC time and give the same ValueError. Here is an example time string:
Wed Mar 21 02:30:31 -0400 2012
There is also a typo in the flickr plugin:
- timetup = datetime.
+ timetup = datetime.
The good news is that will be the last of the supported plugins, so once identica and statusnet are working this is good to go!
Thanks!
Barry Warsaw (barry) wrote : | # |
On May 11, 2012, at 11:42 PM, Ken VanDine wrote:
>Review: Needs Fixing
>
>twitter and facebook confirmed to work, however I tested with identica and
>status.net and they aren't using a UTC time and give the same ValueError.
>Here is an example time string:
>
>Wed Mar 21 02:30:31 -0400 2012
Madness!
What does Gwibber expect us to do with non-UTC timezones? Just dropping the
tz doesn't seem right. OTOH, we have to convert them to naive datetimes. So
I'm guessing we parse them into non-UTC tz-aware datetimes, then convert them
to UTC, then make them tz-naive. Does that sound right?
>There is also a typo in the flickr plugin:
>- timetup = datetime.
>+ timetup = datetime.
>
>The good news is that will be the last of the supported plugins, so once
>identica and statusnet are working this is good to go!
Yay! :)
Cheers,
-Barry
Ken VanDine (ken-vandine) wrote : | # |
On Fri, 2012-05-11 at 16:52 -0700, Barry Warsaw wrote:
> On May 11, 2012, at 11:42 PM, Ken VanDine wrote:
>
> >Review: Needs Fixing
> >
> >twitter and facebook confirmed to work, however I tested with identica and
> >status.net and they aren't using a UTC time and give the same ValueError.
> >Here is an example time string:
> >
> >Wed Mar 21 02:30:31 -0400 2012
>
> Madness!
>
> What does Gwibber expect us to do with non-UTC timezones? Just dropping the
> tz doesn't seem right. OTOH, we have to convert them to naive datetimes. So
> I'm guessing we parse them into non-UTC tz-aware datetimes, then convert them
> to UTC, then make them tz-naive. Does that sound right?
Yeah, I think that is the only thing we can do. Sure wish they were
consistent :(
Thanks!
>
> >There is also a typo in the flickr plugin:
> >- timetup = datetime.
> >+ timetup = datetime.
> >
> >The good news is that will be the last of the supported plugins, so once
> >identica and statusnet are working this is good to go!
>
> Yay! :)
>
> Cheers,
> -Barry
>
--
Ken VanDine
Ubuntu Desktop Integration Engineer
Canonical, Ltd.
- 1349. By Barry Warsaw
-
- Change the flickr plugin to use the parsetime() utility.
- Remove some unused imports from the flickr plugin.
- parsetime(): Elaborate so that non-UTC timezones are handled properly,
which is used by identica and status.net. We have to use regexps here
though since %z isn't universally supported until Python 3.
Barry Warsaw (barry) wrote : | # |
On May 11, 2012, at 11:42 PM, Ken VanDine wrote:
>twitter and facebook confirmed to work, however I tested with identica and
>status.net and they aren't using a UTC time and give the same ValueError.
>Here is an example time string:
>
>Wed Mar 21 02:30:31 -0400 2012
Okay, I think I'm handling this case now. Manual timezone math is always fun
(you have to subtract the value from your local time to get you to UTC).
Please do check my math! (And the test cases. :)
>There is also a typo in the flickr plugin:
>- timetup = datetime.
>+ timetup = datetime.
I changed this to use gwibber.
>The good news is that will be the last of the supported plugins, so once
>identica and statusnet are working this is good to go!
Great! Hopefully this time <wink> will do it.
Update pushed.
Ken VanDine (ken-vandine) wrote : | # |
Looks great, thanks!
Preview Diff
1 | === modified file 'README' |
2 | --- README 2012-05-10 17:20:35 +0000 |
3 | +++ README 2012-05-12 02:14:17 +0000 |
4 | @@ -0,0 +1,50 @@ |
5 | +Installing Gwibber |
6 | +================== |
7 | + |
8 | + Requirements |
9 | + ------------ |
10 | + |
11 | + Please note that the version numbers listed below only reflect my test |
12 | + environment. Gwibber is known to work with those specific versions, but |
13 | + will probably work fine on most current desktop distributions that include |
14 | + Python's WebKit GTK+ bindings. |
15 | + |
16 | + * python (2.5) |
17 | + * python-dbus (0.80.2) |
18 | + * python-gtk2 (2.10.4) |
19 | + * python-gconf (2.18.0) |
20 | + * python-imaging (1.1.6) |
21 | + * python-notify (0.1.1) |
22 | + * python-webkitgtk (1.0.1) |
23 | + * python-simplejson (1.9.1) |
24 | + * python-distutils-extra |
25 | + * python-feedparser (4.1) |
26 | + * python-xdg (0.15) |
27 | + * python-mako (0.2.2) |
28 | + * python-pycurl |
29 | + |
30 | + Installation |
31 | + ------------ |
32 | + |
33 | + Gwibber uses Python's distutils framework for installation. In order to |
34 | + install Gwibber, you will need root access. To install Gwibber, perform |
35 | + the following command as root: |
36 | + |
37 | + $ python setup.py install |
38 | + |
39 | + Run Gwibber |
40 | + ----------- |
41 | + |
42 | + If you installed Gwibber using the setup.py script, you can launch the |
43 | + program by typing "gwibber" at the command line. If you want to run |
44 | + Gwibber without installing it, start "bin/gwibber" from within the |
45 | + Gwibber directory. |
46 | + |
47 | + Testing |
48 | + ------- |
49 | + |
50 | + You can run the dbus isolated tests by cd'ing into the tests directory |
51 | + and running `make check`. |
52 | + |
53 | + You can run the Python unittests by cd'ing into tests/python/unittests |
54 | + and running `PYTHONPATH=../../.. python -m unittest discover -v` |
55 | |
56 | === modified file 'gwibber/microblog/dispatcher.py' |
57 | --- gwibber/microblog/dispatcher.py 2012-03-27 15:48:44 +0000 |
58 | +++ gwibber/microblog/dispatcher.py 2012-05-12 02:14:17 +0000 |
59 | @@ -3,7 +3,7 @@ |
60 | |
61 | import traceback, json, random, string |
62 | import dbus, dbus.service |
63 | -import sqlite3, mx.DateTime, re, uuid |
64 | +import sqlite3, re, uuid |
65 | import urlshorter, storage, network, util, uploader |
66 | from gettext import lgettext as _ |
67 | |
68 | @@ -943,7 +943,7 @@ |
69 | operations.append(o) |
70 | |
71 | if operations: |
72 | - logger.debug("** Starting Refresh - %s **", mx.DateTime.now()) |
73 | + logger.debug("** Starting Refresh - %s **", datetime.now()) |
74 | self.LoadingStarted() |
75 | self.perform_async_operation(operations) |
76 | |
77 | |
78 | === modified file 'gwibber/microblog/plugins/facebook/__init__.py' |
79 | --- gwibber/microblog/plugins/facebook/__init__.py 2012-03-09 19:34:33 +0000 |
80 | +++ gwibber/microblog/plugins/facebook/__init__.py 2012-05-12 02:14:17 +0000 |
81 | @@ -2,8 +2,7 @@ |
82 | |
83 | from gwibber.microblog import network, util |
84 | from gwibber.microblog.util import resources |
85 | -import hashlib, mx.DateTime, time |
86 | -from os.path import join, getmtime, exists |
87 | +import hashlib, time |
88 | from gettext import lgettext as _ |
89 | from gwibber.microblog.util.const import * |
90 | # Try to import * from custom, install custom.py to include packaging |
91 | @@ -12,11 +11,15 @@ |
92 | from gwibber.microblog.util.custom import * |
93 | except: |
94 | pass |
95 | +from gwibber.time import parsetime |
96 | + |
97 | +from datetime import datetime, timedelta |
98 | |
99 | import logging |
100 | logger = logging.getLogger("Facebook") |
101 | logger.debug("Initializing.") |
102 | |
103 | + |
104 | PROTOCOL_INFO = { |
105 | "name": "Facebook", |
106 | "version": "1.1", |
107 | @@ -56,6 +59,21 @@ |
108 | URL_PREFIX = "https://graph.facebook.com/" |
109 | POST_URL = "http://www.facebook.com/profile.php?id=%s&v=feed&story_fbid=%s&ref=mf" |
110 | |
111 | + |
112 | +def convert_time(data): |
113 | + """Extract and convert the time to a timestamp (seconds since Epoch). |
114 | + |
115 | + First, look for the 'updated_time' key in the data, falling back to |
116 | + 'created_time' if the former is missing. |
117 | + """ |
118 | + # Convert from an ISO 8601 format time string to a Epoch timestamp. |
119 | + # Assume standard ISO 8601 'T' separator, no microseconds, and naive |
120 | + # time zone as per: |
121 | + # https://developers.facebook.com/docs/reference/api/event/ |
122 | + time_string = data.get('updated_time', data['created_time']) |
123 | + return parsetime(time_string) |
124 | + |
125 | + |
126 | class Client: |
127 | def __init__(self, acct): |
128 | self.account = acct |
129 | @@ -122,7 +140,7 @@ |
130 | m["service"] = "facebook" |
131 | m["account"] = self.account["id"] |
132 | |
133 | - m["time"] = int(mx.DateTime.DateTimeFrom(str(data.get("updated_time", data["created_time"])))) |
134 | + m["time"] = convert_time(data) |
135 | m["url"] = "https://facebook.com/" + data["id"].split("_")[0] + "/posts/" + data["id"].split("_")[1] |
136 | |
137 | |
138 | @@ -233,7 +251,7 @@ |
139 | for item in data["comments"]["data"]: |
140 | m["comments"].append({ |
141 | "text": item["message"], |
142 | - "time": int(mx.DateTime.DateTimeFrom(str(data.get("updated_time", data["created_time"])))), |
143 | + "time": convert_time(data), |
144 | "sender": self._sender(item["from"]), |
145 | }) |
146 | |
147 | @@ -250,15 +268,16 @@ |
148 | return getattr(self, opname)(**args) |
149 | |
150 | def receive(self, since=None): |
151 | - if not since: |
152 | - since = int(mx.DateTime.DateTimeFromTicks(mx.DateTime.localtime()) - mx.DateTime.TimeDelta(hours=240.0)) |
153 | + if since is None: |
154 | + past = datetime.now() - timedelta(hours=240.0) |
155 | else: |
156 | - since = int(mx.DateTime.DateTimeFromTicks(since).localtime()) |
157 | + past = datetime.fromtimestamp(since) |
158 | |
159 | + since = int(time.mktime(past.timetuple())) |
160 | data = self._get("me/home", since=since, limit=100) |
161 | |
162 | logger.debug("<STATS> facebook:receive account:%s since:%s size:%s", |
163 | - self.account["id"], mx.DateTime.DateTimeFromTicks(since), len(str(data))) |
164 | + self.account["id"], datetime.fromtimestamp(since), len(str(data))) |
165 | |
166 | if not self._check_error(data): |
167 | try: |
168 | |
169 | === modified file 'gwibber/microblog/plugins/flickr/__init__.py' |
170 | --- gwibber/microblog/plugins/flickr/__init__.py 2012-02-10 10:06:59 +0000 |
171 | +++ gwibber/microblog/plugins/flickr/__init__.py 2012-05-12 02:14:17 +0000 |
172 | @@ -1,11 +1,10 @@ |
173 | -from gwibber.microblog import network, util |
174 | -from gwibber.microblog.util import resources |
175 | +from gwibber.microblog import network |
176 | +from gwibber.time import parsetime |
177 | |
178 | import logging |
179 | logger = logging.getLogger("Flickr") |
180 | logger.debug("Initializing.") |
181 | |
182 | -import re, mx.DateTime |
183 | from gettext import lgettext as _ |
184 | |
185 | PROTOCOL_INFO = { |
186 | @@ -37,8 +36,6 @@ |
187 | IMAGE_URL = "http://farm%s.static.flickr.com/%s/%s_%s_%s.jpg" |
188 | IMAGE_PAGE_URL = "http://www.flickr.com/photos/%s/%s" |
189 | |
190 | -def parse_time(t): |
191 | - return mx.DateTime.DateTimeFromTicks(int(t)).gmtime().ticks() |
192 | |
193 | class Client: |
194 | def __init__(self, acct): |
195 | @@ -49,7 +46,7 @@ |
196 | m["mid"] = str(data["id"]) |
197 | m["service"] = "flickr" |
198 | m["account"] = self.account["id"] |
199 | - m["time"] = parse_time(data["dateupload"]) |
200 | + m["time"] = parsetime(data["dateupload"]) |
201 | m["source"] = False |
202 | m["text"] = data["title"] |
203 | |
204 | |
205 | === modified file 'gwibber/microblog/plugins/friendfeed/__init__.py' |
206 | --- gwibber/microblog/plugins/friendfeed/__init__.py 2012-02-10 10:06:59 +0000 |
207 | +++ gwibber/microblog/plugins/friendfeed/__init__.py 2012-05-12 02:14:17 +0000 |
208 | @@ -1,5 +1,6 @@ |
209 | from gwibber.microblog import network, util |
210 | from gwibber.microblog.util import resources |
211 | +from gwibber.time import parsetime |
212 | |
213 | import logging |
214 | logger = logging.getLogger("FriendFeed") |
215 | @@ -61,7 +62,7 @@ |
216 | "mid": data["id"], |
217 | "service": "friendfeed", |
218 | "account": self.account["id"], |
219 | - "time": util.parsetime(data["published"]), |
220 | + "time": parsetime(data["published"]), |
221 | "source": data.get("via", {}).get("name", None), |
222 | "text": data["title"], |
223 | "html": util.linkify(data["title"]), |
224 | @@ -86,7 +87,7 @@ |
225 | for item in data["comments"][-3:]: |
226 | m["comments"].append({ |
227 | "text": item["body"], |
228 | - "time": util.parsetime(item["date"]), |
229 | + "time": parsetime(item["date"]), |
230 | "sender": self._sender(item["user"]), |
231 | }) |
232 | |
233 | |
234 | === modified file 'gwibber/microblog/plugins/identica/__init__.py' |
235 | --- gwibber/microblog/plugins/identica/__init__.py 2012-02-13 20:39:02 +0000 |
236 | +++ gwibber/microblog/plugins/identica/__init__.py 2012-05-12 02:14:17 +0000 |
237 | @@ -1,7 +1,9 @@ |
238 | +from gettext import lgettext as _ |
239 | +from oauth import oauth |
240 | + |
241 | from gwibber.microblog import network, util |
242 | -from oauth import oauth |
243 | from gwibber.microblog.util import resources |
244 | -from gettext import lgettext as _ |
245 | +from gwibber.time import parsetime |
246 | |
247 | import logging |
248 | logger = logging.getLogger("Identica") |
249 | @@ -71,7 +73,7 @@ |
250 | m["service"] = "identica" |
251 | m["account"] = self.account["id"] |
252 | if data.has_key("created_at"): |
253 | - m["time"] = util.parsetime(data["created_at"]) |
254 | + m["time"] = parsetime(data["created_at"]) |
255 | m["source"] = data.get("source", False) |
256 | m["text"] = util.unescape(data["text"]) |
257 | m["to_me"] = ("@%s" % self.account["username"]) in data["text"] |
258 | @@ -130,12 +132,12 @@ |
259 | if data.has_key("retweeted_status"): |
260 | n["retweeted_by"] = self._user(data["user"] if "user" in data else data["sender"]) |
261 | if data.has_key("created_at"): |
262 | - n["time"] = util.parsetime(data["created_at"]) |
263 | + n["time"] = parsetime(data["created_at"]) |
264 | data = data["retweeted_status"] |
265 | else: |
266 | n["retweeted_by"] = None |
267 | if data.has_key("created_at"): |
268 | - n["time"] = util.parsetime(data["created_at"]) |
269 | + n["time"] = parsetime(data["created_at"]) |
270 | |
271 | m = self._common(data) |
272 | |
273 | |
274 | === modified file 'gwibber/microblog/plugins/qaiku/__init__.py' |
275 | --- gwibber/microblog/plugins/qaiku/__init__.py 2012-02-10 10:06:59 +0000 |
276 | +++ gwibber/microblog/plugins/qaiku/__init__.py 2012-05-12 02:14:17 +0000 |
277 | @@ -1,5 +1,6 @@ |
278 | from gwibber.microblog import network, util |
279 | from gwibber.microblog.util import resources |
280 | +from gwibber.time import parsetime |
281 | |
282 | import logging |
283 | logger = logging.getLogger("Qaiku") |
284 | @@ -46,7 +47,7 @@ |
285 | m["mid"] = str(data["id"]) |
286 | m["service"] = "qaiku" |
287 | m["account"] = self.account["id"] |
288 | - m["time"] = util.parsetime(data["created_at"]) |
289 | + m["time"] = parsetime(data["created_at"]) |
290 | m["text"] = data["text"] |
291 | m["to_me"] = ("@%s" % self.account["username"]) in data["text"] |
292 | |
293 | |
294 | === modified file 'gwibber/microblog/plugins/statusnet/__init__.py' |
295 | --- gwibber/microblog/plugins/statusnet/__init__.py 2012-02-13 20:39:02 +0000 |
296 | +++ gwibber/microblog/plugins/statusnet/__init__.py 2012-05-12 02:14:17 +0000 |
297 | @@ -1,8 +1,9 @@ |
298 | -import re |
299 | +from gettext import lgettext as _ |
300 | +from oauth import oauth |
301 | + |
302 | from gwibber.microblog import network, util |
303 | -from oauth import oauth |
304 | from gwibber.microblog.util import resources |
305 | -from gettext import lgettext as _ |
306 | +from gwibber.time import parsetime |
307 | |
308 | import logging |
309 | logger = logging.getLogger("StatusNet") |
310 | @@ -71,7 +72,7 @@ |
311 | m["service"] = "statusnet" |
312 | m["account"] = self.account["id"] |
313 | if data.has_key("created_at"): |
314 | - m["time"] = util.parsetime(data["created_at"]) |
315 | + m["time"] = parsetime(data["created_at"]) |
316 | m["source"] = data.get("source", False) |
317 | m["text"] = util.unescape(data["text"]) |
318 | m["to_me"] = ("@%s" % self.account["username"]) in data["text"] |
319 | @@ -130,12 +131,12 @@ |
320 | if data.has_key("retweeted_status"): |
321 | n["retweeted_by"] = self._user(data["user"] if "user" in data else data["sender"]) |
322 | if data.has_key("created_at"): |
323 | - n["time"] = util.parsetime(data["created_at"]) |
324 | + n["time"] = parsetime(data["created_at"]) |
325 | data = data["retweeted_status"] |
326 | else: |
327 | n["retweeted_by"] = None |
328 | if data.has_key("created_at"): |
329 | - n["time"] = util.parsetime(data["created_at"]) |
330 | + n["time"] = parsetime(data["created_at"]) |
331 | |
332 | m = self._common(data) |
333 | for k in n: |
334 | |
335 | === modified file 'gwibber/microblog/plugins/twitter/__init__.py' |
336 | --- gwibber/microblog/plugins/twitter/__init__.py 2012-03-30 14:32:20 +0000 |
337 | +++ gwibber/microblog/plugins/twitter/__init__.py 2012-05-12 02:14:17 +0000 |
338 | @@ -1,8 +1,10 @@ |
339 | -from gwibber.microblog import network, util |
340 | import cgi |
341 | +from gettext import lgettext as _ |
342 | from oauth import oauth |
343 | + |
344 | +from gwibber.microblog import network, util |
345 | from gwibber.microblog.util import resources |
346 | -from gettext import lgettext as _ |
347 | +from gwibber.time import parsetime |
348 | |
349 | import logging |
350 | logger = logging.getLogger("Twitter") |
351 | @@ -79,7 +81,7 @@ |
352 | m["service"] = "twitter" |
353 | m["account"] = self.account["id"] |
354 | if data.has_key("created_at"): |
355 | - m["time"] = util.parsetime(data["created_at"]) |
356 | + m["time"] = parsetime(data["created_at"]) |
357 | m["text"] = util.unescape(data["text"]) |
358 | m["text"] = cgi.escape(m["text"]) |
359 | m["content"] = m["text"] |
360 | @@ -227,12 +229,12 @@ |
361 | if data.has_key("retweeted_status"): |
362 | n["retweeted_by"] = self._user(data["user"] if "user" in data else data["sender"]) |
363 | if data.has_key("created_at"): |
364 | - n["time"] = util.parsetime(data["created_at"]) |
365 | + n["time"] = parsetime(data["created_at"]) |
366 | data = data["retweeted_status"] |
367 | else: |
368 | n["retweeted_by"] = None |
369 | if data.has_key("created_at"): |
370 | - n["time"] = util.parsetime(data["created_at"]) |
371 | + n["time"] = parsetime(data["created_at"]) |
372 | |
373 | m = self._common(data) |
374 | for k in n: |
375 | |
376 | === modified file 'gwibber/microblog/util/__init__.py' |
377 | --- gwibber/microblog/util/__init__.py 2012-04-20 21:16:37 +0000 |
378 | +++ gwibber/microblog/util/__init__.py 2012-05-12 02:14:17 +0000 |
379 | @@ -1,7 +1,7 @@ |
380 | -import os, locale, re, mx.DateTime, cgi, httplib2 |
381 | -import resources |
382 | +import re, cgi, httplib2 |
383 | +from gwibber.microblog.util import resources |
384 | import dbus |
385 | -from const import * |
386 | +from gwibber.microblog.util.const import * |
387 | from htmlentitydefs import name2codepoint |
388 | |
389 | import logging |
390 | @@ -16,12 +16,6 @@ |
391 | |
392 | COUNT = 200 |
393 | |
394 | -def parsetime(t): |
395 | - locale.setlocale(locale.LC_TIME, 'C') |
396 | - result = mx.DateTime.Parser.DateTimeFromString(t) |
397 | - locale.setlocale(locale.LC_TIME, '') |
398 | - return result.ticks() |
399 | - |
400 | URL_SCHEMES = ('http', 'https', 'ftp', 'mailto', 'news', 'gopher', |
401 | 'nntp', 'telnet', 'wais', 'prospero', 'aim', 'webcal') |
402 | |
403 | |
404 | === modified file 'gwibber/microblog/util/resources.py' |
405 | --- gwibber/microblog/util/resources.py 2012-03-20 16:31:03 +0000 |
406 | +++ gwibber/microblog/util/resources.py 2012-05-12 02:14:17 +0000 |
407 | @@ -9,7 +9,7 @@ |
408 | from os import makedirs, remove, environ |
409 | from os.path import join, isdir, realpath, exists |
410 | import Image |
411 | -import mx.DateTime |
412 | +from datetime import datetime |
413 | from gwibber.microblog import network |
414 | from gwibber.microblog.util.const import * |
415 | import inspect |
416 | @@ -211,7 +211,7 @@ |
417 | dump_cache_dir = realpath(join(CACHE_BASE_DIR, "gwibber", "dump", service)) |
418 | if not isdir(dump_cache_dir): |
419 | makedirs(dump_cache_dir) |
420 | - dump_cache_file = join(dump_cache_dir, (aid + "." + str(mx.DateTime.now()) + "." + operation)) |
421 | + dump_cache_file = join(dump_cache_dir, (aid + "." + str(datetime.now()) + "." + operation)) |
422 | |
423 | if not exists(dump_cache_file) or len(open(dump_cache_file, "r").read()) < 1: |
424 | logger.debug("Dumping test data %s - %s - %s", service, aid, operation) |
425 | |
426 | === added file 'gwibber/time.py' |
427 | --- gwibber/time.py 1970-01-01 00:00:00 +0000 |
428 | +++ gwibber/time.py 2012-05-12 02:14:17 +0000 |
429 | @@ -0,0 +1,96 @@ |
430 | +from __future__ import absolute_import |
431 | + |
432 | +import re |
433 | +import time |
434 | +import locale |
435 | + |
436 | +from contextlib import contextmanager |
437 | +from datetime import datetime, timedelta |
438 | + |
439 | + |
440 | +# Date time formats. Assume no microseconds and no timezone. |
441 | +ISO8601_FORMAT = '%Y-%m-%dT%H:%M:%S' |
442 | +TWITTER_FORMAT = '%a %b %d %H:%M:%S %Y' |
443 | + |
444 | + |
445 | +@contextmanager |
446 | +def c_locale(): |
447 | + locale.setlocale(locale.LC_TIME, 'C') |
448 | + try: |
449 | + yield |
450 | + finally: |
451 | + locale.setlocale(locale.LC_TIME, '') |
452 | + |
453 | + |
454 | +def iso8601(t): |
455 | + return datetime.strptime(t, ISO8601_FORMAT) |
456 | + |
457 | +def iso8601alt(t): |
458 | + return datetime.strptime(t, ISO8601_FORMAT.replace('T', ' ')) |
459 | + |
460 | +def twitter(t): |
461 | + return datetime.strptime(t, TWITTER_FORMAT) |
462 | + |
463 | + |
464 | +def parsetime(t): |
465 | + """Parse an ISO 8601 datetime string and return seconds since epoch. |
466 | + |
467 | + This accepts either a naive (i.e. timezone-less) string or a timezone |
468 | + aware string. The timezone must start with a + or - and must be followed |
469 | + by exactly four digits. This string is parsed and converted to UTC. This |
470 | + value is then converted to an integer seconds since epoch. |
471 | + """ |
472 | + with c_locale(): |
473 | + # In order to parse the UTC timezone (e.g. +0000), you'd think we |
474 | + # could just append %z on the format, but that doesn't work |
475 | + # across Python versions. On some platforms in Python 2.7, %z is not |
476 | + # a valid format directive, and its use will raise a ValueError. |
477 | + # |
478 | + # In Python 3.2, strptime() is implemented in Python, so it *is* |
479 | + # supported everywhere, but we still can't rely on it because of the |
480 | + # non-ISO 8601 formats that some APIs use (I'm looking at you Twitter |
481 | + # and Facebook). We'll use a regular expression to tear out the |
482 | + # timezone string and do the conversion ourselves. |
483 | + # |
484 | + # tz_offset is a list for two reasons. First, so we can play games |
485 | + # with scoped access to the variable inside capture_tz() without the |
486 | + # use of `nonlocal` which doesn't exist in Python 2. Also, we'll use |
487 | + # this as a cheap way of ensuring there aren't multiple matching |
488 | + # timezone strings in a single input string. |
489 | + tz_offset = [] |
490 | + def capture_tz(match_object): |
491 | + tz_string = match_object.group('tz') |
492 | + if tz_string is not None: |
493 | + # It's possible that we'll see more than one substring |
494 | + # matching the timezone pattern. It should be highly unlikely |
495 | + # so we won't test for that here, at least not now. |
496 | + # |
497 | + # The tz_offset is positive, so it must be subtracted from the |
498 | + # naive datetime in order to return it to UTC. E.g. |
499 | + # 13:00 -0400 is 17:00 +0000 |
500 | + # or |
501 | + # 1300 - (-0400 / 100) |
502 | + tz_offset.append(timedelta(hours=int(tz_string) / 100)) |
503 | + # Return the empty string so as to remove the timezone pattern |
504 | + # from the string we're going to parse. |
505 | + return '' |
506 | + naive_t = re.sub(r'[ ]*(?P<tz>[-+]\d{4})', capture_tz, t) |
507 | + if len(tz_offset) == 0: |
508 | + tz_offset = timedelta() |
509 | + elif len(tz_offset) == 1: |
510 | + tz_offset = tz_offset[0] |
511 | + else: |
512 | + raise ValueError('Unsupported time string: {0}'.format(t)) |
513 | + for parser in (iso8601, iso8601alt, twitter): |
514 | + try: |
515 | + dt = parser(naive_t) - tz_offset |
516 | + except ValueError: |
517 | + pass |
518 | + else: |
519 | + break |
520 | + else: |
521 | + # Nothing matched. |
522 | + raise ValueError('Unsupported time string: {0}'.format(t)) |
523 | + # We must have gotten a valid datetime. Convert it to Epoch seconds. |
524 | + timetup = dt.timetuple() |
525 | + return int(time.mktime(timetup)) |
526 | |
527 | === modified file 'gwibber/util.py' |
528 | --- gwibber/util.py 2012-02-13 19:59:49 +0000 |
529 | +++ gwibber/util.py 2012-05-12 02:14:17 +0000 |
530 | @@ -1,4 +1,4 @@ |
531 | -import dbus, os, mx.DateTime, webbrowser |
532 | +import dbus, os, webbrowser |
533 | import subprocess |
534 | from gi.repository import GLib, Gdk, GdkPixbuf, Gtk |
535 | from microblog.util import resources |
536 | @@ -105,8 +105,8 @@ |
537 | |
538 | def generate_time_string(t): |
539 | if isinstance(t, str): return t |
540 | - t = mx.DateTime.TimestampFromTicks(t) |
541 | - d = mx.DateTime.gmt() - t |
542 | + t = datetime.fromtimestap(t) |
543 | + d = datetime.utcnow() - t |
544 | |
545 | # Aliasing the function doesn't work here with intltool... |
546 | if d.days >= 365: |
547 | |
548 | === modified file 'tests/plugins/test/__init__.py' |
549 | --- tests/plugins/test/__init__.py 2012-02-10 10:06:59 +0000 |
550 | +++ tests/plugins/test/__init__.py 2012-05-12 02:14:17 +0000 |
551 | @@ -1,13 +1,11 @@ |
552 | +import json |
553 | from gwibber.microblog import util |
554 | -import json |
555 | +from gwibber.time import parsetime |
556 | |
557 | import logging |
558 | logger = logging.getLogger("Plugin Test") |
559 | logger.debug("Initializing.") |
560 | |
561 | -import re, mx.DateTime |
562 | -from gettext import lgettext as _ |
563 | - |
564 | PROTOCOL_INFO = { |
565 | "name": "Test", |
566 | "version": 0.1, |
567 | @@ -40,7 +38,7 @@ |
568 | m["service"] = "test" |
569 | m["account"] = self.account["id"] |
570 | if data.has_key("created_at"): |
571 | - m["time"] = util.parsetime(data["created_at"]) |
572 | + m["time"] = parsetime(data["created_at"]) |
573 | m["source"] = data["source"] |
574 | m["text"] = data["text"] |
575 | |
576 | |
577 | === added directory 'tests/python/unittests' |
578 | === added file 'tests/python/unittests/__init__.py' |
579 | === added file 'tests/python/unittests/test_time.py' |
580 | --- tests/python/unittests/test_time.py 1970-01-01 00:00:00 +0000 |
581 | +++ tests/python/unittests/test_time.py 2012-05-12 02:14:17 +0000 |
582 | @@ -0,0 +1,53 @@ |
583 | +import unittest |
584 | + |
585 | +from gwibber.time import parsetime |
586 | + |
587 | + |
588 | +class TimeParseTest(unittest.TestCase): |
589 | + def test_type(self): |
590 | + # parsetime() should always return int seconds since the epoch. |
591 | + self.assertTrue(isinstance(parsetime('2012-05-10T13:36:45'), int)) |
592 | + |
593 | + def test_parse_naive(self): |
594 | + # ISO 8601 standard format without timezone. |
595 | + self.assertEqual(parsetime('2012-05-10T13:36:45'), 1336682205) |
596 | + |
597 | + def test_parse_utctz(self): |
598 | + # ISO 8601 standard format with UTC timezone. |
599 | + self.assertEqual(parsetime('2012-05-10T13:36:45 +0000'), 1336682205) |
600 | + |
601 | + def test_parse_naive_altsep(self): |
602 | + # ISO 8601 alternative format without timezone. |
603 | + self.assertEqual(parsetime('2012-05-10 13:36:45'), 1336682205) |
604 | + |
605 | + def test_parse_utctz_altsep(self): |
606 | + # ISO 8601 alternative format with UTC timezone. |
607 | + self.assertEqual(parsetime('2012-05-10T13:36:45 +0000'), 1336682205) |
608 | + |
609 | + def test_bad_time_string(self): |
610 | + # Odd unsupported format. |
611 | + self.assertRaises(ValueError, parsetime, '2012/05/10 13:36:45') |
612 | + |
613 | + def test_non_utc(self): |
614 | + # Non-UTC timezones are get converted to UTC, before conversion to |
615 | + # epoch seconds. |
616 | + self.assertEqual(parsetime('2012-05-10T13:36:45 -0400'), 1336696605) |
617 | + |
618 | + def test_nonstandard_twitter(self): |
619 | + # Sigh. Twitter has to be different. |
620 | + self.assertEqual(parsetime('Thu May 10 13:36:45 +0000 2012'), |
621 | + 1336682205) |
622 | + |
623 | + def test_nonstandard_twitter_non_utc(self): |
624 | + # Sigh. Twitter has to be different. |
625 | + self.assertEqual(parsetime('Thu May 10 13:36:45 -0400 2012'), |
626 | + 1336696605) |
627 | + |
628 | + def test_nonstandard_facebook(self): |
629 | + # Sigh. Facebook gets close, but no cigar. |
630 | + self.assertEqual(parsetime('2012-05-10T13:36:45+0000'), 1336682205) |
631 | + |
632 | + def test_multiple_timezones(self): |
633 | + # Multiple timezone strings are not supported. |
634 | + self.assertRaises(ValueError, parsetime, |
635 | + '2012-05-10T13:36:45 +0000 -0400') |
636 | |
637 | === modified file 'tests/python/utils/__init__.py' |
638 | --- tests/python/utils/__init__.py 2012-03-12 23:47:12 +0000 |
639 | +++ tests/python/utils/__init__.py 2012-05-12 02:14:17 +0000 |
640 | @@ -1,4 +1,7 @@ |
641 | -import os, unittest, gettext, mx.DateTime |
642 | +import time |
643 | +from datetime import datetime, timedelta |
644 | + |
645 | +import os, unittest |
646 | from gi.repository import Gwibber |
647 | from hashlib import sha1 |
648 | |
649 | @@ -7,34 +10,40 @@ |
650 | os.environ["XDG_DATA_HOME"] = os.path.realpath (os.path.join (os.path.curdir, "..", "data")) |
651 | os.environ["XDG_CACHE_HOME"] = os.path.realpath (os.path.join (os.path.curdir, "..", "data")) |
652 | self.utils = Gwibber.Utils () |
653 | - self.now = mx.DateTime.gmt() |
654 | + self.now = datetime.utcnow() |
655 | + |
656 | + def _ago(self, **kws): |
657 | + # Convert keywords into a timedelta, subtract that from utcnow, and |
658 | + # return seconds since epoch. |
659 | + timetup = (self.now - timedelta(**kws)).timetuple() |
660 | + return time.mktime(timetup) |
661 | |
662 | def test_time_string_seconds (self): |
663 | - ts = self.utils.generate_time_string (self.now - 59 * mx.DateTime.oneSecond) |
664 | + ts = self.utils.generate_time_string(self._ago(seconds=59)) |
665 | self.assertEqual (ts, 'a few seconds ago') |
666 | |
667 | def test_time_string_minute (self): |
668 | - ts = self.utils.generate_time_string (self.now - 60 * mx.DateTime.oneSecond) |
669 | + ts = self.utils.generate_time_string(self._ago(seconds=60)) |
670 | self.assertEqual (ts, '1 minute ago') |
671 | |
672 | def test_time_string_minutes (self): |
673 | - ts = self.utils.generate_time_string (self.now - 3559 * mx.DateTime.oneSecond) |
674 | + ts = self.utils.generate_time_string(self._ago(seconds=3559)) |
675 | self.assertIn ('minutes ago', ts) |
676 | |
677 | def test_time_string_hour (self): |
678 | - ts = self.utils.generate_time_string (self.now - 3601 * mx.DateTime.oneSecond) |
679 | + ts = self.utils.generate_time_string(self._ago(seconds=3601)) |
680 | self.assertEqual (ts, '1 hour ago') |
681 | |
682 | def test_time_string_hours (self): |
683 | - ts = self.utils.generate_time_string (self.now - 7201 * mx.DateTime.oneSecond) |
684 | + ts = self.utils.generate_time_string(self._ago(seconds=7201)) |
685 | self.assertIn ('hours ago', ts) |
686 | |
687 | def test_time_string_day (self): |
688 | - ts = self.utils.generate_time_string (self.now - 86400 * mx.DateTime.oneSecond) |
689 | + ts = self.utils.generate_time_string(self._ago(seconds=86400)) |
690 | self.assertEqual (ts, '1 day ago') |
691 | |
692 | def test_time_string_days (self): |
693 | - ts = self.utils.generate_time_string (self.now - 2 * mx.DateTime.oneDay) |
694 | + ts = self.utils.generate_time_string(self._ago(days=2)) |
695 | self.assertIn ('days ago', ts) |
696 | |
697 | class AvatarPathTestCase (unittest.TestCase): |
On May 10, 2012, at 06:09 PM, Ken VanDine wrote:
>Review: Needs Fixing
>
>Actually my last test didn't download any updates for facebook, so I purged
>all my facebook messages from the database. Now facebook refresh gives me
>this traceback:
Ah, so the FB data does have a timezone. Seems like a shallow bug, should be
easy to fix. I'll do that asap.
-Barry
> ken/src/ gwibber/ trunk/gwibber/ microblog/ dispatcher. py", line 81, in run account[ "service" ]].Client( account) (opname, **args) ken/src/ gwibber/ trunk/gwibber/ microblog/ plugins/ facebook/ __init_ _.py", line 276, in __call__ ken/src/ gwibber/ trunk/gwibber/ microblog/ plugins/ facebook/ __init_ _.py", line 292, in receive message( post) for post in data["data"]] ken/src/ gwibber/ trunk/gwibber/ microblog/ plugins/ facebook/ __init_ _.py", line 151, in _message ken/src/ gwibber/ trunk/gwibber/ microblog/ plugins/ facebook/ __init_ _.py", line 80, in convert_time strptime( iso8601, ISO8601FORMAT) python2. 7/_strptime. py", line 328, in _strptime found.end( ):])
>
>Dispatcher Thread-6 : ERROR <facebook:receive> Operation failed
>Dispatcher Thread-6 : DEBUG Traceback:
>Traceback (most recent call last):
> File "/home/
> message_data = PROTOCOLS[
> File "/home/
> return getattr(self, opname)(**args)
> File "/home/
> return [self._
> File "/home/
> m["time"] = convert_time(data)
> File "/home/
> as_datetime = datetime.
> File "/usr/lib/
> data_string[
>ValueError: unconverted data remains: +0000
>