Merge lp:~jtv/storm/profile-fetches into lp:storm

Proposed by Jeroen T. Vermeulen
Status: Needs review
Proposed branch: lp:~jtv/storm/profile-fetches
Merge into: lp:storm
Diff against target: 1109 lines (+903/-7) (has conflicts)
8 files modified
storm/database.py (+1/-0)
storm/fetch_profile.py (+255/-0)
storm/references.py (+12/-4)
storm/store.py (+97/-3)
tests/fetch_context.py (+164/-0)
tests/fetch_profile.py (+64/-0)
tests/fetch_statistics.py (+99/-0)
tests/store/base.py (+211/-0)
Text conflict in tests/store/base.py
To merge this branch: bzr merge lp:~jtv/storm/profile-fetches
Reviewer Review Type Date Requested Status
Storm Developers Pending
Storm Developers Pending
Review via email: mp+43323@code.launchpad.net

Description of the change

= Fetch-Profiling =

Profile dependencies between object fetches from the database.

This work has been raised on the launchpad-dev mailing list, and in a later stage, discussed with Jamu Kakar, Stuart Bishop, and others.

== The problem ==

Profiling will help map out and optimize data needs. For instance, consider this loop:

    def get_x_for(item):
        return item.other_object.x

# ...

    for item in query_items():
        total += get_x_for(item)

This will fetch item.other_object individually for each item coming out of query_items(). It's a common anti-pattern in ORM performance, and easy enough to optimize: just outer-join item.other_object inside query_items, so that it will already be in cache when get_x_for needs it.

Keeping track of all such dependencies is tedious, brittle, and a source of major abstraction leaks. Conventional ways of dealing with them involve profiling, analysis, matching query patterns to code paths, mapping out data requirements, and identifying downside risks of optimization. The result is a single "point-solution" fix. The effort produces experience as a side effect, but transfer of such experience from one human to others is relatively ineffective.

Then, after that's all done and the code has been optimized, it's difficult to keep track of which optimizations are still relevant. Code is alive, and the more intricate and beautiful an optimization is, the easier it is to break. Testing for such "semantically-neutral" breakage is often difficult, and monitoring the relevance of the tests themselves can be costly.

Refactorings in particular raise questions: will I need to port this optimization to the new structure? Will I still be hitting the right indexes? Could the new structure make the optimization unnecessary or less relevant? Am I prefetching a lot of objects that I don't need? And after I've made all those choices, how can I compare the new code's performance to the old code's performance as it's been tuned over time?

Fetch-profiling brings us closer to solving all those problems, but be patient. For now, it simplifies the mapping of data requirements by eliminating the tracing and the matching of query patterns to code paths. Read on for where we go next.

== Visibility improvements ==

Profiling would expose the problem pattern in the example very clearly. After running through the loop a few times, you'd inspect the store's statistics. The statistics will tell you how many item.other_object instances were loaded from the database (for "item"s returned by query_items) as well as how this number compares to the number of "item"s loaded by query_items, as a percentage.

The highest "item" numbers will identify the places most in need of pre-fetching optimizations. Among those, the highest percentages of "item.other_object" loads identify the places where simple prejoins are most likely to be beneficial. Lower percentages may indicate that many of the foreign-key references are null, or that most of the objects they refer to are already covered by other caching, or that most of the objects you might prefetch in the query would be irrelevant.

== Future improvements ==

This is phase 1 of a proposed multi-phase development. For phase 2 I'd like to automate the prefetching so that code like this (using features layered on top of the existing Storm API) will optimize itself, without requiring manual tweaking. After that come policy tuning and automated context definition (see below).

With automated prefetching, the most basic optimizations will no longer be specified in the application. They will work themselves out automatically based on feedback from a real, running production system.

It is at that point where the big problems resolve themselves. After a code change the individual optimizations will re-apply themselves as appropriate. There is no need to track their relevance manually.

== Concepts ==

Cast of characters:
 • A "fetch" is the retrieval and de-marshalling of an object from the database. Reading just a foreign key (e.g. to check for NULL) is not a fetch from the table it refers to; neither is retrieving an object from cache.
 • A "fetch context" is some indicator of what part of the application is executing a query. To the application, this is managed as a "call stack" of strings.
 • An "original fetch" is a free-form query, as performed using Store.find or Store.execute.
 • A "fetch origin" (within a context) is a class participating in an original fetch.
 • A "derived fetch" is the retrieval of objects that are clearly derived (directly or indirectly) from an original fetch through a chain of reference.

In the example loop, query_items() would contain at least an original fetch. The reference to item.other_object inside get_x_for is a derived fetch (derived directly from the original fetch, as it happens). Derived fetches can also be tracked across stores.

There's only room for one fetch context in this example, since derived fetches are associated with the same context as their original fetches. In a web application, the most useful context would probably be the request type, but for detailed optimization you'll want more fine-grained contexts. The typical ideal granularity for automated optimization would be just one original query per context.

Contexts form a hierarchy so as to suit all these use cases, as well as "drilling down" during analysis of data requirements. A context manager helps mark regions of code as being a specific context. Another idea would be a decorator (probably at the application level though, where it's easier to find the right store) and an optional argument to find() that selects a context for just one query.

Original fetches are identified by the fetched class as well as the context. This makes it possible to associate derived fetches with individual classes in a join, and track their dependent fetches separately.

== Implementation notes ==

I'm not planning to map out full dependency chains from fetch origins to derived fetches for now; that would probably become too costly. We'd have to see how useful that information is in practice.

You may note how fetch_context is tracked in Stores, in Results, and in ObjectInfos. The reason for this is that objects may be fetched from a result set long after the store has moved on to a different context. An object fetch should be associated with the context that the result set was produced in, which in turn is the context the store was in at the time.

All interesting analysis and optimization work will be done outside the performance-critical path of query execution. Profiling costs should be minimal, limited to simple dict lookups and counter increments.

Jeroen

To post a comment you must log in.
Revision history for this message
Robert Collins (lifeless) wrote :

Hi Jeroen, thanks for pointing me at this.

I think this is a very interesting project. I think the stats will be useful for manual optimisation in the short term in Launchpad.

As far as the autotuning goes long term, the jit-vm style learn-and-improve doesn't interest me for Launchpad : https://dev.launchpad.net/LEP/PersistenceLayer contains my broad plans for addressing systematic performance issues in Launchpad. I think a jit-vm auto tuning layer would be a fascinating project, but the warm-up time in many JIT's can be substantial, and is only ameliorated by loops running hundreds or thousands of times : and still at best only approaches the efficiency available by writing in a more efficient language. Thus my interest in providing a more efficient DSL than storms bound-object approach. I'd love to see storm become radically simpler in aid of that: faster short circuits in object caching - optionally no object caching at all. Constrained and parameterised references would be awesome too.

Cheers,
Rob

Revision history for this message
Jeroen T. Vermeulen (jtv) wrote :

I'd be more careful in assuming similarity with a JIT VM. Differences in relative overhead aside, JIT compilers don't specialize method calls much. Actually there is a JVM that combines JIT with inlining of all code, but from what I hear that actually yields fantastic results. A lot of the startup overhead would be in the inlining, which doesn't come up per se in fetch profiling.

As an example of specialization, if Launchpad used automatic optimization based on this profiler, a given query method somewhere deep down the call stack gets optimized separately when used from the web service API or when used from the web UI. The two calls have very different needs. We don't have any decent solution for that at the moment.

If you're willing to be as aggressive in fetching objects as to do it "statically," then you might as well use automatic optimization with a warmup time of 1: generate optimization advice after a first pass through a stretch of code, then repeat periodically to cover any objects that may also be needed but weren't referenced in that first run. To amortize startup cost over more requests, pickle the optimization choices and presto: profile-driven optimizations get reused across restarts.

Separately from that, the term "efficient" is treacherous. A static approach is almost guaranteed to be more efficient in terms of computing power, yes, but less efficient when it comes to the human factors: flexibility, legibility, conceptual cleanliness. (Isn't that why we're using python in the first place?) A dynamic approach on the other hand can narrow the gap in computational efficiency without sacrificing any of those other efficiencies. It's also easier to deploy and fine-tune such optimizations across the entire application.

Jeroen

lp:~jtv/storm/profile-fetches updated
419. By Jeroen T. Vermeulen

Don't support derived_from without a known reference.

420. By Jeroen T. Vermeulen

Record origin, source, and reference; ignore cross-store dependencies.

421. By Jeroen T. Vermeulen

Cosmetic.

422. By Jeroen T. Vermeulen

Documentation; made is_root a @property.

Unmerged revisions

422. By Jeroen T. Vermeulen

Documentation; made is_root a @property.

421. By Jeroen T. Vermeulen

Cosmetic.

420. By Jeroen T. Vermeulen

Record origin, source, and reference; ignore cross-store dependencies.

419. By Jeroen T. Vermeulen

Don't support derived_from without a known reference.

418. By Jeroen T. Vermeulen

Aggregate stats by context name; nicer cumulate API.

417. By Jeroen T. Vermeulen

Context iteration.

416. By Jeroen T. Vermeulen

Test add_to_dict separately.

415. By Jeroen T. Vermeulen

Move profiling functions out of Store.

414. By Jeroen T. Vermeulen

Context manager.

413. By Jeroen T. Vermeulen

Track derived fetches across stores.

Preview Diff

[H/L] Next/Prev Comment, [J/K] Next/Prev File, [N/P] Next/Prev Hunk
=== modified file 'storm/database.py'
--- storm/database.py 2010-04-16 07:14:25 +0000
+++ storm/database.py 2011-06-20 13:09:26 +0000
@@ -52,6 +52,7 @@
52 def __init__(self, connection, raw_cursor):52 def __init__(self, connection, raw_cursor):
53 self._connection = connection # Ensures deallocation order.53 self._connection = connection # Ensures deallocation order.
54 self._raw_cursor = raw_cursor54 self._raw_cursor = raw_cursor
55 self.fetch_origin = None
55 if raw_cursor.arraysize == 1:56 if raw_cursor.arraysize == 1:
56 # Default of 1 is silly.57 # Default of 1 is silly.
57 self._raw_cursor.arraysize = 1058 self._raw_cursor.arraysize = 10
5859
=== added file 'storm/fetch_profile.py'
--- storm/fetch_profile.py 1970-01-01 00:00:00 +0000
+++ storm/fetch_profile.py 2011-06-20 13:09:26 +0000
@@ -0,0 +1,255 @@
1#
2# Copyright (c) 2011 Canonical
3#
4# Written by Jeroen Vermeulen at Canonical.
5#
6# This file is part of Storm Object Relational Mapper.
7#
8# Storm is free software; you can redistribute it and/or modify
9# it under the terms of the GNU Lesser General Public License as
10# published by the Free Software Foundation; either version 2.1 of
11# the License, or (at your option) any later version.
12#
13# Storm is distributed in the hope that it will be useful,
14# but WITHOUT ANY WARRANTY; without even the implied warranty of
15# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
16# GNU Lesser General Public License for more details.
17#
18# You should have received a copy of the GNU Lesser General Public License
19# along with this program. If not, see <http://www.gnu.org/licenses/>.
20#
21
22"""Fetch-profiling support.
23
24This profiles how objects are pulled from the database into the local cache.
25Accesses to objects that are already cached are ignored.
26
27
28= Original and Derived Fetches =
29
30The profiler distinguishes two ways in which an object can be retrieved from
31the database: "original fetches" and "derived fetches."
32
33An original fetch happens during a free-form query, such as...
34
35 employees = store.find(Employee, Employee.name.startswith(name_pattern))
36
37A derived fetch happens when the program follows a reference from an object
38that needs to be retrieved from the database. These are often a problem in
39ORM performance, because object-oriented programs can easily request large
40numbers of fetches in small, inefficient queries. For instance, this could
41query all your company's Department records one by one in separate queries:
42
43 for emp in employees:
44 print(emp.department.name)
45
46Here, the fetch that pulls emp.department into the cache is "derived" from
47the original fetch that retrieved emp itself. A derived fetch can also be
48derived from another derived fetch, but ultimately the chain leads back to
49either an original fetch or a new object in memory.
50
51When it comes to optimizing your application, one of the things you'll want to
52look at is reducing derived fetches. That simple loop over employees might be
53many times faster if you loaded all of the departments you needed into cache
54in one single query:
55
56 # "Pre-fetch" the employees' departments into Storm's cache.
57 dept_ids = set([emp.department_id for emp in employees])
58 list(store.find(Department, Department.id.is_in(dept_ids)))
59 for emp in employees:
60 # Faster, because emp.department is in cache now!
61 print(emp.department.name)
62
63Sometimes it may even be most efficient to join the derived fetch into your
64original query:
65
66 emps_and_depts = store.find(
67 (Employee, Department),
68 Department.id == Employee.department_id,
69 employee.name.startswith(name_pattern))
70
71 for emp, dept in emps_and_depts:
72 print dept.name
73
74You'll recognize opportunities for this optimization when you see the same
75derived fetch happen many times after the same original query. But it's
76probably not worth doing this when the derived fetches are infrequent: the
77code path that does the derived fetch could be rare, or the reference might
78be None in most cases. Or more likely in this example, the query will return
79many employees but only very few different departments. Most are probably
80already in the cache, in which case the profile won't count them.
81
82The profile tells you how many objects were fetched at each step along any
83data access path. So for example the profiler might report that the
84original employees query pulled 1,233 objects into memory, but the
85"emp.department" in the loop only fetched 62 departments: that means you can
86save 61 queries by fetching all departments in one go. Or you might see zero
87derived fetches because the departments are already in cache, in which case
88the loop is not worth optimizing. You might even see about the same number
89of departments fetched as you do employees, in which case your best option may
90be to join Employee and Department together into one query. Maybe there is
91too much data to hold in cache, and you need to get really creative.
92
93
94= Fetch Contexts =
95
96When you spot a performance problem in the profile, you'll want to know
97exactly where in your application it occurs. Sometimes the name of the
98function is enough, but most of the time you'll want to know something more
99about the context in which that function was called.
100
101To help you keep track of this information, the profiler lets you define
102"fetch contexts," which are profiled separately. An original fetch is
103counted in its current context, but a derived fetch is counted in the context
104of the original fetch that it is ultimately derived from.
105
106So when you look at the profile for a particular context, you see not only the
107data it queries (its original fetches) but also the "future" of that data: how
108will this data be used after my function returns it? What can we start
109prefetching here to speed up the code that consumes this data? It doesn't
110matter whether that other code is in the same context or not.
111
112You can name contexts whatever you like. Contexts can also nest inside
113contexts: you can have a context "process_salaries" with a context
114"get_employees" inside it, and a different context "send_birthday_cards" also
115with a "get_employees" inside it. Those two nested contexts are different
116ones, even though they're both called "get_employees," and even though they
117could actually be the same function.
118
119This lets you profile the same code separately in different situations, and
120you can then decide whether they need the same optimizations or not. For
121instance, a function in a web application could have different performance
122characteristics depending on which page is being rendered or even who is
123viewing it (an administrator might see more data than someone who is not
124logged in, for example). In that case you could incorporate that information
125in your context names, so that you'll be able to tell them apart in the
126profile.
127"""
128
129
130__all__ = ["FetchContext", "FetchStatistics", "fetch_context"]
131
132
133class fetch_context(object):
134 """Context manager to mark a region of code as a fetch context."""
135 def __init__(self, store, context_name):
136 self.store = store
137 self.context_name = context_name
138
139 def __enter__(self):
140 """Start context `context_name`for the given `Store`."""
141 self.store.push_fetch_context(self.context_name)
142
143 def __exit__(self, *args, **kwargs):
144 """Close context."""
145 self.store.pop_fetch_context()
146
147
148class FetchContext(object):
149 """A context in which database fetches are recorded for profiling.
150
151 Contexts nest in order to support detailed views and aggregation.
152 However they need not exactly match the program's call stack.
153
154 Profiling is disabled in the root context.
155
156 :ivar name: The context's name.
157 :ivar parent: The parent context.
158 :ivar children: Child contexts, mapped by name.
159 :ivar stats: `FetchStatistics` for this context.
160 """
161 def __init__(self, name, parent=None):
162 self.name = name
163 self.parent = parent
164 self.children = {}
165 self.stats = FetchStatistics()
166
167 def __iter__(self):
168 for child in self.children.itervalues():
169 yield child
170 for grandchild in child:
171 yield grandchild
172
173 @property
174 def is_root(self):
175 """Is this the root context?"""
176 return self.parent is None
177
178 def get_child(self, name):
179 """Find a child context of the given name, or create one."""
180 child = self.children.get(name)
181 if child is None:
182 child = self.children[name] = FetchContext(name, parent=self)
183 return child
184
185 def cumulate_stats(self):
186 """Add `FetchStatistics` for this context and its children."""
187 stats = self.stats.copy()
188 for child in self:
189 stats.merge(child.stats)
190 return stats
191
192 def aggregate_stats_by_name(self):
193 """Aggregate `FetchStatistics` for self and children by context name.
194
195 Returns a dict mapping context names to aggregated statistics for
196 all `FetchContext`s of those respective names among self and its
197 children.
198 """
199 names = set(child.name for child in self).union([self.name])
200 stats = dict(
201 (name, FetchStatistics())
202 for name in names if name is not None)
203 stats[self.name].merge(self.stats)
204 for child in self:
205 stats[child.name].merge(child.stats)
206 return stats
207
208
209def add_number_to_dict(dictionary, key, value=1):
210 """Add `value` to `dictionary[key]`, defaulting to 0."""
211 dictionary.setdefault(key, 0)
212 dictionary[key] += value
213
214
215def add_dict_to_dict(dest, addition):
216 """Add the values from dict `addition` into dict `dest`."""
217 for key, value in addition:
218 add_number_to_dict(dest, key, value)
219
220
221class FetchStatistics(object):
222 """Fetch profiling statistics.
223
224 Statistics are recorded for each context, but they can also be
225 aggregated.
226
227 :ivar original_fetches: Maps fetch origin (i.e. a class being fetched)
228 to a count of the number of original fetches on that class.
229 :ivar derived_fetches: Maps derived fetches to their respective fetch
230 counts. A derived derived fetch is represented as a tuple
231 (origin, source, Reference).
232 """
233 def __init__(self):
234 self.original_fetches = {}
235 self.derived_fetches = {}
236
237 def copy(self):
238 new = FetchStatistics()
239 new.original_fetches = self.original_fetches.copy()
240 new.derived_fetches = self.derived_fetches.copy()
241 return new
242
243 def record_original_fetch(self, origin):
244 """Record an original fetch in the statistics."""
245 add_number_to_dict(self.original_fetches, origin)
246
247 def record_derived_fetch(self, origin, source, reference):
248 """Record a derived fetch in the statistics."""
249 fetch = (origin, source, reference)
250 add_number_to_dict(self.derived_fetches, fetch)
251
252 def merge(self, other_stats):
253 """For aggregation purposes: merge `other_stats` into `self`."""
254 add_dict_to_dict(self.original_fetches, other_stats.original_fetches)
255 add_dict_to_dict(self.derived_fetches, other_stats.derived_fetches)
0256
=== modified file 'storm/references.py'
--- storm/references.py 2010-06-01 08:33:33 +0000
+++ storm/references.py 2011-06-20 13:09:26 +0000
@@ -1,5 +1,5 @@
1#1#
2# Copyright (c) 2006, 2007 Canonical2# Copyright (c) 2006-2011 Canonical
3#3#
4# Written by Gustavo Niemeyer <gustavo@niemeyer.net>4# Written by Gustavo Niemeyer <gustavo@niemeyer.net>
5#5#
@@ -22,7 +22,8 @@
2222
23from storm.exceptions import (23from storm.exceptions import (
24 ClassInfoError, FeatureError, NoStoreError, WrongStoreError)24 ClassInfoError, FeatureError, NoStoreError, WrongStoreError)
25from storm.store import Store, get_where_for_args, LostObjectError25from storm.store import (
26 Store, get_where_for_args, LostObjectError, record_derived_fetch)
26from storm.variables import LazyValue27from storm.variables import LazyValue
27from storm.expr import (28from storm.expr import (
28 Select, Column, Exists, ComparableExpr, LeftJoin, Not, SQLRaw,29 Select, Column, Exists, ComparableExpr, LeftJoin, Not, SQLRaw,
@@ -133,7 +134,8 @@
133 def __get__(self, local, cls=None):134 def __get__(self, local, cls=None):
134 if local is not None:135 if local is not None:
135 # Don't use local here, as it might be security proxied.136 # Don't use local here, as it might be security proxied.
136 local = get_obj_info(local).get_obj()137 local_obj_info = get_obj_info(local)
138 local = local_obj_info.get_obj()
137139
138 if self._cls is None:140 if self._cls is None:
139 self._cls = _find_descriptor_class(cls or local.__class__, self)141 self._cls = _find_descriptor_class(cls or local.__class__, self)
@@ -154,11 +156,17 @@
154156
155 if self._relation.remote_key_is_primary:157 if self._relation.remote_key_is_primary:
156 remote = store.get(self._relation.remote_cls,158 remote = store.get(self._relation.remote_cls,
157 self._relation.get_local_variables(local))159 self._relation.get_local_variables(local),
160 derived_from=(local, self))
158 else:161 else:
159 where = self._relation.get_where_for_remote(local)162 where = self._relation.get_where_for_remote(local)
160 result = store.find(self._relation.remote_cls, where)163 result = store.find(self._relation.remote_cls, where)
164 result.fetch_context = local_obj_info["fetch_context"]
165 if not result.fetch_context.is_root:
166 result.fetch_origin = local_obj_info.get("fetch_origin")
161 remote = result.one()167 remote = result.one()
168 if remote is not None:
169 record_derived_fetch(self, local_obj_info, get_obj_info(remote))
162170
163 if remote is not None:171 if remote is not None:
164 self._relation.link(local, remote)172 self._relation.link(local, remote)
165173
=== modified file 'storm/store.py'
--- storm/store.py 2011-05-16 10:45:52 +0000
+++ storm/store.py 2011-06-20 13:09:26 +0000
@@ -1,5 +1,5 @@
1#1#
2# Copyright (c) 2006, 2007 Canonical2# Copyright (c) 2006-2011 Canonical
3#3#
4# Written by Gustavo Niemeyer <gustavo@niemeyer.net>4# Written by Gustavo Niemeyer <gustavo@niemeyer.net>
5#5#
@@ -37,6 +37,7 @@
37from storm.exceptions import (37from storm.exceptions import (
38 WrongStoreError, NotFlushedError, OrderLoopError, UnorderedError,38 WrongStoreError, NotFlushedError, OrderLoopError, UnorderedError,
39 NotOneError, FeatureError, CompileError, LostObjectError, ClassInfoError)39 NotOneError, FeatureError, CompileError, LostObjectError, ClassInfoError)
40from storm.fetch_profile import FetchContext
40from storm import Undef41from storm import Undef
41from storm.cache import Cache42from storm.cache import Cache
42from storm.event import EventSystem43from storm.event import EventSystem
@@ -49,6 +50,48 @@
49PENDING_REMOVE = 250PENDING_REMOVE = 2
5051
5152
53def record_unfetched_object(fetch_context, obj_info):
54 """Record creation of an object not fetched from the database."""
55 obj_info["fetch_context"] = fetch_context
56 if not fetch_context.is_root:
57 obj_info["fetch_origin"] = obj_info.cls_info.cls
58
59
60def record_original_fetch(obj_info, fetch_context, cls):
61 """Record an original fetch.
62
63 The fetch context may or may not be store's currently active context; the
64 active context may have changed between the point where the query occurs
65 in the program and the point where it is actually issued to the database.
66
67 :param obj_info: `ObjectInfo` for the object being fetched.
68 :param fetch_context: The `FetchContext` issuing the fetch.
69 :param cls: The class that's being fetched.
70 """
71 obj_info["fetch_context"] = fetch_context
72 if not fetch_context.is_root:
73 obj_info["fetch_origin"] = cls
74 fetch_context.stats.record_original_fetch(cls)
75
76
77def record_derived_fetch(reference, local, remote):
78 """Record a derived fetch.
79
80 :param reference: The `Reference` whose dereference triggers the fetch.
81 :param local: `ObjectInfo` for the object whose reference to the object is
82 being followed.
83 :param remote: `ObjectInfo` for the object that's being fetched.
84 """
85 fetch_context = local["fetch_context"]
86 remote["fetch_context"] = fetch_context
87 if not fetch_context.is_root:
88 fetch_origin = local["fetch_origin"]
89 remote["fetch_origin"] = fetch_origin
90 fetch_context.stats.record_derived_fetch(fetch_origin,
91 local.cls_info.cls,
92 reference)
93
94
52class Store(object):95class Store(object):
53 """The Storm Store.96 """The Storm Store.
5497
@@ -80,6 +123,7 @@
80 self._cache = cache123 self._cache = cache
81 self._implicit_flush_block_count = 0124 self._implicit_flush_block_count = 0
82 self._sequence = 0 # Advisory ordering.125 self._sequence = 0 # Advisory ordering.
126 self.fetch_context = FetchContext(None)
83127
84 def get_database(self):128 def get_database(self):
85 """Return this Store's Database object."""129 """Return this Store's Database object."""
@@ -137,13 +181,16 @@
137 self.invalidate()181 self.invalidate()
138 self._connection.rollback()182 self._connection.rollback()
139183
140 def get(self, cls, key):184 def get(self, cls, key, derived_from=None):
141 """Get object of type cls with the given primary key from the database.185 """Get object of type cls with the given primary key from the database.
142186
143 If the object is alive the database won't be touched.187 If the object is alive the database won't be touched.
144188
145 @param cls: Class of the object to be retrieved.189 @param cls: Class of the object to be retrieved.
146 @param key: Primary key of object. May be a tuple for composed keys.190 @param key: Primary key of object. May be a tuple for composed keys.
191 @param derived_from: For profiling purposes, an optional tuple of the
192 object that this fetch is derived from and its reference property
193 that linked to this object.
147194
148 @return: The object found with the given primary key, or None195 @return: The object found with the given primary key, or None
149 if no object is found.196 if no object is found.
@@ -176,10 +223,28 @@
176 default_tables=cls_info.table, limit=1)223 default_tables=cls_info.table, limit=1)
177224
178 result = self._connection.execute(select)225 result = self._connection.execute(select)
226
227 if derived_from is None:
228 result.fetch_context = self.fetch_context
229 else:
230 origin_obj, origin_ref = derived_from
231 origin_obj_info = get_obj_info(origin_obj)
232 result.fetch_context = origin_obj_info["fetch_context"]
233 if not result.fetch_context.is_root:
234 result.fetch_origin = origin_obj_info.get("fetch_origin")
235
179 values = result.get_one()236 values = result.get_one()
180 if values is None:237 if values is None:
181 return None238 return None
182 return self._load_object(cls_info, result, values)239
240 obj = self._load_object(cls_info, result, values)
241 if derived_from is not None:
242 obj_info = get_obj_info(obj)
243 if origin_obj_info["store"] == self:
244 record_derived_fetch(origin_ref, origin_obj_info, obj_info)
245 else:
246 record_original_fetch(obj_info, self.fetch_context, cls)
247 return obj
183248
184 def find(self, cls_spec, *args, **kwargs):249 def find(self, cls_spec, *args, **kwargs):
185 """Perform a query.250 """Perform a query.
@@ -260,6 +325,7 @@
260 obj_info["pending"] = PENDING_ADD325 obj_info["pending"] = PENDING_ADD
261 self._set_dirty(obj_info)326 self._set_dirty(obj_info)
262 self._enable_lazy_resolving(obj_info)327 self._enable_lazy_resolving(obj_info)
328 record_unfetched_object(self.fetch_context, obj_info)
263 obj_info.event.emit("added")329 obj_info.event.emit("added")
264330
265 return obj331 return obj
@@ -710,6 +776,10 @@
710 self._set_values(obj_info, cls_info.columns, result, values,776 self._set_values(obj_info, cls_info.columns, result, values,
711 replace_unknown_lazy=True)777 replace_unknown_lazy=True)
712778
779 if result.fetch_origin is None:
780 # This is an original fetch.
781 record_original_fetch(obj_info, result.fetch_context, cls)
782
713 self._add_to_alive(obj_info)783 self._add_to_alive(obj_info)
714 self._enable_change_notification(obj_info)784 self._enable_change_notification(obj_info)
715 self._enable_lazy_resolving(obj_info)785 self._enable_lazy_resolving(obj_info)
@@ -895,6 +965,23 @@
895 self._set_values(obj_info, autoreload_columns,965 self._set_values(obj_info, autoreload_columns,
896 result, result.get_one())966 result, result.get_one())
897967
968 def push_fetch_context(self, context_name):
969 """Enter a fetch context.
970
971 If no fetch context was active previously, this enables
972 profiling.
973 """
974 self.fetch_context = self.fetch_context.get_child(context_name)
975
976 def pop_fetch_context(self):
977 """Leave the current fetch context.
978
979 If the current context was the outermost one, this disables
980 profiling.
981 """
982 assert not self.fetch_context.is_root, "Popped root fetch context."
983 self.fetch_context = self.fetch_context.parent
984
898985
899class ResultSet(object):986class ResultSet(object):
900 """The representation of the results of a query.987 """The representation of the results of a query.
@@ -920,6 +1007,8 @@
920 self._distinct = False1007 self._distinct = False
921 self._group_by = Undef1008 self._group_by = Undef
922 self._having = Undef1009 self._having = Undef
1010 self.fetch_context = store.fetch_context
1011 self.fetch_origin = None
9231012
924 def copy(self):1013 def copy(self):
925 """Return a copy of this ResultSet object, with the same configuration.1014 """Return a copy of this ResultSet object, with the same configuration.
@@ -976,6 +1065,7 @@
976 """Iterate the results of the query.1065 """Iterate the results of the query.
977 """1066 """
978 result = self._store._connection.execute(self._get_select())1067 result = self._store._connection.execute(self._get_select())
1068 result.fetch_context = self.fetch_context
979 for values in result:1069 for values in result:
980 yield self._load_objects(result, values)1070 yield self._load_objects(result, values)
9811071
@@ -1068,6 +1158,7 @@
1068 select.limit = 11158 select.limit = 1
1069 select.order_by = Undef1159 select.order_by = Undef
1070 result = self._store._connection.execute(select)1160 result = self._store._connection.execute(select)
1161 result.fetch_context = self.fetch_context
1071 values = result.get_one()1162 values = result.get_one()
1072 if values:1163 if values:
1073 return self._load_objects(result, values)1164 return self._load_objects(result, values)
@@ -1081,6 +1172,7 @@
1081 select = self._get_select()1172 select = self._get_select()
1082 select.limit = 11173 select.limit = 1
1083 result = self._store._connection.execute(select)1174 result = self._store._connection.execute(select)
1175 result.fetch_context = self.fetch_context
1084 values = result.get_one()1176 values = result.get_one()
1085 if values:1177 if values:
1086 return self._load_objects(result, values)1178 return self._load_objects(result, values)
@@ -1122,6 +1214,7 @@
1122 else:1214 else:
1123 select.order_by.append(Desc(expr))1215 select.order_by.append(Desc(expr))
1124 result = self._store._connection.execute(select)1216 result = self._store._connection.execute(select)
1217 result.fetch_context = self.fetch_context
1125 values = result.get_one()1218 values = result.get_one()
1126 if values:1219 if values:
1127 return self._load_objects(result, values)1220 return self._load_objects(result, values)
@@ -1140,6 +1233,7 @@
1140 if select.limit is not Undef and select.limit > 2:1233 if select.limit is not Undef and select.limit > 2:
1141 select.limit = 21234 select.limit = 2
1142 result = self._store._connection.execute(select)1235 result = self._store._connection.execute(select)
1236 result.fetch_context = self.fetch_context
1143 values = result.get_one()1237 values = result.get_one()
1144 if result.get_one():1238 if result.get_one():
1145 raise NotOneError("one() used with more than one result available")1239 raise NotOneError("one() used with more than one result available")
11461240
=== added file 'tests/fetch_context.py'
--- tests/fetch_context.py 1970-01-01 00:00:00 +0000
+++ tests/fetch_context.py 2011-06-20 13:09:26 +0000
@@ -0,0 +1,164 @@
1# -*- coding: utf-8 -*-
2
3from storm.fetch_profile import FetchContext, FetchStatistics, fetch_context
4
5from tests.helper import TestHelper
6
7
8class FakeStats(object):
9 def __init__(self, contents=None):
10 if contents is None:
11 self.contents = set()
12 else:
13 self.contents = set(contents)
14
15 def merge(self, other_stats):
16 self.contents = self.contents.union(other_stats.contents)
17
18
19class FakeStore(object):
20 def __init__(self):
21 self.fetch_context = FetchContext(None)
22
23 def push_fetch_context(self, name):
24 self.fetch_context = FetchContext(name, parent=self.fetch_context)
25
26 def pop_fetch_context(self):
27 self.fetch_context = self.fetch_context.parent
28
29
30class FetchContextTest(TestHelper):
31 def get_relatives(self, context):
32 """Return a tuple of `context`s parent and children."""
33 return (context.parent, context.children)
34
35 def test_initially_childless(self):
36 self.assertEqual({}, FetchContext("context").children)
37
38 def test_iter_childless_context_yields_nothing(self):
39 self.assertEqual([], list(FetchContext("context")))
40
41 def test_iter_does_not_yield_parent(self):
42 parent = FetchContext("parent")
43 child = parent.get_child("child")
44 self.assertEqual([], list(child))
45
46 def test_iter_context_with_children_yields_children(self):
47 root = FetchContext("root")
48 one = root.get_child("one")
49 two = root.get_child("two")
50 self.assertEqual(set([one, two]), set(root))
51
52 def test_iter_context_includes_grandchildren(self):
53 root = FetchContext("root")
54 child = root.get_child("child")
55 grandchild = child.get_child("grandchild")
56 self.assertEqual(set([child, grandchild]), set(root))
57
58 def test_iter_context_includes_grand_grandchildren(self):
59 root = FetchContext("root")
60 child = root.get_child("child")
61 grandchild = child.get_child("grandchild")
62 grand_grandchild = grandchild.get_child("grand-grandchild")
63 self.assertEqual(set([child, grandchild, grand_grandchild]), set(root))
64
65 def test_is_root_for_root(self):
66 self.assertTrue(FetchContext("root").is_root())
67
68 def test_is_root_for_child(self):
69 root = FetchContext("root")
70 self.assertFalse(root.get_child("child").is_root())
71
72 def test_get_child_creates_first_child(self):
73 parent = FetchContext("parent")
74 child = parent.get_child("child")
75
76 self.assertEqual((None, {"child": child}), self.get_relatives(parent))
77 self.assertEqual((parent, {}), self.get_relatives(child))
78
79 def test_get_child_adds_child(self):
80 parent = FetchContext("parent")
81 eldest = parent.get_child("eldest")
82 youngest = parent.get_child("youngest")
83
84 children = {
85 "eldest": eldest,
86 "youngest": youngest,
87 }
88 self.assertEqual((None, children), self.get_relatives(parent))
89 self.assertEqual((parent, {}), self.get_relatives(eldest))
90 self.assertEqual((parent, {}), self.get_relatives(youngest))
91 self.assertNotEqual(eldest, youngest)
92
93 def test_get_child_finds_child(self):
94 parent = FetchContext("parent")
95 child = parent.get_child("child")
96 self.assertEqual(child, parent.get_child("child"))
97
98 def test_cumulate_stats_on_empty_context_yields_empty(self):
99 stats = FetchContext("context").cumulate_stats()
100 self.assertEqual({}, stats.original_fetches)
101 self.assertEqual({}, stats.derived_fetches)
102
103 def test_cumulate_stats_includes_local_stats(self):
104 context = FetchContext("context")
105 context.stats.original_fetches = {("origin", "reference", "store"): 1}
106 self.assertEqual(context.stats.original_fetches,
107 context.cumulate_stats().original_fetches)
108
109 def test_cumulate_stats_includes_child_stats(self):
110 parent = FetchContext("parent")
111 child = parent.get_child("child")
112 child.stats.original_fetches = {("origin", "reference", "store"): 1}
113 self.assertEqual(child.stats.original_fetches,
114 parent.cumulate_stats().original_fetches)
115
116 def test_aggregate_stats_by_name_includes_local_stats(self):
117 context = FetchContext("context")
118 fetch = ("origin", "reference", "store")
119 context.stats.original_fetches[fetch] = 1
120 stats = context.aggregate_stats_by_name()
121 self.assertEqual(["context"], stats.keys())
122 self.assertEqual({fetch: 1}, stats["context"].original_fetches)
123
124 def test_aggregate_stats_by_name_includes_child_stats(self):
125 parent = FetchContext("parent")
126 child = parent.get_child("child")
127 fetch = ("origin", "reference", "store")
128 child.stats.original_fetches[fetch] = 1
129 stats = parent.aggregate_stats_by_name()
130 self.assertEqual({fetch: 1}, stats["child"].original_fetches)
131
132 def test_aggregate_stats_by_name_aggregates(self):
133 root = FetchContext("x")
134 fetch = ("origin", "reference", "store")
135 root.stats.original_fetches[fetch] = 1
136 root.get_child("x").stats.original_fetches[fetch] = 1
137 stats = root.aggregate_stats_by_name()
138 self.assertEqual(["x"], stats.keys())
139 self.assertEqual({fetch: 2}, stats["x"].original_fetches)
140
141 def test_context_manager_pushes_context(self):
142 store = FakeStore()
143 with fetch_context(store, "with"):
144 current_context = store.fetch_context.name
145 self.assertEqual("with", current_context)
146
147 def test_context_manager_pops_context_on_normal_exit(self):
148 store = FakeStore()
149 with fetch_context(store, "with"):
150 pass
151 self.assertTrue(store.fetch_context.is_root())
152
153 def test_context_manager_pops_context_on_exception(self):
154 class ArbitraryException(Exception):
155 pass
156
157 store = FakeStore()
158 try:
159 with fetch_context(store, "with"):
160 raise ArbitraryException()
161 except ArbitraryException:
162 pass
163
164 self.assertTrue(store.fetch_context.is_root())
0165
=== added file 'tests/fetch_profile.py'
--- tests/fetch_profile.py 1970-01-01 00:00:00 +0000
+++ tests/fetch_profile.py 2011-06-20 13:09:26 +0000
@@ -0,0 +1,64 @@
1# -*- coding: utf-8 -*-
2
3from storm.store import Store, record_original_fetch, record_derived_fetch
4
5from tests.helper import TestHelper
6
7
8class DummyDatabase(object):
9
10 def connect(self, event=None):
11 return None
12
13
14class FetchProfilingTest(TestHelper):
15
16 def test_initial_context_is_root(self):
17 store = Store(DummyDatabase())
18 self.assertTrue(store.fetch_context.is_root())
19
20 def test_push_fetch_context(self):
21 store = Store(DummyDatabase())
22 store.push_fetch_context("context")
23 self.assertFalse(store.fetch_context.is_root())
24
25 def test_pop_fetch_context(self):
26 store = Store(DummyDatabase())
27 store.push_fetch_context("context")
28 store.pop_fetch_context()
29 self.assertTrue(store.fetch_context.is_root())
30
31 def record_original_fetch(self):
32 store = Store(DummyDatabase())
33 store.push_fetch_context("context")
34 fake_object = {"store": store}
35 record_original_fetch(fake_object, store.fetch_context, "class")
36 self.assertEqual({"class": 1},
37 store.fetch_context.stats.original_fetches)
38 self.assertEqual(store.fetch_context, fake_object["fetch_context"])
39
40 def test_record_derived_fetch(self):
41 class FakeObjInfo(dict):
42 pass
43 class FakeClsInfo(object):
44 def __init__(self, cls):
45 self.cls = cls
46
47 store = Store(DummyDatabase())
48 store.push_fetch_context("context")
49 fake_local_object = FakeObjInfo(store=store,
50 fetch_context=store.fetch_context,
51 fetch_origin="origin")
52 fake_local_object.cls_info = FakeClsInfo("source")
53 fake_remote_object = FakeObjInfo(store=store)
54 record_derived_fetch("reference", fake_local_object, fake_remote_object)
55
56 self.assertEqual({("origin", "source", "reference"): 1},
57 store.fetch_context.stats.derived_fetches)
58 self.assertEqual("origin", fake_remote_object["fetch_origin"])
59
60 def test_root_context_does_not_profile(self):
61 store = Store(DummyDatabase())
62 fake_object = {"store": store}
63 record_original_fetch(fake_object, store.fetch_context, "class")
64 self.assertEqual({}, store.fetch_context.stats.original_fetches)
065
=== added file 'tests/fetch_statistics.py'
--- tests/fetch_statistics.py 1970-01-01 00:00:00 +0000
+++ tests/fetch_statistics.py 2011-06-20 13:09:26 +0000
@@ -0,0 +1,99 @@
1# -*- coding: utf-8 -*-
2
3from storm.fetch_profile import add_to_dict, FetchStatistics
4
5from tests.helper import TestHelper
6
7
8class AddToDictTest(TestHelper):
9 def test_creates_entry(self):
10 data = {}
11 add_to_dict(data, "x", 1)
12 self.assertEqual({"x": 1}, data)
13
14 def test_adds_to_entry(self):
15 data = {"x": 1}
16 add_to_dict(data, "x", 1)
17 self.assertEqual({"x": 2}, data)
18
19
20class FetchStatisticsTest(TestHelper):
21 def test_initially_empty(self):
22 empty = FetchStatistics()
23 self.assertEqual({}, empty.original_fetches)
24 self.assertEqual({}, empty.derived_fetches)
25
26 def test_record_original_fetch(self):
27 stats = FetchStatistics()
28 stats.record_original_fetch("origin")
29 self.assertEqual({"origin": 1}, stats.original_fetches)
30
31 def test_record_derived_fetch(self):
32 stats = FetchStatistics()
33 fetch = ("origin", "source", "reference")
34 stats.record_derived_fetch(*fetch)
35 self.assertEqual({fetch: 1}, stats.derived_fetches)
36
37 def test_copy(self):
38 stats = FetchStatistics()
39 stats.record_original_fetch("origin")
40 stats.record_derived_fetch("origin", "source", "reference")
41 copy = stats.copy()
42 self.assertEqual(stats.original_fetches, copy.original_fetches)
43 self.assertEqual(stats.derived_fetches, copy.derived_fetches)
44 self.assertNotEqual(stats, copy)
45
46 def test_merge_empty_does_nothing(self):
47 stats = FetchStatistics()
48 stats.original_fetches = {"origin": 1}
49 derived_fetch = ("origin", "source", "reference")
50 stats.derived_fetches = {derived_fetch: 1}
51 empty = FetchStatistics()
52 stats.merge(empty)
53 self.assertEqual({"origin": 1}, stats.original_fetches)
54 self.assertEqual({derived_fetch: 1}, stats.derived_fetches)
55
56 def test_merge_adds_counts(self):
57 stats = FetchStatistics()
58 other_stats = FetchStatistics()
59 other_stats.original_fetches = {"other_origin": 1}
60 derived_fetch = ("other_origin", "other_source", "reference")
61 other_stats.derived_fetches = {derived_fetch: 1}
62 stats.merge(other_stats)
63 self.assertEqual(other_stats.original_fetches, stats.original_fetches)
64 self.assertEqual(other_stats.derived_fetches, stats.derived_fetches)
65
66 def test_merge_leaves_existing_counts_in_place(self):
67 stats = FetchStatistics()
68 stats.original_fetches = {"origin": 1}
69 derived_fetch = ("origin", "source", "reference")
70 stats.derived_fetches = {derived_fetch: 1}
71 other_stats = FetchStatistics()
72 other_stats.original_fetches = {"other_origin": 1}
73 other_derived_fetch = ("other_origin", "other_source", "reference")
74 other_stats.derived_fetches = {other_derived_fetch: 1}
75 stats.merge(other_stats)
76
77 cumulative_original_fetches = {
78 "origin": 1,
79 "other_origin": 1,
80 }
81 cumulative_derived_fetches = {
82 derived_fetch: 1,
83 other_derived_fetch: 1,
84 }
85 self.assertEqual(cumulative_original_fetches, stats.original_fetches)
86 self.assertEqual(cumulative_derived_fetches, stats.derived_fetches)
87
88 def test_merge_sums_counts(self):
89 stats = FetchStatistics()
90 stats.original_fetches = {"origin": 1}
91 derived_fetch = ("origin", "source", "reference")
92 stats.derived_fetches = {derived_fetch: 1}
93 other_stats = FetchStatistics()
94 other_stats.original_fetches = {"origin": 1}
95 other_stats.derived_fetches = {derived_fetch: 1}
96 stats.merge(other_stats)
97
98 self.assertEqual({"origin": 2}, stats.original_fetches)
99 self.assertEqual({derived_fetch: 2}, stats.derived_fetches)
0100
=== modified file 'tests/store/base.py'
--- tests/store/base.py 2011-02-14 12:17:54 +0000
+++ tests/store/base.py 2011-06-20 13:09:26 +0000
@@ -29,8 +29,13 @@
2929
30from storm.references import Reference, ReferenceSet, Proxy30from storm.references import Reference, ReferenceSet, Proxy
31from storm.database import Result31from storm.database import Result
32<<<<<<< TREE
32from storm.properties import (33from storm.properties import (
33 Int, Float, RawStr, Unicode, Property, Pickle, UUID)34 Int, Float, RawStr, Unicode, Property, Pickle, UUID)
35=======
36from storm.fetch_profile import fetch_context
37from storm.properties import Int, Float, RawStr, Unicode, Property, Pickle
38>>>>>>> MERGE-SOURCE
34from storm.properties import PropertyPublisherMeta, Decimal39from storm.properties import PropertyPublisherMeta, Decimal
35from storm.variables import PickleVariable40from storm.variables import PickleVariable
36from storm.expr import (41from storm.expr import (
@@ -6004,6 +6009,212 @@
6004 result_to_remove = self.store.find(Foo, Foo.id <= 30)6009 result_to_remove = self.store.find(Foo, Foo.id <= 30)
6005 self.assertEquals(result_to_remove.remove(), 3)6010 self.assertEquals(result_to_remove.remove(), 3)
60066011
6012 def test_push_fetch_context(self):
6013 root = self.store.fetch_context
6014 self.store.push_fetch_context("child")
6015 self.assertEqual(root, self.store.fetch_context.parent)
6016
6017 def test_pop_fetch_context(self):
6018 root = self.store.fetch_context
6019 self.store.push_fetch_context("child")
6020 self.store.pop_fetch_context()
6021 self.assertEqual(root, self.store.fetch_context)
6022
6023 def test_fetch_context_manager(self):
6024 with fetch_context(self.store, "with-context"):
6025 context_name = self.store.fetch_context.name
6026 self.assertEqual("with-context", context_name)
6027
6028 def test_profile_find(self):
6029 self.store.push_fetch_context("test")
6030 obj = self.store.find(Foo).any()
6031 stats = self.store.fetch_context.stats
6032 self.assertEqual({Foo: 1}, stats.original_fetches)
6033 self.assertEqual({}, stats.derived_fetches)
6034
6035 def test_profile_get(self):
6036 self.store.push_fetch_context("test")
6037 obj = self.store.get(Foo, 10)
6038 stats = self.store.fetch_context.stats
6039 self.assertEqual({Foo: 1}, stats.original_fetches)
6040 self.assertEqual({}, stats.derived_fetches)
6041
6042 def test_profile_get_derived_from(self):
6043 self.store.push_fetch_context("test")
6044 bar = self.store.get(Bar, 100)
6045 foo = self.store.get(Foo, bar.foo_id, derived_from=(bar, Bar.foo))
6046 stats = self.store.fetch_context.stats
6047 self.assertEqual({Bar: 1}, stats.original_fetches)
6048 fetch = (Bar, Bar, Bar.foo)
6049 self.assertEqual({fetch: 1}, stats.derived_fetches)
6050
6051 def test_profile_dereference(self):
6052 self.store.push_fetch_context("test")
6053 bar = self.store.get(Bar, 100)
6054 foo = bar.foo
6055 stats = self.store.fetch_context.stats
6056 fetch = (Bar, Bar, Bar.foo)
6057 self.assertEqual({fetch: 1}, stats.derived_fetches)
6058
6059 def test_profile_indirect_derived_fetch_records_origin_and_source(self):
6060 self.store.execute("""
6061 CREATE TEMPORARY TABLE splat (id integer, bar_id integer)
6062 """)
6063 self.store.execute("INSERT INTO splat (id, bar_id) VALUES (1, 100)")
6064
6065 class Splat(object):
6066 __storm_table__ = "splat"
6067 id = Int(primary=True)
6068 bar_id = Int()
6069 bar = Reference(bar_id, Bar.id)
6070
6071 with fetch_context(self.store, "test"):
6072 splat = self.store.get(Splat, 1)
6073 context = self.store.fetch_context
6074
6075 foo = splat.bar.foo
6076
6077 expected_fetches = {
6078 (Splat, Splat, Splat.bar): 1,
6079 (Splat, Bar, Bar.foo): 1,
6080 }
6081 self.assertEqual(expected_fetches, context.stats.derived_fetches)
6082
6083 def test_profile_derived_get_records_origin_and_source(self):
6084 self.store.execute("""
6085 CREATE TEMPORARY TABLE splat (id integer, bar_id integer)
6086 """)
6087 self.store.execute("INSERT INTO splat (id, bar_id) VALUES (1, 100)")
6088
6089 class Splat(object):
6090 __storm_table__ = "splat"
6091 id = Int(primary=True)
6092 bar_id = Int()
6093 bar = Reference(bar_id, Bar.id)
6094
6095 with fetch_context(self.store, "test"):
6096 splat = self.store.get(Splat, 1)
6097 context = self.store.fetch_context
6098
6099 bar = splat.bar
6100 self.store.get(Foo, bar.foo_id, derived_from=(bar, Bar.foo))
6101
6102 expected_fetches = {
6103 (Splat, Splat, Splat.bar): 1,
6104 (Splat, Bar, Bar.foo): 1,
6105 }
6106 self.assertEqual(expected_fetches, context.stats.derived_fetches)
6107
6108
6109 def test_profile_new_object_is_origin_but_not_fetched(self):
6110 self.store.push_fetch_context("test")
6111 bar = Bar()
6112 bar.id = 999
6113 bar.foo_id = 10
6114 self.store.add(bar)
6115 foo = bar.foo
6116 stats = self.store.fetch_context.stats
6117 self.assertEqual({}, stats.original_fetches)
6118 self.assertEqual({(Bar, Bar, Bar.foo): 1}, stats.derived_fetches)
6119
6120 def test_profile_cached_objects_not_fetched(self):
6121 foo = self.store.get(Foo, 10)
6122 bar = self.store.get(Bar, 100)
6123 self.store.push_fetch_context("test")
6124 same_foo = bar.foo
6125 self.assertEqual({}, self.store.fetch_context.stats.original_fetches)
6126 self.assertEqual({}, self.store.fetch_context.stats.derived_fetches)
6127
6128 def test_profile_derived_fetch_uses_original_context(self):
6129 with fetch_context(self.store, "original-fetch-context"):
6130 original_context = self.store.fetch_context
6131 bar = self.store.get(Bar, 100)
6132 with fetch_context(self.store, "later-fetch-context"):
6133 later_context = self.store.fetch_context
6134 foo = bar.foo
6135 self.assertEqual({}, later_context.stats.derived_fetches)
6136 self.assertEqual({(Bar, Bar, Bar.foo): 1},
6137 original_context.stats.derived_fetches)
6138
6139 def test_profile_result_uses_original_context(self):
6140 with fetch_context(self.store, "original-fetch-context"):
6141 original_context = self.store.fetch_context
6142 bar_result = self.store.find(Foo, Foo.id == 10)
6143 with fetch_context(self.store, "later-fetch-context"):
6144 later_context = self.store.fetch_context
6145 bar = bar_result.one()
6146 self.assertEqual({}, later_context.stats.original_fetches)
6147 self.assertEqual({Foo: 1}, original_context.stats.original_fetches)
6148
6149 def test_profile_result_find_uses_original_context(self):
6150 with fetch_context(self.store, "original-fetch-context"):
6151 original_context = self.store.fetch_context
6152 original_result = self.store.find(Foo, Foo.id == 10)
6153 with fetch_context(self.store, "later-fetch-context"):
6154 later_context = self.store.fetch_context
6155 original_result.find(True).one()
6156 self.assertEqual({}, later_context.stats.original_fetches)
6157 self.assertEqual({Foo: 1}, original_context.stats.original_fetches)
6158
6159 def test_profile_contexts_persist(self):
6160 with fetch_context(self.store, "context"):
6161 foo = self.store.get(Foo, 10)
6162 context = self.store.fetch_context
6163 with fetch_context(self.store, "context"):
6164 bar = self.store.get(Foo, 20)
6165 self.assertEqual({Foo: 2}, context.stats.original_fetches)
6166
6167 def test_profile_does_not_count_empty_result(self):
6168 self.store.push_fetch_context("context")
6169 self.store.find(Foo, False).any()
6170 self.assertEqual({}, self.store.fetch_context.stats.original_fetches)
6171
6172 def test_profile_counts_objects_fetched(self):
6173 self.store.push_fetch_context("context")
6174 list(self.store.find(Foo, Foo.id.is_in([10, 20])))
6175 self.assertEqual({Foo: 2},
6176 self.store.fetch_context.stats.original_fetches)
6177
6178 def test_profile_counts_all_objects_in_join(self):
6179 self.store.push_fetch_context("context")
6180 list(self.store.find((Foo, Bar), Foo.id == Bar.foo_id, Foo.id == 10))
6181 expected_fetches = {
6182 Foo: 1,
6183 Bar: 1,
6184 }
6185 self.assertEqual(expected_fetches,
6186 self.store.fetch_context.stats.original_fetches)
6187
6188 def test_profile_tracks_origin_within_join(self):
6189 self.store.execute("UPDATE %s SET selfref_id = %d WHERE id = %d" % (
6190 SelfRef.__storm_table__, 25, 15))
6191 self.store.push_fetch_context("context")
6192 query = self.store.find((Bar, SelfRef),
6193 Bar.id == 100,
6194 SelfRef.id == 15)
6195 (bar, selfref) = query.one()
6196 foo = bar.foo
6197 expected_derived_fetches = {
6198 (Bar, Bar, Bar.foo): 1,
6199 }
6200 self.assertEqual(expected_derived_fetches,
6201 self.store.fetch_context.stats.derived_fetches)
6202 other_selfref = selfref.selfref
6203 expected_derived_fetches[(SelfRef, SelfRef, SelfRef.selfref)] = 1
6204 self.assertEqual(expected_derived_fetches,
6205 self.store.fetch_context.stats.derived_fetches)
6206
6207 def test_profile_derived_fetch_on_different_store_is_original_fetch(self):
6208 self.store.push_fetch_context("context")
6209 bar = self.store.get(Bar, 100)
6210 other_store = self.create_store()
6211 other_store.push_fetch_context("remote-context")
6212 other_store.get(Foo, bar.foo_id, derived_from=(bar, Bar.foo))
6213 self.assertEqual({}, self.store.fetch_context.stats.derived_fetches)
6214 self.assertEqual({Foo: 1},
6215 other_store.fetch_context.stats.original_fetches)
6216 self.assertEqual({}, other_store.fetch_context.stats.derived_fetches)
6217
60076218
6008class EmptyResultSetTest(object):6219class EmptyResultSetTest(object):
60096220

Subscribers

People subscribed via source and target branches

to status/vote changes: