python-peps/pep-0343.txt

1029 lines
41 KiB
Plaintext
Raw Normal View History

2005-05-13 20:08:20 -04:00
PEP: 343
Title: The "with" Statement
2005-05-13 20:08:20 -04:00
Version: $Revision$
Last-Modified: $Date$
Author: Guido van Rossum, Nick Coghlan
Status: Accepted
2005-05-13 20:08:20 -04:00
Type: Standards Track
Content-Type: text/plain
Created: 13-May-2005
Post-History: 2-Jun-2005, 16-Oct-2005, 29-Oct-2005, 23-Apr-2006
2005-05-13 20:08:20 -04:00
Abstract
This PEP adds a new statement "with" to the Python language to make
it possible to factor out standard uses of try/finally statements.
The PEP was approved in principle by the BDFL, but there were
still a couple of implementation details to be worked out (see the
section on Resolved Issues). It's still at Draft status until
Guido gives a final blessing to the updated PEP.
Author's Note
This PEP was originally written in first person by Guido, and
subsequently updated by Nick Coghlan to reflect later discussion
on python-dev. Any first person references are from Guido's
original.
Python's alpha release cycle revealed terminology problems in this
PEP and in the associated documentation and implementation [14].
So while the PEP is already accepted, this refers to the
implementation rather than the exact terminology.
The current version of the PEP reflects the implementation and
documentation as at Python 2.5a2. The PEP will be updated to
reflect any changes made to the terminology prior to the final
Python 2.5 release.
2005-05-13 20:08:20 -04:00
Introduction
After a lot of discussion about PEP 340 and alternatives, I
decided to withdraw PEP 340 and proposed a slight variant on PEP
2005-06-03 05:32:57 -04:00
310. After more discussion, I have added back a mechanism for
raising an exception in a suspended generator using a throw()
2005-05-31 16:27:15 -04:00
method, and a close() method which throws a new GeneratorExit
2005-06-03 05:32:57 -04:00
exception; these additions were first proposed on python-dev in
[2] and universally approved of. I'm also changing the keyword to
'with'.
2005-05-31 16:27:15 -04:00
2005-06-06 13:15:17 -04:00
On-line discussion of this PEP should take place in the Python
Wiki [3].
If this PEP is approved, the following PEPs will be rejected due
to overlap:
- PEP 310, Reliable Acquisition/Release Pairs. This is the
original with-statement proposal.
- PEP 319, Python Synchronize/Asynchronize Block. Its use cases
can be covered by the current PEP by providing suitable
with-statement controllers: for 'synchronize' we can use the
"locking" template from example 1; for 'asynchronize' we can use
a similar "unlocking" template. I don't think having an
"anonymous" lock associated with a code block is all that
important; in fact it may be better to always be explicit about
the mutex being used.
(PEP 340 and PEP 346 have already been withdrawn.)
2005-05-13 20:08:20 -04:00
Motivation and Summary
PEP 340, Anonymous Block Statements, combined many powerful ideas:
using generators as block templates, adding exception handling and
finalization to generators, and more. Besides praise it received
a lot of opposition from people who didn't like the fact that it
2005-05-14 20:45:42 -04:00
was, under the covers, a (potential) looping construct. This
meant that break and continue in a block-statement would break or
continue the block-statement, even if it was used as a non-looping
resource management tool.
But the final blow came when I read Raymond Chen's rant about
flow-control macros[1]. Raymond argues convincingly that hiding
flow control in macros makes your code inscrutable, and I find
that his argument applies to Python as well as to C. I realized
that PEP 340 templates can hide all sorts of control flow; for
example, its example 4 (auto_retry()) catches exceptions and
repeats the block up to three times.
However, the with-statement of PEP 310 does *not* hide control
flow, in my view: while a finally-suite temporarily suspends the
control flow, in the end, the control flow resumes as if the
finally-suite wasn't there at all.
Remember, PEP 310 proposes rougly this syntax (the "VAR =" part is
optional):
with VAR = EXPR:
BLOCK
which roughly translates into this:
VAR = EXPR
VAR.__enter__()
try:
BLOCK
finally:
VAR.__exit__()
Now consider this example:
with f = open("/etc/passwd"):
BLOCK1
BLOCK2
Here, just as if the first line was "if True" instead, we know
that if BLOCK1 completes without an exception, BLOCK2 will be
reached; and if BLOCK1 raises an exception or executes a non-local
goto (a break, continue or return), BLOCK2 is *not* reached. The
magic added by the with-statement at the end doesn't affect this.
(You may ask, what if a bug in the __exit__() method causes an
exception? Then all is lost -- but this is no worse than with
other exceptions; the nature of exceptions is that they can happen
*anywhere*, and you just have to live with that. Even if you
write bug-free code, a KeyboardInterrupt exception can still cause
it to exit between any two virtual machine opcodes.)
This argument almost led me to endorse PEP 310, but I had one idea
left from the PEP 340 euphoria that I wasn't ready to drop: using
generators as "templates" for abstractions like acquiring and
releasing a lock or opening and closing a file is a powerful idea,
as can be seen by looking at the examples in that PEP.
Inspired by a counter-proposal to PEP 340 by Phillip Eby I tried
to create a decorator that would turn a suitable generator into an
object with the necessary __enter__() and __exit__() methods.
Here I ran into a snag: while it wasn't too hard for the locking
example, it was impossible to do this for the opening example.
The idea was to define the template like this:
@contextmanager
def opening(filename):
f = open(filename)
try:
yield f
finally:
f.close()
and used it like this:
with f = opening(filename):
...read data from f...
The problem is that in PEP 310, the result of calling EXPR is
assigned directly to VAR, and then VAR's __exit__() method is
called upon exit from BLOCK1. But here, VAR clearly needs to
receive the opened file, and that would mean that __exit__() would
have to be a method on the file.
While this can be solved using a proxy class, this is awkward and
made me realize that a slightly different translation would make
writing the desired decorator a piece of cake: let VAR receive the
result from calling the __enter__() method, and save the value of
EXPR to call its __exit__() method later. Then the decorator can
return an instance of a wrapper class whose __enter__() method
calls the generator's next() method and returns whatever next()
returns; the wrapper instance's __exit__() method calls next()
again but expects it to raise StopIteration. (Details below in
the section Optional Generator Decorator.)
So now the final hurdle was that the PEP 310 syntax:
with VAR = EXPR:
BLOCK1
would be deceptive, since VAR does *not* receive the value of
EXPR. Borrowing from PEP 340, it was an easy step to:
with EXPR as VAR:
BLOCK1
Additional discussion showed that people really liked being able
to "see" the exception in the generator, even if it was only to
log it; the generator is not allowed to yield another value, since
the with-statement should not be usable as a loop (raising a
different exception is marginally acceptable). To enable this, a
new throw() method for generators is proposed, which takes one to
three arguments representing an exception in the usual fashion
(type, value, traceback) and raises it at the point where the
generator is suspended.
Once we have this, it is a small step to proposing another
generator method, close(), which calls throw() with a special
exception, GeneratorExit. This tells the generator to exit, and
from there it's another small step to proposing that close() be
called automatically when the generator is garbage-collected.
Then, finally, we can allow a yield-statement inside a try-finally
statement, since we can now guarantee that the finally-clause will
(eventually) be executed. The usual cautions about finalization
apply -- the process may be terminated abruptly without finalizing
any objects, and objects may be kept alive forever by cycles or
memory leaks in the application (as opposed to cycles or leaks in
the Python implementation, which are taken care of by GC).
Note that we're not guaranteeing that the finally-clause is
executed immediately after the generator object becomes unused,
even though this is how it will work in CPython. This is similar
to auto-closing files: while a reference-counting implementation
like CPython deallocates an object as soon as the last reference
to it goes away, implementations that use other GC algorithms do
not make the same guarantee. This applies to Jython, IronPython,
and probably to Python running on Parrot.
2005-05-13 20:08:20 -04:00
Use Cases
See the Examples section near the end.
Specification: The 'with' Statement
2005-05-13 20:08:20 -04:00
A new statement is proposed with the syntax:
2005-05-13 20:08:20 -04:00
with EXPR as VAR:
2005-05-13 20:08:20 -04:00
BLOCK
Here, 'with' and 'as' are new keywords; EXPR is an arbitrary
expression (but not an expression-list) and VAR is a single
assignment target. It can *not* be a comma-separated sequence of
variables, but it *can* be a *parenthesized* comma-separated
sequence of variables. (This restriction makes a future extension
possible of the syntax to have multiple comma-separated resources,
each with its own optional as-clause.)
2005-05-13 20:08:20 -04:00
The "as VAR" part is optional.
The translation of the above statement is:
mgr = (EXPR).__context__()
exit = mgr.__exit__ # Not calling it yet
value = mgr.__enter__()
exc = True
2005-05-13 20:08:20 -04:00
try:
try:
VAR = value # Only if "as VAR" is present
BLOCK
except:
# The exceptional case is handled here
exc = False
if not exit(*sys.exc_info()):
raise
# The exception is swallowed if exit() returns true
2005-05-13 20:08:20 -04:00
finally:
# The normal and non-local-goto cases are handled here
if exc:
exit(None, None, None)
2005-05-13 20:08:20 -04:00
Here, the lowercase variables (mgr, exit, value, exc) are internal
variables and not accessible to the user; they will most likely be
implemented as special registers or stack positions.
The details of the above translation are intended to prescribe the
exact semantics. If any of the relevant methods are not found as
expected, the interpreter will raise AttributeError, in the order
that they are tried (__context__, __exit__, __enter__).
Similarly, if any of the calls raises an exception, the effect is
exactly as it would be in the above code. Finally, if BLOCK
contains a break, continue or return statement, the __exit__()
method is called with three None arguments just as if BLOCK
completed normally. (I.e. these "pseudo-exceptions" are not seen
as exceptions by __exit__().)
The call to the __context__() method serves a similar purpose to
that of the __iter__() method of iterator and iterables. A context
specifier with simple state requirements (such as
threading.RLock) may provide its own __enter__() and __exit__()
methods, and simply return 'self' from its __context__ method. On
the other hand, a context specifier with more complex state
requirements (such as decimal.Context) may return a distinct
context manager each time its __context__ method is invoked.
2005-05-13 20:08:20 -04:00
If the "as VAR" part of the syntax is omitted, the "VAR =" part of
the translation is omitted (but mgr.__enter__() is still called).
2005-05-13 20:08:20 -04:00
The calling convention for mgr.__exit__() is as follows. If the
finally-suite was reached through normal completion of BLOCK or
2005-05-14 13:36:15 -04:00
through a non-local goto (a break, continue or return statement in
BLOCK), mgr.__exit__() is called with three None arguments. If
2005-05-14 13:36:15 -04:00
the finally-suite was reached through an exception raised in
BLOCK, mgr.__exit__() is called with three arguments representing
2005-05-14 13:36:15 -04:00
the exception type, value, and traceback.
2005-05-13 20:08:20 -04:00
IMPORTANT: if mgr.__exit__() returns a "true" value, the exception
is "swallowed". That is, if it returns "true", execution
continues at the next statement after the with-statement, even if
an exception happened inside the with-statement. However, if the
with-statement was left via a non-local goto (break, continue or
return), this non-local return is resumed when mgr.__exit__()
returns regardless of the return value. The motivation for this
detail is to make it possible for mgr.__exit__() to swallow
exceptions, without making it too easy (since the default return
value, None, is false and this causes the exception to be
re-raised). The main use case for swallowing exceptions is to
make it possible to write the @contextmanager decorator so
that a try/except block in a decorated generator behaves exactly
as if the body of the generator were expanded in-line at the place
of the with-statement.
The motivation for passing the exception details to __exit__(), as
opposed to the argument-less __exit__() from PEP 310, was given by
the transactional() use case, example 3 below. The template in
that example must commit or roll back the transaction depending on
2005-05-14 01:18:36 -04:00
whether an exception occurred or not. Rather than just having a
boolean flag indicating whether an exception occurred, we pass the
2005-05-14 20:45:42 -04:00
complete exception information, for the benefit of an
2005-05-14 01:18:36 -04:00
exception-logging facility for example. Relying on sys.exc_info()
2005-05-14 20:45:42 -04:00
to get at the exception information was rejected; sys.exc_info()
2005-05-14 01:18:36 -04:00
has very complex semantics and it is perfectly possible that it
returns the exception information for an exception that was caught
ages ago. It was also proposed to add an additional boolean to
distinguish between reaching the end of BLOCK and a non-local
goto. This was rejected as too complex and unnecessary; a
non-local goto should be considered unexceptional for the purposes
of a database transaction roll-back decision.
To facilitate chaining of contexts in Python code that directly
manipulates context specifiers and managers, __exit__() methods
should *not* re-raise the error that is passed in to them, because
it is always the responsibility of the *caller* to do any reraising
in that case.
That way, if the caller needs to tell whether the __exit__()
invocation *failed* (as opposed to successfully cleaning up before
propagating the original error), it can do so.
If __exit__() returns without an error, this can then be
interpreted as success of the __exit__() method itself (whether the
original error is to be propagated or suppressed).
However, if __exit__() propagates an exception to its caller, this
means that __exit__() *itself* has failed. Thus, __exit__()
methods should avoid raising errors unless they have actually
failed. (And allowing the original error to proceed isn't a
failure.)
Objects returned by __context__() methods should also provide a
__context__() method that returns self. This allows a program to
retrieve the context manager directly without breaking anything.
For example, the following should work just as well as the normal
case where the extra variable isn't used:
mgr = (EXPR).__context__()
with mgr as VAR:
BLOCK
The with statement implementation and examples like the nested()
function require this behaviour in order to be able to deal
transparently with both context specifiers and context managers.
Transition Plan
In Python 2.5, the new syntax will only be recognized if a future
statement is present:
from __future__ import with_statement
This will make both 'with' and 'as' keywords. Without the future
statement, using 'with' or 'as' as an identifier will cause a
2006-02-28 16:13:18 -05:00
Warning to be issued to stderr.
In Python 2.6, the new syntax will always be recognized; 'with'
and 'as' are always keywords.
Generator Decorator
2005-05-13 20:08:20 -04:00
With PEP 342 accepted, it is possible to write a decorator
that makes it possible to use a generator that yields exactly once
to control a with-statement. Here's a sketch of such a decorator:
2005-05-13 20:08:20 -04:00
class GeneratorContextManager(object):
def __init__(self, gen):
self.gen = gen
def __context__(self):
return self
def __enter__(self):
try:
return self.gen.next()
except StopIteration:
raise RuntimeError("generator didn't yield")
def __exit__(self, type, value, traceback):
if type is None:
try:
self.gen.next()
except StopIteration:
return
else:
raise RuntimeError("generator didn't stop")
else:
try:
self.gen.throw(type, value, traceback)
raise RuntimeError("generator didn't stop after throw()")
except StopIteration:
return True
except:
# only re-raise if it's *not* the exception that was
# passed to throw(), because __exit__() must not raise
# an exception unless __exit__() itself failed. But
# throw() has to raise the exception to signal
# propagation, so this fixes the impedance mismatch
# between the throw() protocol and the __exit__()
# protocol.
#
if sys.exc_info()[1] is not value:
raise
def contextmanager(func):
def helper(*args, **kwds):
return GeneratorContextManager(func(*args, **kwds))
return helper
2005-05-13 20:08:20 -04:00
This decorator could be used as follows:
@contextmanager
def opening(filename):
f = open(filename) # IOError is untouched by GeneratorContext
try:
yield f
finally:
f.close() # Ditto for errors here (however unlikely)
2005-05-13 20:08:20 -04:00
A robust implementation of this decorator will be made
part of the standard library.
Just as generator-iterator functions are very useful for writing
__iter__() methods for iterables, generator context functions will
be very useful for writing __context__() methods for context
specifiers. These methods will still need to be decorated using the
contextmanager decorator. To ensure an obvious error message if the
decorator is left out, generator-iterator objects will NOT be given
a native context - if you want to ensure a generator is closed
promptly, use something similar to the duck-typed "closing" context
manager in the examples.
2005-05-13 20:08:20 -04:00
Optional Extensions
It would be possible to endow certain objects, like files,
sockets, and locks, with __enter__() and __exit__() methods so
that instead of writing:
with locking(myLock):
BLOCK
one could write simply:
with myLock:
BLOCK
I think we should be careful with this; it could lead to mistakes
like:
f = open(filename)
with f:
BLOCK1
with f:
BLOCK2
which does not do what one might think (f is closed before BLOCK2
is entered).
2005-06-02 11:13:55 -04:00
OTOH such mistakes are easily diagnosed; for example, the
generator context decorator above raises RuntimeError when a
second with-statement calls f.__enter__() again. A similar error
can be raised if __enter__ is invoked on a closed file object.
For Python 2.5, the following candidates have been identified for
native context managers:
- file
- decimal.Context
- thread.LockType
- threading.Lock
- threading.RLock
- threading.Condition
- threading.Semaphore and threading.BoundedSemaphore
Standard Terminology
Discussions about iterators and iterables are aided by the standard
terminology used to discuss them. The protocol used by the for
statement is called the iterator protocol and an iterator is any
object that properly implements that protocol. The term "iterable"
then encompasses all objects with an __iter__() method that
returns an iterator.
This PEP proposes that the protocol consisting of the __enter__()
and __exit__() methods, and a __context__() method that returns
self be known as the "context management protocol", and that
objects that implement that protocol be known as "context
managers".
The term "context specifier" then encompasses all objects with a
__context__() method that returns a context manager. The protocol
these objects implement is called the "context specification
protocol". This means that all context managers are context
specifiers, but not all context specifiers are context managers,
just as all iterators are iterables, but not all iterables are
iterators.
These terms are based on the concept that the context specifier
defines a context of execution for the code that forms the body of
the with statement. The role of the context manager is to
translate the context specifier's stored state into an active
manipulation of the runtime environment to setup and tear down the
desired runtime context for the duration of the with statement.
For example, a synchronisation lock's context manager acquires the
lock when entering the with statement, and releases the lock when
leaving it. The runtime context established within the body of the
with statement is that the synchronisation lock is currently held.
The general term "context" is unfortunately ambiguous. If necessary,
it can be made more explicit by using the terms "context specifier"
for objects providing a __context__() method and "runtime context"
for the runtime environment modifications made by the context
manager. When solely discussing use of the with statement, the
distinction between the two shouldn't matter as the context
specifier fully defines the changes made to the runtime context.
The distinction is more important when discussing the process of
implementing context specifiers and context managers.
Open Issues
1. After this PEP was originally approved, a subsequent discussion
on python-dev [4] settled on the term "context manager" for
objects which provide __enter__ and __exit__ methods, and
"context management protocol" for the protocol itself. With the
addition of the __context__ method to the protocol, the natural
adjustment is to call all objects which provide a __context__
method "context managers", and the objects with __enter__ and
__exit__ methods "contexts" (or "manageable contexts" in
situations where the general term "context" would be ambiguous).
As noted above, the Python 2.5 release cycle revealed problems
with the previously agreed terminology. The updated standard
terminology section has not yet met with consensus on
python-dev. It will be refined throughout the Python 2.5 release
cycle based on user feedback on the usability of the
documentation.
The first change made as a result of the current discussion is
replacement of the term "context object" with
"context specifier".
2. The original resolution was for the decorator to make a context
manager from a generator to be a builtin called "contextmanager".
The shorter term "context" was considered too ambiguous and
potentially confusing [9].
The different flavours of generators could then be described as:
- A "generator function" is an undecorated function containing
the 'yield' keyword, and the objects produced by
such functions are "generator-iterators". The term
"generator" may refer to either a generator function or a
generator-iterator depending on the situation.
- A "generator context function" is a generator function to
which the "contextmanager" decorator is applied and the
objects produced by such functions are "generator-context-
managers". The term "generator context" may refer to either
a generator context function or a generator-context-manager
depending on the situation.
In the Python 2.5 implementation, the decorator is actually part
of the standard library module contextlib. The ongoing
terminology review may lead to it being renamed
"contextlib.context" (with the existence of the underlying context
manager being an implementation detail).
Resolved Issues
The following issues were resolved either by BDFL approval,
consensus on python-dev, or a simple lack of objection to
proposals in the original version of this PEP.
1. The __exit__() method of the GeneratorContextManager class
catches StopIteration and considers it equivalent to re-raising
the exception passed to throw(). Is allowing StopIteration
right here?
This is so that a generator doing cleanup depending on the
exception thrown (like the transactional() example below) can
*catch* the exception thrown if it wants to and doesn't have to
worry about re-raising it. I find this more convenient for the
generator writer. Against this was brought in that the
generator *appears* to suppress an exception that it cannot
suppress: the transactional() example would be more clear
according to this view if it re-raised the original exception
after the call to db.rollback(). I personally would find the
requirement to re-raise the exception an annoyance in a
generator used as a with-template, since all the code after
yield is used for is cleanup, and it is invoked from a
finally-clause (the one implicit in the with-statement) which
re-raises the original exception anyway.
2. What exception should GeneratorContextManager raise when the
underlying generator-iterator misbehaves? The following quote is
the reason behind Guido's choice of RuntimeError for both this
and for the generator close() method in PEP 342 (from [8]):
"I'd rather not introduce a new exception class just for this
purpose, since it's not an exception that I want people to catch:
I want it to turn into a traceback which is seen by the
programmer who then fixes the code. So now I believe they
should both raise RuntimeError.
There are some precedents for that: it's raised by the core
Python code in situations where endless recursion is detected,
and for uninitialized objects (and for a variety of
miscellaneous conditions)."
3. See item 1 in open issues :)
4. The originally approved version of this PEP did not include a
__context__ method - the method was only added to the PEP after
Jason Orendorff pointed out the difficulty of writing
appropriate __enter__ and __exit__ methods for decimal.Context
[5]. This approach allows a class to define a native context
manager using generator syntax. It also allows a class to use an
existing independent context as its native context object by
applying the independent context to 'self' in its __context__
method. It even allows a class written in C to
use a generator context manager written in Python.
The __context__ method parallels the __iter__ method which forms
part of the iterator protocol.
An earlier version of this PEP called this the __with__ method.
This was later changed to match the name of the protocol rather
than the keyword for the statement [9].
5. The suggestion was made by Jason Orendorff that the __enter__
and __exit__ methods could be removed from the context
management protocol, and the protocol instead defined directly
in terms of the enhanced generator interface described in PEP
342 [6].
Guido rejected this idea [7]. The following are some of benefits
of keeping the __enter__ and __exit__ methods:
- it makes it easy to implement a simple context in C
without having to rely on a separate coroutine builder
- it makes it easy to provide a low-overhead implementation
for contexts that don't need to maintain any
special state between the __enter__ and __exit__ methods
(having to use a generator for these would impose
unnecessary overhead without any compensating benefit)
- it makes it possible to understand how the with statement
works without having to first understand the mechanics of
how generator context managers are implemented.
6. See item 2 in open issues :)
7. A generator function used to implement a __context__ method will
need to be decorated with the contextmanager decorator in order
to have the correct behaviour. Otherwise, you will get an
AttributeError when using the class in a with statement, as
normal generator-iterators will NOT have __enter__ or __exit__
methods.
Getting deterministic closure of generators will require a
separate context manager such as the closing example below.
As Guido put it, "too much magic is bad for your health" [10].
8. It is fine to raise AttributeError instead of TypeError if the
relevant methods aren't present on a class involved in a with
statement. The fact that the abstract object C API raises
TypeError rather than AttributeError is an accident of history,
rather than a deliberate design decision [11].
2005-05-13 20:08:20 -04:00
Examples
The generator based examples rely on PEP 342. Also, some of the
examples are likely to be unnecessary in practice, as the
appropriate objects, such as threading.RLock, will be able to be
used directly in with statements.
The tense used in the names of the example contexts is not
arbitrary. Past tense ("-ed") is used when the name refers to an
action which is done in the __enter__ method and undone in the
__exit__ method. Progressive tense ("-ing") is used when the name
refers to an action which is to be done in the __exit__ method.
2005-05-13 20:08:20 -04:00
1. A template for ensuring that a lock, acquired at the start of a
block, is released when the block is left:
@contextmanager
def locked(lock):
2005-05-13 20:08:20 -04:00
lock.acquire()
try:
yield
finally:
lock.release()
2005-05-13 20:08:20 -04:00
Used as follows:
with locked(myLock):
2005-05-13 20:08:20 -04:00
# Code here executes with myLock held. The lock is
# guaranteed to be released when the block is left (even
# if via return or by an uncaught exception).
PEP 319 gives a use case for also having an unlocked()
context; this can be written very similarly (just swap the
acquire() and release() calls).
2005-05-13 20:08:20 -04:00
2. A template for opening a file that ensures the file is closed
when the block is left:
@contextmanager
def opened(filename, mode="r"):
2005-05-13 20:08:20 -04:00
f = open(filename, mode)
try:
yield f
finally:
f.close()
2005-05-13 20:08:20 -04:00
Used as follows:
with opened("/etc/passwd") as f:
2005-05-13 20:08:20 -04:00
for line in f:
print line.rstrip()
3. A template for committing or rolling back a database
transaction:
@contextmanager
def transaction(db):
db.begin()
try:
yield None
except:
db.rollback()
raise
else:
db.commit()
2005-05-13 20:08:20 -04:00
4. Example 1 rewritten without a generator:
class locked:
2005-05-13 20:08:20 -04:00
def __init__(self, lock):
self.lock = lock
def __context__(self):
return self
2005-05-13 20:08:20 -04:00
def __enter__(self):
self.lock.acquire()
def __exit__(self, type, value, tb):
self.lock.release()
if type is not None:
raise type, value, tb
2005-05-13 20:08:20 -04:00
(This example is easily modified to implement the other
relatively stateless examples; it shows that it is easy to avoid
the need for a generator if no special state needs to be
preserved.)
2005-05-13 20:08:20 -04:00
5. Redirect stdout temporarily:
@contextmanager
def stdout_redirected(new_stdout):
2005-05-13 20:08:20 -04:00
save_stdout = sys.stdout
sys.stdout = new_stdout
try:
yield None
finally:
sys.stdout = save_stdout
2005-05-13 20:08:20 -04:00
Used as follows:
with opened(filename, "w") as f:
with stdout_redirected(f):
2005-05-13 20:08:20 -04:00
print "Hello world"
This isn't thread-safe, of course, but neither is doing this
same dance manually. In single-threaded programs (for example,
in scripts) it is a popular way of doing things.
6. A variant on opened() that also returns an error condition:
2005-05-13 20:08:20 -04:00
@contextmanager
def opened_w_error(filename, mode="r"):
2005-05-13 20:08:20 -04:00
try:
f = open(filename, mode)
except IOError, err:
yield None, err
else:
try:
yield f, None
finally:
f.close()
2005-05-13 20:08:20 -04:00
Used as follows:
with opened_w_error("/etc/passwd", "a") as (f, err):
2005-05-13 20:08:20 -04:00
if err:
print "IOError:", err
else:
f.write("guido::0:0::/:/bin/sh\n")
7. Another useful example would be an operation that blocks
signals. The use could be like this:
import signal
with signal.blocked():
2005-05-13 20:08:20 -04:00
# code executed without worrying about signals
An optional argument might be a list of signals to be blocked;
by default all signals are blocked. The implementation is left
as an exercise to the reader.
8. Another use for this feature is the Decimal context. Here's a
simple example, after one posted by Michael Chermside:
import decimal
@contextmanager
def extra_precision(places=2):
c = decimal.getcontext()
saved_prec = c.prec
c.prec += places
try:
yield None
finally:
c.prec = saved_prec
Sample usage (adapted from the Python Library Reference):
def sin(x):
"Return the sine of x as measured in radians."
with extra_precision():
i, lasts, s, fact, num, sign = 1, 0, x, 1, x, 1
while s != lasts:
lasts = s
i += 2
fact *= i * (i-1)
num *= x * x
sign *= -1
s += num / fact * sign
# The "+s" rounds back to the original precision,
# so this must be outside the with-statement:
return +s
9. Here's a proposed native context manager for decimal.Context:
# This would be a new decimal.Context method
@contextmanager
def __context__(self):
# We set the thread context to a copy of this context
# to ensure that changes within the block are kept
# local to the block. This also gives us thread safety
# and supports nested usage of a given context.
newctx = self.copy()
oldctx = decimal.getcontext()
decimal.setcontext(newctx)
try:
yield newctx
finally:
decimal.setcontext(oldctx)
Sample usage:
def sin(x):
with decimal.getcontext() as ctx:
ctx.prec += 2
# Rest of sin calculation algorithm
# uses a precision 2 greater than normal
return +s # Convert result to normal precision
def sin(x):
with decimal.ExtendedContext:
# Rest of sin calculation algorithm
# uses the Extended Context from the
# General Decimal Arithmetic Specification
return +s # Convert result to normal context
2005-05-17 22:53:26 -04:00
10. A generic "object-closing" template:
2005-06-02 11:13:55 -04:00
@contextmanager
def closing(obj):
try:
yield obj
finally:
try:
close = obj.close
except AttributeError:
pass
else:
close()
This can be used to deterministically close anything with a
close method, be it file, generator, or something else. It
can even be used when the object isn't guaranteed to require
closing (e.g., a function that accepts an arbitrary
iterable):
# emulate opening():
with closing(open("argument.txt")) as contradiction:
for line in contradiction:
print line
# deterministically finalize an iterator:
with closing(iter(data_source)) as data:
for datum in data:
process(datum)
2005-06-02 11:13:55 -04:00
11. Native contexts for objects with acquire/release methods:
# This would be a new method of e.g., threading.RLock
def __context__(self):
return locked(self)
def released(self):
return unlocked(self)
Sample usage:
with my_lock:
# Operations with the lock held
with my_lock.released():
# Operations without the lock
# e.g. blocking I/O
# Lock is held again here
12. A "nested" context manager that automatically nests the
supplied contexts from left-to-right to avoid excessive
indentation:
@contextmanager
def nested(*contexts):
exits = []
vars = []
try:
try:
for context in contexts:
mgr = context.__context__()
exit = mgr.__exit__
enter = mgr.__enter__
vars.append(enter())
exits.append(exit)
yield vars
except:
exc = sys.exc_info()
else:
exc = (None, None, None)
finally:
while exits:
exit = exits.pop()
try:
exit(*exc)
except:
exc = sys.exc_info()
else:
exc = (None, None, None)
if exc != (None, None, None):
2006-04-24 00:41:45 -04:00
# sys.exc_info() may have been
# changed by one of the exit methods
# so provide explicit exception info
raise exc[0], exc[1], exc[2]
Sample usage:
with nested(a, b, c) as (x, y, z):
# Perform operation
Is equivalent to:
with a as x:
with b as y:
with c as z:
# Perform operation
Reference Implementation
This PEP was first accepted by Guido at his EuroPython
keynote, 27 June 2005.
It was accepted again later, with the __context__ method added.
The PEP was implemented for Python 2.5a1
References
[1] http://blogs.msdn.com/oldnewthing/archive/2005/01/06/347666.aspx
2005-05-31 16:27:15 -04:00
[2] http://mail.python.org/pipermail/python-dev/2005-May/053885.html
2005-06-06 13:15:17 -04:00
[3] http://wiki.python.org/moin/WithStatement
[4]
http://mail.python.org/pipermail/python-dev/2005-July/054658.html
[5]
http://mail.python.org/pipermail/python-dev/2005-October/056947.html
[6]
http://mail.python.org/pipermail/python-dev/2005-October/056969.html
[7]
http://mail.python.org/pipermail/python-dev/2005-October/057018.html
[8]
http://mail.python.org/pipermail/python-dev/2005-June/054064.html
[9]
http://mail.python.org/pipermail/python-dev/2005-October/057520.html
[10]
http://mail.python.org/pipermail/python-dev/2005-October/057535.html
[11]
http://mail.python.org/pipermail/python-dev/2005-October/057625.html
[12]
http://sourceforge.net/tracker/index.php?func=detail&aid=1223381&group_id=5470&atid=305470
[13]
http://mail.python.org/pipermail/python-dev/2006-February/061903.html
[14]
http://mail.python.org/pipermail/python-dev/2006-April/063859.html
2005-05-13 20:08:20 -04:00
Copyright
This document has been placed in the public domain.