Updates for PEP 554. (#419)

This commit is contained in:
Eric Snow 2017-09-12 12:31:24 -07:00 committed by GitHub
parent 0926eeee4c
commit aef969627a
1 changed files with 413 additions and 122 deletions

View File

@ -12,86 +12,16 @@ Post-History:
Abstract
========
This proposal introduces the stdlib ``interpreters`` module. It exposes
the basic functionality of subinterpreters that already exists in the
C-API. Each subinterpreter runs with its own state (see
``Interpreter Isolation`` below). The module will be "provisional", as
described by PEP 411.
Rationale
=========
Running code in multiple interpreters provides a useful level of
isolation within the same process. This can be leveraged in number
of ways. Furthermore, subinterpreters provide a well-defined framework
in which such isolation may extended.
CPython has supported subinterpreters, with increasing levels of
support, since version 1.5. While the feature has the potential
to be a powerful tool, subinterpreters have suffered from neglect
because they are not available directly from Python. Exposing the
existing functionality in the stdlib will help reverse the situation.
support, since version 1.5. The feature has been available via the
C-API. [c-api]_ Subinterpreters operate in
`relative isolation from one another <Interpreter Isolation_>`_, which
provides the basis for an
`alternative concurrency model <Concurrency_>`_.
This proposal is focused on enabling the fundamental capability of
multiple isolated interpreters in the same Python process. This is a
new area for Python so there is relative uncertainly about the best
tools to provide as companions to subinterpreters. Thus we minimize
the functionality we add in the proposal as much as possible.
Concerns
--------
* "subinterpreters are not worth the trouble"
Some have argued that subinterpreters do not add sufficient benefit
to justify making them an official part of Python. Adding features
to the language (or stdlib) has a cost in increasing the size of
the language. So it must pay for itself. In this case, subinterpreters
provide a novel concurrency model focused on isolated threads of
execution. Furthermore, they present an opportunity for changes in
CPython that will allow simulateous use of multiple CPU cores (currently
prevented by the GIL).
Alternatives to subinterpreters include threading, async, and
multiprocessing. Threading is limited by the GIL and async isn't
the right solution for every problem (nor for every person).
Multiprocessing is likewise valuable in some but not all situations.
Direct IPC (rather than via the multiprocessing module) provides
similar benefits but with the same caveat.
Notably, subinterpreters are not intended as a replacement for any of
the above. Certainly they overlap in some areas, but the benefits of
subinterpreters include isolation and (potentially) performance. In
particular, subinterpreters provide a direct route to an alternate
concurrency model (e.g. CSP) which has found success elsewhere and
will appeal to some Python users. That is the core value that the
``interpreters`` module will provide.
* "stdlib support for subinterpreters adds extra burden
on C extension authors"
In the ``Interpreter Isolation`` section below we identify ways in
which isolation in CPython's subinterpreters is incomplete. Most
notable is extension modules that use C globals to store internal
state. PEP 3121 and PEP 489 provide a solution for most of the
problem, but one still remains. [petr-c-ext]_ Until that is resolved,
C extension authors will face extra difficulty to support
subinterpreters.
Consequently, projects that publish extension modules may face an
increased maintenance burden as their users start using subinterpreters,
where their modules may break. This situation is limited to modules
that use C globals (or use libraries that use C globals) to store
internal state.
Ultimately this comes down to a question of how often it will be a
problem in practice: how many projects would be affected, how often
their users will be affected, what the additional maintenance burden
will be for projects, and what the overall benefit of subinterpreters
is to offset those costs. The position of this PEP is that the actual
extra maintenance burden will be small and well below the threshold at
which subinterpreters are worth it.
This proposal introduces the stdlib ``interpreters`` module. The module
will be `provisional <Provisional Status_>`_. It exposes the basic
functionality of subinterpreters already provided by the C-API.
Proposal
@ -100,7 +30,8 @@ Proposal
The ``interpreters`` module will be added to the stdlib. It will
provide a high-level interface to subinterpreters and wrap the low-level
``_interpreters`` module. The proposed API is inspired by the
``threading`` module.
``threading`` module. See the `Examples`_ section for concrete usage
and use cases.
The module provides the following functions:
@ -112,15 +43,13 @@ The module provides the following functions:
Return the currently running interpreter.
``get_main()``::
Return the main interpreter.
``create()``::
Initialize a new Python interpreter and return it. The
interpreter will be created in the current thread and will remain
idle until something is run in it.
idle until something is run in it. The interpreter may be used
in any thread and will run in whichever thread calls
``interp.run()``.
The module also provides the following classes:
@ -133,23 +62,53 @@ The module also provides the following classes:
is_running():
Return whether or not the interpreter is currently executing code.
Calling this on the current interpreter will always return True.
destroy():
Finalize and destroy the interpreter. If called on a running
interpreter then raise RuntimeError in the interpreter that
called ``destroy()``.
Finalize and destroy the interpreter.
run(code):
This may not be called on an already running interpreter. Doing
so results in a RuntimeError.
Run the provided Python code in the interpreter, in the current
OS thread. If the interpreter is already running then raise
RuntimeError in the interpreter that called ``run()``.
run(source_str):
The current interpreter (which called ``run()``) will block until
the subinterpreter finishes running the requested code. Any
uncaught exception in that code will bubble up to the current
interpreter.
Run the provided Python source code in the interpreter.
This may not be called on an already running interpreter. Doing
so results in a RuntimeError.
A "run()" call is quite similar to any other function call. Once
it completes, the code that called "run()" continues executing
(in the original interpreter). Likewise, if there is any uncaught
exception, it propagates into the code where "run()" was called.
The big difference is that "run()" executes the code in an
entirely different interpreter, with entirely separate state.
The state of the current interpreter in the current OS thread
is swapped out with the state of the target interpreter (the one
that will execute the code). When the target finishes executing,
the original interpreter gets swapped back in and its execution
resumes.
So calling "run()" will effectively cause the current Python
thread to pause. Sometimes you won't want that pause, in which
case you should make the "run()" call in another thread. To do
so, add a function that calls "run()" and then run that function
in a normal "threading.Thread".
Note that interpreter's state is never reset, neither before
"run()" executes the code nor after. Thus the interpreter
state is preserved between calls to "run()". This includes
"sys.modules", the "builtins" module, and the internal state
of C extension modules.
Also note that "run()" executes in the namespace of the "__main__"
module, just like scripts, the REPL, "-m", and "-c". Just as
the interpreter's state is not ever reset, the "__main__" module
is never reset. You can imagine concatenating the code from each
"run()" call into one long script. This is the same as how the
REPL operates.
Supported code: source text.
@ -240,9 +199,9 @@ About FIFOs
-----------
Subinterpreters are inherently isolated (with caveats explained below),
in contrast to threads. This enables a different concurrency model than
currently exists in Python. CSP (Communicating Sequential Processes),
upon which Go's concurrency is based, is one example of this model.
in contrast to threads. This enables `a different concurrency model
<Concurrency_>`_ than currently exists in Python.
`Communicating Sequential Processes`_ (CSP) is the prime example.
A key component of this approach to concurrency is message passing. So
providing a message/object passing mechanism alongside ``Interpreter``
@ -265,6 +224,293 @@ minimal and initially restricting the supported types helps us avoid
further exposing any underlying complexity to Python users.
Examples
========
Run isolated code
-----------------
::
interp = interpreters.create()
print('before')
interp.run('print("during")')
print('after')
Run in a thread
---------------
::
interp = interpreters.create()
def run():
interp.run('print("during")')
t = threading.Thread(target=run)
print('before')
t.start()
print('after')
Pre-populate an interpreter
---------------------------
::
interp = interpreters.create()
interp.run("""if True:
import some_lib
import an_expensive_module
some_lib.set_up()
""")
wait_for_request()
interp.run("""if True:
some_lib.handle_request()
""")
Handling an exception
---------------------
::
interp = interpreters.create()
try:
interp.run("""if True:
raise KeyError
""")
except KeyError:
print("got the error from the subinterpreter")
Synchronize using a FIFO
------------------------
::
interp = interpreters.create()
writer = interp.add_recv_fifo('spam')
def run():
interp.run("""if True:
import interpreters
interp = interpreters.get_current()
reader = interp.get_fifo('spam')
reader.pop()
print("during")
""")
t = threading.Thread(target=run)
print('before')
t.start()
print('after')
writer.push(None)
Sharing a file descriptor
-------------------------
::
interp = interpreters.create()
writer = interp.add_recv_fifo('spam')
reader = interp.add_send_fifo('done')
def run():
interp.run("""if True:
import interpreters
interp = interpreters.get_current()
reader = interp.get_fifo('spam')
writer = interp.get_fifo('done')
fd = reader.pop()
for line in os.fdopen(fd):
print(line)
writer.push(None)
""")
t = threading.Thread(target=run)
t.start()
with open('spamspamspam') as infile:
writer.push(infile.fileno())
reader.pop()
Passing objects via pickle
--------------------------
::
interp = interpreters.create()
writer = interp.add_recv_fifo('spam')
interp.run("""if True:
import pickle
import interpreters
interp = interpreters.get_current()
reader = interp.get_fifo('spam')
""")
def run():
interp.run("""if True:
data = reader.pop()
while data is not None:
obj = pickle.loads(data)
do_something(obj)
data = reader.pop()
""")
t = threading.Thread(target=run)
t.start()
for obj in input:
data = pickle.dumps(obj)
writer.push(data)
writer.push(None)
Rationale
=========
Running code in multiple interpreters provides a useful level of
isolation within the same process. This can be leveraged in number
of ways. Furthermore, subinterpreters provide a well-defined framework
in which such isolation may extended.
CPython has supported subinterpreters, with increasing levels of
support, since version 1.5. While the feature has the potential
to be a powerful tool, subinterpreters have suffered from neglect
because they are not available directly from Python. Exposing the
existing functionality in the stdlib will help reverse the situation.
This proposal is focused on enabling the fundamental capability of
multiple isolated interpreters in the same Python process. This is a
new area for Python so there is relative uncertainly about the best
tools to provide as companions to subinterpreters. Thus we minimize
the functionality we add in the proposal as much as possible.
Concerns
--------
* "subinterpreters are not worth the trouble"
Some have argued that subinterpreters do not add sufficient benefit
to justify making them an official part of Python. Adding features
to the language (or stdlib) has a cost in increasing the size of
the language. So it must pay for itself. In this case, subinterpreters
provide a novel concurrency model focused on isolated threads of
execution. Furthermore, they present an opportunity for changes in
CPython that will allow simulateous use of multiple CPU cores (currently
prevented by the GIL).
Alternatives to subinterpreters include threading, async, and
multiprocessing. Threading is limited by the GIL and async isn't
the right solution for every problem (nor for every person).
Multiprocessing is likewise valuable in some but not all situations.
Direct IPC (rather than via the multiprocessing module) provides
similar benefits but with the same caveat.
Notably, subinterpreters are not intended as a replacement for any of
the above. Certainly they overlap in some areas, but the benefits of
subinterpreters include isolation and (potentially) performance. In
particular, subinterpreters provide a direct route to an alternate
concurrency model (e.g. CSP) which has found success elsewhere and
will appeal to some Python users. That is the core value that the
``interpreters`` module will provide.
* "stdlib support for subinterpreters adds extra burden
on C extension authors"
In the `Interpreter Isolation`_ section below we identify ways in
which isolation in CPython's subinterpreters is incomplete. Most
notable is extension modules that use C globals to store internal
state. PEP 3121 and PEP 489 provide a solution for most of the
problem, but one still remains. [petr-c-ext]_ Until that is resolved,
C extension authors will face extra difficulty to support
subinterpreters.
Consequently, projects that publish extension modules may face an
increased maintenance burden as their users start using subinterpreters,
where their modules may break. This situation is limited to modules
that use C globals (or use libraries that use C globals) to store
internal state.
Ultimately this comes down to a question of how often it will be a
problem in practice: how many projects would be affected, how often
their users will be affected, what the additional maintenance burden
will be for projects, and what the overall benefit of subinterpreters
is to offset those costs. The position of this PEP is that the actual
extra maintenance burden will be small and well below the threshold at
which subinterpreters are worth it.
About Subinterpreters
=====================
Interpreter Isolation
---------------------
CPython's interpreters are intended to be strictly isolated from each
other. Each interpreter has its own copy of all modules, classes,
functions, and variables. The same applies to state in C, including in
extension modules. The CPython C-API docs explain more. [caveats]_
However, there are ways in which interpreters share some state. First
of all, some process-global state remains shared:
* file descriptors
* builtin types (e.g. dict, bytes)
* singletons (e.g. None)
* underlying static module data (e.g. functions) for
builtin/extension/frozen modules
There are no plans to change this.
Second, some isolation is faulty due to bugs or implementations that did
not take subinterpreters into account. This includes things like
extension modules that rely on C globals. [cryptography]_ In these
cases bugs should be opened (some are already):
* readline module hook functions (http://bugs.python.org/issue4202)
* memory leaks on re-init (http://bugs.python.org/issue21387)
Finally, some potential isolation is missing due to the current design
of CPython. Improvements are currently going on to address gaps in this
area:
* interpreters share the GIL
* interpreters share memory management (e.g. allocators, gc)
* GC is not run per-interpreter [global-gc]_
* at-exit handlers are not run per-interpreter [global-atexit]_
* extensions using the ``PyGILState_*`` API are incompatible [gilstate]_
Concurrency
-----------
Concurrency is a challenging area of software development. Decades of
research and practice have led to a wide variety of concurrency models,
each with different goals. Most center on correctness and usability.
One class of concurrency models focuses on isolated threads of
execution that interoperate through some message passing scheme. A
notable example is `Communicating Sequential Processes`_ (CSP), upon
which Go's concurrency is based. The isolation inherent to
subinterpreters makes them well-suited to this approach.
Existing Usage
--------------
Subinterpreters are not a widely used feature. In fact, the only
documented case of wide-spread usage is
`mod_wsgi <https://github.com/GrahamDumpleton/mod_wsgi>`_. On the one
hand, this case provides confidence that existing subinterpreter support
is relatively stable. On the other hand, there isn't much of a sample
size from which to judge the utility of the feature.
Provisional Status
==================
The new ``interpreters`` module will be added with "provisional" status
(see PEP 411). This allows Python users to experiment with the feature
and provide feedback while still allowing us to adjust to that feedback.
The module will be provisional in Python 3.7 and we will make a decision
before the 3.8 release whether to keep it provisional, graduate it, or
remove it.
Alternate Python Implementations
================================
TBD
Deferred Functionality
======================
@ -292,50 +538,95 @@ timeout arg to pop() and push()
Typically functions that have a ``block`` argument also have a
``timeout`` argument. We can add it later if needed.
get_main()
----------
Interpreter Isolation
=====================
CPython has a concept of a "main" interpreter. This is the initial
interpreter created during CPython's runtime initialization. It may
be useful to identify the main interpreter. For instance, the main
interpreter should not be destroyed. However, for the basic
functionality of a high-level API a ``get_main()`` function is not
necessary. Furthermore, there is no requirement that a Python
implementation have a concept of a main interpreter. So until there's
a clear need we'll leave ``get_main()`` out.
CPython's interpreters are intended to be strictly isolated from each
other. Each interpreter has its own copy of all modules, classes,
functions, and variables. The same applies to state in C, including in
extension modules. The CPython C-API docs explain more. [c-api]_
Interpreter.run_in_thread()
---------------------------
However, there are ways in which interpreters share some state. First
of all, some process-global state remains shared, like file descriptors.
There are no plans to change this.
This method would make a ``run()`` call for you in a thread. Doing this
using only ``threading.Thread`` and ``run()`` is relatively trivial so
we've left it out.
Second, some isolation is faulty due to bugs or implementations that did
not take subinterpreters into account. This includes things like
at-exit handlers and extension modules that rely on C globals. In these
cases bugs should be opened (some are already).
Synchronization Primitives
--------------------------
Finally, some potential isolation is missing due to the current design
of CPython. This includes the GIL and memory management. Improvements
are currently going on to address gaps in this area.
The ``threading`` module provides a number of synchronization primitives
for coordinating concurrent operations. This is especially necessary
due to the shared-state nature of threading. In contrast,
subinterpreters do not share state. Data sharing is restricted to
FIFOs, which do away with the need for explicit synchronization. If
any sort of opt-in shared state support is added to subinterpreters in
the future, that same effort can introduce synchronization primitives
to meet that need.
CSP Library
-----------
Provisional Status
==================
A ``csp`` module would not be a large step away from the functionality
provided by this PEP. However, adding such a module is outside the
minimalist goals of this proposal.
The new ``interpreters`` module will be added with "provisional" status
(see PEP 411). This allows Python users to experiment with the feature
and provide feedback while still allowing us to adjust to that feedback.
The module will be provisional in Python 3.7 and we will make a decision
before the 3.8 release whether to keep it provisional, graduate it, or
remove it.
Syntactic Support
-----------------
The ``Go`` language provides a concurrency model based on CSP, so
it's similar to the concurrency model that subinterpreters support.
``Go`` provides syntactic support, as well several builtin concurrency
primitives, to make concurrency a first-class feature. Conceivably,
similar syntactic (and builtin) support could be added to Python using
subinterpreters. However, that is *way* outside the scope of this PEP!
Multiprocessing
---------------
The ``multiprocessing`` module could support subinterpreters in the same
way it supports threads and processes. In fact, the module's
maintainer, Davin Potts, has indicated this is a reasonable feature
request. However, it is outside the narrow scope of this PEP.
References
==========
.. [c-api]
https://docs.python.org/3/c-api/init.html#sub-interpreter-support
.. _Communicating Sequential Processes:
.. [CSP]
https://en.wikipedia.org/wiki/Communicating_sequential_processes
https://github.com/futurecore/python-csp
.. [caveats]
https://docs.python.org/3/c-api/init.html#bugs-and-caveats
.. [petr-c-ext]
https://mail.python.org/pipermail/import-sig/2016-June/001062.html
https://mail.python.org/pipermail/python-ideas/2016-April/039748.html
.. [cryptography]
https://github.com/pyca/cryptography/issues/2299
.. [global-gc]
http://bugs.python.org/issue24554
.. [gilstate]
https://bugs.python.org/issue10915
http://bugs.python.org/issue15751
.. [global-atexit]
https://bugs.python.org/issue6531
Copyright
=========