Remove trailing spaces from many PEPs (#983)

This commit is contained in:
Serhiy Storchaka 2019-04-16 17:50:15 +03:00 committed by Guido van Rossum
parent 1b0e8221be
commit ad7f2b2f6c
25 changed files with 205 additions and 205 deletions

View File

@ -21,10 +21,10 @@ reStructuredText for PEPs
=========================
Original PEP source should be written in reStructuredText format,
which is a constrained version of plaintext, and is described in
which is a constrained version of plaintext, and is described in
PEP 12. Older PEPs were often written in a more mildly restricted
plaintext format, as described in PEP 9. The ``pep2html.py``
processing and installation script knows how to produce the HTML
plaintext format, as described in PEP 9. The ``pep2html.py``
processing and installation script knows how to produce the HTML
for either PEP format.
For processing reStructuredText format PEPs, you need the docutils

View File

@ -427,7 +427,7 @@ Each PEP should have the following parts/sections:
The rationale should provide evidence of consensus within the
community and discuss important objections or concerns raised
during discussion.
5. Specification -- The technical specification should describe the
syntax and semantics of any new language feature. The
specification should be detailed enough to allow competing,
@ -440,7 +440,7 @@ Each PEP should have the following parts/sections:
author proposes to deal with these incompatibilities. PEP
submissions without a sufficient backwards compatibility treatise
may be rejected outright.
7. Security Implications -- If there are security concerns in relation
to the PEP, those concerns should be explicitly written out to make
sure reviewers of the PEP are aware of them.
@ -465,14 +465,14 @@ Each PEP should have the following parts/sections:
The final implementation must include test code and documentation
appropriate for either the Python language reference or the
standard library reference.
10. Rejected Ideas -- Throughout the discussion of a PEP, various ideas
will be proposed which are not accepted. Those rejected ideas should
be recorded along with the reasoning as to why they were rejected.
This both helps record the thought process behind the final version
of the PEP as well as preventing people from bringing up the same
rejected idea again in subsequent discussions.
In a way this section can be thought of as a breakout section of the
Rationale section that is focused specifically on why certain ideas
were not ultimately pursued.
@ -483,9 +483,9 @@ Each PEP should have the following parts/sections:
resolution. This helps make sure all issues required for the PEP to be
ready for consideration are complete complete and reduces people
duplicating prior discussion.
12. References -- A collection of URLs used as references through the PEP.
12. References -- A collection of URLs used as references through the PEP.
13. Copyright/public domain -- Each PEP must either be explicitly
labeled as placed in the public domain (see this PEP as an
example) or licensed under the `Open Publication License`_.

View File

@ -73,7 +73,7 @@ directions below.
Your email address may appear second (or it can be omitted) and if
it appears, it must appear in angle brackets. It is okay to
obfuscate your email address.
- If none of the authors are Python core developers and this PEP is
ready to be submitted to the PEP repository, include a Sponsor header
with the name of the core developer sponsoring your PEP.
@ -158,7 +158,7 @@ your PEP)::
Title: [...]
Author: [Full Name <email at example.com>]
Sponsor: *[Full Name <email at example.com>]
BDFL-Delegate:
BDFL-Delegate:
Discussions-To: *[...]
Status: Draft
Type: [Standards Track | Informational | Process]
@ -603,8 +603,8 @@ should be avoided. For ordinary text, use ordinary 'single-quotes' or
above), use double-backquotes::
``literal text: in here, anything goes!``
Suggested Sections
==================
Various sections are found to be common across PEPs and are outlined in
@ -613,70 +613,70 @@ PEP 1 [1]_. Those sections are provided here for convenience.
Abstract
========
[A short (~200 word) description of the technical issue being addressed.]
Motivation
==========
[Clearly explain why the existing language specification is inadequate to address the problem that the PEP solves.]
Rationale
=========
[Describe why particular design decisions were made.]
Specification
=============
[Describe the syntax and semantics of any new language feature.]
Backwards Compatibility
=======================
[Describe potential impact and severity on pre-existing code.]
Security Implications
=====================
[How could a malicious user take advantage of this new feature?]
How to Teach This
=================
[How to teach users, new and experienced, how to apply the PEP to their work.]
Reference Implementation
========================
[Link to any existing implementation and details about its state, e.g. proof-of-concept.]
Rejected Ideas
==============
[Why certain ideas that were brought while discussing this PEP were not ultimately pursued.]
Open Issues
===========
[Any points that are still being decided/discussed.]
References
==========
[A collection of URLs used as references through the PEP.]
Copyright
=========

View File

@ -226,7 +226,7 @@ members must first check that they are available with the
if (PyType_HasFeature(x->ob_type, Py_TPFLAGS_HAVE_INPLACE_OPS) &&
x->ob_type->tp_as_number && x->ob_type->tp_as_number->nb_inplace_add) {
/* ... */
This check must be made even before testing the method slots for ``NULL``
values! The macro only tests whether the slots are available, not whether
they are filled with methods or not.

View File

@ -482,7 +482,7 @@ Miscellaneous issues
- Additional multiplication operators. Several forms of multiplications are
used in (multi-)linear algebra. Most can be seen as variations of
multiplication in linear algebra sense (such as Kronecker product). But two
forms appear to be more fundamental: outer product and inner product.
forms appear to be more fundamental: outer product and inner product.
However, their specification includes indices, which can be either
- associated with the operator, or
@ -542,7 +542,7 @@ a given operator. Several examples are listed here:
Among them, the ``&`` ``|`` ``^`` ``~`` operators can be regarded as
elementwise versions of lattice operators applied to integers regarded as
bit strings.::
5 and 6 # 6
5 or 6 # 5
@ -556,7 +556,7 @@ a given operator. Several examples are listed here:
make ``xor`` a reserved word.
2. List arithmetics.::
[1, 2] + [3, 4] # [1, 2, 3, 4]
[1, 2] ~+ [3, 4] # [4, 6]

View File

@ -386,7 +386,7 @@ Then why not some other special syntax without a new keyword?
-------------------------------------------------------------
For example, one of these instead of ``yield 3``::
return 3 and continue
return and continue 3
return generating 3

View File

@ -1534,7 +1534,7 @@ the initial reference implementation was released in setuptools 8.0 and pip
This change was based on user feedback received when setuptools 8.0
started applying normalisation to the release metadata generated when
preparing packages for publication on PyPI [8]_.
* The PEP text and the ``is_canonical`` regex were updated to be explicit
that numeric components are specifically required to be represented as
squences of ASCII digits, not arbitrary Unicode [Nd] code points. This
@ -1575,7 +1575,7 @@ justifications for needing such a standard can be found in PEP 386.
.. [9] Changing the status of PEP 440 to Provisional
https://mail.python.org/pipermail/distutils-sig/2014-December/025412.html
.. [10] PEP 440: regex should not permit Unicode [Nd] characters
https://github.com/python/peps/pull/966

View File

@ -49,8 +49,8 @@ dictionary::
return cls.__dict__[name]
except KeyError:
raise AttributeError(name) from None
PEP Status
==========

View File

@ -170,7 +170,7 @@ short-circuited. For example, ``await a?.b(c).d?[e]`` is evaluated::
_v = _v[e]
await _v
.. note::
.. note::
``await`` will almost certainly fail in this context, as it would in
the case where code attempts ``await None``. We are not proposing to add a
``None``-aware ``await`` keyword here, and merely include it in this

View File

@ -774,7 +774,7 @@ Optional features:
- `Bot to generate cherry-pick pull requests`_
- Write `.github/CONTRIBUTING.md`
(to prevent PRs that are inappropriate from even showing up and pointing to the devguide)
Open Issues

View File

@ -466,7 +466,7 @@ successfully configured::
(Note: this warning ended up being silenced by default. See the
Implementation Note above for more details)
As long as the current platform provides at least one of the candidate UTF-8
based environments, this locale coercion will mean that the standard
Python binary *and* locale-aware extensions should once again "just work"

View File

@ -219,17 +219,17 @@ names as unavailable for security reasons.
Intellectual property policy
----------------------------
It is the policy of Python Software Foundation and the Package Index
maintainers to be appropriately responsive to claims of intellectual
It is the policy of Python Software Foundation and the Package Index
maintainers to be appropriately responsive to claims of intellectual
property infringement by third parties. It is not the policy of
the Python Software Foundation nor the Package Index maintainers
to pre-screen uploaded packages for any type of intellectual property
infringement.
to pre-screen uploaded packages for any type of intellectual property
infringement.
Possibly-infringing packages should be reported to legal@python.org
and counsel to the Python Software Foundation will determine an
and counsel to the Python Software Foundation will determine an
appropriate response. A package can be removed or transferred to a
new owner at the sole discretion of the Python Software Foundation to
new owner at the sole discretion of the Python Software Foundation to
address a claim of infringement.
A project published on the Package Index meeting ANY of the following
@ -244,9 +244,9 @@ or transferral to a new owner:
the subject of a complaint; or
* project is subject to an active lawsuit.
In the event of a complaint for intellectual property infringement,
a copy of the complaint will be sent to the package owner. In some
cases, action may be taken by the Package Index maintainers before
In the event of a complaint for intellectual property infringement,
a copy of the complaint will be sent to the package owner. In some
cases, action may be taken by the Package Index maintainers before
the owner responds.

View File

@ -9,7 +9,7 @@ Type: Standards Track
Content-Type: text/x-rst
Created: 25-May-2017
Python-Version: 3.7
Post-History:
Post-History:
Deferral Notice
@ -161,7 +161,7 @@ importlib's ``ExtensionFileLoader`` will get an implementation of
extension module's ``PyInit_*`` function.
The ``PyInit_*`` function can return either a fully initialized module
(single-phase initialization) or a ``PyModuleDef`` (for PEP 489 multi-phase
(single-phase initialization) or a ``PyModuleDef`` (for PEP 489 multi-phase
initialization).
In the single-phase initialization case, ``_imp.exec_in_module`` will raise

View File

@ -26,7 +26,7 @@ an extension to the descriptor protocol allowing use of
the descriptor protocol for members of *instances.* This
would permit using properties in modules.
Rationale
Rationale
=========
Python's descriptor protocol guides programmers towards

View File

@ -14,14 +14,14 @@ Post-History: 06-Sep-2017
Abstract
========
Sometimes, in special cases, it is desired that code can pass information down the function call chain to the callees without having to explicitly pass the information as arguments to each function in the call chain. This proposal describes a construct which allows code to explicitly switch in and out of a context where a certain context variable has a given value assigned to it. This is a modern alternative to some uses of things like global variables in traditional single-threaded (or thread-unsafe) code and of thread-local storage in traditional *concurrency-unsafe* code (single- or multi-threaded). In particular, the proposed mechanism can also be used with more modern concurrent execution mechanisms such as asynchronously executed coroutines, without the concurrently executed call chains interfering with each other's contexts.
Sometimes, in special cases, it is desired that code can pass information down the function call chain to the callees without having to explicitly pass the information as arguments to each function in the call chain. This proposal describes a construct which allows code to explicitly switch in and out of a context where a certain context variable has a given value assigned to it. This is a modern alternative to some uses of things like global variables in traditional single-threaded (or thread-unsafe) code and of thread-local storage in traditional *concurrency-unsafe* code (single- or multi-threaded). In particular, the proposed mechanism can also be used with more modern concurrent execution mechanisms such as asynchronously executed coroutines, without the concurrently executed call chains interfering with each other's contexts.
The "call chain" can consist of normal functions, awaited coroutines, or generators. The semantics of context variable scope are equivalent in all cases, allowing code to be refactored freely into *subroutines* (which here refers to functions, sub-generators or sub-coroutines) without affecting the semantics of context variables. Regarding implementation, this proposal aims at simplicity and minimum changes to the CPython interpreter and to other Python interpreters.
Rationale
=========
Consider a modern Python *call chain* (or call tree), which in this proposal refers to any chained (nested) execution of *subroutines*, using any possible combinations of normal function calls, or expressions using ``await`` or ``yield from``. In some cases, passing necessary *information* down the call chain as arguments can substantially complicate the required function signatures, or it can even be impossible to achieve in practice. In these cases, one may search for another place to store this information. Let us look at some historical examples.
Consider a modern Python *call chain* (or call tree), which in this proposal refers to any chained (nested) execution of *subroutines*, using any possible combinations of normal function calls, or expressions using ``await`` or ``yield from``. In some cases, passing necessary *information* down the call chain as arguments can substantially complicate the required function signatures, or it can even be impossible to achieve in practice. In these cases, one may search for another place to store this information. Let us look at some historical examples.
The most naive option is to assign the value to a global variable or similar, where the code down the call chain can access it. However, this immediately makes the code thread-unsafe, because with multiple threads, all threads assign to the same global variable, and another thread can interfere at any point in the call chain. Sooner or later, someone will probably find a reason to run the same code in parallel threads.
@ -29,7 +29,7 @@ A somewhat less naive option is to store the information as per-thread informati
Note that in the above two historical approaches, the stored information has the *widest* available scope without causing problems. For a third solution along the same path, one would first define an equivalent of a "thread" for asynchronous execution and concurrency. This could be seen as the largest amount of code and nested calls that is guaranteed to be executed sequentially without ambiguity in execution order. This might be referred to as concurrency-local or task-local storage. In this meaning of "task", there is no ambiguity in the order of execution of the code within one task. (This concept of a task is close to equivalent to a ``Task`` in ``asyncio``, but not exactly.) In such concurrency-locals, it is possible to pass information down the call chain to callees without another code path interfering with the value in the background.
Common to the above approaches is that they indeed use variables with a wide but just-narrow-enough scope. Thread-locals could also be called thread-wide globals---in single-threaded code, they are indeed truly global. And task-locals could be called task-wide globals, because tasks can be very big.
Common to the above approaches is that they indeed use variables with a wide but just-narrow-enough scope. Thread-locals could also be called thread-wide globals---in single-threaded code, they are indeed truly global. And task-locals could be called task-wide globals, because tasks can be very big.
The issue here is that neither global variables, thread-locals nor task-locals are really meant to be used for this purpose of passing information of the execution context down the call chain. Instead of the widest possible variable scope, the scope of the variables should be controlled by the programmer, typically of a library, to have the desired scope---not wider. In other words, task-local variables (and globals and thread-locals) have nothing to do with the kind of context-bound information passing that this proposal intends to enable, even if task-locals can be used to emulate the desired semantics. Therefore, in the following, this proposal describes the semantics and the outlines of an implementation for *context-local variables* (or context variables, contextvars). In fact, as a side effect of this PEP, an async framework can use the proposed feature to implement task-local variables.
@ -40,20 +40,20 @@ Because the proposed semantics are not a direct extension to anything already av
Semantics and higher-level API
------------------------------
Core concept
''''''''''''
A context-local variable is represented by a single instance of ``contextvars.Var``, say ``cvar``. Any code that has access to the ``cvar`` object can ask for its value with respect to the current context. In the high-level API, this value is given by the ``cvar.value`` property::
cvar = contextvars.Var(default="the default value",
cvar = contextvars.Var(default="the default value",
description="example context variable")
assert cvar.value == "the default value" # default still applies
# In code examples, all ``assert`` statements should
# succeed according to the proposed semantics.
# succeed according to the proposed semantics.
No assignments to ``cvar`` have been applied for this context, so ``cvar.value`` gives the default value. Assigning new values to contextvars is done in a highly scope-aware manner::
@ -61,13 +61,13 @@ No assignments to ``cvar`` have been applied for this context, so ``cvar.value``
assert cvar.value is new_value
# Any code here, or down the call chain from here, sees:
# cvar.value is new_value
# unless another value has been assigned in a
# unless another value has been assigned in a
# nested context
assert cvar.value is new_value
# the assignment of ``cvar`` to ``new_value`` is no longer visible
# the assignment of ``cvar`` to ``new_value`` is no longer visible
assert cvar.value == "the default value"
Here, ``cvar.assign(value)`` returns another object, namely ``contextvars.Assignment(cvar, new_value)``. The essential part here is that applying a context variable assignment (``Assignment.__enter__``) is paired with a de-assignment (``Assignment.__exit__``). These operations set the bounds for the scope of the assigned value.
Assignments to the same context variable can be nested to override the outer assignment in a narrower context::
@ -79,16 +79,16 @@ Assignments to the same context variable can be nested to override the outer ass
assert cvar.value == "inner"
assert cvar.value == "outer"
assert cvar.value == "the default value"
Also multiple variables can be assigned to in a nested manner without affecting each other::
cvar1 = contextvars.Var()
cvar2 = contextvars.Var()
cvar1 = contextvars.Var()
cvar2 = contextvars.Var()
assert cvar1.value is None # default is None by default
assert cvar2.value is None
with cvar1.assign(value1):
assert cvar1.value is value1
assert cvar2.value is None
@ -99,15 +99,15 @@ Also multiple variables can be assigned to in a nested manner without affecting
assert cvar2.value is None
assert cvar1.value is None
assert cvar2.value is None
Or with more convenient Python syntax::
Or with more convenient Python syntax::
with cvar1.assign(value1), cvar2.assign(value2):
assert cvar1.value is value1
assert cvar2.value is value2
In another *context*, in another thread or otherwise concurrently executed task or code path, the context variables can have a completely different state. The programmer thus only needs to worry about the context at hand.
Refactoring into subroutines
@ -123,8 +123,8 @@ Code using contextvars can be refactored into subroutines without affecting the
assert cvar.value is new_value
assi.__exit__()
assert cvar.value == "the default value"
Or similarly in an asynchronous context where ``await`` expressions are used. The subroutine can now be a coroutine::
assi = cvar.assign(new_value)
@ -135,15 +135,15 @@ Or similarly in an asynchronous context where ``await`` expressions are used. Th
assert cvar.value is new_value
assi.__exit__()
assert cvar.value == "the default value"
Or when the subroutine is a generator::
def apply():
yield
assi.__enter__()
which is called using ``yield from apply()`` or with calls to ``next`` or ``.send``. This is discussed further in later sections.
Semantics for generators and generator-based coroutines
@ -156,7 +156,7 @@ Generators, coroutines and async generators act as subroutines in much the same
assert cvar.value is new_value
yield
assert cvar.value is new_value
g = genfunc()
g = genfunc()
next(g)
assert cvar.value == "the default value"
with cvar.assign(another_value):
@ -172,7 +172,7 @@ However, the outer context visible to the generator may change state across yiel
yield
with cvar.assign(value3):
assert cvar.value is value3
with cvar.assign(value1):
g = genfunc()
with cvar.assign(value2):
@ -182,7 +182,7 @@ However, the outer context visible to the generator may change state across yiel
assert cvar.value is value1
Similar semantics apply to async generators defined by ``async def ... yield ...`` ).
Similar semantics apply to async generators defined by ``async def ... yield ...`` ).
By default, values assigned inside a generator do not leak through yields to the code that drives the generator. However, the assignment contexts entered and left open inside the generator *do* become visible outside the generator after the generator has finished with a ``StopIteration`` or another exception::
@ -191,7 +191,7 @@ By default, values assigned inside a generator do not leak through yields to the
yield
assi.__enter__():
yield
g = genfunc()
assert cvar.value == "the default value"
next(g)
@ -221,7 +221,7 @@ Using the ``contextvars.leaking_yields`` decorator, one can choose to leak the c
yield
assert cvar.value == "inner"
assert cvar.value == "outer"
g = genfunc():
with cvar.assign("outer"):
assert cvar.value == "outer"
@ -254,8 +254,8 @@ Using ``contextvars.capture()``, one can capture the assignment contexts that ar
delta.reapply()
assert cvar1.value is value2
assert cvar2.value == 2
However, reapplying the "delta" if its net contents include deassignments may not be possible (see also Implementation and Open Issues).
@ -273,22 +273,22 @@ Although it is possible to revert all applied context changes using the above pr
with context_vars.clean_context():
# here, all context vars start off with their default values
# here, the state is back to what it was before the with block.
Implementation
--------------
This section describes to a variable level of detail how the described semantics can be implemented. At present, an implementation aimed at simplicity but sufficient features is described. More details will be added later.
This section describes to a variable level of detail how the described semantics can be implemented. At present, an implementation aimed at simplicity but sufficient features is described. More details will be added later.
Alternatively, a somewhat more complicated implementation offers minor additional features while adding some performance overhead and requiring more code in the implementation.
Data structures and implementation of the core concept
''''''''''''''''''''''''''''''''''''''''''''''''''''''
Each thread of the Python interpreter keeps its own stack of ``contextvars.Assignment`` objects, each having a pointer to the previous (outer) assignment like in a linked list. The local state (also returned by ``contextvars.get_local_state()``) then consists of a reference to the top of the stack and a pointer/weak reference to the bottom of the stack. This allows efficient stack manipulations. An object produced by ``contextvars.capture()`` is similar, but refers to only a part of the stack with the bottom reference pointing to the top of the stack as it was in the beginning of the capture block.
Each thread of the Python interpreter keeps its own stack of ``contextvars.Assignment`` objects, each having a pointer to the previous (outer) assignment like in a linked list. The local state (also returned by ``contextvars.get_local_state()``) then consists of a reference to the top of the stack and a pointer/weak reference to the bottom of the stack. This allows efficient stack manipulations. An object produced by ``contextvars.capture()`` is similar, but refers to only a part of the stack with the bottom reference pointing to the top of the stack as it was in the beginning of the capture block.
Now, the stack evolves according to the assignment ``__enter__`` and ``__exit__`` methods. For example::
cvar1 = contextvars.Var()
cvar2 = contextvars.Var()
# stack: []
@ -298,26 +298,26 @@ Now, the stack evolves according to the assignment ``__enter__`` and ``__exit__`
with cvar1.assign("outer"):
# stack: [Assignment(cvar1, "outer")]
assert cvar1.value == "outer"
with cvar1.assign("inner"):
# stack: [Assignment(cvar1, "outer"),
# stack: [Assignment(cvar1, "outer"),
# Assignment(cvar1, "inner")]
assert cvar1.value == "inner"
with cvar2.assign("hello"):
# stack: [Assignment(cvar1, "outer"),
# stack: [Assignment(cvar1, "outer"),
# Assignment(cvar1, "inner"),
# Assignment(cvar2, "hello")]
assert cvar2.value == "hello"
# stack: [Assignment(cvar1, "outer"),
# stack: [Assignment(cvar1, "outer"),
# Assignment(cvar1, "inner")]
assert cvar1.value == "inner"
assert cvar2.value is None
# stack: [Assignment(cvar1, "outer")]
assert cvar1.value == "outer"
# stack: []
assert cvar1.value is None
assert cvar2.value is None
@ -339,7 +339,7 @@ Within generators, coroutines and async generators, assignments and deassignment
return self._old_send(value)
try:
with contextvars.capture() as delta:
if self.gi_contextvars:
if self.gi_contextvars:
# non-zero captured content from previous iteration
self.gi_contextvars.reapply()
ret = self._old_send(value)
@ -352,7 +352,7 @@ Within generators, coroutines and async generators, assignments and deassignment
return ret
The corresponding modifications to the other methods is essentially identical. The same applies to coroutines and async generators.
The corresponding modifications to the other methods is essentially identical. The same applies to coroutines and async generators.
For code that does not use ``contextvars``, the additions are O(1) and essentially reduce to a couple of pointer comparisons. For code that does use ``contextvars``, the additions are still O(1) in most cases.
@ -364,9 +364,9 @@ The rest of the functionality, including ``contextvars.leaking_yields``, context
Backwards compatibility
=======================
There are no *direct* backwards-compatibility concerns, since a completely new feature is proposed.
There are no *direct* backwards-compatibility concerns, since a completely new feature is proposed.
However, various traditional uses of thread-local storage may need a smooth transition to ``contextvars`` so they can be concurrency-safe. There are several approaches to this, including emulating task-local storage with a little bit of help from async frameworks. A fully general implementation cannot be provided, because the desired semantics may depend on the design of the framework.
However, various traditional uses of thread-local storage may need a smooth transition to ``contextvars`` so they can be concurrency-safe. There are several approaches to this, including emulating task-local storage with a little bit of help from async frameworks. A fully general implementation cannot be provided, because the desired semantics may depend on the design of the framework.
Another way to deal with the transition is for code to first look for a context created using ``contextvars``. If that fails because a new-style context has not been set or because the code runs on an older Python version, a fallback to thread-local storage is used.

View File

@ -1,4 +1,4 @@
PEP: 561
PEP: 561
Title: Distributing and Packaging Type Information
Author: Ethan Smith <ethan@ethanhs.me>
Status: Accepted
@ -29,7 +29,7 @@ to keep Python 2 compatibility while using newer annotation syntax. However,
there is no standard method to distribute packages with type information.
Also, if one wished to ship stub files privately the only method available
would be via setting ``MYPYPATH`` or the equivalent to manually point to
stubs. If the package can be released publicly, it can be added to
stubs. If the package can be released publicly, it can be added to
typeshed [1]_. However, this does not scale and becomes a burden on the
maintainers of typeshed. In addition, it ties bug fixes in stubs to releases
of the tool using typeshed.
@ -39,7 +39,7 @@ section [2]_ the PEP recommends using ``shared/typehints/pythonX.Y/`` for
shipping stub files. However, manually adding a path to stub files for each
third party library does not scale. The simplest approach people have taken
is to add ``site-packages`` to their ``MYPYPATH``, but this causes type
checkers to fail on packages that are highly dynamic (e.g. sqlalchemy
checkers to fail on packages that are highly dynamic (e.g. sqlalchemy
and Django).
@ -80,7 +80,7 @@ create:
3. A third party or package maintainer would like to share stub files for
a package, but the maintainer does not want to include them in the source
of the package.
This PEP aims to support all three scenarios and make them simple to add to
packaging and deployment.
@ -266,7 +266,7 @@ Version History
* Name of marker file changed from ``.typeinfo`` to ``py.typed``
* 2017-11-10
* Specification re-written to use package metadata instead of distribution
metadata.
* Removed stub-only packages and merged into third party packages spec.
@ -274,7 +274,7 @@ Version History
* Implementations updated to reflect PEP changes.
* 2017-10-26
* Added implementation references.
* Added acknowledgements and version history.
@ -295,7 +295,7 @@ References
.. [2] PEP 484, Storing and Distributing Stub Files
(https://www.python.org/dev/peps/pep-0484/#storing-and-distributing-stub-files)
.. [3] PEP 426 definitions
(https://www.python.org/dev/peps/pep-0426/)

View File

@ -118,7 +118,7 @@ a heap type.
Curently, most exception types, apart from the ones in ``builtins``, are
heap types. This is likely simply because there is a convenient way
to create them: ``PyErr_NewException``.
to create them: ``PyErr_NewException``.
Heap types generally have a mutable ``__dict__``.
In most cases, this mutability is harmful. For example, exception types
from the ``sqlite`` module are mutable and shared across subinterpreters.
@ -176,7 +176,7 @@ By contrast, extension methods are typically implemented as normal C functions.
This means that they only have access to their arguments and C level thread-local
and process-global states. Traditionally, many extension modules have stored
their shared state in C-level process globals, causing problems when:
* running multiple initialize/finalize cycles in the same process
* reloading modules (e.g. to test conditional imports)
* loading extension modules in subinterpreters
@ -194,7 +194,7 @@ Proposal
========
Currently, a bound extension method (``PyCFunction`` or ``PyCFunctionWithKeywords``) receives only
``self``, and (if applicable) the supplied positional and keyword arguments.
``self``, and (if applicable) the supplied positional and keyword arguments.
While module-level extension functions already receive access to the defining module object via their
``self`` argument, methods of extension types don't have that luxury: they receive the bound instance
@ -321,7 +321,7 @@ to private API is required, now ``_PyMethodDef_RawFastCallDict`` and
``_PyMethodDef_RawFastCallKeywords`` will receive ``PyTypeObject *cls``
as one of their arguments.
A new macro ``PyCFunction_GET_CLASS(cls)`` will be added for easier access to mm_class.
A new macro ``PyCFunction_GET_CLASS(cls)`` will be added for easier access to mm_class.
Method construction and calling code and will be updated to honor
``METH_METHOD``.
@ -467,7 +467,7 @@ New macros:
New types:
* PyCMethodObject
* PyCMethodObject
New structures:

View File

@ -304,7 +304,7 @@ see which operations provide audit events.
.. csv-table:: Table 1: Suggested Audit Hooks
:header: "API Function", "Event Name", "Arguments", "Rationale"
:widths: 2, 2, 3, 6
``PySys_AddAuditHook``, ``sys.addaudithook``, "", "Detect when new
audit hooks are being added.
"
@ -364,7 +364,7 @@ see which operations provide audit events.
.. csv-table:: Table 2: Potential CPython Audit Hooks
:header: "API Function", "Event Name", "Arguments", "Rationale"
:widths: 2, 2, 3, 6
``_PySys_ClearAuditHooks``, ``sys._clearaudithooks``, "", "Notifies
hooks they are being cleaned up, mainly in case the event is
triggered unexpectedly. This event cannot be aborted.
@ -387,8 +387,8 @@ see which operations provide audit events.
``_ctypes._CData``, ``ctypes.cdata``, "``(ptr_as_int,)``", "Detect
when code is accessing arbitrary memory using ``ctypes``.
"
"``new_mmap_object``",``mmap.__new__``,"``(fileno, map_size, access,
offset)``", "Detects creation of mmap objects. On POSIX, access may
"``new_mmap_object``",``mmap.__new__``,"``(fileno, map_size, access,
offset)``", "Detects creation of mmap objects. On POSIX, access may
have been calculated from the ``prot`` and ``flags`` arguments.
"
``sys._getframe``, ``sys._getframe``, "``(frame_object,)``", "Detect
@ -409,9 +409,9 @@ see which operations provide audit events.
members that are marked as restricted, and members that may allow
bypassing imports.
"
"``urllib.urlopen``",``urllib.Request``,"``(url, data, headers,
method)``", "Detects URL requests.
"
"``urllib.urlopen``",``urllib.Request``,"``(url, data, headers,
method)``", "Detects URL requests.
"
Performance Impact
==================

View File

@ -95,11 +95,11 @@ the Python executable and any script will behave.
/> python foo/myscript.py
sys.path[0] == 'foo'
sys.path[1] == 'foo/__pypackages__/3.8/lib'
cd foo
foo> /usr/bin/ansible
foo> /usr/bin/ansible
#! /usr/bin/env python3
foo> python /usr/bin/ansible

View File

@ -2,12 +2,12 @@ PEP: 585
Title: Type Hinting Usability Conventions
Version: $Revision$
Last-Modified: $Date$
Author: Łukasz Langa <lukasz@python.org>
Author: Łukasz Langa <lukasz@python.org>
Discussions-To: Python-Dev <python-dev@python.org>
Status: Draft
Type: Standards Track
Content-Type: text/x-rst
Created: 03-Mar-2019
Created: 03-Mar-2019
Python-Version: 3.8
Status of this PEP

View File

@ -28,7 +28,7 @@ only expressions that have literally the value "4"::
Motivation and Rationale
========================
Python has many APIs that return different types depending on the
Python has many APIs that return different types depending on the
value of some argument provided. For example:
- ``open(filename, mode)`` returns either ``IO[bytes]`` or ``IO[Text]``
@ -53,7 +53,7 @@ The typing issue tracker contains some
There is currently no way of expressing the type signatures of these
functions: PEP 484 does not include any mechanism for writing signatures
where the return type varies depending on the value passed in.
where the return type varies depending on the value passed in.
Note that this problem persists even if we redesign these APIs to
instead accept enums: ``MyEnum.FOO`` and ``MyEnum.BAR`` are both
considered to be of type ``MyEnum``.
@ -90,7 +90,7 @@ literal type. So, if we have some variable ``foo`` of type ``Literal[3]``
its safe to do things like ``foo + 5`` since ``foo`` inherits ints
``__add__`` method. The resulting type of ``foo + 5`` is ``int``.
This "inheriting" behavior is identical to how we
This "inheriting" behavior is identical to how we
`handle NewTypes. <newtypes_>`_.
Equivalence of two Literals
@ -123,11 +123,11 @@ many different literals more ergonomic — for example, functions like
_PathType = Union[str, bytes, int]
@overload
def open(path: _PathType,
def open(path: _PathType,
mode: Literal["r", "w", "a", "x", "r+", "w+", "a+", "x+"],
) -> IO[Text]: ...
@overload
def open(path: _PathType,
def open(path: _PathType,
mode: Literal["rb", "wb", "ab", "xb", "r+b", "w+b", "a+b", "x+b"],
) -> IO[bytes]: ...
@ -217,7 +217,7 @@ Illegal parameters for ``Literal`` at type check time
The following parameters are intentionally disallowed by design:
- Arbitrary expressions like ``Literal[3 + 4]`` or
``Literal["foo".replace("o", "b")]``.
``Literal["foo".replace("o", "b")]``.
- Rationale: Literal types are meant to be a
minimal extension to the PEP 484 typing ecosystem and requiring type
@ -265,7 +265,7 @@ any checks at runtime. For example::
def my_function(x: Literal[1 + 2]) -> int:
return x * 3
x: Literal = 3
y: Literal[my_function] = my_function
@ -276,7 +276,7 @@ should execute this program with no errors.
This is partly to help us preserve flexibility in case we want to expand the
scope of what ``Literal`` can be used for in the future, and partly because
it is not possible to detect all illegal parameters at runtime to begin with.
For example, it is impossible to distinguish between ``Literal[1 + 2]`` and
For example, it is impossible to distinguish between ``Literal[1 + 2]`` and
``Literal[3]`` at runtime.
Literals, enums, and forward references
@ -365,7 +365,7 @@ in objects::
m = MyObject()
# ...this assignment would no longer type check
m.field = 4
m.field = 4
Using non-Literals in Literal contexts
--------------------------------------
@ -418,9 +418,9 @@ We expect similar behavior when using functions like getattr::
class Test:
def __init__(self, param: int) -> None:
self.myfield = param
def mymethod(self, val: int) -> str: ...
a: Literal["myfield"] = "myfield"
b: Literal["mymethod"] = "mymethod"
c: Literal["blah"] = "blah"
@ -443,11 +443,11 @@ types. For example, consider ``open``::
_PathType = Union[str, bytes, int]
@overload
def open(path: _PathType,
def open(path: _PathType,
mode: Literal["r", "w", "a", "x", "r+", "w+", "a+", "x+"],
) -> IO[Text]: ...
@overload
def open(path: _PathType,
def open(path: _PathType,
mode: Literal["rb", "wb", "ab", "xb", "r+b", "w+b", "a+b", "x+b"],
) -> IO[bytes]: ...
@ -486,7 +486,7 @@ classes using Literal types::
def __add__(self, other: Matrix[A, B]) -> Matrix[A, B]: ...
def __matmul__(self, other: Matrix[B, C]) -> Matrix[A, C]: ...
def transpose(self) -> Matrix[B, A]: ...
foo: Matrix[Literal[2], Literal[3]] = Matrix(...)
bar: Matrix[Literal[3], Literal[7]] = Matrix(...)
@ -562,13 +562,13 @@ containment or equality checks::
# Type checker could narrow 'status' to type
# Literal["MALFORMED", "ABORTED"] here.
return expects_bad_status(status)
# Similarly, type checker could narrow 'x' to Literal["PENDING"]
if status == "PENDING":
expects_pending_status(status)
It may also be useful to perform narrowing taking into account expressions
involving Literal bools. For example, we can combine ``Literal[True]``,
involving Literal bools. For example, we can combine ``Literal[True]``,
``Literal[False]``, and overloads to construct "custom type guards"::
@overload
@ -588,7 +588,7 @@ involving Literal bools. For example, we can combine ``Literal[True]``,
scalar += 3 # Type checks: type of 'scalar' is narrowed to 'int'
else:
scalar += "foo" # Type checks: type of 'scalar' is narrowed to 'str'
Rejected or out-of-scope ideas
@ -608,7 +608,7 @@ let us write signatures like the below::
# A vector has length 'n', containing elements of type 'T'
class Vector(Generic[N, T]): ...
# The type checker will statically verify our function genuinely does
# The type checker will statically verify our function genuinely does
# construct a vector that is equal in length to "len(vec1) + len(vec2)"
# and will throw an error if it does not.
def concat(vec1: Vector[A, T], vec2: Vector[B, T]) -> Vector[A + B, T]:
@ -692,7 +692,7 @@ following threads:
- `Typing for multi-dimensional arrays <arrays-discussion_>`_
The overall design of this proposal also ended up converging into
something similar to how
something similar to how
`literal types are handled in TypeScript <typescript-literal-types_>`_.
.. _typing-discussion: https://github.com/python/typing/issues/478

View File

@ -6,7 +6,7 @@ Type: Standards Track
Content-Type: text/x-rst
Created: 29-Mar-2019
Python-Version: 3.8
Post-History:
Post-History:
Abstract
========
@ -21,9 +21,9 @@ The choice of a calling convention impacts the performance and flexibility of co
Often there is tension between performance and flexibility.
The current ``tp_call`` [2]_ calling convention is sufficiently flexible to cover all cases, but its performance is poor.
The poor performance is largely a result of having to create intermediate tuples, and possibly intermediate dicts, during the call.
The poor performance is largely a result of having to create intermediate tuples, and possibly intermediate dicts, during the call.
This is mitigated in CPython by including special-case code to speed up calls to Python and builtin functions.
Unfortunately this means that other callables such as classes and third party extension objects are called using the
Unfortunately this means that other callables such as classes and third party extension objects are called using the
slower, more general ``tp_call`` calling convention.
This PEP proposes that the calling convention used internally for Python and builtin functions is generalized and published
@ -65,7 +65,7 @@ A new flag is added, ``Py_TPFLAGS_HAVE_VECTORCALL``, which is set for any new Py
If ``Py_TPFLAGS_HAVE_VECTORCALL`` is set then ``tp_vectorcall_offset`` is the offset
into the object of the ``vectorcall`` function-pointer.
The unused slot ``printfunc tp_print`` is replaced with ``vectorcall tp_vectorcall``, so that classes
The unused slot ``printfunc tp_print`` is replaced with ``vectorcall tp_vectorcall``, so that classes
can support the vectorcall calling convention.
Additional flags
@ -100,15 +100,15 @@ in the array remain correct.
Example of how ``PY_VECTORCALL_ARGUMENTS_OFFSET`` is used by a callee is safely used to avoid allocation [3]_
Whenever they can do so cheaply (without allocation) callers are encouraged to offset the arguments.
Whenever they can do so cheaply (without allocation) callers are encouraged to offset the arguments.
Doing so will allow callables such as bound methods to make their onward calls cheaply.
The interpreter already allocates space on the stack for the callable, so it can offset its arguments for no additional cost.
Continued prohibition of callable classes as base classes
---------------------------------------------------------
Currently any attempt to use ``function``, ``method`` or ``method_descriptor`` as a base class for a new class will fail with a ``TypeError``.
This behaviour is desirable as it prevents errors when a subclass overrides the ``__call__`` method.
Currently any attempt to use ``function``, ``method`` or ``method_descriptor`` as a base class for a new class will fail with a ``TypeError``.
This behaviour is desirable as it prevents errors when a subclass overrides the ``__call__`` method.
If callables could be sub-classed then any call to a ``function`` or a ``method_descriptor`` would need an additional check that the ``__call__`` method had not been overridden. By exposing an additional call mechanism, the potential for errors becomes greater. As a consequence, any third-party class implementing the new call interface will not be usable as a base class.
New C API and changes to CPython
@ -170,9 +170,9 @@ The ``PyMethodDef`` protocol and Argument Clinic
================================================
Argument Clinic [4]_ automatically generates wrapper functions around lower-level callables, providing safe unboxing of primitive types and
other safety checks.
Argument Clinic could be extended to generate wrapper objects conforming to the new ``vectorcall`` protocol.
This will allow execution to flow from the caller to the Argument Clinic generated wrapper and
other safety checks.
Argument Clinic could be extended to generate wrapper objects conforming to the new ``vectorcall`` protocol.
This will allow execution to flow from the caller to the Argument Clinic generated wrapper and
thence to the hand-written code with only a single indirection.
Performance implications of these changes
@ -189,12 +189,12 @@ Alternative Suggestions
PEP 576 and PEP 580
-------------------
Both PEP 576 and PEP 580 are designed to enable 3rd party objects to be both expressive and performant (on a par with
CPython objects). The purpose of this PEP is provide a uniform way to call objects in the CPython ecosystem that is
Both PEP 576 and PEP 580 are designed to enable 3rd party objects to be both expressive and performant (on a par with
CPython objects). The purpose of this PEP is provide a uniform way to call objects in the CPython ecosystem that is
both expressive and as performant as possible.
This PEP is broader in scope than PEP 576 and uses variable rather than fixed offset function-pointers.
The underlying calling convention is similar. Because PEP 576 only allows a fixed offset for the function pointer,
This PEP is broader in scope than PEP 576 and uses variable rather than fixed offset function-pointers.
The underlying calling convention is similar. Because PEP 576 only allows a fixed offset for the function pointer,
it would not allow the improvements to any objects with constraints on their layout.
PEP 580 proposes a major change to the ``PyMethodDef`` protocol used to define builtin functions.

View File

@ -310,7 +310,7 @@ or
between tracebacks, depending whether they are linked by ``__cause__`` or
``__context__`` respectively. Here is a sketch of the procedure::
def print_chain(exc):
def print_chain(exc):
if exc.__cause__:
print_chain(exc.__cause__)
print '\nThe above exception was the direct cause...'
@ -391,7 +391,7 @@ try to wrap exceptions by writing this::
or this::
try:
... implementation may raise an exception ...
... implementation may raise an exception ...
except Exception, exc:
raise ApplicationError from exc

View File

@ -414,7 +414,7 @@ experienced core contributors as listed in the README of the project repo.
To be recommended and invited as a Steering Council member, an individual must
be a Project Contributor who has produced contributions that are substantial in
quality and quantity, and sustained over at least one year. Potential Council
Members are nominated by existing Council members and voted upon by the
Members are nominated by existing Council members and voted upon by the
existing Council after asking if the potential Member is interested and willing
to serve in that capacity.
@ -428,9 +428,9 @@ without further process. All merged pull requests end up in the next stable
release of a subproject.
There is a weekly, public Project-wide meeting that is recorded and posted on
YouTube. Some larger GitHub organizations, which are subprojects of
YouTube. Some larger GitHub organizations, which are subprojects of
Project Jupyter, e.g. JupyterLab and JupyterHub, may
have additional public team meetings on a weekly or monthly schedule.
have additional public team meetings on a weekly or monthly schedule.
Discussions occur on Gitter, the Jupyter mailing list, and most frequently an
open issue and/or pull request on GitHub.
@ -445,17 +445,17 @@ The foundations of Project Jupyter's governance are:
During the everyday project activities, Steering Council members participate in
all discussions, code review and other project activities as peers with all
other Contributors and the Community. In these everyday activities,
Council Members do not have any special power or privilege through their
other Contributors and the Community. In these everyday activities,
Council Members do not have any special power or privilege through their
membership on the Council. However, it is expected that because of the quality
and quantity of their contributions and their expert knowledge of the
and quantity of their contributions and their expert knowledge of the
Project Software and Services that Council Members will provide useful guidance,
both technical and in terms of project direction, to potentially less
both technical and in terms of project direction, to potentially less
experienced contributors.
For controversial issues, the contributor community works together to refine
potential solutions, iterate as necessary, and build consensus by sharing
information and views constructively and openly. The Steering Council may
information and views constructively and openly. The Steering Council may
make decisions when regular community discussion doesn't produce consensus
on an issue in a reasonable time frame.
@ -467,7 +467,7 @@ Rarely, if ever, is voting done for technical decisions.
For other Project issues, the Steering Council may call for a vote for a
decision via a Governance PR or email proposal. Acceptance
requires a minimum of 80% of the Steering Council to vote and at least 2/3 of
the vote must be positive.
the vote must be positive.
The BDFL can act alone to accept or reject changes or override the Steering
Council decision; though this would be an extremely rare event. As Benevolent,

View File

@ -89,7 +89,7 @@ The rationale for the model is that a model that casts everything in concrete wi
have unintended negative side effects. For example, a governance model that
assigns voting rights to Python committers may cause an individual not
to be accepted as a committer because there are already a lot of committers
from the company the new candidate works for.
from the company the new candidate works for.
As another example, setting a fixed percentage for PEP acceptance may lead
to party-formation amongst the voters and individual PEPs no longer be being
@ -119,7 +119,7 @@ public, i.e. the voters are identified and the results are known to all.
Voting may be simple +1/0/-1, but might also be extended with +2/-2 with a
very terse explanation why the voter feels very strong about the issue. Such
an annotation would serve as an explanation to the Council of Elders. Voters
are annotated with their community status (core developer, etc).
are annotated with their community status (core developer, etc).
The vote is clearly separated from the discussion, by using a well-defined Discourse
category or tag, a special mailing list or a similar technical method
@ -127,7 +127,7 @@ category or tag, a special mailing list or a similar technical method
community status can be automatically added, and their identity can be somewhat
confirmed).
The PEP author presents the PEP and the vote results to the Council of Elders.
The PEP author presents the PEP and the vote results to the Council of Elders.
The council ponders two things:
- the PEP gravity and its implications,
@ -156,19 +156,19 @@ Council of Elders
The intention of the Councel of Elders is that they, together, are capable
of judging whether the will of the Python community is upheld in a specific
vote.
vote.
The Council of Elders is *not* a replacement of the BDFL by a group of
people with the same power as the BDFL: it will not provide guidance on the
direction of Python, it only attempts to ensure the outcome of a vote
represents the will of the community.
represents the will of the community.
The Council of Elders is *not* like the US Supreme Court, which has actual
decision power, the council only oversees the voting process to ensure that
the community is represented in the vote. And the Council of Elders is most
definitely not like the Spanish Inquisition, because fear, surprise and
ruthless efficiency are things we can do without (but there is some merit in
using the cute scarlet regalia).
using the cute scarlet regalia).
The council is somewhat like the dutch
`Hoge Raad`_ (which is unfortunately often translated as Supreme Court in
@ -177,7 +177,7 @@ only send cases back for a renewed judgement.
.. _Hoge Raad: https://en.wikipedia.org/wiki/Supreme_Court_of_the_Netherlands
It is also somewhat like the *election commission* that many countries have
It is also somewhat like the *election commission* that many countries have
(under different names) in that it oversees elections.
Council operation
@ -197,13 +197,13 @@ The proposal attempts to minimize the workload through two methods:
the Council of Elders does not organize the vote and tally the results.
- The idea behind the first tentative decision is mistakes by the Council
of elders (misjudging how far-reaching a PEP is, most likely) are not fatal, because
the community has a chance to point out these mistakes.
the community has a chance to point out these mistakes.
Practically speaking this means that the tentative decision can be taken by
a subset of the council, depending on the community to correct them.
Getting seven hard-working professionals together every two weeks, even by
email, may be a bit much to ask.
Clarifying when an individual Elder speaks on behalf of the Council is
probably best done by using a special email address, or some Discourse topic
into which only Elders can post. There is an analogy here with the Pope
@ -248,9 +248,9 @@ Python community, and willing to be impartial *while operating as part of
the council*. Council members may be core developers but this is not a requirement.
Everyone in the community should feel represented by the council so it would
be good if the council is diverse:
be good if the council is diverse:
- scientists and technologists,
- scientists and technologists,
- progressives and conservatives (with respect to the Python language),
- people with different cultural backgrounds, genders, age,
- etc
@ -300,7 +300,7 @@ Discussion
This PEP does not handle other roles of the BDFL, only the voting process.
Most importantly, the direction of Python in the long term is not expected
to be handled by the Council of Elders. This falls to the community as a whole
(or to individual members of the community, most likely).
(or to individual members of the community, most likely).
There is also the role of figurehead or spokesperson to represent Python and
the Python community to the outside world. Again, this is *not* a role that