Merge branch 'master' of github.com:python/peps

This commit is contained in:
Barry Warsaw 2017-09-11 11:10:59 -07:00
commit 8fba9ebb22
5 changed files with 752 additions and 2 deletions

View File

@ -6,8 +6,8 @@ Type: Standards Track
Content-Type: text/x-rst
Created: 2017-09-08
Python-Version: 3.7
Resolution: https://mail.python.org/pipermail/python-dev/2017-September/149438.html
Post-History: 2017-09-09
Resolution: https://mail.python.org/pipermail/python-dev/2017-September/149438.html
Abstract
@ -61,7 +61,7 @@ you return ``7`` or ``((7,), {})``? And so on.
The author claims that you won't ever need the return value of ``noop()`` so
it will always return ``None``.
Coghlin's Dialogs (edited for formatting):
Coghlan's Dialogs (edited for formatting):
My counterargument to this would be ``map(noop, iterable)``,
``sorted(iterable, key=noop)``, etc. (``filter``, ``max``, and

202
pep-0560.rst Normal file
View File

@ -0,0 +1,202 @@
PEP: 560
Title: Core support for generic types
Author: Ivan Levkivskyi <levkivskyi@gmail.com>
Status: Draft
Type: Standards Track
Content-Type: text/x-rst
Created: 03-Sep-2017
Python-Version: 3.7
Post-History: 09-Sep-2017
Abstract
========
Initially PEP 484 was designed in such way that it would not introduce
*any* changes to the core CPython interpreter. Now type hints and
the ``typing`` module are extensively used by the community, e.g. PEP 526
and PEP 557 extend the usage of type hints, and the backport of ``typing``
on PyPI has 1M downloads/moth. Therefore, this restriction can be removed.
It is proposed to add two special methods ``__class_getitem__`` and
``__subclass_base__`` to the core CPython for better support of
generic types.
Rationale
=========
The restriction to not modify the core CPython interpreter lead to some
design decisions that became questionable when the ``typing`` module started
to be widely used. There are three main points of concerns:
performance of the ``typing`` module, metaclass conflicts, and the large
number of hacks currently used in ``typing``.
Performance:
------------
The ``typing`` module is one of the heaviest and slowest modules in
the standard library even with all the optimizations made. Mainly this is
because subscripted generic types (see PEP 484 for definition of terms
used in this PEP) are class objects (see also [1]_). The three main ways how
the performance can be improved with the help of the proposed special methods:
- Creation of generic classes is slow since the ``GenericMeta.__new__`` is
very slow; we will not need it anymore.
- Very long MROs for generic classes will be twice shorter; they are present
because we duplicate the ``collections.abc`` inheritance chain
in ``typing``.
- Time of instantiation of generic classes will be improved
(this is minor however).
Metaclass conflicts:
--------------------
All generic types are instances of ``GenericMeta``, so if a user uses
a custom metaclass, then it is hard to make a corresponding class generic.
This is particularly hard for library classes that a user doesn't control.
A workaround is to always mix-in ``GenericMeta``::
class AdHocMeta(GenericMeta, LibraryMeta):
pass
class UserClass(LibraryBase, Generic[T], metaclass=AdHocMeta):
...
but this is not always practical or even possible. With the help of the
proposed special attributes the ``GenericMeta`` metaclass will not be needed.
Hacks and bugs that will be removed by this proposal:
-----------------------------------------------------
- ``_generic_new`` hack that exists since ``__init__`` is not called on
instances with a type differing form the type whose ``__new__`` was called,
``C[int]().__class__ is C``.
- ``_next_in_mro`` speed hack will be not necessary since subscription will
not create new classes.
- Ugly ``sys._getframe`` hack, this one is particularly nasty, since it looks
like we can't remove it without changes outside ``typing``.
- Currently generics do dangerous things with private ABC caches
to fix large memory consumption that grows at least as O(N\ :sup:`2`),
see [2]_. This point is also important because it was recently proposed to
re-implement ``ABCMeta`` in C.
- Problems with sharing attributes between subscripted generics,
see [3]_. Current solution already uses ``__getattr__`` and ``__setattr__``,
but it is still incomplete, and solving this without the current proposal
will be hard and will need ``__getattribute__``.
- ``_no_slots_copy`` hack, where we clean-up the class dictionary on every
subscription thus allowing generics with ``__slots__``.
- General complexity of the ``typing`` module, the new proposal will not
only allow to remove the above mentioned hacks/bugs, but also simplify
the implementation, so that it will be easier to maintain.
Specification
=============
The idea of ``__class_getitem__`` is simple: it is an exact analog of
``__getitem__`` with an exception that it is called on a class that
defines it, not on its instances, this allows us to avoid
``GenericMeta.__getitem__`` for things like ``Iterable[int]``.
The ``__class_getitem__`` is automatically a class method and
does not require ``@classmethod`` decorator (similar to
``__init_subclass__``) and is inherited like normal attributes.
For example::
class MyList:
def __getitem__(self, index):
return index + 1
def __class_getitem__(cls, item):
return f"{cls.__name__}[{item.__name__}]"
class MyOtherList(MyList):
pass
assert MyList()[0] == 1
assert MyList[int] == "MyList[int]"
assert MyOtherList()[0] == 1
assert MyOtherList[int] == "MyOtherList[int]"
Note that this method is used as a fallback, so if a metaclass defines
``__getitem__``, then that will have the priority.
If an object that is not a class object appears in the bases of a class
definition, the ``__subclass_base__`` is searched on it. If found,
it is called with the original tuple of bases as an argument. If the result
of the call is not ``None``, then it is substituted instead of this object.
Otherwise (if the result is ``None``), the base is just removed. This is
necessary to avoid inconsistent MRO errors, that are currently prevented by
manipulations in ``GenericMeta.__new__``. After creating the class,
the original bases are saved in ``__orig_bases__`` (currently this is also
done by the metaclass).
NOTE: These two method names are reserved for exclusive use by
the ``typing`` module and the generic types machinery, and any other use is
strongly discouraged. The reference implementation (with tests) can be found
in [4]_, the proposal was originally posted and discussed on
the ``typing`` tracker, see [5]_.
Backwards compatibility and impact on users who don't use ``typing``:
=====================================================================
This proposal may break code that currently uses the names
``__class_getitem__`` and ``__subclass_base__``.
This proposal will support almost complete backwards compatibility with
the current public generic types API; moreover the ``typing`` module is still
provisional. The only two exceptions are that currently
``issubclass(List[int], List)`` returns True, with this proposal it will raise
``TypeError``. Also ``issubclass(collections.abc.Iterable, typing.Iterable)``
will return ``False``, which is probably desirable, since currently we have
a (virtual) inheritance cycle between these two classes.
With the reference implementation I measured negligible performance effects
(under 1% on a micro-benchmark) for regular (non-generic) classes.
References
==========
.. [1] Discussion following Mark Shannon's presentation at Language Summit
(https://github.com/python/typing/issues/432)
.. [2] Pull Request to implement shared generic ABC caches
(https://github.com/python/typing/pull/383)
.. [3] An old bug with setting/accessing attributes on generic types
(https://github.com/python/typing/issues/392)
.. [4] The reference implementation
(https://github.com/ilevkivskyi/cpython/pull/2/files)
.. [5] Original proposal
(https://github.com/python/typing/issues/468)
Copyright
=========
This document has been placed in the public domain.
..
Local Variables:
mode: indented-text
indent-tabs-mode: nil
sentence-end-double-space: t
fill-column: 70
coding: utf-8
End:

179
pep-0561.rst Normal file
View File

@ -0,0 +1,179 @@
PEP: 561
Title: Distributing and Packaging Type Information
Author: Ethan Smith <ethan@ethanhs.me>
Status: Draft
Type: Standards Track
Content-Type: text/x-rst
Created: 09-Sep-2017
Python-Version: 3.7
Post-History:
Abstract
========
PEP 484 introduced type hints to Python, with goals of making typing
gradual and easy to adopt. Currently, typing information must be distributed
manually. This PEP provides a standardized means to package and distribute
type information and an ordering for type checkers to resolve modules and
collect this information for type checking using existing packaging
architecture.
Rationale
=========
PEP 484 has a brief section on distributing typing information. In this
section [1]_ the PEP recommends using ``shared/typehints/pythonX.Y/`` for
shipping stub files. However, manually adding a path to stub files for each
third party library does not scale. The simplest approach people have taken
is to add ``site-packages`` to their ``PYTHONPATH``, but this causes type
checkers to fail on packages that are highly dynamic (e.g. sqlalchemy
and Django).
Furthermore, package authors are wishing to distribute code that has
inline type information, and there currently is no standard method to
distribute packages with inline type annotations or syntax that can
simultaneously be used at runtime and in type checking.
Specification
=============
There are several motivations and methods of supporting typing in a package. This PEP recognizes three (3) types of packages that may be created:
1. The package maintainer would like to add type information inline.
2. The package maintainer would like to add type information via stubs.
3. A third party would like to share stub files for a package, but the
maintainer does not want to include them in the source of the package.
This PEP aims to support these scenarios and make them simple to add to
packaging and deployment.
The two major parts of this specification are the packaging specifications
and the resolution order for resolving module type information. This spec
is meant to replace the ``shared/typehints/pythonX.Y/`` spec of PEP 484 [1]_.
Packaging Type Information
--------------------------
Packages must opt into supporting typing. This will be done though a distutils
extension [2]_, providing a ``typed`` keyword argument to the distutils
``setup()`` command. The argument value will depend on the kind of type
information the package provides. The distutils extension will be added to the
``typing`` package. Therefore a package maintainer may write
::
setup(
...
setup_requires=["typing"],
typed="inline",
...
)
Inline Typed Packages
'''''''''''''''''''''
Packages that have inline type annotations simply have to pass the value
``"inline"`` to the ``typed`` argument in ``setup()``.
Stub Only Packages
''''''''''''''''''
For package maintainers wishing to ship stub files containing all of their
type information, it is prefered that the ``*.pyi`` stubs are alongside the
corresponding ``*.py`` files. However, the stubs may be put in a sub-folder
of the Python sources, with the same name the ``*.py`` files are in. For
example, the ``flyingcircus`` package would have its stubs in the folder
``flyingcircus/flyingcircus/``. This path is chosen so that if stubs are
not found in ``flyingcircus/`` the type checker may treat the subdirectory as
a normal package. The normal resolution order of checking ``*.pyi`` before
``*.py`` will be maintained. The value of the ``typed`` argument to
``setup()`` is ``"stubs"`` for this type of distribution. The author of the
package is suggested to use ``package_data`` to assure the stub files are
installed alongside the runtime Python code.
Third Party Stub Packages
'''''''''''''''''''''''''
Third parties seeking to distribute stub files are encouraged to contact the
maintainer of the package about distribution alongside the package. If the
maintainer does not wish to maintain or package stub files or type information
inline, then a "third party stub package" should be created. The structure is
similar, but slightly different from that of stub only packages. If the stubs
are for the library ``flyingcircus`` then the package should be named
``flyingcircus-stubs`` and the stub files should be put in a sub-directory
named ``flyingcircus``. This allows the stubs to be checked as if they were in
a regular package. These packages should also pass ``"stubs"`` as the value
of ``typed`` argument in ``setup()``. These packages are suggested to use
``package_data`` to package stub files.
The version of the ``flyingcircus-stubs`` package should match the version of
the ``flyingcircus`` package it is providing types for.
Type Checker Module Resolution Order
------------------------------------
The following is the order that type checkers supporting this PEP should
resolve modules containing type information:
1. User code - the files the type checker is running on.
2. Stubs or Python source in ``PYTHONPATH``. This is to allow the user
complete control of which stubs to use, and patch broken stubs/inline
types from packages.
3. Third party stub packages - these packages can supersede the installed
untyped packages. They can be found at ``pkg-stubs`` for package ``pkg``,
however it is encouraged to check their metadata to confirm that they opt
into type checking.
4. Inline packages - finally, if there is nothing overriding the installed
package, and it opts into type checking.
5. Typeshed (if used) - Provides the stdlib types and several third party libraries
When resolving step (3) type checkers should assure the version of the stubs
match the installed runtime package.
Type checkers that check a different Python version than the version they run
on must find the type information in the ``site-packages``/``dist-packages``
of that Python version. This can be queried e.g.
``pythonX.Y -c 'import sys; print(sys.exec_prefix)'``. It is also recommended
that the type checker allow for the user to point to a particular Python
binary, in case it is not in the path.
To check if a package has opted into type checking, type checkers are
recommended to use the ``pkg_resources`` module to query the package
metadata. If the ``typed`` package metadata has ``None`` as its value, the
package has not opted into type checking, and the type checker should skip that
package.
References
==========
.. [1] PEP 484, Storing and Distributing Stub Files
(https://www.python.org/dev/peps/pep-0484/#storing-and-distributing-stub-files)
.. [2] Distutils Extensions, Adding setup() arguments
(http://setuptools.readthedocs.io/en/latest/setuptools.html#adding-setup-arguments)
Copyright
=========
This document has been placed in the public domain.
..
Local Variables:
mode: indented-text
indent-tabs-mode: nil
sentence-end-double-space: t
fill-column: 70
coding: utf-8
End:

105
pep-0562.rst Normal file
View File

@ -0,0 +1,105 @@
PEP: 562
Title: Module __getattr__
Author: Ivan Levkivskyi <levkivskyi@gmail.com>
Status: Draft
Type: Standards Track
Content-Type: text/x-rst
Created: 09-Sep-2017
Python-Version: 3.7
Post-History: 09-Sep-2017
Abstract
========
It is proposed to support ``__getattr__`` function defined on modules to
provide basic customization of module attribute access.
Rationale
=========
It is sometimes convenient to customize or otherwise have control over
access to module attributes. A typical example is managing deprecation
warnings. Typical workarounds are assigning ``__class__`` of a module object
to a custom subclass of ``types.ModuleType`` or substituting ``sys.modules``
item with a custom wrapper instance. It would be convenient to simplify this
procedure by recognizing ``__getattr__`` defined directly in a module that
would act like a normal ``__getattr__`` method, except that it will be defined
on module *instances*. For example::
# lib.py
from warnings import warn
deprecated_names = ["old_function", ...]
def _deprecated_old_function(arg, other):
...
def __getattr__(name):
if name in deprecated_names:
warn(f"{name} is deprecated", DeprecationWarning)
return globals()[f"_deprecated_{name}"]
raise AttributeError(f"module {__name__} has no attribute {name}")
# main.py
from lib import old_function # Works, but emits the warning
There is a related proposal PEP 549 that proposes to support instance
properties for a similar functionality. The difference is this PEP proposes
a faster and simpler mechanism, but provides more basic customization.
An additional motivation for this proposal is that PEP 484 already defines
the use of module ``__getattr__`` for this purpose in Python stub files,
see [1]_.
Specification
=============
The ``__getattr__`` function at the module level should accept one argument
which is a name of an attribute and return the computed value or raise
an ``AttributeError``::
def __getattr__(name: str) -> Any: ...
This function will be called only if ``name`` is not found in the module
through the normal attribute lookup.
The reference implementation for this PEP can be found in [2]_.
Backwards compatibility and impact on performance
=================================================
This PEP may break code that uses module level (global) name ``__getattr__``.
The performance implications of this PEP are minimal, since ``__getattr__``
is called only for missing attributes.
References
==========
.. [1] PEP 484 section about ``__getattr__`` in stub files
(https://www.python.org/dev/peps/pep-0484/#stub-files)
.. [2] The reference implementation
(https://github.com/ilevkivskyi/cpython/pull/3/files)
Copyright
=========
This document has been placed in the public domain.
..
Local Variables:
mode: indented-text
indent-tabs-mode: nil
sentence-end-double-space: t
fill-column: 70
coding: utf-8
End:

264
pep-0563.rst Normal file
View File

@ -0,0 +1,264 @@
PEP: 563
Title: Postponed Evaluation of Annotations
Version: $Revision$
Last-Modified: $Date$
Author: Łukasz Langa <lukasz@langa.pl>
Discussions-To: Python-Dev <python-dev@python.org>
Status: Draft
Type: Standards Track
Content-Type: text/x-rst
Created: 8-Sep-2017
Python-Version: 3.7
Post-History:
Resolution:
Abstract
========
PEP 3107 introduced syntax for function annotations, but the semantics
were deliberately left undefined. PEP 484 introduced a standard meaning
to annotations: type hints. PEP 526 defined variable annotations,
explicitly tying them with the type hinting use case.
This PEP proposes changing function annotations and variable annotations
so that they are no longer evaluated at function definition time.
Instead, they are preserved in ``__annotations__`` in string form.
This change is going to be introduced gradually, starting with a new
``__future__`` import in Python 3.7.
Rationale and Goals
===================
PEP 3107 added support for arbitrary annotations on parts of a function
definition. Just like default values, annotations are evaluated at
function definition time. This creates a number of issues for the type
hinting use case:
* forward references: when a type hint contains names that have not been
defined yet, that definition needs to be expressed as a string
literal;
* type hints are executed at module import time, which is not
computationally free.
Postponing the evaluation of annotations solves both problems.
Non-goals
---------
Just like in PEP 484 and PEP 526, it should be emphasized that **Python
will remain a dynamically typed language, and the authors have no desire
to ever make type hints mandatory, even by convention.**
Annotations are still available for arbitrary use besides type checking.
Using ``@typing.no_type_hints`` in this case is recommended to
disambiguate the use case.
Implementation
==============
In a future version of Python, function and variable annotations will no
longer be evaluated at definition time. Instead, a string form will be
preserved in the respective ``__annotations__`` dictionary. Static type
checkers will see no difference in behavior, whereas tools using
annotations at runtime will have to perform postponed evaluation.
If an annotation was already a string, this string is preserved
verbatim. In other cases, the string form is obtained from the AST
during the compilation step, which means that the string form preserved
might not preserve the exact formatting of the source.
Annotations need to be syntactically valid Python expressions, also when
passed as literal strings (i.e. ``compile(literal, '', 'eval')``).
Annotations can only use names present in the module scope as postponed
evaluation using local names is not reliable.
Note that as per PEP 526, local variable annotations are not evaluated
at all since they are not accessible outside of the function's closure.
Enabling the future behavior in Python 3.7
------------------------------------------
The functionality described above can be enabled starting from Python
3.7 using the following special import::
from __future__ import annotations
Resolving Type Hints at Runtime
===============================
To resolve an annotation at runtime from its string form to the result
of the enclosed expression, user code needs to evaluate the string.
For code that uses type hints, the ``typing.get_type_hints()`` function
correctly evaluates expressions back from its string form. Note that
all valid code currently using ``__annotations__`` should already be
doing that since a type annotation can be expressed as a string literal.
For code which uses annotations for other purposes, a regular
``eval(ann, globals, locals)`` call is enough to resolve the
annotation. The trick here is to get the correct value for globals.
Fortunately, in the case of functions, they hold a reference to globals
in an attribute called ``__globals__``. To get the correct module-level
context to resolve class variables, use::
cls_globals = sys.modules[SomeClass.__module__].__dict__
Runtime annotation resolution and class decorators
--------------------------------------------------
Metaclasses and class decorators that need to resolve annotations for
the current class will fail for annotations that use the name of the
current class. Example::
def class_decorator(cls):
annotations = get_type_hints(cls) # raises NameError on 'C'
print(f'Annotations for {cls}: {annotations}')
return cls
@class_decorator
class C:
singleton: 'C' = None
This was already true before this PEP. The class decorator acts on
the class before it's assigned a name in the current definition scope.
The situation is made somewhat stricter when class-level variables are
considered. Previously, when the string form wasn't used in annotations,
a class decorator would be able to cover situations like::
@class_decorator
class Restaurant:
class MenuOption(Enum):
SPAM = 1
EGGS = 2
default_menu: List[MenuOption] = []
This is no longer possible.
Runtime annotation resolution and ``TYPE_CHECKING``
---------------------------------------------------
Sometimes there's code that must be seen by a type checker but should
not be executed. For such situations the ``typing`` module defines a
constant, ``TYPE_CHECKING``, that is considered ``True`` during type
checking but ``False`` at runtime. Example::
import typing
if typing.TYPE_CHECKING:
import expensive_mod
def a_func(arg: expensive_mod.SomeClass) -> None:
a_var: expensive_mod.SomeClass = arg
...
This approach is also useful when handling import cycles.
Trying to resolve annotations of ``a_func`` at runtime using
``typing.get_type_hints()`` will fail since the name ``expensive_mod``
is not defined (``TYPE_CHECKING`` variable being ``False`` at runtime).
This was already true before this PEP.
Backwards Compatibility
=======================
This is a backwards incompatible change. Applications depending on
arbitrary objects to be directly present in annotations will break
if they are not using ``typing.get_type_hints()`` or ``eval()``.
Annotations that depend on locals at the time of the function/class
definition are now invalid. Example::
def generate_class():
some_local = datetime.datetime.now()
class C:
field: some_local = 1 # NOTE: INVALID ANNOTATION
def method(self, arg: some_local.day) -> None: # NOTE: INVALID ANNOTATION
...
Annotations using nested classes and their respective state are still
valid, provided they use the fully qualified name. Example::
class C:
field = 'c_field'
def method(self, arg: C.field) -> None: # this is OK
...
class D:
field2 = 'd_field'
def method(self, arg: C.field -> C.D.field2: # this is OK
...
In the presence of an annotation that cannot be resolved using the
current module's globals, a NameError is raised at compile time.
Deprecation policy
------------------
In Python 3.7, a ``__future__`` import is required to use the described
functionality and a ``PendingDeprecationWarning`` is raised by the
compiler in the presence of type annotations in modules without the
``__future__`` import. In Python 3.8 the warning becomes a
``DeprecationWarning``. In the next version this will become the
default behavior.
Rejected Ideas
==============
Keep the ability to use local state when defining annotations
-------------------------------------------------------------
With postponed evaluation, this is impossible for function locals. For
classes, it would be possible to keep the ability to define annotations
using the local scope. However, when using ``eval()`` to perform the
postponed evaluation, we need to provide the correct globals and locals
to the ``eval()`` call. In the face of nested classes, the routine to
get the effective "globals" at definition time would have to look
something like this::
def get_class_globals(cls):
result = {}
result.update(sys.modules[cls.__module__].__dict__)
for child in cls.__qualname__.split('.'):
result.update(result[child].__dict__)
return result
This is brittle and doesn't even cover slots. Requiring the use of
module-level names simplifies runtime evaluation and provides the
"one obvious way" to read annotations. It's the equivalent of absolute
imports.
Acknowledgements
================
This document could not be completed without valuable input,
encouragement and advice from Guido van Rossum, Jukka Lehtosalo, and
Ivan Levkivskyi.
Copyright
=========
This document has been placed in the public domain.
..
Local Variables:
mode: indented-text
indent-tabs-mode: nil
sentence-end-double-space: t
fill-column: 70
coding: utf-8
End: