Update PEP 517 to use pyproject.toml from PEP 518 (#51)

Update PEP 517 to use pyproject.toml from PEP 518
This commit is contained in:
Thomas Kluyver 2016-09-22 12:39:58 +01:00 committed by Nick Coghlan
parent 3b87611b4b
commit 94dbee096b
1 changed files with 53 additions and 66 deletions

View File

@ -2,7 +2,8 @@ PEP: 517
Title: A build-system independent format for source trees Title: A build-system independent format for source trees
Version: $Revision$ Version: $Revision$
Last-Modified: $Date$ Last-Modified: $Date$
Author: Nathaniel J. Smith <njs@pobox.com> Author: Nathaniel J. Smith <njs@pobox.com>,
Thomas Kluyver <thomas@kluyver.me.uk>
BDFL-Delegate: Nick Coghlan <ncoghlan@gmail.com> BDFL-Delegate: Nick Coghlan <ncoghlan@gmail.com>
Discussions-To: <distutils-sig@python.org> Discussions-To: <distutils-sig@python.org>
Status: Draft Status: Draft
@ -99,63 +100,55 @@ specification is encoded in the source code and documentation of
``distutils``, ``setuptools``, ``pip``, and other tools. We'll refer ``distutils``, ``setuptools``, ``pip``, and other tools. We'll refer
to it as the ``setup.py``\-style. to it as the ``setup.py``\-style.
Here we define a new ``pypackage.json``\-style source tree. This Here we define a new style of source tree based around the
consists of any directory which contains a file named ``pyproject.toml`` file defined in PEP 518, extending the
``pypackage.json``. (If a tree contains both ``pypackage.json`` and ``[build-system]`` table in that file with one additional key,
``setup.py`` then it is a ``pypackage.json``\-style source tree, and ``build_backend``. Here's an example of how it would look::
``pypackage.json``\-aware tools should ignore the ``setup.py``; this
allows packages to include a ``setup.py`` for compatibility with old
build frontends, while using the new system with new build frontends.)
This file has the following schema. Extra keys are ignored. [build-system]
# Defined by PEP 518:
requires = ["flit"]
# Defined by this PEP:
build_backend = "flit.api:main"
schema ``build_backend`` is a string naming a Python object that will be
The version of the schema. This PEP defines version "1". Defaults to "1" used to perform the build (see below for details). This is formatted
when absent. All tools reading the file MUST error on an unrecognised following the same ``module:object`` syntax as a ``setuptools`` entry
schema version. point. For instance, if the string is ``"flit.api:main"`` as in the
example above, this object would be looked up by executing the
equivalent of::
bootstrap_requires import flit.api
Optional list of PEP 508 dependency specifications that the backend = flit.api.main
build frontend must ensure are available before invoking the build
backend. For instance, if using flit, then the requirements might
be::
"bootstrap_requires": ["flit"] It's also legal to leave out the ``:object`` part, e.g. ::
build_backend build_backend = "flit.api"
A mandatory string naming a Python object that will be used to
perform the build (see below for details). This is formatted
following the same ``module:object`` syntax as a ``setuptools``
entry point. For instance, if using flit, then the build system
might be specified as::
"build_system": "flit.api:main" which acts like::
and this object would be looked up by executing the equivalent of:: import flit.api
backend = flit.api
import flit.api Formally, the string should satisfy this grammar::
backend = flit.api.main
It's also legal to leave out the ``:object`` part, e.g. :: identifier = (letter | '_') (letter | '_' | digit)*
module_path = identifier ('.' identifier)*
object_path = identifier ('.' identifier)*
entry_point = module_path (':' object_path)?
"build_system": "flit.api" And we import ``module_path`` and then lookup
``module_path.object_path`` (or just ``module_path`` if
``object_path`` is missing).
which acts like:: If the ``pyproject.toml`` file is absent, or the ``build_backend``
key is missing, the source tree is not using this specification, and
import flit.api tools should fall back to running ``setup.py``.
backend = flit.api
Formally, the string should satisfy this grammar::
identifier = (letter | '_') (letter | '_' | digit)*
module_path = identifier ('.' identifier)*
object_path = identifier ('.' identifier)*
entry_point = module_path (':' object_path)?
And we import ``module_path`` and then lookup
``module_path.object_path`` (or just ``module_path`` if
``object_path`` is missing).
Where the ``build_backend`` key exists, it takes precedence over
``setup.py``, and source trees need not include ``setup.py`` at all.
Projects may still wish to include a ``setup.py`` for compatibility
with tools that do not use this spec.
========================= =========================
Build backend interface Build backend interface
@ -170,7 +163,7 @@ argument is described after the individual hooks::
This hook MUST return an additional list of strings containing PEP 508 This hook MUST return an additional list of strings containing PEP 508
dependency specifications, above and beyond those specified in the dependency specifications, above and beyond those specified in the
``pypackage.json`` file. Example:: ``pyproject.toml`` file. Example::
def get_build_requires(config_settings): def get_build_requires(config_settings):
return ["wheel >= 0.25", "setuptools"] return ["wheel >= 0.25", "setuptools"]
@ -304,11 +297,11 @@ following criteria:
- The ``get_build_requires`` hook is executed in an environment - The ``get_build_requires`` hook is executed in an environment
which contains the bootstrap requirements specified in the which contains the bootstrap requirements specified in the
``pypackage.json`` file. ``pyproject.toml`` file.
- All other hooks are executed in an environment which contains both - All other hooks are executed in an environment which contains both
the bootstrap requirements specified in the ``pypackage.json`` hook the bootstrap requirements specified in the ``pyproject.toml``
and those specified by the ``get_build_requires`` hook. hook and those specified by the ``get_build_requires`` hook.
- This must remain true even for new Python subprocesses spawned by - This must remain true even for new Python subprocesses spawned by
the build environment, e.g. code like:: the build environment, e.g. code like::
@ -370,8 +363,8 @@ explicitly requested build-dependencies. This has two benefits:
However, there will also be situations where build-requirements are However, there will also be situations where build-requirements are
problematic in various ways. For example, a package author might problematic in various ways. For example, a package author might
accidentally leave off some crucial requirement despite our best accidentally leave off some crucial requirement despite our best
efforts; or, a package might declare a build-requirement on `foo >= efforts; or, a package might declare a build-requirement on ``foo >=
1.0` which worked great when 1.0 was the latest version, but now 1.1 1.0`` which worked great when 1.0 was the latest version, but now 1.1
is out and it has a showstopper bug; or, the user might decide to is out and it has a showstopper bug; or, the user might decide to
build a package against numpy==1.7 -- overriding the package's build a package against numpy==1.7 -- overriding the package's
preferred numpy==1.8 -- to guarantee that the resulting build will be preferred numpy==1.8 -- to guarantee that the resulting build will be
@ -399,7 +392,7 @@ undefined, but basically comes down to: a file named
``{NAME}-{VERSION}.{EXT}``, which unpacks into a buildable source tree ``{NAME}-{VERSION}.{EXT}``, which unpacks into a buildable source tree
called ``{NAME}-{VERSION}/``. Traditionally these have always called ``{NAME}-{VERSION}/``. Traditionally these have always
contained ``setup.py``\-style source trees; we now allow them to also contained ``setup.py``\-style source trees; we now allow them to also
contain ``pypackage.json``\-style source trees. contain ``pyproject.toml``\-style source trees.
Integration frontends require that an sdist named Integration frontends require that an sdist named
``{NAME}-{VERSION}.{EXT}`` will generate a wheel named ``{NAME}-{VERSION}.{EXT}`` will generate a wheel named
@ -410,9 +403,8 @@ Integration frontends require that an sdist named
Comparison to competing proposals Comparison to competing proposals
=================================== ===================================
The primary difference between this and competing proposals (`in The primary difference between this and competing proposals (in
particular particular, PEP 516) is
<https://github.com/pypa/interoperability-peps/pull/54/files>`_) is
that our build backend is defined via a Python hook-based interface that our build backend is defined via a Python hook-based interface
rather than a command-line based interface. rather than a command-line based interface.
@ -580,7 +572,7 @@ Furthermore, our mechanism should also fulfill two more goals: (a) If
new versions of e.g. ``pip`` and ``flit`` are both updated to support new versions of e.g. ``pip`` and ``flit`` are both updated to support
the new interface, then this should be sufficient for it to be used; the new interface, then this should be sufficient for it to be used;
in particular, it should *not* be necessary for every project that in particular, it should *not* be necessary for every project that
*uses* ``flit`` to update its individual ``pypackage.json`` file. (b) *uses* ``flit`` to update its individual ``pyproject.toml`` file. (b)
We do not want to have to spawn extra processes just to perform this We do not want to have to spawn extra processes just to perform this
negotiation, because process spawns can easily become a bottleneck when negotiation, because process spawns can easily become a bottleneck when
deploying large multi-package stacks on some platforms (Windows). deploying large multi-package stacks on some platforms (Windows).
@ -601,11 +593,10 @@ process, it can easily write it to do something like::
In the alternative where the public interface boundary is placed at In the alternative where the public interface boundary is placed at
the subprocess call, this is not possible -- either we need to spawn the subprocess call, this is not possible -- either we need to spawn
an extra process just to query what interfaces are supported (as was an extra process just to query what interfaces are supported (as was
included in an earlier version of `this alternative PEP included in an earlier draft of PEP 516, an alternative to this), or
<https://github.com/pypa/interoperability-peps/pull/54/files>`_), or
else we give up on autonegotiation entirely (as in the current version else we give up on autonegotiation entirely (as in the current version
of that PEP), meaning that any changes in the interface will require of that PEP), meaning that any changes in the interface will require
N individual packages to update their ``pypackage.json`` files before N individual packages to update their ``pyproject.toml`` files before
any change can go live, and that any changes will necessarily be any change can go live, and that any changes will necessarily be
restricted to new releases. restricted to new releases.
@ -658,10 +649,6 @@ above, there are a few other differences in this proposal:
step and the wheel building step. I guess everyone probably will step and the wheel building step. I guess everyone probably will
agree this is a good idea? agree this is a good idea?
* We call our config file ``pypackage.json`` instead of
``pypa.json``. This is because it describes a package, rather than
describing a packaging authority. But really, who cares.
* We provide more detailed recommendations about the build environment, * We provide more detailed recommendations about the build environment,
but these aren't normative anyway. but these aren't normative anyway.
@ -673,7 +660,7 @@ above, there are a few other differences in this proposal:
A goal here is to make it as simple as possible to convert old-style A goal here is to make it as simple as possible to convert old-style
sdists to new-style sdists. (E.g., this is one motivation for sdists to new-style sdists. (E.g., this is one motivation for
supporting dynamic build requirements.) The ideal would be that there supporting dynamic build requirements.) The ideal would be that there
would be a single static pypackage.json that could be dropped into any would be a single static ``pyproject.toml`` that could be dropped into any
"version 0" VCS checkout to convert it to the new shiny. This is "version 0" VCS checkout to convert it to the new shiny. This is
probably not 100% possible, but we can get close, and it's important probably not 100% possible, but we can get close, and it's important
to keep track of how close we are... hence this section. to keep track of how close we are... hence this section.
@ -700,7 +687,7 @@ automatically upgrade packages to the new format:
check whether they do this, and if so then when upgrading to the check whether they do this, and if so then when upgrading to the
new system they will have to start explicitly declaring these new system they will have to start explicitly declaring these
dependencies (either via ``setup_requires=`` or via static dependencies (either via ``setup_requires=`` or via static
declaration in ``pypackage.json``). declaration in ``pyproject.toml``).
2) There currently exist packages which do not declare consistent 2) There currently exist packages which do not declare consistent
metadata (e.g. ``egg_info`` and ``bdist_wheel`` might get different metadata (e.g. ``egg_info`` and ``bdist_wheel`` might get different