PEP 513: Cover unicode ABI compatibility

This commit is contained in:
Nick Coghlan 2016-01-30 18:45:54 +10:00
parent f9e2c09c98
commit 4132e304ce
1 changed files with 28 additions and 15 deletions

View File

@ -151,8 +151,12 @@ included in the following list: ::
libgthread-2.0.so.0
libglib-2.0.so.0
and (b), work on a stock CentOS 5.11 [6]_ system that contains the system
package manager's provided versions of these libraries.
and, (b) work on a stock CentOS 5.11 [6]_ system that contains the system
package manager's provided versions of these libraries. In addition,
for wheels targeting CPython 3.2 and earlier (including all 2.x
versions), there is an extra requirement that (c) the wheel be
built against a version of CPython compiled with 4-byte unicode
support (i.e. one where ``sys.maxunicode > 0xFFFF``).
Because CentOS 5 is only available for x86_64 and i386 architectures,
these are the only architectures currently supported by the ``manylinux1``
@ -344,9 +348,11 @@ suggest that installation tools should error on the side of assuming
that a system *is* compatible, unless there is specific reason to
think otherwise.
We know of three main sources of potential incompatibility that are likely to
arise in practice:
We know of four main sources of potential incompatibility that are
likely to arise in practice:
* "Narrow" unicode builds of Python 3.2 and earlier (including all
versions of Python 2)
* Eventually, in the future, there may exist distributions that break
compatibility with this profile (e.g., if one of the libraries in
the profile changes its ABI in a backwards-incompatible way)
@ -354,20 +360,22 @@ arise in practice:
* A linux distribution that does not use ``glibc`` (e.g. Alpine Linux, which is
based on musl ``libc``, or Android)
Therefore, we propose a two-pronged approach. To catch the first
case, we standardize a mechanism for a Python distributor to signal
that a particular Python install definitely is or is not compatible
with ``manylinux1``: this is done by installing a module named
``_manylinux``, and setting its ``manylinux1_compatible``
attribute. We do not propose adding any such module to the standard
library -- this is merely a well-known name by which distributors and
installation tools can rendezvous. However, if a distributor does add
this module, *they should add it to the standard library* rather than
to a ``site-packages/`` directory, because the standard library is
Checking for unicode configuration compatibility is straightforward,
but the other cases are more subtle. We propose a two-pronged
approach. To handle potential future incompatibilities, we standardize
a mechanism for a Python distributor to signal that a particular
Python install definitely is or is not compatible with ``manylinux1``:
this is done by installing a module named ``_manylinux``, and setting
its ``manylinux1_compatible`` attribute. We do not propose adding any
such module to the standard library -- this is merely a well-known
name by which distributors and installation tools can
rendezvous. However, if a distributor does add this module, *they
should add it to the standard library* rather than to a
``site-packages/`` directory, because the standard library is
inherited by virtualenvs (which we want), and ``site-packages/`` in
general is not.
Then, to handle the latter two cases for existing Python
Then, to handle the last two cases for existing Python
distributions, we suggest a simple and reliable method to check for
the presence and version of ``glibc`` (basically using it as a "clock"
for the overall age of the distribution).
@ -380,6 +388,11 @@ Specifically, the algorithm we propose is::
if get_platform() not in ["linux_x86_64", "linux_i386"]:
return False
# "wide" Unicode mode is mandatory (always true on CPython 3.3+)
import sys
if sys.maxunicode <= 0xFFFF:
return False
# Check for presence of _manylinux module
try:
import _manylinux