Fix typos (#1113)
This commit is contained in:
parent
5f39f06484
commit
cfb7bd74db
|
@ -660,7 +660,7 @@ at the moment, sorry.
|
|||
|
||||
To do these steps, you must have the permission to edit the website. If you
|
||||
don't have that, ask someone on pydotorg@python.org for the proper
|
||||
permissions. (Or ask Ewa, who coordinated the effort for the new newbsite
|
||||
permissions. (Or ask Ewa, who coordinated the effort for the new website
|
||||
with RevSys.)
|
||||
|
||||
- Log in to https://www.python.org/admin .
|
||||
|
|
|
@ -281,7 +281,7 @@ References
|
|||
.. [2] http://www.haskell.org/onlinereport/standard-prelude.html#$vzip
|
||||
|
||||
|
||||
Greg Wilson's questionaire on proposed syntax to some CS grad students
|
||||
Greg Wilson's questionnaire on proposed syntax to some CS grad students
|
||||
http://www.python.org/pipermail/python-dev/2000-July/013139.html
|
||||
|
||||
|
||||
|
|
|
@ -125,7 +125,7 @@ underlying object, and proxy objects which masquerade as the
|
|||
original objects as much as possible.
|
||||
|
||||
Reference objects are easy to work with when some additional layer
|
||||
of object managemenet is being added in Python; references can be
|
||||
of object management is being added in Python; references can be
|
||||
checked for liveness explicitly, without having to invoke
|
||||
operations on the referents and catching some special exception
|
||||
raised when an invalid weak reference is used.
|
||||
|
|
|
@ -90,7 +90,7 @@ Proposed Resolutions
|
|||
1. Full backwards compatibility can be achieved as follows. When
|
||||
an object defines ``tp_compare()`` but not ``tp_richcompare()``, and a
|
||||
rich comparison is requested, the outcome of ``tp_compare()`` is
|
||||
used in the ovious way. E.g. if "<" is requested, an exception if
|
||||
used in the obvious way. E.g. if "<" is requested, an exception if
|
||||
``tp_compare()`` raises an exception, the outcome is 1 if
|
||||
``tp_compare()`` is negative, and 0 if it is zero or positive. Etc.
|
||||
|
||||
|
|
|
@ -160,7 +160,7 @@ The trailing slash would cause the Python compiler to concatenate
|
|||
the attribute value and the docstring.
|
||||
|
||||
A modern syntax highlighting editor would easily make this
|
||||
accident visible, though, and by simply inserting emtpy lines
|
||||
accident visible, though, and by simply inserting empty lines
|
||||
between the attribute definition and the docstring you can avoid
|
||||
the possible concatenation completely, so the problem is
|
||||
negligible.
|
||||
|
|
|
@ -73,7 +73,7 @@ name ``__future__``::
|
|||
feature: identifier
|
||||
name: identifier
|
||||
|
||||
In addition, all future_statments must appear near the top of the module. The
|
||||
In addition, all future_statements must appear near the top of the module. The
|
||||
only lines that can appear before a future_statement are:
|
||||
|
||||
+ The module docstring (if any).
|
||||
|
@ -167,7 +167,7 @@ Standard Module __future__.py
|
|||
|
||||
3. To document when incompatible changes were introduced, and when they will
|
||||
be-- or were --made mandatory. This is a form of executable documentation,
|
||||
and can be inspected programatically via importing ``__future__`` and
|
||||
and can be inspected programmatically via importing ``__future__`` and
|
||||
examining its contents.
|
||||
|
||||
Each statement in ``__future__.py`` is of the form::
|
||||
|
|
|
@ -187,7 +187,7 @@ CountFishInterface and ColorFishInterface::
|
|||
def buySomeFish(quantity=1):
|
||||
"Buy some fish at the market"
|
||||
|
||||
The FishMarketInteface extends upon the CountFishInterface and
|
||||
The FishMarketInterface extends upon the CountFishInterface and
|
||||
ColorfishInterface.
|
||||
|
||||
|
||||
|
@ -293,7 +293,7 @@ or a tuple of interface assertions. For example::
|
|||
|
||||
FooInterface
|
||||
|
||||
FooInterface, (BarInteface, BobInterface)
|
||||
FooInterface, (BarInterface, BobInterface)
|
||||
|
||||
FooInterface, (BarInterface, (BobInterface, MyClass.__implements__))
|
||||
|
||||
|
|
|
@ -54,7 +54,7 @@ In an attempt to gauge the effect of this proposal, I modified the Pystone
|
|||
benchmark program included in the Python distribution to cache global
|
||||
functions. Its main function, ``Proc0``, makes calls to ten different
|
||||
functions inside its ``for`` loop. In addition, ``Func2`` calls ``Func1``
|
||||
repeatedly inside a loop. If local copies of these 11 global idenfiers are
|
||||
repeatedly inside a loop. If local copies of these 11 global identifiers are
|
||||
made before the functions' loops are entered, performance on this particular
|
||||
benchmark improves by about two percent (from 5561 pystones to 5685 on my
|
||||
laptop). It gives some indication that performance would be improved by
|
||||
|
|
|
@ -120,7 +120,7 @@ well-known immutable types (strings, unicode, numbers) and
|
|||
use the hash table for finding the right opcode snippet. If
|
||||
this condition is not met, the interpreter should revert to
|
||||
the standard if-elif-else processing by simply skipping the
|
||||
SWITCH opcode and procedding with the usual if-elif-else byte
|
||||
SWITCH opcode and proceeding with the usual if-elif-else byte
|
||||
code stream.
|
||||
|
||||
|
||||
|
@ -129,7 +129,7 @@ Issues:
|
|||
The new optimization should not change the current Python
|
||||
semantics (by reducing the number of ``__cmp__`` calls and adding
|
||||
``__hash__`` calls in if-elif-else constructs which are affected
|
||||
by the optimiztation). To assure this, switching can only
|
||||
by the optimization). To assure this, switching can only
|
||||
safely be implemented either if a "from __future__" style
|
||||
flag is used, or the switching variable is one of the builtin
|
||||
immutable types: int, float, string, unicode, etc. (not
|
||||
|
|
|
@ -312,7 +312,7 @@ Additional Ideas
|
|||
# we're checking the builtin dict for that *now*, this
|
||||
# still works if the builtin first came into existence
|
||||
# after we were constructed. Note too that del on
|
||||
# namespace dicts is rare, so the expensse of this check
|
||||
# namespace dicts is rare, so the expense of this check
|
||||
# shouldn't matter.
|
||||
if key in self.basedict:
|
||||
c.objptr = self.basedict[key]
|
||||
|
|
|
@ -91,7 +91,7 @@ During import, this extension works as follows:
|
|||
- If no byte-compiled file is found, an attempt to read a
|
||||
byte-compiled file from the augmented directory is made.
|
||||
|
||||
- If bytecode generation is required, the generated bytecode is wrtten
|
||||
- If bytecode generation is required, the generated bytecode is written
|
||||
to the augmented directory if possible.
|
||||
|
||||
Note that this PEP is explicitly *not* about providing
|
||||
|
|
|
@ -407,9 +407,9 @@ is equivalent to the proposed::
|
|||
synchronize the_lock:
|
||||
change_shared_data()
|
||||
|
||||
PEP 310 must synchronize on an exsiting lock, while this PEP
|
||||
PEP 310 must synchronize on an existing lock, while this PEP
|
||||
proposes that unqualified 'synchronize' statements synchronize on
|
||||
a global, internal, transparent lock in addition to qualifiled
|
||||
a global, internal, transparent lock in addition to qualified
|
||||
'synchronize' statements. The 'with' statement also requires lock
|
||||
initialization, while the 'synchronize' statement can synchronize
|
||||
on any target object **including** locks.
|
||||
|
|
|
@ -418,7 +418,7 @@ incremented by 1 (round toward positive infinity)::
|
|||
``round-floor``: If all of the discarded digits are zero or if the
|
||||
sign is positive the result is unchanged; otherwise, the absolute
|
||||
value of the result is incremented by 1 (round toward negative
|
||||
infinty)::
|
||||
infinity)::
|
||||
|
||||
1.123 --> 1.12
|
||||
1.128 --> 1.12
|
||||
|
|
|
@ -872,7 +872,7 @@ portion of the response that precedes it.
|
|||
|
||||
In these cases, applications will usually return an iterator (often
|
||||
a generator-iterator) that produces the output in a block-by-block
|
||||
fashion. These blocks may be broken to coincide with mulitpart
|
||||
fashion. These blocks may be broken to coincide with multipart
|
||||
boundaries (for "server push"), or just before time-consuming
|
||||
tasks (such as reading another block of an on-disk file).
|
||||
|
||||
|
|
|
@ -22,7 +22,7 @@ BDFL Pronouncement
|
|||
|
||||
This PEP is rejected. It is considered a feature that ``None`` raises
|
||||
an error when called. The proposal falls short in tests for
|
||||
obviousness, clarity, explictness, and necessity. The provided Switch
|
||||
obviousness, clarity, explicitness, and necessity. The provided Switch
|
||||
example is nice but easily handled by a simple lambda definition.
|
||||
See python-dev discussion on 17 June 2005 [2]_.
|
||||
|
||||
|
|
|
@ -239,7 +239,7 @@ License (optional)
|
|||
Text indicating the license covering the distribution where the license
|
||||
is not a selection from the "License" Trove classifiers. See
|
||||
"Classifier" below. This field may also be used to specify a
|
||||
particular version of a licencse which is named via the ``Classifier``
|
||||
particular version of a license which is named via the ``Classifier``
|
||||
field, or to indicate a variation or exception to such a license.
|
||||
|
||||
Examples::
|
||||
|
|
|
@ -214,7 +214,7 @@ Py_ssize_t is just a typedef for int.
|
|||
On a 64-bit system, the compiler will warn in many
|
||||
places. If these warnings are ignored, the code will
|
||||
continue to work as long as the container sizes don't
|
||||
exceeed 2**31, i.e. it will work nearly as good as
|
||||
exceed 2**31, i.e. it will work nearly as good as
|
||||
it does currently. There are two exceptions to this
|
||||
statement: if the extension module implements the
|
||||
sequence protocol, it must be updated, or the calling
|
||||
|
|
|
@ -472,7 +472,7 @@ equivalent nesting with a much more explicit syntax::
|
|||
h1.tail = 'after second h1'
|
||||
|
||||
And if the repetition of the element names here is too much of a DRY
|
||||
violoation, it is also possible to eliminate all as-clauses except for
|
||||
violation, it is also possible to eliminate all as-clauses except for
|
||||
the first by adding a few methods to Element. [10]_
|
||||
|
||||
So are there real use-cases for executing the block in a dict of a
|
||||
|
|
|
@ -133,7 +133,7 @@ The import hook system guarantees certain invariants. XXX
|
|||
Sample Python implementation
|
||||
----------------------------
|
||||
|
||||
A Python implemenation may look like::
|
||||
A Python implementation may look like::
|
||||
|
||||
def notify(name):
|
||||
try:
|
||||
|
|
|
@ -163,7 +163,7 @@ directories are created on demand.
|
|||
|
||||
``distutils.command.build_ext`` (setup.py build_ext) gets a new argument
|
||||
``--user`` which adds the include/ and lib/ directories in the user base
|
||||
dirctory to the search paths for header files and libraries. It also
|
||||
directory to the search paths for header files and libraries. It also
|
||||
adds the lib/ directory to rpath.
|
||||
|
||||
The ``site`` module gets two arguments ``--user-base`` and ``--user-site``
|
||||
|
|
|
@ -229,7 +229,7 @@ Commentary
|
|||
ten-thousands. Eric Smith pointed-out that these are already
|
||||
handled by the "n" specifier in the locale module (albeit only
|
||||
for integers). This PEP does not attempt to support all of those
|
||||
possibilities. It focues on a single, relatively common grouping
|
||||
possibilities. It focuses on a single, relatively common grouping
|
||||
convention that offers a quick way to improve readability in many
|
||||
(though not all) contexts.
|
||||
|
||||
|
|
|
@ -87,7 +87,7 @@ PyPI provides statistics on downloads at `/stats`. This page is
|
|||
calculated daily by PyPI, by reading all mirrors' local stats and
|
||||
summing them.
|
||||
|
||||
The stats are presented in daily or montly files, under `/stats/days`
|
||||
The stats are presented in daily or monthly files, under `/stats/days`
|
||||
and `/stats/months`. Each file is a `bzip2` file with these formats:
|
||||
|
||||
- YYYY-MM-DD.bz2 for daily files
|
||||
|
@ -290,7 +290,7 @@ Clients that are browsing PyPI should be able to use a fail-over
|
|||
mechanism when PyPI or the used mirror is not responding.
|
||||
|
||||
It is up to the client to decide wich mirror should be used, maybe by
|
||||
looking at its geographical location and its responsivness.
|
||||
looking at its geographical location and its responsiveness.
|
||||
|
||||
This PEP does not describe how this fail-over mechanism should work,
|
||||
but it is strongly encouraged that the clients try to use the nearest
|
||||
|
|
|
@ -175,7 +175,7 @@ extensions related to Python should start with ``.py``. Therefore, the
|
|||
marker file was renamed to be ``.pyp``.
|
||||
|
||||
Dinu Gherman then observed that using a marker file is not necessary,
|
||||
and that a directoy extension could well serve as a such as a
|
||||
and that a directory extension could well serve as a such as a
|
||||
marker. This is what this PEP currently proposes.
|
||||
|
||||
Phillip Eby designed PEP 402 as an alternative approach to this PEP,
|
||||
|
@ -183,7 +183,7 @@ after comparing Python's package syntax with that found in other
|
|||
languages. PEP 402 proposes not to use a marker file at all. At the
|
||||
discussion at PyCon DE 2011, people remarked that having an explicit
|
||||
declaration of a directory as contributing to a package is a desirable
|
||||
property, rather than an obstactle. In particular, Jython developers
|
||||
property, rather than an obstacle. In particular, Jython developers
|
||||
noticed that Jython could easily mistake a directory that is a Java
|
||||
package as being a Python package, if there is no need to declare
|
||||
Python packages.
|
||||
|
|
|
@ -256,7 +256,7 @@ References
|
|||
.. [3] Importlib documentation, Cannon
|
||||
(http://docs.python.org/dev/library/importlib)
|
||||
|
||||
.. [4] Reference implentation
|
||||
.. [4] Reference implementation
|
||||
(https://bitbucket.org/jergosh/gsoc_import_engine/src/default/Lib/importlib/engine.py)
|
||||
|
||||
|
||||
|
|
|
@ -185,14 +185,14 @@ Blacklist approach: inherit from dict and override write methods to raise an
|
|||
exception. It is not truly read-only: it is still possible to call dict methods
|
||||
on such "frozen dictionary" to modify it.
|
||||
|
||||
* brownie: `brownie.datastructures.ImmuatableDict
|
||||
* brownie: `brownie.datastructures.ImmutableDict
|
||||
<https://github.com/DasIch/brownie/blob/HEAD/brownie/datastructures/mappings.py>`_.
|
||||
It is hashable if keys and values are hashable. werkzeug project has the
|
||||
same code: `werkzeug.datastructures.ImmutableDict
|
||||
<https://github.com/mitsuhiko/werkzeug/blob/master/werkzeug/datastructures.py>`_.
|
||||
ImmutableDict is used for global constant (configuration options). The Flask
|
||||
project uses ImmutableDict of werkzeug for its default configuration.
|
||||
* SQLAchemy project: `sqlachemy.util.immutabledict
|
||||
* SQLAlchemy project: `sqlalchemy.util.immutabledict
|
||||
<http://hg.sqlalchemy.org/sqlalchemy/file/tip/lib/sqlalchemy/util/_collections.py>`_.
|
||||
It is not hashable and has an extra method: union(). immutabledict is used
|
||||
for the default value of parameter of some functions expecting a mapping.
|
||||
|
|
|
@ -81,7 +81,7 @@ Limitations:
|
|||
documentation of new functions. The behaviour depends on the
|
||||
operating system: see the `Monotonic Clocks`_ section below. Some
|
||||
recent operating systems provide two clocks, one including time
|
||||
elapsed during system suspsend, one not including this time. Most
|
||||
elapsed during system suspend, one not including this time. Most
|
||||
operating systems only provide one kind of clock.
|
||||
* time.monotonic() and time.perf_counter() may or may not be adjusted.
|
||||
For example, ``CLOCK_MONOTONIC`` is slewed on Linux, whereas
|
||||
|
|
|
@ -149,7 +149,7 @@ Ownership could be:
|
|||
|
||||
* `zest.releaser`_ is owned and maintained by Zest Software.
|
||||
* `Django`_ is owned and maintained by the Django Software
|
||||
Fundation.
|
||||
Foundation.
|
||||
|
||||
* a group or community.
|
||||
Example: `sphinx`_ is maintained by developers of the Sphinx
|
||||
|
|
|
@ -262,7 +262,7 @@ Why didn't you mention my favorite Python implementation?
|
|||
Why is the ABI tag (the second tag) sometimes "none" in the reference implementation?
|
||||
Since Python 2 does not have an easy way to get to the SOABI
|
||||
(the concept comes from newer versions of Python 3) the reference
|
||||
implentation at the time of writing guesses "none". Ideally it
|
||||
implementation at the time of writing guesses "none". Ideally it
|
||||
would detect "py27(d|m|u)" analogous to newer versions of Python,
|
||||
but in the meantime "none" is a good enough way to say "don't know".
|
||||
|
||||
|
|
|
@ -1029,7 +1029,7 @@ If the standard ``alldev`` extra has no explicitly declared entries, then
|
|||
integration tools SHOULD implicitly define it as a dependency on the standard
|
||||
``test``, ``build``, ``doc``, and ``dev`` extras.
|
||||
|
||||
The full set of dependency requirements is then based on the uncondtional
|
||||
The full set of dependency requirements is then based on the unconditional
|
||||
dependencies, along with those of any requested extras.
|
||||
|
||||
Dependency examples (showing just the ``requires`` subfield)::
|
||||
|
|
|
@ -406,7 +406,7 @@ initialization from main interpreter initialization.
|
|||
Uninitialized State
|
||||
-------------------
|
||||
|
||||
The unitialized state is where an embedding application determines the settings
|
||||
The uninitialized state is where an embedding application determines the settings
|
||||
which are required in order to be able to correctly pass configurations settings
|
||||
to the embedded Python runtime.
|
||||
|
||||
|
@ -421,7 +421,7 @@ started the initialization process::
|
|||
|
||||
int Py_IsInitializing();
|
||||
|
||||
The query for a completely unitialized environment would then be
|
||||
The query for a completely uninitialized environment would then be
|
||||
``!(Py_Initialized() || Py_Initializing())``.
|
||||
|
||||
|
||||
|
|
|
@ -45,7 +45,7 @@ to integers is semantically meaningful. For most uses of enumerations, it's
|
|||
a **feature** to reject comparison to integers; enums that compare to integers
|
||||
lead, through transitivity, to comparisons between enums of unrelated types,
|
||||
which isn't desirable in most cases. For some uses, however, greater
|
||||
interoperatiliby with integers is desired. For instance, this is the case for
|
||||
interoperability with integers is desired. For instance, this is the case for
|
||||
replacing existing standard library constants (such as ``socket.AF_INET``)
|
||||
with enumerations.
|
||||
|
||||
|
|
|
@ -920,7 +920,7 @@ following clauses would match or not as shown::
|
|||
== 1.1a1 # Equal, so 1.1a1 matches clause
|
||||
== 1.1.* # Same prefix, so 1.1a1 matches clause
|
||||
|
||||
An exact match is also considered a prefix match (this interpreation is
|
||||
An exact match is also considered a prefix match (this interpretation is
|
||||
implied by the usual zero padding rules for the release segment of version
|
||||
identifiers). Given the version ``1.1``, the following clauses would
|
||||
match or not as shown::
|
||||
|
@ -1537,7 +1537,7 @@ the initial reference implementation was released in setuptools 8.0 and pip
|
|||
|
||||
* The PEP text and the ``is_canonical`` regex were updated to be explicit
|
||||
that numeric components are specifically required to be represented as
|
||||
squences of ASCII digits, not arbitrary Unicode [Nd] code points. This
|
||||
sequences of ASCII digits, not arbitrary Unicode [Nd] code points. This
|
||||
was previously implied by the version parsing regex in Appendix B, but
|
||||
not stated explicitly [10]_.
|
||||
|
||||
|
|
|
@ -888,7 +888,7 @@ portion of the response that precedes it.
|
|||
In these cases, applications will usually return a ``body`` iterator
|
||||
(often a generator-iterator) that produces the output in a
|
||||
block-by-block fashion. These blocks may be broken to coincide with
|
||||
mulitpart boundaries (for "server push"), or just before
|
||||
multipart boundaries (for "server push"), or just before
|
||||
time-consuming tasks (such as reading another block of an on-disk
|
||||
file).
|
||||
|
||||
|
|
|
@ -256,13 +256,13 @@ uppercase versions of names::
|
|||
return 42
|
||||
|
||||
def M(self):
|
||||
return "fourtytwo"
|
||||
return "fortytwo"
|
||||
|
||||
obj = SillyObject()
|
||||
assert obj.m() == "fortytwo"
|
||||
|
||||
As mentioned earlier in this PEP a more realistic use case of this
|
||||
functionallity is a ``__getdescriptor__`` method that dynamicly populates the
|
||||
functionality is a ``__getdescriptor__`` method that dynamically populates the
|
||||
class ``__dict__`` based on attribute access, primarily when it is not
|
||||
possible to reliably keep the class dict in sync with its source, for example
|
||||
because the source used to populate ``__dict__`` is dynamic as well and does
|
||||
|
@ -339,7 +339,7 @@ changes to the visible behaviour of the ``object.__getattribute__``.
|
|||
|
||||
As with a custom ``__getattribute__`` method `dir()`_ might not see all
|
||||
(instance) attributes when using the ``__getdescriptor__()`` method to
|
||||
dynamicly resolve attributes.
|
||||
dynamically resolve attributes.
|
||||
|
||||
The solution for that is quite simple: classes using ``__getdescriptor__``
|
||||
should also implement `__dir__()`_ if they want full support for the builtin
|
||||
|
|
|
@ -830,7 +830,7 @@ Other Changes
|
|||
|
||||
* The various finders and loaders provided by importlib will be
|
||||
updated to comply with this proposal.
|
||||
* Any other implmentations of or dependencies on the import-related APIs
|
||||
* Any other implementations of or dependencies on the import-related APIs
|
||||
(particularly finders and loaders) in the stdlib will be likewise
|
||||
adjusted to this PEP. While they should continue to work, any such
|
||||
changes that get missed should be considered bugs for the Python 3.4.x
|
||||
|
|
|
@ -478,7 +478,7 @@ bytes_hash() (Objects/bytesobject.c)
|
|||
|
||||
``bytes_hash`` uses ``_Py_HashBytes`` to provide the tp_hash slot function
|
||||
for bytes objects. The function will continue to use ``_Py_HashBytes``
|
||||
but withoht a type cast.
|
||||
but without a type cast.
|
||||
|
||||
memory_hash() (Objects/memoryobject.c)
|
||||
--------------------------------------
|
||||
|
@ -486,7 +486,7 @@ memory_hash() (Objects/memoryobject.c)
|
|||
``memory_hash`` provides the tp_hash slot function for read-only memory
|
||||
views if the original object is hashable, too. It's the only function that
|
||||
has to support hashing of unaligned memory segments in the future. The
|
||||
function will continue to use ``_Py_HashBytes`` but withoht a type cast.
|
||||
function will continue to use ``_Py_HashBytes`` but without a type cast.
|
||||
|
||||
|
||||
unicode_hash() (Objects/unicodeobject.c)
|
||||
|
|
|
@ -1005,7 +1005,7 @@ in download links.
|
|||
|
||||
__ http://www.python.org/dev/peps/pep-0470/
|
||||
|
||||
Potentional approaches that PyPI administrators MAY consider to handle
|
||||
Potential approaches that PyPI administrators MAY consider to handle
|
||||
projects hosted externally:
|
||||
|
||||
1. Download external distributions but do not verify them. The targets
|
||||
|
|
|
@ -243,7 +243,7 @@ References
|
|||
.. [3] Enhance exceptions by attaching some more information to them
|
||||
(https://mail.python.org/pipermail/python-ideas/2014-February/025601.html)
|
||||
|
||||
.. [4] Specifity in AttributeError
|
||||
.. [4] Specificity in AttributeError
|
||||
(https://mail.python.org/pipermail/python-ideas/2013-April/020308.html)
|
||||
|
||||
.. [5] Add an 'attr' attribute to AttributeError
|
||||
|
|
|
@ -71,7 +71,7 @@ than the standard one provided by consuming upstream CPython 2.7 releases
|
|||
directly, but other potential challenges have also been pointed out with
|
||||
updating embedded Python runtimes and other user level installations of Python.
|
||||
|
||||
Rather than allowing a plethora of mutually incompatibile migration techniques
|
||||
Rather than allowing a plethora of mutually incompatible migration techniques
|
||||
to bloom, this PEP proposes an additional feature to be added to Python 2.7.12
|
||||
to make it easier to revert a process to the past behaviour of skipping
|
||||
certificate validation in HTTPS client modules. It also provides additional
|
||||
|
|
|
@ -64,7 +64,7 @@ Subtraction of datetime
|
|||
A ``tzinfo`` subclass supporting the PDDM, may define a method called
|
||||
``__datetime_diff__`` that should take two ``datetime.datetime``
|
||||
instances and return a ``datetime.timedelta`` instance representing
|
||||
the time elapced from the time represented by the first datetime
|
||||
the time elapsed from the time represented by the first datetime
|
||||
instance to another.
|
||||
|
||||
|
||||
|
|
|
@ -181,9 +181,9 @@ identifiable information [#breaches]_, as well as with failures to take
|
|||
security considerations into account when new systems, like motor vehicles
|
||||
[#uconnect]_, are connected to the internet. It's also the case that a lot of
|
||||
the programming advice readily available on the internet [#search] simply
|
||||
doesn't take the mathemetical arcana of computer security into account.
|
||||
doesn't take the mathematical arcana of computer security into account.
|
||||
Compounding these issues is the fact that defenders have to cover *all* of
|
||||
their potential vulnerabilites, as a single mistake can make it possible to
|
||||
their potential vulnerabilities, as a single mistake can make it possible to
|
||||
subvert other defences [#bcrypt]_.
|
||||
|
||||
One of the factors that contributes to making this last aspect particularly
|
||||
|
@ -275,7 +275,7 @@ Using first DeprecationWarning, and then eventually a RuntimeWarning, to
|
|||
advise against implicitly switching to the deterministic PRNG aims to
|
||||
nudge future users that need a cryptographically secure RNG away from
|
||||
calling ``random.seed()`` and those that genuinely need a deterministic
|
||||
generator towards explicitily calling ``random.ensure_repeatable()``.
|
||||
generator towards explicitly calling ``random.ensure_repeatable()``.
|
||||
|
||||
Avoiding the introduction of a userspace CSPRNG
|
||||
-----------------------------------------------
|
||||
|
|
|
@ -87,7 +87,7 @@ The coalesce rule
|
|||
*****************
|
||||
|
||||
The ``coalesce`` rule provides the ``??`` binary operator. Unlike most binary
|
||||
operators, the right-hand side is not evaulated until the left-hand side is
|
||||
operators, the right-hand side is not evaluated until the left-hand side is
|
||||
determined to be ``None``.
|
||||
|
||||
The ``??`` operator binds more tightly than other binary operators as most
|
||||
|
|
|
@ -124,7 +124,7 @@ environments::
|
|||
Optional components of a distribution may be specified using the extras
|
||||
field::
|
||||
|
||||
identifer_end = letterOrDigit | (('-' | '_' | '.' )* letterOrDigit)
|
||||
identifier_end = letterOrDigit | (('-' | '_' | '.' )* letterOrDigit)
|
||||
identifier = letterOrDigit identifier_end*
|
||||
name = identifier
|
||||
extras_list = identifier (wsp* ',' wsp* identifier)*
|
||||
|
|
|
@ -291,7 +291,7 @@ incremented) according to the C standard.
|
|||
|
||||
After an integer overflow, a guard can succeed whereas the watched
|
||||
dictionary key was modified. The bug only occurs at a guard check if
|
||||
there are exaclty ``2 ** 64`` dictionary creations or modifications
|
||||
there are exactly ``2 ** 64`` dictionary creations or modifications
|
||||
since the previous guard check.
|
||||
|
||||
If a dictionary is modified every nanosecond, ``2 ** 64`` modifications
|
||||
|
|
|
@ -54,7 +54,7 @@ and source distributions.
|
|||
Provisional Acceptance
|
||||
=======================
|
||||
|
||||
In accordance with the PyPA's specication process, this PEP has been
|
||||
In accordance with the PyPA's specification process, this PEP has been
|
||||
`provisionally accepted <https://www.pypa.io/en/latest/specifications/#provisional-acceptance>`_
|
||||
for initial implementation in ``pip`` and other PyPA tools.
|
||||
|
||||
|
|
|
@ -74,7 +74,7 @@ of issues, such as:
|
|||
projects cannot take advantage of newer setuptools features until
|
||||
their users naturally upgrade the version of setuptools to a newer
|
||||
one.
|
||||
* The items listed in ``setup_requires`` get implicily installed
|
||||
* The items listed in ``setup_requires`` get implicitly installed
|
||||
whenever you execute the ``setup.py`` but one of the common ways
|
||||
that the ``setup.py`` is executed is via another tool, such as
|
||||
``pip``, who is already managing dependencies. This means that
|
||||
|
@ -375,7 +375,7 @@ other configuration data such as project name and version number may
|
|||
end up in the same file someday where arbitrary code execution is not
|
||||
desired.
|
||||
|
||||
And finally, the most popular Python implemenation of YAML is
|
||||
And finally, the most popular Python implementation of YAML is
|
||||
PyYAML [#pyyaml]_ which is a large project of a few thousand lines of
|
||||
code and an optional C extension module. While in and of itself this
|
||||
isn't necessarily an issue, this becomes more of a problem for
|
||||
|
@ -489,7 +489,7 @@ meta.toml
|
|||
setup.toml
|
||||
While keeping with traditional thanks to ``setup.py``, it does not
|
||||
necessarily match what the file may contain in the future (.e.g is
|
||||
knowing the name of a project inerhently part of its setup?).
|
||||
knowing the name of a project inherently part of its setup?).
|
||||
|
||||
pymeta.toml
|
||||
Not obvious to newcomers to programming and/or Python.
|
||||
|
|
|
@ -171,7 +171,7 @@ which is exactly correct for ``__definition_order__``. Since it
|
|||
represents the state of a particular one-time event (execution of
|
||||
the class definition body), allowing the value to be replaced would
|
||||
reduce confidence that the attribute corresponds to the original class
|
||||
body. Furthermore, often an immuntable-by-default approach helps to
|
||||
body. Furthermore, often an immutable-by-default approach helps to
|
||||
make data easier to reason about.
|
||||
|
||||
However, in this case there still isn't a *strong* reason to counter
|
||||
|
|
|
@ -41,7 +41,7 @@ example, this code::
|
|||
yield x
|
||||
|
||||
may or may not successfully catch warnings raised by ``g()``, and may
|
||||
or may not inadverdantly swallow warnings triggered elsewhere in the
|
||||
or may not inadvertently swallow warnings triggered elsewhere in the
|
||||
code. The context manager, which was intended to apply only to ``f``
|
||||
and its callees, ends up having a dynamic scope that encompasses
|
||||
arbitrary and unpredictable parts of its call\ **ers**. This problem
|
||||
|
|
|
@ -222,7 +222,7 @@ On Python 3.5.2, os.urandom() uses the
|
|||
non-blocking ``/dev/urandom`` if ``getrandom(size, GRND_NONBLOCK)``
|
||||
fails with ``EAGAIN``.
|
||||
|
||||
Security experts promotes ``os.urandom()`` to genereate cryptographic
|
||||
Security experts promotes ``os.urandom()`` to generate cryptographic
|
||||
keys because it is implemented with a `Cryptographically secure
|
||||
pseudo-random number generator (CSPRNG)
|
||||
<https://en.wikipedia.org/wiki/Cryptographically_secure_pseudorandom_number_generator>`_.
|
||||
|
@ -435,7 +435,7 @@ not acceptable. The application must handle ``BlockingIOError``: poll
|
|||
except BlockingIOError:
|
||||
pass
|
||||
|
||||
print("Wait for system urandom initialiation: move your "
|
||||
print("Wait for system urandom initialization: move your "
|
||||
"mouse, use your keyboard, use your disk, ...")
|
||||
while 1:
|
||||
# Avoid busy-loop: sleep 1 ms
|
||||
|
@ -497,7 +497,7 @@ have a well defined non-blocking API (``getrandom(size,
|
|||
GRND_NONBLOCK)``).
|
||||
|
||||
As `Raise BlockingIOError in os.urandom()`_, it doesn't seem worth it to
|
||||
make the API more complex for a theorical (or at least very rare) use
|
||||
make the API more complex for a theoretical (or at least very rare) use
|
||||
case.
|
||||
|
||||
As `Leave os.urandom() unchanged, add os.getrandom()`_, the problem is
|
||||
|
@ -551,7 +551,7 @@ Since ``os.urandom()`` is implemented in the kernel, it doesn't have
|
|||
issues of user-space RNG. For example, it is much harder to get its
|
||||
state. It is usually built on a CSPRNG, so even if its state is
|
||||
"stolen", it is hard to compute previously generated numbers. The kernel
|
||||
has a good knowledge of entropy sources and feed regulary the entropy
|
||||
has a good knowledge of entropy sources and feed regularly the entropy
|
||||
pool.
|
||||
|
||||
That's also why ``os.urandom()`` is preferred over ``ssl.RAND_bytes()``.
|
||||
|
|
|
@ -753,7 +753,7 @@ The same logic applies to ``else``, only reversed::
|
|||
>>> is_not_none(data.get("key")) else y
|
||||
|
||||
This expression returns ``data.get("key")`` if it is not ``None``, otherwise it
|
||||
evaluates and returns ``y``. To understand the mechancics, we rewrite the
|
||||
evaluates and returns ``y``. To understand the mechanics, we rewrite the
|
||||
expression as follows::
|
||||
|
||||
>>> maybe_value = is_not_none(data.get("key"))
|
||||
|
|
|
@ -486,7 +486,7 @@ comprehensions):
|
|||
- ``__del__`` calls ``self.close()`` (same as now), and additionally
|
||||
issues a ``ResourceWarning`` if the generator wasn't exhausted. This
|
||||
warning is hidden by default, but can be enabled for those who want
|
||||
to make sure they aren't inadverdantly relying on CPython-specific
|
||||
to make sure they aren't inadvertently relying on CPython-specific
|
||||
GC semantics.
|
||||
|
||||
Async generator objects (including those created by async generator
|
||||
|
|
|
@ -901,7 +901,7 @@ current process and any subprocesses that inherit the current environment.
|
|||
Avoiding setting LANG for UTF-8 locale coercion
|
||||
-----------------------------------------------
|
||||
|
||||
Earlier versions of this PEP proposed setting the ``LANG`` category indepdent
|
||||
Earlier versions of this PEP proposed setting the ``LANG`` category independent
|
||||
default locale, in addition to setting ``LC_CTYPE``.
|
||||
|
||||
This was later removed on the grounds that setting only ``LC_CTYPE`` is
|
||||
|
|
|
@ -94,7 +94,7 @@ The new TSS API does not provide functions which correspond to
|
|||
``PyThread_delete_key_value`` and ``PyThread_ReInitTLS``, because these
|
||||
functions were needed only for CPython's now defunct built-in TLS
|
||||
implementation; that is the existing behavior of these functions is treated
|
||||
as follows: ``PyThread_delete_key_value(key)`` is equalivalent to
|
||||
as follows: ``PyThread_delete_key_value(key)`` is equivalent to
|
||||
``PyThread_set_key_value(key, NULL)``, and ``PyThread_ReInitTLS()`` is a
|
||||
no-op [8]_.
|
||||
|
||||
|
|
|
@ -157,7 +157,7 @@ are:
|
|||
presented by a remote peer.
|
||||
11. Finding a way to get hold of these interfaces at run time.
|
||||
|
||||
For the sake of simplicitly, this PEP proposes to take a unified approach to
|
||||
For the sake of simplicity, this PEP proposes to take a unified approach to
|
||||
(2) and (3) (that is, buffers and sockets). The Python socket API is a
|
||||
sizeable one, and implementing a wrapped socket that has the same behaviour as
|
||||
a regular Python socket is a subtle and tricky thing to do. However, it is
|
||||
|
|
|
@ -492,7 +492,7 @@ Create Github Repository
|
|||
------------------------
|
||||
|
||||
Create a repository named "python-docs-{LANGUAGE_TAG}" (IETF language
|
||||
tag, without redundent region subtag, with a dash, and lowercased.) on
|
||||
tag, without redundant region subtag, with a dash, and lowercased.) on
|
||||
the Python GitHub organization (See `Repository For Po Files`_.), and
|
||||
grant the language coordinator push rights to this repository.
|
||||
|
||||
|
|
|
@ -457,7 +457,7 @@ to the language (or stdlib) has a cost in increasing the size of
|
|||
the language. So an addition must pay for itself. In this case,
|
||||
subinterpreters provide a novel concurrency model focused on isolated
|
||||
threads of execution. Furthermore, they provide an opportunity for
|
||||
changes in CPython that will allow simulateous use of multiple CPU
|
||||
changes in CPython that will allow simultaneous use of multiple CPU
|
||||
cores (currently prevented by the GIL).
|
||||
|
||||
Alternatives to subinterpreters include threading, async, and
|
||||
|
@ -1384,7 +1384,7 @@ executes.
|
|||
|
||||
The idea is rejected because the benefit is small and the cost is high.
|
||||
The difference from the capability in the C-API would be potentially
|
||||
confusing. The implcit creation of threads is magical. The early
|
||||
confusing. The implicit creation of threads is magical. The early
|
||||
creation of threads is potentially wasteful. The inability to run
|
||||
arbitrary interpreters in an existing thread would prevent some valid
|
||||
use cases, frustrating users. Tying interpreters to threads would
|
||||
|
|
|
@ -778,7 +778,7 @@ For example::
|
|||
|
||||
Because the ``__post_init__`` function is the last thing called in the
|
||||
generated ``__init__``, having a classmethod constructor (which can
|
||||
also execute code immmediately after constructing the object) is
|
||||
also execute code immediately after constructing the object) is
|
||||
functionally equivalent to being able to pass parameters to a
|
||||
``__post_init__`` function.
|
||||
|
||||
|
|
|
@ -455,7 +455,7 @@ Instead of requiring the following import::
|
|||
|
||||
the PEP could call the feature more explicitly, for example
|
||||
``string_annotations``, ``stringify_annotations``,
|
||||
``annotation_strings``, ``annotations_as_strings``, ``lazy_anotations``,
|
||||
``annotation_strings``, ``annotations_as_strings``, ``lazy_annotations``,
|
||||
``static_annotations``, etc.
|
||||
|
||||
The problem with those names is that they are very verbose. Each of
|
||||
|
@ -666,7 +666,7 @@ This document could not be completed without valuable input,
|
|||
encouragement and advice from Guido van Rossum, Jukka Lehtosalo, and
|
||||
Ivan Levkivskyi.
|
||||
|
||||
The implementation was throroughly reviewed by Serhiy Storchaka who
|
||||
The implementation was thoroughly reviewed by Serhiy Storchaka who
|
||||
found all sorts of issues, including bugs, bad readability, and
|
||||
performance problems.
|
||||
|
||||
|
|
|
@ -222,7 +222,7 @@ material for ROP gadgets, mapped into the process. [14]_
|
|||
``vsyscall=emulated`` has been the default configuration in most
|
||||
distribution's kernels for many years.
|
||||
|
||||
Unfortunately, ``vsyscall`` emulation still exposes predicatable code
|
||||
Unfortunately, ``vsyscall`` emulation still exposes predictable code
|
||||
at a reliable memory location, and continues to be useful for
|
||||
return-oriented programming. [15]_ Because most distributions have now
|
||||
upgraded to ``glibc`` versions that do not depend on ``vsyscall``,
|
||||
|
|
|
@ -437,7 +437,7 @@ easier, a helper will be added::
|
|||
This function takes a heap type and on success, it returns pointer to state of the
|
||||
module that the heap type belongs to.
|
||||
|
||||
On failure, two scenarios may occure. When a type without a module is passed in,
|
||||
On failure, two scenarios may occur. When a type without a module is passed in,
|
||||
``SystemError`` is set and ``NULL`` returned. If the module is found, pointer
|
||||
to the state, which may be ``NULL``, is returned without setting any exception.
|
||||
|
||||
|
|
|
@ -352,7 +352,7 @@ such as Numpy arrays, need to be backed by a mutable buffer for full
|
|||
operation. Pickle consumers that use the ``buffer_callback`` and ``buffers``
|
||||
arguments will have to be careful to recreate mutable buffers. When doing
|
||||
I/O, this implies using buffer-passing API variants such as ``readinto``
|
||||
(which are also often preferrable for performance).
|
||||
(which are also often preferable for performance).
|
||||
|
||||
Data sharing
|
||||
------------
|
||||
|
|
|
@ -823,7 +823,7 @@ In the Python standard library for example,
|
|||
changes are needed in the ``doctest`` module because of this.
|
||||
|
||||
Also, tools which take various kinds of functions as input will need to deal
|
||||
with the new function hieararchy and the possibility of custom
|
||||
with the new function hierarchy and the possibility of custom
|
||||
function classes.
|
||||
|
||||
Python functions
|
||||
|
|
|
@ -52,7 +52,7 @@ Currently the majority of calls are dispatched to ``function``\s and ``method_de
|
|||
Continued prohibition of callable classes as base classes
|
||||
~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
|
||||
|
||||
Currently any attempt to use ``function``, ``method`` or ``method_descriptor`` as a base class for a new class will fail with a ``TypeError``. This behaviour is desirable as it prevents errors when a subclass overrides the ``__call__`` method. If callables could be sub-classed then any call to a ``function`` or a ``method_descriptor`` would need an additional check that the ``__call__`` method had not been overridden. By exposing an additional call mechanism, the potential for errors becomes greater. As a consequence, any third-partyy class implementing the addition call interface will not be usable as a base class.
|
||||
Currently any attempt to use ``function``, ``method`` or ``method_descriptor`` as a base class for a new class will fail with a ``TypeError``. This behaviour is desirable as it prevents errors when a subclass overrides the ``__call__`` method. If callables could be sub-classed then any call to a ``function`` or a ``method_descriptor`` would need an additional check that the ``__call__`` method had not been overridden. By exposing an additional call mechanism, the potential for errors becomes greater. As a consequence, any third-party class implementing the addition call interface will not be usable as a base class.
|
||||
|
||||
|
||||
New classes and changes to existing classes
|
||||
|
|
|
@ -28,7 +28,7 @@ technically allows inline augmented assignments to be written using the
|
|||
...
|
||||
|
||||
The restriction to simple names as inline assignment targets means that the
|
||||
target expession can always be repeated without side effects, and thus avoids
|
||||
target expression can always be repeated without side effects, and thus avoids
|
||||
the ambiguity that would arise from allowing actual embedded augmented
|
||||
assignments (it's still a bad idea, since it would almost certainly be hard
|
||||
for humans to read, this note is just about the theoretical limits of language
|
||||
|
|
|
@ -157,7 +157,7 @@ the current directory, instead if there is a ``__pypackages__`` directory in the
|
|||
same path of the script, that will be used.
|
||||
|
||||
For example, if we execute ``python /usr/share/myproject/fancy.py`` from the
|
||||
``/tmp`` dirctory and if there is a ``__pypackages__`` directory inside of
|
||||
``/tmp`` directory and if there is a ``__pypackages__`` directory inside of
|
||||
``/usr/share/myproject/`` directory, it will be used. Any potential
|
||||
``__pypackages__`` directory in ``/tmp`` will be ignored.
|
||||
|
||||
|
|
|
@ -331,7 +331,7 @@ The ``legacy_windows_fs_encoding`` field is only available on Windows.
|
|||
* ``_config_version`` (``int``):
|
||||
Configuration version, used for ABI compatibility.
|
||||
* ``_config_init`` (``int``):
|
||||
Function used to initalize ``PyConfig``, used for preinitialization.
|
||||
Function used to initialize ``PyConfig``, used for preinitialization.
|
||||
|
||||
``PyMem_SetAllocator()`` can be called after ``Py_PreInitialize()`` and
|
||||
before ``Py_InitializeFromConfig()`` to install a custom memory
|
||||
|
@ -585,7 +585,7 @@ Options`_.
|
|||
* ``_config_version`` (``int``):
|
||||
Configuration version, used for ABI compatibility.
|
||||
* ``_config_init`` (``int``):
|
||||
Function used to initalize ``PyConfig``, used for preinitialization.
|
||||
Function used to initialize ``PyConfig``, used for preinitialization.
|
||||
* ``_install_importlib`` (``int``):
|
||||
Install importlib?
|
||||
* ``_init_main`` (``int``):
|
||||
|
|
|
@ -111,7 +111,7 @@ The call takes the form ``((vectorcallfunc)(((char *)o)+offset))(o, args, n, kwn
|
|||
``offset`` is ``Py_TYPE(o)->tp_vectorcall_offset``.
|
||||
The caller is responsible for creating the ``kwnames`` tuple and ensuring that there are no duplicates in it.
|
||||
|
||||
``n`` is the number of postional arguments plus possibly the ``PY_VECTORCALL_ARGUMENTS_OFFSET`` flag.
|
||||
``n`` is the number of positional arguments plus possibly the ``PY_VECTORCALL_ARGUMENTS_OFFSET`` flag.
|
||||
|
||||
PY_VECTORCALL_ARGUMENTS_OFFSET
|
||||
------------------------------
|
||||
|
|
|
@ -262,7 +262,7 @@ A related feature to final classes would be Scala-style sealed
|
|||
classes, where a class is allowed to be inherited only by classes
|
||||
defined in the same module. Sealed classes seem most useful in
|
||||
combination with pattern matching, so it does not seem to justify the
|
||||
complexity in our case. This could be revisisted in the future.
|
||||
complexity in our case. This could be revisited in the future.
|
||||
|
||||
It would be possible to have the ``@final`` decorator on classes
|
||||
dynamically prevent subclassing at runtime. Nothing else in ``typing``
|
||||
|
|
|
@ -94,7 +94,7 @@ the original order to install the now yanked file, then it acts as if it
|
|||
had not been yaned.
|
||||
|
||||
An installer **MUST** ignore yanked releases, if the selection constraints
|
||||
can be satisified with a non-yanked version, and **MAY** refuse to use a
|
||||
can be satisfied with a non-yanked version, and **MAY** refuse to use a
|
||||
yanked release even if it means that the request cannot be satisfied at all.
|
||||
An implementation **SHOULD** choose a policy that follows the spirit of the
|
||||
intention above, and that prevents "new" dependencies on yanked
|
||||
|
|
|
@ -45,7 +45,7 @@ on GitHub Issues.
|
|||
workflow evolved.
|
||||
|
||||
It is possible to gradually improve it and avoid the disruption
|
||||
that a switch to a different system would inevitabily bring to
|
||||
that a switch to a different system would inevitably bring to
|
||||
the workflow.
|
||||
|
||||
* **Open-source and Python powered.** Roundup is an open-source
|
||||
|
@ -292,7 +292,7 @@ only when someone wants to keep working on them. This approach
|
|||
has several issues, but there are also other issues that will
|
||||
need to be addressed regardless of the approach used:
|
||||
|
||||
* **Vendor lock-in.** GitHub is properietary and there is risk
|
||||
* **Vendor lock-in.** GitHub is proprietary and there is risk
|
||||
of vendor lock-in. Their business model might change and they
|
||||
could shut down altogether. For example, several projects
|
||||
decided to move away from GitHub after Microsoft acquisition.
|
||||
|
|
|
@ -210,7 +210,7 @@ And there are many difficulty there:
|
|||
* Omitting ``encoding`` option is very common.
|
||||
|
||||
* If we raise ``DeprecationWarning`` always, it will be too noisy.
|
||||
* We can not assume how user use it. Complicated heuritics may be
|
||||
* We can not assume how user use it. Complicated heuristics may be
|
||||
needed to raise ``DeprecationWarning`` only when it is really
|
||||
needed.
|
||||
|
||||
|
|
|
@ -497,7 +497,7 @@ to the way that PEP 596 proposes to starting publishing Python 3.9.0 alpha
|
|||
releases during the Python 3.8.0 release candidate period).
|
||||
|
||||
However, rather than setting specific timelines for that at a policy level,
|
||||
it may make sense to leave that decision to invididual release managers, based
|
||||
it may make sense to leave that decision to individual release managers, based
|
||||
on the specific changes that are being proposed for the release they're
|
||||
managing.
|
||||
|
||||
|
|
|
@ -60,7 +60,7 @@ Other Resources
|
|||
I've barely skimmed the surface of the many examples put forward to point out
|
||||
just how much *easier* and more *sensible* many aspects of mathematics become
|
||||
when conceived in terms of ``tau`` rather than ``pi``. If you don't find my
|
||||
specific examples sufficiently persausive, here are some more resources that
|
||||
specific examples sufficiently persuasive, here are some more resources that
|
||||
may be of interest:
|
||||
|
||||
* Michael Hartl is the primary instigator of Tau Day in his `Tau Manifesto`_
|
||||
|
|
|
@ -189,7 +189,7 @@ to match on a single expression; use::
|
|||
|
||||
case EXPR, EXPR, ...:
|
||||
|
||||
to match on mulltiple expressions. The is interpreted so that if EXPR
|
||||
to match on multiple expressions. The is interpreted so that if EXPR
|
||||
is a parenthesized tuple or another expression whose value is a tuple,
|
||||
the switch expression must equal that tuple, not one of its elements.
|
||||
This means that we cannot use a variable to indicate multiple cases.
|
||||
|
@ -432,7 +432,7 @@ that overkill.)
|
|||
|
||||
Personally, I'm in school II: I believe that the dict-based dispatch
|
||||
is the one true implementation for switch statements and that we
|
||||
should face the limitiations up front, so that we can reap maximal
|
||||
should face the limitations up front, so that we can reap maximal
|
||||
benefits. I'm leaning towards school IIb -- duplicate cases should be
|
||||
resolved by the ordering of the cases instead of flagged as errors.
|
||||
|
||||
|
|
|
@ -206,7 +206,7 @@ reasonable because of this fact.
|
|||
|
||||
- Wrapper for SGI libimage library for imglib image files
|
||||
(``.rgb`` files).
|
||||
- Python Imaging Library provdes read-only support [#pil]_.
|
||||
- Python Imaging Library provides read-only support [#pil]_.
|
||||
- Not uniquely edited in 13 years.
|
||||
|
||||
+ IN
|
||||
|
|
|
@ -362,7 +362,7 @@ The members of the bufferinfo structure are:
|
|||
set to NULL or an PyExc_BufferError raised if this is not possible.
|
||||
|
||||
For clarity, here is a function that returns a pointer to the
|
||||
element in an N-D array pointed to by an N-dimesional index when
|
||||
element in an N-D array pointed to by an N-dimensional index when
|
||||
there are both non-NULL strides and suboffsets::
|
||||
|
||||
void *get_item_pointer(int ndim, void *buf, Py_ssize_t *strides,
|
||||
|
|
|
@ -125,7 +125,7 @@ bytearray:
|
|||
<encoding>[, <errors>])``: encode a text string. Note that the
|
||||
``str.encode()`` method returns an *immutable* bytes object. The
|
||||
<encoding> argument is mandatory; <errors> is optional.
|
||||
<encoding> and <errrors>, if given, must be ``str`` instances.
|
||||
<encoding> and <errors>, if given, must be ``str`` instances.
|
||||
|
||||
- ``bytes(<memory view>)``, ``bytearray(<memory view>)``: construct
|
||||
a bytes or bytearray object from anything that implements the PEP
|
||||
|
|
|
@ -55,7 +55,7 @@ We can use ``print(aJapaneseString)`` to get a readable string, but we
|
|||
don't have a similar workaround for printing strings from collections
|
||||
such as lists or tuples. ``print(listOfJapaneseStrings)`` uses repr()
|
||||
to build the string to be printed, so the resulting strings are always
|
||||
hex-escaped. Or when ``open(japaneseFilemame)`` raises an exception,
|
||||
hex-escaped. Or when ``open(japaneseFilename)`` raises an exception,
|
||||
the error message is something like ``IOError: [Errno 2] No such file
|
||||
or directory: '\u65e5\u672c\u8a9e'``, which isn't helpful.
|
||||
|
||||
|
|
|
@ -717,7 +717,7 @@ Rejected Alternatives
|
|||
the general structure of the syntax as defined in this PEP (For example,
|
||||
allowing a subexpression like ``?given`` or ``:given`` to be used in
|
||||
expressions to indicate a direct reference to the implied closure, thus
|
||||
preventig it from being called automatically to create the local namespace).
|
||||
preventing it from being called automatically to create the local namespace).
|
||||
All such attempts have appeared unattractive and confusing compared to
|
||||
the simpler decorator-inspired proposal in PEP 403.
|
||||
|
||||
|
|
|
@ -437,7 +437,7 @@ Naming
|
|||
|
||||
Various naming controversies can arise. One of them is whether all
|
||||
exception class names should end in "``Error``". In favour is consistency
|
||||
with the rest of the exception hiearchy, against is concision (especially
|
||||
with the rest of the exception hierarchy, against is concision (especially
|
||||
with long names such as ``ConnectionAbortedError``).
|
||||
|
||||
Exception attributes
|
||||
|
|
|
@ -983,7 +983,7 @@ portion of the response that precedes it.
|
|||
|
||||
In these cases, applications will usually return an iterator (often
|
||||
a generator-iterator) that produces the output in a block-by-block
|
||||
fashion. These blocks may be broken to coincide with mulitpart
|
||||
fashion. These blocks may be broken to coincide with multipart
|
||||
boundaries (for "server push"), or just before time-consuming
|
||||
tasks (such as reading another block of an on-disk file).
|
||||
|
||||
|
|
|
@ -283,7 +283,7 @@ cryptographic systems for Condorcet ballots, the CIVS system was chosen to
|
|||
act as a trusted party.
|
||||
|
||||
More information about the security and privacy afforded by CIVS, including
|
||||
how a malicous voter, election supervisor, or CIVS administrator can
|
||||
how a malicious voter, election supervisor, or CIVS administrator can
|
||||
influence the election can be be found
|
||||
`here <https://civs.cs.cornell.edu/sec_priv.html>`_.
|
||||
|
||||
|
|
|
@ -64,7 +64,7 @@ Key people and their functions
|
|||
In the Rust project there are teams responsible for certain areas. For language features
|
||||
there is a "lang team", for tooling there's "dev tools" and "Cargo", and so on.
|
||||
Contentious issues have facilitators to drive discussion who often aren't the decision
|
||||
makers. Typically the faciliators are authors of the proposed changes (see
|
||||
makers. Typically the facilitators are authors of the proposed changes (see
|
||||
"Controversial decision process" below). They ensure all key decision makers are
|
||||
involved along with interested community members. They push towards an agreeable
|
||||
outcome via iteration.
|
||||
|
@ -622,7 +622,7 @@ Controversial decision process
|
|||
------------------------------
|
||||
|
||||
Hejlsberg is the central figure of the project in terms of language
|
||||
design, sythesizing community needs into a cohesive whole. There is
|
||||
design, synthesizing community needs into a cohesive whole. There is
|
||||
no formal process to externally contribute to the design of the
|
||||
language.
|
||||
|
||||
|
@ -941,7 +941,7 @@ Jeremy Stanley, Chris Dent, Julia Kreger, Sean McGinnis, Emmet Hikory,
|
|||
and Thierry Carrez contributed to the OpenStack section.
|
||||
|
||||
The Project Jupyter Steering Council created the Main Governance Document for
|
||||
Project Jupyter, and Carol Willing summarized the key points of that documennt
|
||||
Project Jupyter, and Carol Willing summarized the key points of that document
|
||||
for the Jupyter section.
|
||||
|
||||
Thank you to Carl Meyer from the Django team for explanation how their
|
||||
|
|
|
@ -187,7 +187,7 @@ Rationale
|
|||
=========
|
||||
|
||||
**Inclusive** The Community Model is the most inclusive model. No single person
|
||||
or a small group of people is in a distiguished position of power over
|
||||
or a small group of people is in a distinguished position of power over
|
||||
others. Contributors and any workgroups in this model are self-selecting.
|
||||
|
||||
**Pragmatic** This model ensures no user group is put at a disadvantage due to
|
||||
|
|
|
@ -278,9 +278,9 @@ member can be retired by a unanimous vote by the rest of the council.
|
|||
|
||||
There is an emergency brake procedure to get rid of a non-functioning council.
|
||||
A single Elder or a group of 10 core developers or PSF voting members can ask for
|
||||
an immedeate reinstating vote of the council as a whole (presumably with the
|
||||
an immediate reinstating vote of the council as a whole (presumably with the
|
||||
intention that the council lose their mandate). If this vote has been requested by an
|
||||
Elder that individual immedeately lose their council position, independent of
|
||||
Elder that individual immediately lose their council position, independent of
|
||||
the outcome of the vote. If the vote has been requested by community members and
|
||||
the council is reinstated this procedure cannot be invoked again for a year.
|
||||
|
||||
|
@ -304,7 +304,7 @@ to be handled by the Council of Elders. This falls to the community as a whole
|
|||
|
||||
There is also the role of figurehead or spokesperson to represent Python and
|
||||
the Python community to the outside world. Again, this is *not* a role that
|
||||
should be handled by the Council of Elders, in my opionion, but by some
|
||||
should be handled by the Council of Elders, in my opinion, but by some
|
||||
other person or body.
|
||||
|
||||
Note that this proposal most likely favors conservatism over progression. Or, at least, the
|
||||
|
|
Loading…
Reference in New Issue