PEP 675: Mark malicious code example with red sidebar (#3574)

Co-authored-by: Aliaksei Urbanski <aliaksei.urbanski@gmail.com>
This commit is contained in:
Hugo van Kemenade 2023-12-12 01:22:29 +02:00 committed by GitHub
parent d9e47a206b
commit 8ce4ba9e8a
No known key found for this signature in database
GPG Key ID: 4AEE18F83AFDEB23
1 changed files with 16 additions and 11 deletions

View File

@ -1,14 +1,11 @@
PEP: 675
Title: Arbitrary Literal String Type
Version: $Revision$
Last-Modified: $Date$
Author: Pradeep Kumar Srinivasan <gohanpra@gmail.com>, Graham Bleaney <gbleaney@gmail.com>
Sponsor: Jelle Zijlstra <jelle.zijlstra@gmail.com>
Discussions-To: https://mail.python.org/archives/list/typing-sig@python.org/thread/VB74EHNM4RODDFM64NEEEBJQVAUAWIAW/
Status: Accepted
Type: Standards Track
Topic: Typing
Content-Type: text/x-rst
Created: 30-Nov-2021
Python-Version: 3.11
Post-History: 07-Feb-2022
@ -50,7 +47,8 @@ However, the user-controlled data ``user_id`` is being mixed with the
SQL command string, which means a malicious user could run arbitrary
SQL commands:
::
.. code-block::
:class: bad
# Delete the table.
query_user(conn, "user123; DROP TABLE data;")
@ -883,7 +881,8 @@ or ``do_mark_safe`` in `Jinja2
<https://github.com/pallets/jinja/blob/077b7918a7642ff6742fe48a32e54d7875140894/src/jinja2/filters.py#L1264>`_,
which cause XSS vulnerabilities:
::
.. code-block::
:class: bad
dangerous_string = django.utils.safestring.mark_safe(f"<script>{user_input}</script>")
return(dangerous_string)
@ -891,7 +890,8 @@ which cause XSS vulnerabilities:
This vulnerability could be prevented by updating ``mark_safe`` to
only accept ``LiteralString``:
::
.. code-block::
:class: good
def mark_safe(s: LiteralString) -> str: ...
@ -913,7 +913,8 @@ insert expressions which execute arbitrary code and `compromise
<https://www.onsecurity.io/blog/server-side-template-injection-with-jinja2/>`_
the application:
::
.. code-block::
:class: bad
malicious_str = "{{''.__class__.__base__.__subclasses__()[408]('rm - rf /',shell=True)}}"
template = jinja2.Template(malicious_str)
@ -923,7 +924,8 @@ the application:
Template injection exploits like this could be prevented by updating
the ``Template`` API to only accept ``LiteralString``:
::
.. code-block::
:class: good
class Template:
def __init__(self, source: LiteralString): ...
@ -945,7 +947,8 @@ options which are vulnerable to Denial of Service attacks from
externally controlled logging strings. The following example
illustrates a simple denial of service scenario:
::
.. code-block::
:class: bad
external_string = "%(foo)999999999s"
...
@ -957,7 +960,8 @@ string passed to the logger be a ``LiteralString`` and that all
externally controlled data be passed separately as arguments (as
proposed in `Issue 46200 <https://bugs.python.org/issue46200>`_):
::
.. code-block::
:class: good
def info(msg: LiteralString, *args: object) -> None:
...
@ -983,7 +987,8 @@ warnings about non-literal strings.
4. Trivial functions could be constructed to convert a ``str`` to a
``LiteralString``:
::
.. code-block::
:class: bad
def make_literal(s: str) -> LiteralString:
letters: Dict[str, LiteralString] = {