Withdrawn by author; marked rejected.
This commit is contained in:
parent
9b7cec2cd0
commit
26511b4f25
19
pep-0270.txt
19
pep-0270.txt
|
@ -3,13 +3,30 @@ Title: uniq method for list objects
|
||||||
Version: $Revision$
|
Version: $Revision$
|
||||||
Last-Modified: $Date$
|
Last-Modified: $Date$
|
||||||
Author: jp@demonseed.net (Jason Petrone)
|
Author: jp@demonseed.net (Jason Petrone)
|
||||||
Status: Draft
|
Status: Rejected
|
||||||
Type: Standards Track
|
Type: Standards Track
|
||||||
Created: 21-Aug-2001
|
Created: 21-Aug-2001
|
||||||
Python-Version: 2.2
|
Python-Version: 2.2
|
||||||
Post-History:
|
Post-History:
|
||||||
|
|
||||||
|
|
||||||
|
Notice
|
||||||
|
|
||||||
|
This PEP is withdrawn by the author. He writes:
|
||||||
|
|
||||||
|
Removing duplicate elements from a list is a common task, but
|
||||||
|
there are only two reasons I can see for making it a built-in.
|
||||||
|
The first is if it could be done much faster, which isn't the
|
||||||
|
case. The second is if it makes it significantly easier to
|
||||||
|
write code. The introduction of sets.py eliminates this
|
||||||
|
situation since creating a sequence without duplicates is just
|
||||||
|
a matter of choosing a different data structure: a set instead
|
||||||
|
of a list.
|
||||||
|
|
||||||
|
As described in PEP 218, sets are being added to the standard
|
||||||
|
library for Python 2.3.
|
||||||
|
|
||||||
|
|
||||||
Abstract
|
Abstract
|
||||||
|
|
||||||
This PEP proposes adding a method for removing duplicate elements to
|
This PEP proposes adding a method for removing duplicate elements to
|
||||||
|
|
Loading…
Reference in New Issue