Update to reflect the equivalence:

Decimal(12345)==Decimal('12345') where the precision is three
This commit is contained in:
Raymond Hettinger 2004-06-30 04:46:47 +00:00
parent 04361245af
commit 304ff42785
1 changed files with 7 additions and 4 deletions

View File

@ -728,10 +728,13 @@ So, here I define the behaviour again for each data type.
From int or long
''''''''''''''''
Aahz suggested the need of an explicit conversion from int, but also
thinks it's OK if the precision in the current Context is not
exceeded; in that case you raise ValueError. Votes in
comp.lang.python agreed with this.
An int or long is a treated like a Decimal explicitly constructed from
Decimal(str(x)) in the current context (meaning that the to-string rules
for rounding are applied and the appropriate flags are set). This
guarantees that expressions like ``Decimal('1234567') + 13579`` match
the mental model of ``Decimal('1234567') + Decimal('13579')``. That
model works because all integers are representable as strings without
representation error.
From string