Expecting Integer.valueOf(String) to accept Literal format ...
Bernd Eckenfels
ecki at zusammenkunft.net
Wed Apr 13 20:30:53 UTC 2016
Am Wed, 13 Apr 2016 13:16:45 -0500
schrieb Paul Benedict <pbenedict at apache.org>:
> I think the argument for changing Integer.valueOf(String) hinges on
> the belief that the method is meant to parse JLS integer syntax. If
> it is, the Javadocs don't speak to that responsibility. If it's an
> omission from the docs, I would never have guessed.
I agree it is not. It points to parseInt() which itself exactly
describes its grammar:
# Parses the string argument as a signed decimal integer.
# The characters in the string must all be decimal digits,
# except that the first character may be an ASCII minus sign
# '-' ('\u002D') to indicate a negative value or an ASCII plus
# sign '+' ('\u002B') to indicate a positive value.
This quite concrete. The decode() method mentioning the JLS (and
supporting the integer literal prefixes) is a more ikely candidate for
that. But that method explicitely dislclaims its heritage:
# DecimalNumeral, HexDigits, and OctalDigits are as defined in section
# 3.10.1 of The Java™ Language Specification, except that underscores
# are not accepted between digits.
So while I symphatise with having a full features JLS integer parser
the spec would require a change to accept it. Maybe a new method... (it
could eve take a JLS version argument :)
>
> Regarding how it parses today, I wouldn't be surprised if there are
> validation libraries out there using Integer.valueOf(String) for
> user-input validation. Good or bad decision, I expect them to exist.
> They probably have relied on it not being JLS compliant. So changing
> the behavior would break the assumption and their code.
>
> Cheers,
> Paul
More information about the core-libs-dev
mailing list