Integer.parseInt
Joe Darcy
joe.darcy at oracle.com
Thu Apr 12 03:00:26 UTC 2012
On 4/11/2012 7:45 AM, Rémi Forax wrote:
> On 04/11/2012 04:18 PM, Benedict Elliott Smith wrote:
>> Hi,
>>
>> Looking at the source code, it doesn't appear as though there is any
>> reason
>> to require the input be a String - only length() and charAt() are
>> called,
>> which are both declared by CharSequence. Is there a reason I can't
>> fathom,
>> or was it an oversight? It would be nice to widen the type for this (and
>> potentially other equivalent methods).
>
> Integer.parseInt (1.0 or 1.1) pre-date CharSequence (1.4),
> that's why it use a String an not a CharSequence.
>
> If you don't want to break all already compiled programs,
> you can't just replace String by CharSequence because the exact
> signature of the method
> (with the parameter types) is encoded in the bytecode.
> Joe Darcy write a cool blog post on that [1].
That is a kinder description of the blog post than I would expect :-)
FYI, a fuller exploration of that issue in a broader context is written
up in:
http://cr.openjdk.java.net/~darcy/OpenJdkDevGuide/OpenJdkDevelopersGuide.v0.777.html
>
> The best here is to add new methods that takes a CharSequence, move
> the code that use
> a String in them and change the method that takes a String to delegate
> to the one that use
> a CharSequence.
>
> cheers,
> Rémi
> [1] https://blogs.oracle.com/darcy/entry/kinds_of_compatibility
>
Remi and I have in the past had differences of opinion on the utility of
introducing CharSequence versions of such methods.
One benefit to using a string is that the object is immutable; there are
no time-of-check-versus-time-of-use conditions to worry about. Robust
code should arguably work sensibly even with mutable CharSequences, and
the easiest way to ensure that is to call the toString method of a
CharSequence passed as a parameter.
-Joe
More information about the core-libs-dev
mailing list