Integer.parseInt
Benedict Elliott Smith
lists at laerad.com
Wed Apr 11 14:50:12 UTC 2012
Sounds like a perfectly good reason! - and also that it should be a
relatively safe change to implement. Any volunteers? I'd be happy to, but I
expect the overhead for having a non-contributor do it would exceed the
actual work by several orders of magnitude.
On 11 April 2012 15:45, Rémi Forax <forax at univ-mlv.fr> wrote:
> On 04/11/2012 04:18 PM, Benedict Elliott Smith wrote:
>
>> Hi,
>>
>> Looking at the source code, it doesn't appear as though there is any
>> reason
>> to require the input be a String - only length() and charAt() are called,
>> which are both declared by CharSequence. Is there a reason I can't fathom,
>> or was it an oversight? It would be nice to widen the type for this (and
>> potentially other equivalent methods).
>>
>
> Integer.parseInt (1.0 or 1.1) pre-date CharSequence (1.4),
> that's why it use a String an not a CharSequence.
>
> If you don't want to break all already compiled programs,
> you can't just replace String by CharSequence because the exact signature
> of the method
> (with the parameter types) is encoded in the bytecode.
> Joe Darcy write a cool blog post on that [1].
>
> The best here is to add new methods that takes a CharSequence, move the
> code that use
> a String in them and change the method that takes a String to delegate to
> the one that use
> a CharSequence.
>
> cheers,
> Rémi
> [1] https://blogs.oracle.com/**darcy/entry/kinds_of_**compatibility<https://blogs.oracle.com/darcy/entry/kinds_of_compatibility>
>
>
>
More information about the core-libs-dev
mailing list