Optimize (Linked-)HashMap lookup when backing array is null
Claes Redestad
claes.redestad at oracle.com
Wed May 20 19:11:31 UTC 2020
Hi,
seems acceptable to me, too.
I'd be happy to sponsor this, but just starting a long weekend here so
might not have time before start of next week.
/Claes
On 2020-05-19 22:21, Roger Riggs wrote:
> Hi,
>
> Right, its only in the case of removing the node because the new value
> was null.
> Besides being infrequent, that should not be a problem.
>
> Thanks, Roger
>
>
> On 5/19/20 3:56 PM, Christoph Dreis wrote:
>> Hi Roger,
>>
>>> How does the performance degrade when the computation of the hash
>>> is non-trivial. For example, with an array as a key or an object with
>>> fields that are objects?
>>> The original code made a point of computing the hash only once.
>> HashMap.get() and HashMap.containsKey() etc. would still calculate it
>> only once.
>> Only computeIfPresent would degrade. Did you mean that?
>>
>> Cheers,
>> Christoph
>>
>> On 5/19/20 3:21 PM, Claes Redestad wrote:
>>>> Thanks for producing the simpler variant and getting some comparative
>>>> runs done.
>>>>
>>>> On 2020-05-19 19:50, Christoph Dreis wrote:
>>>>> I would probably go for the first version although it is a bit more
>>>>> complicated and has the computeIfPresent caveat, because it doesn't
>>>>> slow down the common Map.get() case when the map is actually filled.
>>>> It is a bit of step up in complexity, but getting the win without
>>>> adding
>>>> a branch to Map.get() _is_ a nice property.
>>>>
>>>>> Let me know what you think.
>>>> If core libs maintainers are fine with the added complexity of your
>>>> original patch, I think it looks ok.
>>>>
>>>> /Claes
>>
>>
>
More information about the core-libs-dev
mailing list