HashMap.putAll can cause redundant space waste
Xeno Amess
xenoamess at gmail.com
Fri Feb 4 10:45:21 UTC 2022
also find some other places have same problem.
If some of your already-in people aggree, I would create a pr, but
according to the rules seems I should wait for now.
Xeno Amess <xenoamess at gmail.com> 于2022年2月4日周五 18:42写道:
> import java.lang.reflect.Array;
> import java.lang.reflect.Field;
> import java.util.HashMap;
> import java.util.Map;
>
> public class TestMap {
>
> public static void main(String[] args) throws NoSuchFieldException, IllegalAccessException {
> HashMap<Object, Object> a = new HashMap<>();
> fill12(a);
> HashMap<Object, Object> b = new HashMap<>(12);
> fill12(b);
> HashMap<Object, Object> c = new HashMap<>(a);
> HashMap<Object, Object> d = new HashMap<>();
> d.putAll(a);
> System.out.println("a : " + getArrayLength(a));
> System.out.println("b : " + getArrayLength(b));
> System.out.println("c : " + getArrayLength(c));
> System.out.println("d : " + getArrayLength(d));
> }
>
> public static void fill12(Map<Object, Object> map) {
> for (int i = 0; i < 12; i++) {
> map.put(i, i);
> }
> }
>
> public static int getArrayLength(Map<Object, Object> map) throws NoSuchFieldException, IllegalAccessException {
> Field field = HashMap.class.getDeclaredField("table");
> field.setAccessible(true);
> Object table = field.get(map);
> return Array.getLength(table);
> }
>
> }
>
> run this and we get the output:
>
> a : 16
> b : 16
> c : 32
> d : 32
>
> So I go see the codes.
>
> /**
> * Implements Map.putAll and Map constructor.
> *
> * @param m the map
> * @param evict false when initially constructing this map, else
> * true (relayed to method afterNodeInsertion).
> */
> final void putMapEntries(Map<? extends K, ? extends V> m, boolean evict) {
> int s = m.size();
> if (s > 0) {
> if (table == null) { // pre-size
> float ft = ((float)s / loadFactor) + 1.0F;
> int t = ((ft < (float)MAXIMUM_CAPACITY) ?
> (int)ft : MAXIMUM_CAPACITY);
> if (t > threshold)
> threshold = tableSizeFor(t);
> } else {
> // Because of linked-list bucket constraints, we cannot
> // expand all at once, but can reduce total resize
> // effort by repeated doubling now vs later
> while (s > threshold && table.length < MAXIMUM_CAPACITY)
> resize();
> }
>
> for (Map.Entry<? extends K, ? extends V> e : m.entrySet()) {
> K key = e.getKey();
> V value = e.getValue();
> putVal(hash(key), key, value, false, evict);
> }
> }
> }
>
> yep I do think *((float)s / loadFactor) + 1.0F* here is wrong.
>
> It should be *Math.ceil((float)s / loadFactor)*
>
> So I wish to generate a pull request.
>
> Anyone interested?
>
>
More information about the core-libs-dev
mailing list