Java – Hashmap creates performance when I know the final number of elements

Hashmap creates performance when I know the final number of elements… here is a solution to the problem.

Hashmap creates performance when I know the final number of elements

If I know the final size of an element in a HashMap, what is the best way to build it from a performance perspective? Based on JavaDoc, to avoid rehashing, you can do the following:

int TOTAL_ELEMENTS_TO_BE_STORED = 10;
... = new HashMap<T, Q>( TOTAL_ELEMENTS_TO_BE_STORED + 1, 1.0f );

Also:

... = new HashMap<T, Q>( Math.ceil(TOTAL_ELEMENTS_TO_BE_STORED * 1.333) + 1 );

I read from HashMap javadoc:

Higher values decrease the space overhead but increase the lookup cost (reflected in most of the operations of the HashMap class, including get and put).

Is it really more expensive to find? In this case, it is generally recommended to use the default 0.75 load factor, instead to provide more capacity, or vice versa?

Solution

Yes, it will cost more to look.

The choice depends on your requirements.

  • You need to be able to find elements quickly, and your data is small enough – a 0.75 load factor is retained
  • You have a lot of data and don’t want to free up too much memory – use 1.0 as a loading factor.

By the way, the load factor is not in the range [0.75, 1] – you can choose any positive value. The higher the value, the less memory you need, but the longer it takes to look up.

Related Problems and Solutions