HashMap Vs. ConcurrentHashMap Vs. SynchronizedMap – How a HashMap can be Synchronized in Java
HashMap
is a very powerful data structure in Java. We use it everyday and almost in all applications. There are quite a few examples which I have written before on How to Implement Threadsafe cache, How to convert Hashmap to Arraylist?
We used Hashmap in both above examples but those are pretty simple use cases of Hashmap.
HashMap is a non-synchronized
collection class.Do you have any of below questions?
- What’s the difference between ConcurrentHashMap and Collections.synchronizedMap(Map)?
- What’s the difference between ConcurrentHashMap and Collections.synchronizedMap(Map) in term of performance?
- ConcurrentHashMap vs Collections.synchronizedMap()
- Popular HashMap and ConcurrentHashMap interview questions
In this tutorial we will go over all above queries and reason
why and how
we could Synchronize Hashmap?Why?
The Map object is an associative containers that store elements, formed by a combination of a uniquely identify
key
and a mapped value
. If you have very highly concurrent application in which you may want to modify or read key value in different threads then it’s ideal to use Concurrent Hashmap. Best example is Producer Consumer which handles concurrent read/write.
So what does the thread-safe Map means? If
multiple threads
access a hash map concurrently, and at least one of the threads modifies the map structurally, it must be synchronized externally
to avoid an inconsistent view of the contents.How?
There are two ways we could synchronized HashMap
- Java Collections synchronizedMap() method
- Use ConcurrentHashMap
//Hashtable
Map<String, String> normalMap = new Hashtable<String, String>();
//synchronizedMap
synchronizedHashMap = Collections.synchronizedMap(new HashMap<String, String>());
//ConcurrentHashMap
concurrentHashMap = new ConcurrentHashMap<String, String>();
|
ConcurrentHashMap
- You should use ConcurrentHashMap when you need very high concurrency in your project.
- It is thread safe without synchronizing the
whole map
. - Reads can happen very fast while write is done with a lock.
- There is no locking at the object level.
- The locking is at a much finer granularity at a hashmap bucket level.
- ConcurrentHashMap doesn’t throw a
ConcurrentModificationException
if one thread tries to modify it while another is iterating over it. - ConcurrentHashMap uses multitude of locks.
SynchronizedHashMap
- Synchronization at Object level.
- Every read/write operation needs to acquire lock.
- Locking the entire collection is a performance overhead.
- This essentially gives access to only one thread to the entire map & blocks all the other threads.
- It may cause contention.
- SynchronizedHashMap returns
Iterator
, which fails-fast on concurrent modification.
Now let’s take a look at code
- Create class
CrunchifyConcurrentHashMapVsSynchronizedHashMap.java
- Create object for each HashTable, SynchronizedMap and CrunchifyConcurrentHashMap
- Add and retrieve 500k entries from Map
- Measure start and end time and display time in milliseconds
- We will use ExecutorService to run
5 threads
in parallel
Here is a Java code:
package crunchify.com.tutorials;
import java.util.Collections;
import java.util.HashMap;
import java.util.Hashtable;
import java.util.Map;
import java.util.concurrent.ConcurrentHashMap;
import java.util.concurrent.ExecutorService;
import java.util.concurrent.Executors;
import java.util.concurrent.TimeUnit;
/**
* @author Crunchify.com
*
*/
public class CrunchifyConcurrentHashMapVsSynchronizedMap {
public final static int THREAD_POOL_SIZE = 5;
public static Map<String, Integer> crunchifyHashTableObject = null;
public static Map<String, Integer> crunchifySynchronizedMapObject = null;
public static Map<String, Integer> crunchifyConcurrentHashMapObject = null;
public static void main(String[] args) throws InterruptedException {
// Test with Hashtable Object
crunchifyHashTableObject = new Hashtable<String, Integer>();
crunchifyPerformTest(crunchifyHashTableObject);
// Test with synchronizedMap Object
crunchifySynchronizedMapObject = Collections.synchronizedMap(new HashMap<String, Integer>());
crunchifyPerformTest(crunchifySynchronizedMapObject);
// Test with ConcurrentHashMap Object
crunchifyConcurrentHashMapObject = new ConcurrentHashMap<String, Integer>();
crunchifyPerformTest(crunchifyConcurrentHashMapObject);
}
public static void crunchifyPerformTest(final Map<String, Integer> crunchifyThreads) throws InterruptedException {
System.out.println("Test started for: " + crunchifyThreads.getClass());
long averageTime = 0;
for (int i = 0; i < 5; i++) {
long startTime = System.nanoTime();
ExecutorService crunchifyExServer = Executors.newFixedThreadPool(THREAD_POOL_SIZE);
for (int j = 0; j < THREAD_POOL_SIZE; j++) {
crunchifyExServer.execute(new Runnable() {
@SuppressWarnings("unused")
@Override
public void run() {
for (int i = 0; i < 500000; i++) {
Integer crunchifyRandomNumber = (int) Math.ceil(Math.random() * 550000);
// Retrieve value. We are not using it anywhere
Integer crunchifyValue = crunchifyThreads.get(String.valueOf(crunchifyRandomNumber));
// Put value
crunchifyThreads.put(String.valueOf(crunchifyRandomNumber), crunchifyRandomNumber);
}
}
});
}
// Initiates an orderly shutdown in which previously submitted tasks are executed, but no new tasks will be accepted. Invocation
// has no additional effect if already shut down.
// This method does not wait for previously submitted tasks to complete execution. Use awaitTermination to do that.
crunchifyExServer.shutdown();
// Blocks until all tasks have completed execution after a shutdown request, or the timeout occurs, or the current thread is
// interrupted, whichever happens first.
crunchifyExServer.awaitTermination(Long.MAX_VALUE, TimeUnit.DAYS);
long entTime = System.nanoTime();
long totalTime = (entTime - startTime) / 1000000L;
averageTime += totalTime;
System.out.println("500K entried added/retrieved in " + totalTime + " ms");
}
System.out.println("For " + crunchifyThreads.getClass() + " the average time is " + averageTime / 5 + " ms\n");
}
}
|
shutdown()
means the executor service takes no more incoming tasks.awaitTermination()
is invoked after a shutdown request.
And hence, you need to first shutdown the serviceExecutor and then block and wait for threads to finish.
Eclipse Console Result:
Test started for: class java.util.Hashtable
500K entried added/retrieved in 1832 ms
500K entried added/retrieved in 1723 ms
500K entried added/retrieved in 1782 ms
500K entried added/retrieved in 1607 ms
500K entried added/retrieved in 1851 ms
For class java.util.Hashtable the average time is 1759 ms
Test started for: class java.util.Collections$SynchronizedMap
500K entried added/retrieved in 1923 ms
500K entried added/retrieved in 2032 ms
500K entried added/retrieved in 1621 ms
500K entried added/retrieved in 1833 ms
500K entried added/retrieved in 2048 ms
For class java.util.Collections$SynchronizedMap the average time is 1891 ms
Test started for: class java.util.concurrent.ConcurrentHashMap
500K entried added/retrieved in 1068 ms
500K entried added/retrieved in 1029 ms
500K entried added/retrieved in 1165 ms
500K entried added/retrieved in 840 ms
500K entried added/retrieved in 1017 ms
For class java.util.concurrent.ConcurrentHashMap the average time is 1023 ms
|
No comments:
Post a Comment