CS102: Deadlock

Copyright © 1999, Kenneth J. Goldman


We've seen that by using thread synchronization, we can prevent threads from seeing the intermediate states during method execution by other threads. The synchronized modifier acquires a lock on the object at the beginning of the execution of the method and releases it after the method is done executing, thereby making the method appear as if it were executed in a single atomic step.

When Not to Synchronize

So, the next logical question is: Why not put the synchronized modifier in front of every method?

It would be nice if it were so easy, but certain problems can arise if we acquire a lock an object before execution of every method. Let's Consider an example:

class Foo{
private int x=0;
synchronized void setMaximum(int v, Foo f){
x = Math.max(v,f.getValue());
synchronized int getValue(){
return x;


So far, Foo looks harmless. Now, lets' suppose we create two objects.

Foo a = new Foo();
Foo b = new Foo();

Now, lets start up two threads, as follows:

Thread 1 Thread 2
a.setMaximum(3,b); b.setMaximum(4,a);

Either thread might run to completion before the other starts, but suppose the following happens:

Thread 1 Thread 2
aquire lock on a
aquire a lock on b
attempt call to b.getValue()
attempt call to a.getValue()

Each thread is waiting for the other thread to release the lock before it can continue. But neither thread will release the lock until it finishes. This situation is knows as Deadlock

Causes of Deadlock

Deadlock occurs when there is a cycle of threads waiting for each other. We can draw a diagram of this situation as a graph, where an arrow represents a "waiting for" relation.

Or, to be more specific, we can add the objects (resources) to the graph. Here, and arrow from a thread to a resource means the thread is waiting for a lock on that resource, and an arrow from a resource to a thread means the reasource is locked by the thread.

In general, if there is a cycle in the graph, deadlock exists and all threads involved in the cycle will be blocked forever. Java does not detect this situation--you'll notice it only when things come to a grinding halt. And no amount of testing will guarantee deadlock can't happen unless you test all possible interleavings. Therefore, it's important to reason carefully about your program in order to prove that deadlock cannot occur.

Avoiding Deadlock

You can avoid problems like the example above by making sure that synchronized methods never call (directly or indirectly) synchronized methods on other objects (that could be locked by other threads). However, calling another synchronized method within the same object is perfectly safe, since you already have the lock. To fix the problem above means either dropping synchronized from getX, or changing the interface so setMaximum takes in two values, and therefore has no need to call getX on another object.

Note: If two threads may access shared data through methods that are not synchronized, its important to declare those variables as volatile so the compiler will not perform optimizations that result in local copies.
For example,
private volatile int x;
Volatile here tells the compiler that multiple threads may access the data.

Now, lets look at another kind of situation that can cause deadlock.

Whats wrong with the following class definition?

class SureToBlock extends Thread{
int counter = 10;
public void run(){

synchronized void takeStep(){
if(counter == 0)

}catch(InterruptedException ie){

synchronized void resetCounter(int n){
counter = n;

Whats wrong? If the counter gets to zero, the thread will suspend itself. Presumably, the resetCounter method is supposed to wake it up. But, any thread that tries to call resetCounter will be blocked waiting for takeStep to return--but it won't return because it's suspended!

Solution: call suspend within run(), outside the synchronized method.