Concurrency in Java
One of the most powerful resources of Java is the synchronization mechanism (monitor). However, we rarely use it in the development of information systems. With the advent of J2EE, there is an application server that offers an execution environment where the programmer doesn't need to concern about concurrency, threads, locks, and so on. But in reality, virtually every program we create is executed concurrently, whether it's a servlet, EJB, applet, or Java application, etc. But the operating system, the JVM or, at a higher level, the application server containers carries the burden of controlling concurrent processes.
But who knows? One day you may need to write an application where concurrent processes (threads in Java) access a shared resource and hence need synchronism. If you've been through this situation before, you know that the complexity of the problem cannot be ignored. To address this problem, Computer Science has developed during the past 3 decades several theoretical and mathematical models (Petri Nets, process algebras such as CSP, etc.), and several language constructs and hardware solutions (mutual exclusion algorithms, semaphores, test-and-set instruction, message passing, monitors). The mechanism present in Java is not new; the "monitor" was developed by Brinch Hansen and implemented in Concurrent Pascal in the 70's. It's a high level solution that is especially useful in the creation of multi-thread applications.
But why is this kind of program complex? To make short a long story, imagine several threads sharing resources and executing concurrently. We can't tell the sequence in which the control will pass from one thread to another; in other words, the different interleaving possibilities are numerous. Then, we need to assure synchronism when a thread uses a shared resource, that is, there must be mutual exclusion during the execution of a critical section of code. Then, we can figure that, as long as a thread detains a resource, the others that need the same resource must be blocked waiting. Then, we can have the situation where a thread holds a resource and is also waiting for another resource that is being used by another thread, which, by its turn, is waiting for the resource held by the first one. Yes, this is the simplest case of "deadlock", which in more complex situations may involve a "circular wait" with N threads. Therefore, you need to concern about preventing, avoiding or at least detecting deadlocks.
But mutual exclusion and deadlock is not the only headache to whom develops a multi-thread application. In some cases, your program will have to guarantee "fair" chances of execution among the threads, that is, avoiding one thread to become indefinitely waiting for a resource (starvation).
I brought up this subject to say that the good news is that there are tools to model and analyze a system with concurrent processes, even before you code it. And if you some day need to write a program with such characteristics or if you are interested in this subject, I'd like to recommend a book: "Concurrency: State Models & Java Programs", by Jeff Magee & Jeff Kramer. This book teaches the theory, presents a formal notation (Finite State Processes - FSP) to model concurrent programs, and contains a CD with a tool called LTSA (Labeled Transition System Analyzer) that can be used to analyze and validate your solution. The idea is: before coding, you model the system using FSP and verifies if and how it leads to deadlocks, and other desirable or undesirable properties of concurrent programs. You can even test properties that are specific to your system (e.g. two cars in opposite directions must not enter a single lane bridge at the same time). And the book teaches how to implement concurrent programs using threads and monitors in Java (the "synchronized" primitive). I believe you'll be more confident when writing the next multi-thread Java program. Read more at www-dse.doc.ic.ac.uk/concurrency/