In a shared-memory model, you don't want P and Q to be writing to the shared variable at the same time. This (safety) property of having at most one process access a shared variable at once is called mutual exclusion. To synchronize access to the shared variable, i.e., to implement mutual exclusion, you resort to using one or more synchronization constructs that have been designed and thoroughly studied over the years, e.g., semaphores, mutexes, condition variables, and monitors.
You also usually want Q to always read the ``last'' value written. Here ``last'' must be defined in terms of some ordering. Usually this ordering is a partial order on the set of operations performed by the concurrent processes; sometimes it's a real-time order.
A good example of shared memory is the highway system. When cars approach a four-way intersection, we do not want all cars to enter the intersection at once. We could allow cars moving parallel to each other (but in opposite directions) to enter the intersection at the same time, but certainly not cars whose paths are perpendicular to each other. So, we use traffic lights to mediate access to the shared resource.
In a message-passing model, the sending and receiving processes need to coordinate sending and receiving messages with each other so that messages sent are eventually received and that messages received have actually been sent. They synchronize access to the shared channel.
There are two kinds of communication over this shared channel: asynchronous and synchronous. In asynchronous message passing, the sender is non-blocking; it sends its message and proceeds immediately to do more work, not waiting for the receiver to receive the message. The sender and receiver execute independently of each other.
A good example of asynchronous message passing is the postal system where the sender drops a piece of mail in the mailbox and continues merrily along (shopping, eating, whatever); the recipient receives the mail sometime later. (It would be terrible if the sender had to wait until the mail was actually delivered to the recipient; it would get awfully tired and hungry!)
In synchronous message passing, both sender and receiver are blocking and the channel provides a direct link between the two processes. A process sending a message delays until the other process is ready to receive it. An exchange of a message represents a synchronization point between the two processes. Thus, communication and synchronization are tightly-coupled.
A good example of synchronous message passing is the telephone system where the caller places a call and waits for the callee to answer; the caller is blocked (and does not do more work) until the callee answers. Note that initially the callee is blocked too; for example, it does not periodically pick up its phone to see if there's someone trying to talk to it. We can also model buffered message passing, in which the channel has capacity. Here, the sender delays if the channel is full; thus, if the callee is busy, then the caller has to wait until the callee is off the phone before being able to place the call successfully.
CSP, a model of concurrent systems that we will study in detail this term is based on synchronous message passing.