Markov modeling is a common approach to analyzing the performance of computer systems, service facilities, and manufacturing systems. The performance (such as mean response time) of a system can be analyzed by modeling the system as a Markov chain and analyzing the stationary probabilities in the Markov chain. However, the performance analysis of a multiserver system with multiple classes of jobs has a common source of difficulty: the Markov chain that models the system behavior has a state space that grows infinitely in multiple dimensions. We start this chapter by providing two simple examples of such multidimensional Markov chains (Markov chains on a multidimensionally infinite state space).
First, consider two processors, each serving its own M/M/1 queue,
where one of the processors (the ``donor'') can help the other
processor (the ``beneficiary'') with its jobs, during times when the
donor queue is empty. Specifically, the beneficiary jobs
(respectively, donor jobs) arrive according to a Poisson process with
rate (respectively,
). The service demands of
these jobs have exponential distributions with rate
and
, respectively. When there are no donor jobs and there are at
least two beneficiary jobs, the donor server processes a beneficiary
job, and hence the beneficiary jobs completes with rate
.
Figure 3.1(a) shows a Markov chain
on a two-dimensionally infinite state space (2D Markov chain)
that models the behavior of this system. In this Markov chain, one
dimension tracks the number of donor jobs, and the other dimension
tracks the number of beneficiary jobs. Since the behavior of
beneficiary jobs depends on the number of donor jobs
(specifically, whether there are 0 or
donor jobs) in the system,
the 2D Markov chain cannot simply be decomposed into two 1D Markov
chains (Markov chains on a one-dimensionally infinite state space).
|
Next, consider an M/M/2 queue with two priority classes, where high
priority jobs have preemptive priority over low priority jobs.
Specifically, the low (respectively, high) priority jobs
arrive according to a Poisson process with rate
(respectively,
).
The service demands of these jobs have exponential distributions
with rate
and
, respectively.
When there are
high priority jobs,
only
servers are available for low priority jobs.
Figure 3.1(b) shows a 2D Markov chain
that models the behavior of this system. In this Markov chain, one
dimension tracks the number of high priority jobs, and the other
dimension tracks the number of low priority jobs. Again, since the
behavior of low priority jobs depends on the number of high priority
jobs in the system, the 2D Markov chain cannot simply be decomposed into
two 1D Markov chains. As we will see, when there are
priority
classes, the performance analysis of the lowest priority class
involves an
D Markov chain (Markov chain on an
-dimensionally infinite state space).
The goal of this chapter is to provide an analytical tool, which we refer to as dimensionality reduction (DR), that allows us to analyze these and more complex multidimensional Markov chains. DR reduces a multidimensional Markov chain to a 1D Markov chain, which closely approximates the multidimensional Markov chain and can be analyzed efficiently. We will define a broad class of (multidimensional) Markov chains, which we will refer to as RFB/GFB processes (recursive foreground-background processes or generalized foreground-background processes), that allow us to model many interesting multiserver systems with multiple classes of jobs, and then show an analysis of these Markov chains via DR. DR involves approximations, but we will show that the mean response time computed via DR is usually within a few percent of the simulated value. DR will enable one to analyze the performance of many multiserver systems by simply modeling them as RFB/GFB processes.