Next: 6.10 Application: parallel scheduling Up: 6. Aggregate Solutions of Previous: 6.8 Generation of the

6.9 Multiple upper-lower classes

In our exposition, we have restricted ourselves to a single pair of upper and lower sets. Here, we present an extension of our upper-lower aggregation algorithm based on recursion that allows us to deal with multiple upper and lower partitions.

The partition is defined by splitting each , , into upper sets , , and lower sets , consistent with a partition of the set of state indices within a level set into classes , that is:

Let , , and ; analogously, , , and .

Given that lower sets and upper sets exist that satisfy the main conditions, i.e., there are no transitions among sets of the same type (except, of course, through the boundary set), there are transitions only from the upper sets to the lower sets, and communication from the lower sets to the upper sets can occur only through the boundary set , we can apply Algorithm 6.4 in a recursive manner.

(Recursive aggregation of given upper and lower classes)

Step 1: Determine the upper and lower classes in .
Select one lower set , and redefine and , so as to obtain a new partition of the state space that consists of only one upper set and one lower set such that there is a special gate'' state in such that all paths from to must first visit . Sets , , , and are defined as in Algorithm 6.4.

If no such partitioning exists, then the algorithm first attempts to solve with a known method, if this is not possible then the algorithm exits. Otherwise, the algorithm continues with step 2.

Step 2: Generate and solve the upper CTMC''. Using stochastic complementation identify a new CTMC observing the behavior of the original CTMC only in the states . The infinitesimal generator of the new process is . Observe that the new upper'' CTMC is composed by the upper sets and lower sets of the original partitioning.
Step 2a: If , then call Recursive aggregation of given upper and lower classes
Step 2b: If , i.e., if the new upper CTMC'' is composed only of the upper'' sets of the original process, then the process defined by is of the GI/M/1-type. This process can be solved using the matrix geometric method, or by applying stochastic complementation to each of its sets.

Step 3: Generate and solve the lower CTMC''. Same as in Algorithm 6.4.

Step 4: Compute the coupling factors. Same as in Algorithm 6.4.

Step 5: Generate the stationary distribution of the original process. Same as in Algorithm 6.4.

Remarks: We assume that there are upper sets. Because there is strictly no connection among their infinite parts (except, of course, through the boundary set), applying stochastic complementation to these sets in the CTMC that results in Step 2b is an easy problem. We observe that the single entry condition must be satisfied for each of the stochastic complements to be solved efficiently, that each stochastic complement results in a process of the GI/M/1-type, and that more than one of the new processes may share the same gate state.

To illustrate the idea of the multiple lower and upper sets, we give an example of applying Algorithm 6.9. The CTMC in Figure 6.3(a) has a gate state which we use to construct the pseudo-stochastic complement of the states in . This is Step 1 of Algorithm 6.9. We proceed with Step 2 of Algorithm 6.9 and construct the stochastic complement of the rest of the CTMC, which by itself is an independent CTMC as shown in Figure 6.3(b). We continue with Step 2a of Algorithm 6.9 and in the new CTMC of Figure 6.3(b) identify a new lower subchain and since Step 2a is satisfied we recursively continue with Step 1 of Algorithm 6.9. The gate state for the second lower subchain is . We built the CTMC of the pseudo-stochastic complement (sets in ). We go on with Step 2 of Algorithm 6.9 and generate the stochastic complement as shown in Figure 6.3(c). Now the Step 2a of Algorithm 6.9 is not satisfied and we continue with Step 2b by identifying upper sets of the CTMC in Figure 6.3(c). We find the gate states and and built the stochastic complements for the states in and respectively as shown in Figure 6.3(d). After solving these two stochastic complements, we can apply the classic aggregation on the sets of the chain in Figure 6.3(c), as shown in Figure 6.3(d). This completes Step 2b of Algorithm 6.9 and we continue to Step 3, namely solving the lower subsets identified in recursive Step 1 of Algorithm 6.9. Since we have computed the conditional probabilities of the whole set of upper states, we can proceed with the solution of the pseudo-stochastic complements. Finally, we can apply Step 4 and Step 5 of Algorithm 6.9 to complete the solution of the given CTMC.

We close this section with the following observation. Given the stationary probabilities, we can compute various performance measures of interest such as the expected queue length or its higher moments. The performance measures of interest are computed at Step 2 and Step 3 of both Algorithm 6.4 and Algorithm 6.9, where the conditional stationary probabilities of the stochastic complements or the pseudo-stochastic complements are computed and then scaled back according to the coupling factors as defined by Step 4.

Next: 6.10 Application: parallel scheduling Up: 6. Aggregate Solutions of Previous: 6.8 Generation of the
Alma Riska 2003-01-13