Principles of Operational Research

运筹学通论(070105M04001H)

1.1

Suppose the time spent waiting for a bus on the North Carolina State Campus can be represented by an exponential random variable with a mean of three minutes. Compute the probability of having to wait more than five minutes.

Solution:

Given the interarrival times, denoted by \(X_n\), obeying Poisson distribution with \(E(X_n)=3,n=1,2,...\), the process of waiting for the bus can be regarded as a Poisson process with \(\lambda=\frac{1}{3}\). ("the 4th definition of Poisson process", Remark 2.2, Ⅱ-20/41) Thus, the probability is \(P\{X_i>5\}=e^{-5\lambda}=e^{-\frac{5}{3}}\approx0.189\). ("\(X_n,n=1,2,...\)are i.i.d.", Proposition 2.1, Ⅱ-14/41)

1.2

Two individuals, \(A\) and \(B\), both require kidney transplants. If she does not receive a new kidney, then \(A\) will die after an exponential time with rate \(\mu_A\), and \(B\) after an exponential time with rate \(\mu_B\). New kidneys arrive in accordance with a Poisson process having rate \(\lambda\). It has been decided that the first kidney will go to \(A\) (or to \(B\) if \(B\) is alive and \(A\) is not at that time) and the next one to \(B\) (if still living).

  1. What is the probability that \(A\) obtains a new kidney?
  2. What is the probability that neither \(A\) nor \(B\) obtains a new kidney?
  3. What is the probability that both \(A\) and \(B\) obtain new kidneys?

Solution:

Denote the time of death of \(A\) and \(B\) by \(D_A,D_B\) , respectively. Then we have \(D_A\sim Exp(\mu_A),\ D_B\sim Exp(\mu_B)\). Denote the waiting time for the \(n\)th kidney by \(S_0=0,\ S_n=\sum\limits_{i=1}^n X_i\ ,n\ge1\). Then we have \(S_n\sim Erlang(n,\lambda)\). ("waiting time distribution", Ⅱ-17/41). All cases are shown in the table below.

\(0\xrightarrow{\qquad\qquad\qquad}t\) life-or-death situation
\(D_A\le D_B\le S_1\le S_2\) both \(A\) and \(B\) die
\(D_A\le S_1\le D_B\le S_2\) \(A\) die, \(B\) live
\(D_A\le S_1\le S_2\le D_B\) \(A\) die, \(B\) live
\(D_B\le D_A\le S_1\le S_2\) both \(A\) and \(B\) die
\(D_B\le S_1\le D_A\le S_2\) \(A\) live, \(B\) die
\(D_B\le S_1\le S_2\le D_A\) \(A\) live, \(B\) die
\(S_1\le D_A\le D_B\le S_2\) \(A\) live, \(B\) die
\(S_1\le D_B\le D_A\le S_2\) \(A\) live, \(B\) die
\(S_1\le D_A\le S_2\le D_B\) both \(A\) and \(B\) live
\(S_1\le D_B\le S_2\le D_A\) \(A\) live, \(B\) die
\(S_1\le S_2\le D_A\le D_B\) both \(A\) and \(B\) live
\(S_1\le S_2\le D_B\le D_A\) both \(A\) and \(B\) live

Throughout these three questions, we use the continuous form of conditional expectation. ("conditional expectation", Ⅰ-43/58).

(a)

The probability that \(A\) obtains a new kidney is \[ \begin{aligned} P\{D_A>S_1\}&=\int^\infty_0 P\{D_A>S_1|S_1=s\}f_{S_1}(s)\mathrm{d}s\\ &=\int^\infty_0e^{-\mu_As}\cdot \lambda e^{-\lambda s}\mathrm{d}s\\ &=\frac{\lambda}{\mu_A+\lambda}. \end{aligned} \] ("birth and death processes", Ⅴ-10/48)

(b)

The probability that neither \(A\) nor \(B\) obtains a new kidney is \[ \begin{aligned} P\{D_A<S_1,\ D_B<S_1\}&=\int_{0}^{\infty} P\{D_A<S_1,D_B<S_1|S_1=s\}f_{S_1}(s)\mathrm{d}s\\ &=\int_{0}^{\infty}P\{D_A<s\}P\{D_B<s\}f_{S_1}(s)\mathrm{d}s\qquad (\text{since}\ D_A \perp \!\!\! \perp D_B) \\ &=\int_{0}^{\infty}(1-e^{-\mu_A s})(1-e^{-\mu_B s})\lambda e^{-\lambda s}\mathrm{d}s \\ &=1-\frac{\lambda}{\lambda+\mu_A}-\frac{\lambda}{\lambda+\mu_B}+\frac{\lambda}{\mu_A+\mu_B}. \end{aligned} \]

(c)

The probability that both \(A\) and \(B\) obtain new kidneys is \[ \begin{aligned} P\{D_A>S_1,\ D_B>S_2\}&=\int_{0}^{\infty}P\{D_A>S_1,D_B>S_2|S_1=s\}f_{S_1}(s)\mathrm{d}s \\ &=\int_{0}^{\infty}P\{D_A>s\}P\{D_B-X_2>s\}f_{S_1}(s)\mathrm{d}s. \qquad \\ &(\text{since}\ D_A \perp \!\!\! \perp D_B,D_A \perp \!\!\! \perp X_2 \Longrightarrow D_A \perp \!\!\! \perp D_B-X_2) \end{aligned} \] (reference: https://math.stackexchange.com/a/1682608)

We need calculate \(P\{D_B-X_2>s\}\). Since \(D_B \perp \!\!\! \perp X_2\), we have the joint density function of \(D_B\) and \(X_2\) as \(f_{D_B,X_2}(b,x)=\mu_B\lambda\cdot e^{-\mu_B b-\lambda x}\) in the first quadrant and \(0\) elsewhere. Let \(S=D_B-X_2\), we have

\[ \begin{aligned} F_S(s)&=P\{S\le s\}=P\{D_B-X_2\le s\}\\ &=\int^\infty_0\biggl(\int^{x+s}_0 \mu_B\lambda\cdot e^{-\mu_B b-\lambda x}db\biggr)\mathrm{d}x\\ &=1-\frac{\lambda}{\lambda+\mu_B}e^{-\mu_B s},\qquad \text{for}\ s\ge0. \end{aligned} \]

Thus, we have \[ \begin{aligned} P\{D_A>S_1,\ D_B>S_2\}&=\int_{0}^{\infty}P\{D_A>s\}P\{D_B-X_2>s\}f_{S_1}(s)\mathrm{d}s \qquad \\ &=\int^\infty_0 e^{-\mu_A s}\cdot\frac{\lambda}{\lambda+\mu_B}e^{-\mu_B s}\cdot\lambda e^{-\lambda s}\mathrm{d}s\\ &= \frac{\lambda^2}{(\lambda+\mu_B)(\mu_A+\mu_B+\lambda)}. \end{aligned} \] (reference: https://math.stackexchange.com/a/115062)

1.3

Let \(X\) be an exponentially distributed random variable with parameter \(\lambda = 1\). Compare the upper bound on the probability \(P\{X > 4\}\) obtained from the Chebychev inequality and the exact value of this probability.

Solution:

Given \(X\sim Exp(1)\), we have \(P\{X>4\}=P\{|X-1|>3\}\le \frac{1}{9}\), and \(P\{X>4\}=e^{-x}\vert_{x=4}\approx0.018\).

("Chebychev's inequality: \(P\{|X-\mu|\ge k\}\le\frac{\sigma^2}{k^2}\)", Ⅰ-51/64)

1.4

Suppose that the interarrival distribution for a renewal process is Poisson distributed with mean \(\mu\). That is, \[ P\{X_n=k\}=e^{-\mu}\frac{\mu^k}{k!},\ k=0,1,... \] (a) Find the distribution of \(S_n\).

  1. Calculate \(P\{N(t) = n\}\).

Solution:

Let \(X\sim Poisson(\lambda_1)\), \(Y\sim Poisson(\lambda_2)\), and \(X\perp \!\!\! \perp Y\). Consider the distribution of \(Z=X+Y\). We have \[ \begin{aligned} P\{Z\le z\}&=P\{X+Y\le z\}\\ &=\sum\limits_{y=0}^z P\{X+Y\le z|Y=y\}P\{Y=y\}\qquad \text{("conditional expectation", Ⅰ-43/58)}\\ &=\sum\limits_{y=0}^z P\{X\le z-y\}P\{Y=y\}\qquad(\text{since}\ X\perp \!\!\! \perp Y)\\ &=\sum\limits_{y=0}^z \frac{e^{-\lambda_1} \lambda_1^{z-y}}{(z-y)!}\frac{e^{-\lambda_2}\lambda_2^{y}}{y!}\\ &=\bigg[\sum\limits_{y=0}^z \frac{z!}{(z-y)!y!}\lambda_1^{z-y}\lambda^{y}\bigg]\frac{e^{-(\lambda_1+\lambda_2)}}{z!}\\ &=(\lambda_1+\lambda_2)^z \frac{e^{-(\lambda_1+\lambda_2)}}{z!}\qquad(\text{using binomial expansion in reverse})\\ &=e^{-\lambda}\frac{\lambda^z}{z!}.\qquad(\lambda=\lambda_1+\lambda_2) \end{aligned} \] Thus, we conclude that \(Z\sim Poisson(\lambda_1+\lambda_2)\).

(reference: https://llc.stat.purdue.edu/2014/41600/notes/prob1805.pdf)

(a)

Given \(\{X_N,\ n=1,2,...\}\), which is a sequence of nonnegative independent random variables with a common distribution, that is, \(X_N\sim Poisson(\mu),\ n=1,2,...\)("renewal process", Ⅲ-4/40), and \(S_0=0,\ S_n=\sum\limits_{i=1}^n X_i,\ n\ge1\) We have \(S_n\sim Poisson(n\mu)\), that is, \(P\{S_n=k\}=e^{-n\mu}\frac{(n\mu)^k}{k!},\ k=0,1,...\)

(b)

We have \[ \begin{aligned} P\{N(t)=n\}&=P\{N(t)\ge n\}-P\{N(t)\ge n+1\}\\ &=P\{S_n\le t\}-P\{S_{n+1}\le t\}\\ &=\sum\limits_{k=0}^{k=\lfloor t\rfloor} (e^{-n\mu}\frac{(n\mu)^k}{k!}-e^{-(n+1)\mu}\frac{[(n+1)\mu]^k}{k!})\\ &=e^{-n\mu}[n^k-e^{-\mu}(n+1)^k] \sum\limits_{k=0}^{k=\lfloor t\rfloor} \frac{\mu^k}{k!}. \end{aligned} \] ("distribution of \(N(t)\) ", Ⅲ-7/40)

2.1

A Markov chain \(\{X_n, n \ge 0\}\) with states \(0, 1, 2\), has the transition probability matrix \[ \left[\begin{array}{lll} 1 / 2 & 1 / 3 & 1 / 6 \\ 0 & 1 / 3 & 2 / 3 \\ 1 / 2 & 0 & 1 / 2 \end{array}\right] \] (a) If \(P\{X_0=0\}=P\{X_0=1\}=\frac{1}{4}\), find \(E[X_2]\). (b) What are the limiting probabilities.

Solution:

Let \(\pi^{(0)}=[0.25,0.25,0.5]\), we have \[ \begin{aligned} \pi^{(0)} P^{(2)}&=[0.25,0.25,0.5] \left[\begin{array}{lll} 1 / 2 & 1 / 3 & 1 / 6 \\ 0 & 1 / 3 & 2 / 3 \\ 1 / 2 & 0 & 1 / 2 \end{array}\right] \left[\begin{array}{lll} 1 / 2 & 1 / 3 & 1 / 6 \\ 0 & 1 / 3 & 2 / 3 \\ 1 / 2 & 0 & 1 / 2 \end{array}\right]\\ &= [3/8,1/6,11/24] \left[\begin{array}{lll} 1 / 2 & 1 / 3 & 1 / 6 \\ 0 & 1 / 3 & 2 / 3 \\ 1 / 2 & 0 & 1 / 2 \end{array}\right]\\ &=[5/12, 13/72, 29/72] \end{aligned} \] Thus \(P\{X_2=0\}=5/12,\ P\{X_2=1\}=13/72,\ P\{X_2=2\}=29/72\). We have \[ \begin{aligned} E[X_2]&=0\cdot P\{X_2=0\}+1\cdot P\{X_2=1\}+2\cdot P\{X_2=2\}\\ &=13/72+2\cdot 29/72\\ &=71/72\approx0.9861 \end{aligned} \] (b)

Let \(\pi=[\pi_0,\pi_1,\pi_2]\). We have from ("Long-Run Proportions", Theorem 4.2, Ⅳ-44/60) \[ \left\{\begin{aligned} & [\pi_0,\pi_1,\pi_2] =[\pi_0,\pi_1,\pi_2]\left[\begin{array}{lll} 1 / 2 & 1 / 3 & 1 / 6 \\ 0 & 1 / 3 & 2 / 3 \\ 1 / 2 & 0 & 1 / 2 \end{array}\right]\\ & \pi_0+\pi_1+\pi_2=1 \end{aligned}\right. \] Solving yields \(\pi=[0.4,0.2,0.4]\).

2.2

  1. Write down the forward equations for the pure birth process and prove that \[ \begin{aligned} &P_{i i}(t)=e^{-\lambda_{i} t}, \quad i \geqslant 0, \\ &P_{i j}(t)=\lambda_{j-1} e^{-\lambda_{j} t}\int_{0}^{t} e^{\lambda_{j} s} P_{i, j-1}(s) \mathrm{d} s, \quad j \geqslant i+1. \end{aligned} \]
  2. Write down the forward equations for the birth and death process.

Solution:

(a)

The forward equations for the pure birth process is \[ \begin{aligned} &P_{i i}^{\prime}(t)=- P_{i i}(t)\lambda_{i},\qquad i\ge0,\\ &P_{i j}^{\prime}(t)=P_{i, j-1}(t)\lambda_{j-1}- P_{i j}(t)\lambda_{j},\qquad j\ge i+1. \end{aligned} \] ("Kolmogorov’s Forward Equations", Theorem 3.2, Ⅴ-23/48)

We have \[ \begin{aligned} P_{i i}^{\prime}(t)+ P_{i i}(t)\lambda_{i}&=0\\ P_{i i}^{\prime}(t)e^{\lambda_i t}+ P_{i i}(t)\lambda_{i}e^{\lambda_i t}&=0\\ \frac{d(P_{ii}(t)e^{\lambda_i t})}{dt}&=0\\ P_{ii}(t)&=Ce^{-\lambda_i t} \end{aligned} \] We know \(P_{ii}(0)=1\). Thus we obtain \(P_{ii}(t)=e^{-\lambda_i t}\).

Similarly, we have for \(j\ge i+1\) \[ \begin{aligned} P_{i j}^{\prime}(t)+P_{i j}(t)\lambda_{j}&=P_{i, j-1}(t)\lambda_{j-1}\\ P_{i j}^{\prime}(t)e^{\lambda_i t}+ P_{i j}(t)\lambda_{i}e^{\lambda_i t}&=P_{i, j-1}(t)\lambda_{j-1}\\ \frac{\mathrm{d}(P_{i j}(t)e^{\lambda_i t})}{\mathrm{d}t}&=P_{i, j-1}(t)\lambda_{j-1}\\ P_{ij}(t)&=\lambda_{j-1}e^{-\lambda_i t}\int_{0}^{t} e^{\lambda_{j} s} P_{i, j-1}(s) \mathrm{d} s \end{aligned} \] The proof is complete.

(Ross S M. Introduction to probability models[M]. Academic press, 2014. Proposition 6.4)

(b)

The forward equation for the birth and death process is \[ \begin{aligned} &P_{i 0}^{\prime}(t)=P_{i1}(t)\mu_{1}- P_{i 0}(t)\lambda_{0}\\ &P_{i j}^{\prime}(t)=P_{i, j-1}(t)\lambda_{j-1}+P_{i,j+1}(t)\mu_{j+1}- P_{i j}(t)(\lambda_{j}+\mu_j) \end{aligned} \]

(Ross S M. Introduction to probability models[M]. Academic press, 2014. Example 6.12)

2.3

Consider a continuous-time Markov chain with infinitesimal generator matrix \[ \boldsymbol{Q}=\left[\begin{array}{cccc} -4 & 4 & 0 & 0 \\ 3 & -6 & 3 & 0 \\ 0 & 2 & -4 & q_{1} \\ 0 & 0 & 1 & q_{2} \end{array}\right] \] (a) What are \(q_1\) and \(q_2\)? (b) Compute the limiting probabilities.

Solution:

(a)

\(q_1=2,\ q_2=-1.\)

(b)

Let \(\alpha=[\alpha_0,\alpha_1,\alpha_2,\alpha_3]\). We have from ("Limiting Probabilities ", Ⅴ-31/48) \[ \left\{\begin{aligned} & [\alpha_0,\alpha_1,\alpha_2,\alpha_3]\left[\begin{array}{cccc} -4 & 4 & 0 & 0 \\ 3 & -6 & 3 & 0 \\ 0 & 2 & -4 & 2 \\ 0 & 0 & 1 & -1 \end{array}\right]=0\\ & \alpha_0+\alpha_1+\alpha_2+\alpha_3=1 \end{aligned}\right. \] Solving yields \(\alpha=[0.12,0.16,0.24,0.48]\).

2.4

Customers arrive at a service center according to a Poisson process with a rate of one every \(15\) minutes. The service time is exponentially distributed and the average time is \(10\) minutes. Answer the following questions:

  1. What is the probability that an arriving customer will have to wait?
  2. What is the probability that there are more than four customers in the system?

Solution:

This is a M/M/1 queueing system with \(\lambda=\frac{1}{15},\ \mu=\frac{1}{10}\)

(a) \[ \mathrm{P}\{N \geqslant 1\}=1-\mathrm{P}\{N =0\}=(1-\alpha_0)^+= \left\{\begin{aligned} 1,\qquad&\textbf{if }\lambda\geqslant\mu, \\ \frac{\lambda}{\mu},\qquad&\textbf{else}. \end{aligned}\right. =\frac{2}{3}. \] (b) \[ \mathrm{P}\{N \geqslant4\}=(\frac{\lambda}{\mu})^4=\frac{16}{81}\approx0.1975. \]

("M/M/1 Queue", Ⅵ-18/53)