Lecture 3: Palm Probabilities and Rate Conservation Laws Ravi R. Mazumdar Professor and University Research Chair
Dept. of Electrical and Computer Engineering University of Waterloo, Waterloo, ON, Canada
&
1
' Palm probabilities and stationary queueing systems The idea of Palm probabilities is one of conditioning on a point in time where an event takes place. Let {Tn } denote a sequence of r.v.s such that ...T1 < T0 0 < T1 <, . The r.vs correspond to a sequence of time points. Assume that the sequence is stationary i.e Ti+1 Ti are identically distributed. Dene: Nt = N (0, t) =
k
1 [0,t) (Tk ) I
then Nt is said to be a simple point process (it counts how many points lie in [0, t). Now suppose {Xt } is a stochastic process dened on a probability space (, Ft , P ) on which also {Nt } is dened. &
2
'
A Palm probability tries to make sense of the following: Pt (Xt A|Nt = 1) i.e. the probability of Xt A when a point occurs. Note the event Nt = 1 occurs on a set of measure 0 and thus making sense of such a conditional probability needs some care. Let us see some examples.
&
3
'
Suppose {Nt }t0 is a Poisson process with intensity i.e. Nt t isa Ft martingale. Consider Xt too be Ft adapted. E[Xt 1 [N [t,t+]=1] ] I lim 0 E[1 (N [t,t+)=1] I
0
E[Xt |Nt = 1]
= =
lim
By denition of the stochastic intensity E[1 [N (t,t+)=1] |Ft ] = + o(). I And hence we see that E[Xt |Nt = 1] = E[Xt ] We can make this argument completely rigorous. The key is that conditioning w.r.t points of a Poisson process do not aect the probabilities. This is an apparition of the so-called PASTA property. We will see this more in detail later. In general conditioning does aect the expectation.
&
4
'
Let us see another example. Once again let {Nt } be a Poisson process with intensity . Let {Xt } be a stochastic process adapted to Ft then E[
t 0
Xs dNs = E[
n
XTn 1 [Tn t] = I
E[Xs ]ds
This is sometimes called Campbells formula. In this case it just follows from the martingale property. However when Nt is a stationary point process we can still obtain a similar formula if we replace the expectation on the r.h.s by expectation w.r.t. Palm probability and by E[N [0, 1)]
&
5
' Let us now see a more general situation in the discrete-time case (when there is no problem in dening the conditional probability). Let {n } be a a stationary sequence of {0, 1} r.v. with P(n = 1) = . Let {Tn } be the set of times when n = 1 and we adopt the convention < T1 < T0 0 < T1 < . Let N (n) = n . We can now dene for any A F: Pn (Xn A) = P(Xn A|N (n) = 1) P(Xn A, N (n) = 1) = P(N (n) = 1) 1 = P(Xn A, N (n) = 1)
The probability on the r.h.s is well dened since Nk , Xk are jointly dened. Note by convention we take P 0 (T0 = 0) = 1 &
6
'
E[
n
XTn 1 [0Tn k] ] = I
0
E[Xn 1 [N ({n}=1] ] I
k
=
0
by the denition of Palm probability above and stationarity. This is exactly the analog of the result previously.
&
7
' In the continuous time case we can do the following: Let = E[N [0, 1)] Now clearly for any r.v. X we can dene a measure (.) for A as follows: (A) = 1 E[X 1 [Tn A] ] I
Tn
This is absolutely continuous w.r.t. Lebesgue measure and hence by the Radon-Nikodym theorem we can dene a density, say p0 (t). And (A) = A p0 (s)ds where p0 (t) = E[X|N ({t} = 1] Hence in particular: E[XN (A)] =
A
E t [X]dt
This is a special case of the Campbell-Mecke formula. In lecture 3 we will see these concepts more rigorously. &
8
'
Inversion Formula and the Waiting Time Paradox In general how do we relate the Palm probability and the reference probability? This is given by the inversion formula:
T1 1
E[Xk ] = E[X0 ] = E 0 [
Xk ]
k=0
&
9
Xk ] = E[
k=0 k=0
= E[ = E[X0
where we have used stationarity in the last step. Now I k=0 1 [k =1;m =0,k+1m1] = 1 a.s. since by denition < and this just corresponds to stating that there exists a point before 0 at a nite distance. Hence the result follows. A simple consequence of this result is E0 [T1 T0 ] = taking Xk = 1. &
10 1
obtained by
'
In continuous-time the corresponding result is: E[Xt ] = E0 [ where = E[N [0, 1)] Let us now see a consequence of the inversion formula: the famous inspection paradox.
T1 0
Xs ds]
&
11
'
Let Nt be a point process. Dene: A(t) = TNt +1 t. Then A(t) is the forward recurrence timetime to the next point given we arrive at t. Similarly dene B(t) = t TNt the backward recurrence time. Then A(t) + B(t) = TNt +1 TNt is the inter-point time interval. By stationarity E0 [TNt +1 TNt ] = E0 [T1 T0 ] = E0 [T1 ] since under P 0 we have T0 = 0. Taking Xt = T1 t and Xt = t T0 and using the inversion formula we have:
2 E[A(t) + B(t)] = E[T1 T0 ] = E0 [T1 ]
Noting = (E0 [T1 ])1 and the fact that E[X 2 ] (E[X])2 we see that E[T1 T0 ] E0 [T1 T0 ]. The exact dierence is
var0 (T1 ) . E 0 [T1 ]
What this
says is that observing an interval between two points biases us- i.e. if we arrive at arbitrary time between two points, then we are more likely to arrive in a long interval.
&
12
' Motivation Let {X(t)}, t R, be a real valued stochastic process and let N be a point process on R. The time average of {X(t)} up to time t is 1 Tt = t
t 0
X(s)ds
X(s)N (ds)
(0,t]
X(Tn )1[Tn
t]
&
13
'
When the processes are stationary and ergodic, (1.1) corresponds to the mean under the stationary measure while the event average (1.2) converges to the mean under a measure termed the Palm probability. The natural questions are how does one formally dene the Palm probability and how does one compute it? What role does it play in queues?
&
14
&
15
'
Equality 0 (i) = (i) holds for all i E if and only if qi = constant, which is equivalent to {Tn } being a Poisson process.
What happens if x(t) is not Markov and the point process is not Poisson? This will bring us to Palm probabilities.
&
16
' Palm Probability Let (, F, P) be a complete probability space which carries a measurable ow (shift) {}t . Let P be stationary w.r.t. {t } i.e. P t 1 = P Let N be a point stationary point process (w.r.t the ow {t } dened on (, F, P) N (t , C) = N (, C + t) where C is a Borel set in .
Let N denote the average intensity of N given by: N = E[N (0, 1]] &
17
'
1 = E[ N (C)
1A (s )N (ds)
C
where (c) denotes the Lebesgue measure of C and the denition does not depend on C.
&
18
'
0 Properties of PN
An immediate consequence of the denition is the so-called Mathes-Mecke formula N E0 [ N v(s)ds] = E[ v(0) s N (ds)]
for all non-negative Ft predictable processes {X(t)}, t R. The process {A(t)} is called the (P, Ft ) compensator of the point process N . Moreover A(t) 1 and { |lim N (t, ) = } = { | lim A(t, ) = }. In fact, the process {A(t)} is such that M = N A is a local martingale . If At is absolutely conttinuous w.r.t Lebesgue measure, its density Rt denoted by t given by At = 0 s ds is called the Ft -(stochastic)
&
intensity.
20
|X(s)|A(ds) < ,
a.s.
the process {M (t)} dened by M (t) = is a Ft -local martingale. It is also true that the local martingale M = N A is locally square integrable and more generally if for all t
(0,t] (0,t]
|X(s)|2 (1
A(s))A(ds) < ,
a.s.
then M (t) is a local square integrable martingale. The condition above condition is automatically satised if the process {X(t)} is bounded and N is locally nite. &
21
'
The quadratic variation process of M denoted < M > is dened as the unique predictable, non-negative and increasing process that makes M 2 < M > a local martingale. For the local martingale M as dened earlier one has the explicit characterization for the quadratic variation process < M >t = |X(s)|2 (1 A(s))A(ds)
(0,t]
&
22
'
Martingale SLLN A. If Mt is a local square integrable martingale with a quadratic variation process < M > and if < M > () < then Mt () B. If Mt is a local square integrable martingale with a quadratic variation process < M > and if < M > () = then Mt () <M >t () 0
&
23
' Papangelous Theorem One of the fundamental theorems that links the Palm probabilities with the stochastic intensity theory is the Papangelou Theorem.
0 Theorem : PN << P on F0 i N admits a Ft -intensity {(t)}. Moreover, in that case (t, ) = (0, t ) where 0 dPN (0) = N dP F0
If the number of jumps in each compact interval is nite then the process {Nt } dened by: X Nt = 1[Xs =Xs ]
st
&
25
Xs
Ys dNs
where Yt =
&
'
Xt = X0 +
+ Xs ds +
t 0
Ys dNs
&
27
'
Rate Conservation Law (RCL) We now state the main result. Theorem Let {Xt } be a cadlag process of bounded variation that is stationary w.r.t t on , F, P . Then:
+ E[X0 ] + N E0 [X0 ] = 0 N
' Applications
The rst simple application is the level crossing formula due to Brill and Posner. Theorem Let {X t} be a stationary cadlag process that possesses a density. Then:
+ p(x)E[X0 /X0 = x] = N E0 [1[X0 >x] 1[X0 >x] ] N
Noting that: 1[X0 >x] 1[X0 >x] = 1[X0 >x] 1[X0 x] 1[X0 x] 1[X0 >x] We can re-write the result as:
+ 0 p(x)E[X0 /X0 = x] = N EN [1[X0 >x] 1[X0 x] 1[X0 x] 1[X0 >x] ]
&
29
Proof: Let T+ (t) be the rst point of Nt after t. R T (t) Dene Yt = t + Xs ds Then Yt+ = Xt and Y0 = 0 since T+ (0) = T0 and T0 = 0 under PN Hence 0 Z T1 Xs ds] E[Y0+ ] = E[X0 ] = N E0 [Y0 ] = N E0 [ N N
0
&
30
(f t )dNt ]
Proof; We give a proof with a stochastic intensity. Dene R T+ (t) fs s ds g(t) = t R T1 + Then it is easy to see g t = ft t and g(0) = 0 fs dNs . Then applying RCL w.r.t. N and using Papangelous formula we have: Z T1 fs dNs ] N E0 [f (0)] = N EN [ N
0
&
31
'
Taking f = 1 we see N E0 [N [0, T1 ] = N we have N R T1 0 EN [ 0 (f t )dNt ] EN [f (0)] = N E0 [N [0, T1 ] N which gives the cycle representation (the Palm distribution can be obtained as an average over a selection of points of the original point process). Actually N and N do not have to be subsets but only jointly dened and compatible w.r.t t on the same space.
&
32
' Littles Formula Consider a queueing system in which arrivals take place as a stationary point process and each arrival at time Tn brings an amount of work n that is a stationary sequence. Assume the arrivals are serviced in the order they arrive (FIFO) Let Wt denote the workload in the queue at time t
Nt
Wt = W0 +
n=0
t 0
When there exists a stationary distribution (i.e. when 0 N EN [] < 1 we obtain the so-called Littles formula given by:
0 E[Q] = N EN [W0 ]
'
The proof follows by applying the RCL to the total sojourn time process in the queue dened by:
t
Vt =
&
34
'
Pollaczek-Khinchine Formula Consider the function f (Wt ) = Wt2 and n to be i.i.d. Then applying the RCL to this function we obtain:
0 2 N EN [0 ] E[Wo ] = 2(1 ) 0 where = N EN [0 ] < 1
&
35
' PASTA
PASTA says that if N is poisson then E[X0 ] = E0 [X0 ] n This readily follows from Papangelous theorem because if N is Poisson t () = = N . On the other hand via the martingale SLLN for a PASTA type result to hold we do not even require stationarity. We state the result below: Theorem: If N is a simple point process with Ft intensity t and {Xt } is a Ft predictable process then on the set = { | lim A(t, ) = }
t
one has the pointwise result Z t Z t 1 1 lim X(s)N (ds) X(s)A(ds) = 0 t N (t) 0 A(t) 0
&
&
37
'
Fluid queues
For c > 0, dene (Q(t), t 0) as: Q(t) = Q(0) + A(t) ct + Z(t), where (Z(t), t 0) is an increasing process, null at 0, which satises For all t 0, Q(0) + A(t) ct + Z(t) 0, The support of Z(dt) is included in the set {s 0, Q(s) = 0}.
&
38
'
A Dene A = c . Then, under the condition A < 1, it can be shown that there exists a stationary regime for Q, i.e. there is a unique {t } consistent solution dened on the same probability space (, F, P ).
&
39
'
Assume that A is a continuous, stationary, increasing random measure with E[A(0, 1]] = A and A = A c1 < 1. Then, 1) For all continuous functions cE(Q(0))1{Q(0)>0} = A EA (Q(0))1{Q(0)>0} . 2) For all Borel sets B of which do not contain the origin 0,
&
40
'
Littles Law for Fluid Queues Under the hypotheses above: E[Q(0)] = A EA [Q(0)] Note unlike the classical Littles law that relates the average number to the average waiting time here we just have a kind of Mecke formula.
&
41
' Now consider an input of the type ON-OFF given by the following description:
Unfinished workload at time t Arrival Rate
C
t A(0,t) Workload arrived upto time t
Wt
Let N be a stationary marked point process with points {Tn ; n Z} and marks {(Ln , Sn , Fn ), n Z} such that T0 0 < T1 , The random marks (Ln ) are positive, Each triplet (Tn , Ln , Sn ) satises Tn+1 Tn = Ln + Sn , The marks Fn are continuous increasing processes null at 0, constant on ]Ln , +[ and such that Fn (t) ct on {0 t Ln } (burstiness assumption). &
42
'
Fn (dt)
where B is a Borel set in Then A(t) is a continuous stationary increasing process that species the cumulative input up to time t..
&
43
'
We assume that, under PN , the Palm measure associated with N , the sequences (Fn ), (Ln ) and (Sn ) are i.i.d. and mutually independent and in addition the r.vs Sn are exponentially distributed. Dening m EN [T1 ], n EN [F0 (L0 )] = EN [A(0, T1 ]] = A m and [S q = P [T0 + L0 < 0] = ENm 0 ] .
&
44
'
With respect to the ltration (Ft ) generated by the process A([u, t])t0 , u < t, the stochastic intensity of the point process (Nt ) is given by t (qm)1 1{t =0} where t {0, 1} and takes the value 1 if the source is ON at time t and 0 otherwise.
&
45
'
E[Q(0)] =
where m = EN 0 [T1 ], F0 (t) is denotes the cumulative input on [0, t] for the source when ON under PN0 , L0 is the length of an ON period of the source and A = E[A(0, 1)]. Note the dierence with the Pollaczek-Khinchine formula in the point process case.
&
46
' References
Key papers
P. Brmaud, R. Kannurpatti and R. Mazumdar; Event and time e averages : A review, Advances in Applied Probability, 24, (1992), pp. 377-411 P.H. Brill and M.J.M. Posner; Level crossings in point processes
&
47
'
applied to queues: Single server case, Operations Research 25, (1977), pp. 662-674 M. Miyazawa; The derivation of invariance relations in complex queueing systems with stationary inputs, Advances in Applied Probability 15 (1983), pp. 875-885 M. Miyazawa, Rate Conservation Laws: A survey, QUESTA, vol 15, 1994, pp. 1-58 Mazumdar, R. and Guillemin, F.; Forward equations for reected diusions with jumps, Applied Mathematics and Optimization, Vol. 33, No. 1, 1996, pp. 81-102. Mazumdar, R., Badrinath, V., Guillemin, F. and Rosenberg, C; Pathwise rate conservation for a class of semi-martingales, Stochastic Processes and their Applications, Aug. 1993, pp. 119-131. T. Konstantopoulos, T. and G. Last: On the dynamics and performance of stochastic uid systems, J. Appl. Probab. 37 (2000), no. 3, 652667.
&
48