Popular in Course
Popular in Telecommunication
verified elite notetaker
This 64 page Class Notes was uploaded by Ryleigh Heller on Monday September 28, 2015. The Class Notes belongs to TCOM501 at University of Pennsylvania taught by Staff in Fall. Since its upload, it has received 15 views. For similar materials see /class/215359/tcom501-university-of-pennsylvania in Telecommunication at University of Pennsylvania.
Reviews for NETWKINGTHEORY&FUND
Report this Material
What is Karma?
Karma is the currency of StudySoup.
Date Created: 09/28/15
TCOM 501 Networking Theory amp Fundamentals Lecture 3 January 29 2003 Prof Yannis A Korilis g Topics Markov Chains Discrete Time Markov Chains Calculating Stationary Distribution Global Balance Equations Detailed Balance Equations Birth Death Process Generalized Markov Chains Continuous Time Markov Chains g Markov Chain Stochastic process that takes values in a countable set Example O12m or O12 Elements represent possible states Chain jumps from state to state Memoryless Markov Property Given the present state future jumps of the chain are independent of past history Markov Chains discrete or continuous time Discrete Time Markov Chain Discrete time stochastic process 29 n O12 Takes values in O12 Memoryless property PXn1 I Xn i9Xn 1 in 19quot399X0 PXn1 I Xn Pi PXn1 j X i Transition probabilities Pi 13120 Z gj1 J Transition probability matrix PPJ g Chapman Kolmogorov Equations n step transition probabilities PPXnmjXmi nm20ij20 Chapman Kolmogorov equations o o Firm nm20l20 k0 r P is element i j in matrix Pn 1 Recursive computation of state probabilities State Probabilities Stationary Distribution State probabilities time dependent 7t PXn 1 7t 7t397tiaoo PXn j iPXn1 iPXn j Xn1 i gt a an lp J In matrix form 7tn TCn IP 7ln 22 H 710372 If time dependent distribution converges to a limit 7t 1im7t n gtOO TC is cal led th e stationary distribution 7t 2 RP 2 Existence depends on the structure of Markov chain Classificationof Markov Chains Irreducible Aperiodic States i andj communicate State i is periodic EInmBfgtOPgtO EIdgt12PgtOgtnad Irreducible Markov chain all Aperiodic Markov chain none states communicate of the states is periodic g Limit Theorems Theorem 1 Irreducible aperiodic Markov chain For every state j the following limit 7 1imPXn j X0 i i 012 71 00 exists and is independent of initial state i Njk number of visits to statej up to time k Nk P7tlim 11X0i1 71 frequency the process visits state j Existence of Stationary Distribution Theorem 2 Irreducible aperiodic Markov Chain There are two possibilities for scalars TC 1imPXn jXOi1imP 72 00 72 00 1 at O for all statesj r No stationary distribution 2 7tgt O for all statesj a TC is the unique stationary distribution Remark If the number of states is finite case 2 is the only possibility Ergodic Markov Chains Markov chain with a stationary distribution 7tjgt0 j012 States are positive recurrent The process returns to state j infinitely often A positive recurrent and aperiodic Markov chain is called ergodic Ergodic chains have a unique stationary distribution 7tlimP J n OO 2 Ergodicity gt Time Averages Stochastic Averages Calculationof Stationary Distribution A Finite number of states B Infinite number of states Solve explicitly the system of Cannot apply previous methods equations to problem of infinite dimension n Zini9 j Zojljmm Guessooa solution to recurrence z O 75221739519 j Ol in 1 i0 Numerically from Pquot which i0 converges to a matrix with rows equal to TC I Suitable for a small number of states Example Finite Markov Chain 1 P Absent minded professor uses two umbrellas when commuting o e o 1 19 between home and office If it rains and an umbrella is available 1 19 P at her location she takes it If it MarkOV Chain formulation does not rain she always forgets i is the number of umbrenas t0 take an umbrella Le39L39 P be the available at her current location probability of rain each time she commutes What is the probability that she gets wet on 0 0 1 any given day P O l p p 1 p O Transition matrix g Example Finite Markov Chain 1 P 110 1 p 750 Z 1 p7c2 71 TCP n1 2 1 p7t1 pnz ltgt lt Zini 1 Kn07t17t2 1 1 29 3 29 Pgets wet 7130 p g Example Finite Markov Chain Takingp 01 n 1 1 1 1 031003450345 3 p 3 p 3 p 0 0 1 P 0 09 01 09 01 0 Numerically determine limit of P 0310 0345 0345 1310 1345 1345 n 150 1310 1345 1345 lim Pquot 71 00 Effectiveness depends on structure of P g Global Balance Equations Markov chain with infinite number of states Global Balance Equations GBE 713121012 271sz lt3 WJZPJI39 Znisz j Z 0 i0 i0 i j i j an is the frequency of transitions from j to 139 Frequency of Frequency of transitions out of j transitions into j Intuitionj visited infinitely often for each transition out ofj there must be a subsequent transition intoj with probabilib 1 g Global Balance Equations Alternative Form of GBE 2713121017 ZniZsz S 091929m ieS jeS I39eES jeS If a probabilib distribution satisfies the GBE then it is the unique stationary distribution of the Markov chain Finding the stationary distribution Guess distribution from properties of the system Verify that it satisfies the GBE Special structure of the Markov chain simplifies task 33f 3 3 33f quotHZ nZ Z uz 3 3 33f 3 3 33f CMZ HquotMZZQZWIZJHLZ 03f 0 33f 0 0 c quot fuZZ dzxx c famz 12qu f f ejrgu3 ij3fJL ltgt iJL E ijrf Ef1L l lt 1321 p119 9M 1L JOOJd suonenbg eaueeg eq05 sSc P P n 1n n n1 P10 P 5P nn 1 n1n POO P 7171 g Birth Death Process v One dimensional Markov Chain with transitions only between neighboring states Pi if z jgt1 Detailed Balance Equations DBE TCP 7t P n01 n nn1 n1 n1n Proof GBE with SO1n give n n 2 Z 7IjIDji Z Z gt nann1 Tcn1n1n jO in1 j0 in1 Example Discrete Time Queue In a time slot one arrival with probabilibp or zero arrivals with probabilib lp In a time slot the customer in service departs with probability q or stays with probabilib lq Independent arrivals and service times State number of customers in system 19 p1 q p p1 q 009 0430 q1 p q1 p 1 1 19 1 p1 qpq qlt p 1 P1 QPq Example Discrete Time Queue p p1 q p1 q p1 q 009 o Q1 P q1p q1p 1 10 1 p1 qpq 1 p1 qpq nopn1q1pgtnl P 9 WC 1 19 1 2M 74 q1 p De ne papg QEM q1 p 0 7t 7r 1 1 0 z nnzan 11p WC n21 7T 067t 7221 p Example Discrete Time Queue Have determined the distribution as a function of no n l 0 1 19 no n21 How do we calculate the normalization constant no Probability conservation law 00 0 oo n1 1 0 1 A Znonn ljnO l anla 11p1a 1 19 Noting that 1p1a1p41 P P1 QZ q P 1p q1 p q nozl p nn 01 05an1 n21 Detailed Balance Equations General case TC 19 all ij 01 J Ji Imply the GBE Need not hold for a given Markov chain Greatly simplify the calculation of stationary distribution Methodology Assume DBE hold have to guess their form Solve the system defined by DBE and 21711 1 If system is inconsistent then DBE do not hold If system has a solution nit i01 then this is the unique stationary distribution Generalized Markov Chains Markov chain on a set of states O1 that whenever enters state i The next state that will be entered isj with probability Pi Given that the next state entered will bej the time it spends at state 139 until the transition occurs is a RV with distribution Fij Zt t 2 O describing the state the chain is in at time t Generalzed Mar0 I Chain 0 r SemiMarko I p rocess It does not have the Markov properw future depends on The present state and The length of time the process has spent in this state Generalized Markov Chains Ti time process spends at state 139 before making a transition holding time Probability distribution function of TI HZ2 PTl S 2 ZPTZ S 2 next statejPlj ZEjtPij j0 j0 Em frdHim Tii time between successive transitions to i Xn is the nth state visited 29 n01 Is a Markov chain embedded Markov chain Has transition probabilities Pi Semi Markov process irreducible if its embedded Markov chain is irreducible g Limit Theorems Theorem 3 Irreducible semi Markov process ETll lt oo For any statej the following limit p limPZt j 20 i i 012 exists and is independent of the initial state ETJ 19quot Z Em 130 time spent at statej up to time t Tl Ppj g Jr ZOz1 up is equal to the proportion of time spent at statej Occupancy Distribution Theorem 4 Irreducible semi Markov process ETll lt oo Embedded Markov chain ergodic stationary distribution 71 00 TC nij20 Znizl 3 Occupancy distribution of the semi Markov process njEiTji p J ZniEiz 71 proportion of transitions into statej E mean time spent atj 2 Probability of being atj iS PFOPOI tiona39 to TCjE jO1 Continuous Time Markov Chains Continuous time process Xt t 2 0 taking values in O12 Whenever it enters state i Time it spends at state i is exponentially distributed with parameter vl When it leaves state i it enters state j with probability Pi where EjiiPij 1 Continuous time Markov Chain is a semi Markov process with Vit 39 Fijt 1 e l O1 2 Exponential holding times a continuous time Markov Chain has the Markov properb Continuous Time Markov Chains When at state i the process makes transitions to state jabi with rate qij E VP 1 l Total rate of transitions out of state i Zqij 14392sz V j i j i Average time spent at state 139 before making a transition Em lvi Occupancy Probability Irreducible and regular continuous time Markov chain Embedded Markov chain is irreducible Number of transitions in a finite time interval is finite with probability 1 From Theorem 3 for any statej the limit p limPXt j XO i i 012 exists and is independent of the initial state p is the steady state occupancy probability of state j p is equal to the proportion of time spent at statej Why a Global Balance Equations Two possibilities for the occupancy probabilities ij O for allj ijgt O for allj and ijj 1 Global Balance Equations pquji 219239qu j Z 0 1quotquot Rate of transitioan out ofj rate of transitions into j If a distribution pjtj 01 satisfies GBE then it is the unique occupancy distribution of the Markov chain Alternative form of GBE ijiji ZpizqijD S Oala jeS i S i S jeS Detailed Balance Equations Detailed Balance Equations 191417 play9 ij 091w Simplify the calculation of the stationary distribution 69 Need not hold for any given Markov chain Examples birth death processes and reversible Markov chains g Birth Death Process S SC 2 0 1 274 1 I in 009 00 1 2 Hr Mm 5 Transitions only between neighboring states qua 1 q1 1 qij 0 li j gt 1 Detailed Balance Equations inpn uannH n 01 Proof GBE with SO1n give 2 2 quji 2 z z piqij gt Anpn lun1pn1 jO in1 j0 in1 Birth Death Process npn An lpn l Z 2L 2W 2W 2W 2W 2L quot 1 ll pm L1pn1 1 2pn2 1 2 0 p0 pol I Lln LIIz tin 1 n n l ll ll i0 Hi1 oo 00 11 1 00 11 1 1 00 11 1 an1lt2gtp01zn 1ltgtp01ZH isz lltoo n20 n21 i0 i1 n1 i0 il n1 i0 Um Use DBE to determine state probabilities as a function opr Use the probability conservation law to find p0 Igt Using DBE in problems Prove that DBE hold or Justify validity eg reversible process or Assume they hold have to guess their form and solve system MMl Queue Arrival process Poisson with rate A Service times iid exponential with parameter p Service times and interarrival times independent Single server Infinite waiting room Nt Number of customers in system at time t state MMl Queue 2L 2L 2L 2L 009 o H H H H Birth death process gt DBE HI 11914 gt 1 n pm Zzpnq ppm 1 quot392 p p0 Normalization constant an1ltZgtPo1anlltgtpol p ifplt1 n20 Stationary dnistribution n20919 g The MMl Queue Average number of customers N znpn 1 p2 mp 1 ppz np n0 n0 n0 1 A gtN01 0 L 1 pgt2 2 1 p 2 u 2 Applying Little s Theorem we have N 1 x1 1 7 u u Similarly the average waiting time and number of customers in the queue is given by sz iz 390 and NQ2W 390 1 0 TCOM 501 Networking Theory amp Fundamentals Lecture 1 1 April 16 2003 Prof Yannis A Korilis Topics Routing in Data Network Graph Representation of a Network Undireeted Graphs Spanning Trees and Minimum Weight Spanning Trees Directed Graphs Shortest Path Algorithms Bellman Ford Dijkstra Floyd Warshall Distributed Asynchronous Bellman Ford Algorithm Introduction to Routing What is routing The creation of state information in the network to enable e eient delivery of packets to their intended destination Two main components Information acquisition Topology addresses Information usage Computing good paths to all destinations Questions Where is B o e G How to reach B How to best reach B How to best distribute all traf c 6 3 not only A to B Graph Theoretic Concepts 9 An Undirected Graph G 9V it consists of a nite nonempty set of nodes W and a collection of arcs interconnecting pairs of distinct nodes from 9V If i andj are nodes in Wand ij is an arc in it the arc is said to be incident on i and j Walk sequence of nodes n1 n2 nk where 141242 142143 mm m are arcs Path a walk With no repeated nodes Cycle a walk n1 n2 no other repeated nodes nk W1th n1nk and Connected Graph for all i j E 5V there eXists a Path quot19 n2 9 71k With ianaJan 2256 9939 W ABCDEFG C9C9D9C9F9 39D9B G9E9G Spanning Tree A graph G39 WC 439 With W39g W and jZl39g is called a subgraph of G Tree a connected graph that contains no cycles Spanning tree of a graph G a subgraph of G that is a tree and contains all nodes of G that is W39 W 2059 Lemma Let G be a connected graph G 614 and S a nonernpty strict subset of the set of nodes 5V Then there eXists at least one arc i j such that ieS andjeES Spanning Tree Algorithm 1 Select arbitrary node nejv and initialize G39 W39 1439 W39 n 39 Q 2 If v 5V STOP G39 WC 439 is a spanning tree ELSE go to step 3 3 Letij e withi e Wand j e W JV Update W39 W39U j fl39 2 fl39 U 191 Go to step 2 39P Proof of correctness Use induction to establish that after a neW node i is added G39 remains connected and does not contain any cycles therefore it is a tree After N l iterations the algorithm terminates and G39 contains Nnodes and N l arcs Construction of a Spanning Tree I W A 39 Q I W AEC 2139 AEACgt o e I W AECD 2139 AEgtACgtltCDgt G a G w AECDF fl39 AaE9A9C9CD9CF e e W AECDFB fl39 AaEAC9CDaCF9FDB W AECDFBG fl39 AaEAC9CDaCF9FDBMEDG Spanning Trees Proposition Let G be a connected graph With N nodes and A links 1 G contains a spanning tree 2 A Z N l 3 G is a tree if and only if AN l Proof The spanning tree construction algorithm starts With a single node and at each iteration augments the tree by one node and one arc Therefore the tree is constructed after N l iterations and has N l links Evidently A Z N l If AN l the spanning tree includes all arcs of G thus G is a tree If AgtN l there is a link i j that does not belong to the tree Considering the path connecting nodes i and j along the spanning tree that path and link i j form a cycle thus G is not a tree Use of Spanning Trees Problem how to distribute information to all nodes in a graph network e g address and topology information 391quot Flooding each node forwards information to all its neighbors Simple but multiple copies of the same packet traverse each link and are received by each node I Spanning tree nodes forward information only along links that belong to the spanning tree More eff1cient Information reaches each node only once and traverses links at most once Note that spanning tree is bidirectional Weight wij is used to quantify the cost for using link ij Examples delay offered load distance etc Weight cost ofa tree is the sum of the weights of all its links packets traverse all tree links once I De nition A Minimum Weight Spanning Tree MST is a spanning tree With minimum sum of link weights De nition A subtree of an MST is called a fragment An arc With one node in a fragment and the other node not in this fragment is called an outgoing are from the fragment Minimum Weight Spanning Trees Lemma Given a fragment F let ei j be an outgoing arc with minimum weight where j F Then F extended by arc e and node j is a fragment Proof itquot Let Tbe the MST that includes fragment F Ifee T we are done r1quot Assume 69 T then a cycle is formed by e and the arcs of T Since ng there is an arc e e which belongs to the cycle and T and is outgoing from F Let T T e ue This is subgraph with N l arcs and no cycles sinceje F ie a spanning tree Since we we the weight of T is less than or equal to the weight of T Then T is an MST and F extended by arc e and node j is a fragment Minimum Weight Spanning Tree Algorithms Inductive algorithms based on the previous lemma Prim s Algorithm Start with an arbitrary node as the initial fragmant Enlarge current fragment by successively adding minimum weight outgoing arcs Kruskal s Algorithm All vertices are initial fragments Successively combine fragments using minimum weight arcs that do not introduce a cycle Prim 5 Algorithm Ge e G 0 G as 60 gorithm 1 G 39 g Kruskall 5 A1 0 09 H Ga G C9 Shortest Path Algorithms Problem Given nodes A and B nd the best route for sending traf c from A to B Bestz the one with minimum cost where typically the cost of a path is equal to the sum of the costs of the links in the path Important problem in various networking applications Routing of traf c over network links gt need to distinguish direction of flow 1 Appropriate network model Directed Graph Directed Graphs A Directed Graph or Digraph G 9V it consists of a nite nonempty set of nodes W and a collection of directed arcs ie ordered pairs of distinct nodes from 9V Directed walks directed paths and directed cycles can be de ned to extend the corresponding de nitions for undirected graphs Given a digraph G ma3921 there is an associated undirected graph G39 WC 439 With W39W and ije Zl39 if ije Zl or f ie Zl A digraph G 614 is called connected if its associated undirected graph G39 W39 1439 is connected A digraph G 614 is called strongly connected if for every 139 j E 5V there eXists a directed path n1 n2 nk With in1 jnk Shortest Path Algorithms Problem Formulation Consider an N vertex graph G N With link lengths 071 for edge 139 j dz 00 if i j 95 Problem Find minimum distance paths from all vertices in W to vertex 1 Alternatively nd minimum weight paths from vertex 1 to all vertices in W General approach is again iterative Dim minDquot oily Differences are in how the iterations proceed Three main algorithms BellmanFord Algorithm Iterative step is over increasing hop count De ne DZh as a shortest walk from node i to 1 that contains at most h edges ODlh O by de nition for all h BellmanFord Algorithm ODe ne Dih1 minDl dij Vi 721 9 Set initial conditions to D10 00 for i 72 1 a The scalars Dih are equal to the shortest walk lengths with less than k hops from node i to node 1 b The algorithm terminates after at most N iterations if there are no negative length cycles without node 1 and it yields the shortest path lengths to node 1 18 Proof of BellmanFord l Proof is by induction on hop count h OFor h l we have D1 a l1 for all i7 1 so the result holds for z39 l OAssurne that the result holds for all k S h We need to show it still holds for 11 Two cases need to be considered 1 The shortest S hl walk from i to 1 has S h edges Its length is then Dih 2 The shortest S hl walk from ito 1 has hl edges 0 It consists of the concatenation of an edge ij followed by a shortest h hop walk from verter to vertex 1 O This second case is the one we will focus on Proof of BellmanFord 2 From Case 1 and Case 2 we have shortest S h 1 walk length min f mi1nDj7 d1 I From induction hypothesis and initial conditions oDC S Dik391 for all k S 11 so that h1 h h l h D1 mjmD dills mjmD d0 DZ oDt s D d11 d11 D1h Combining the two gives shortest S h 1 walk length min Dih minDj7 d1 J minDlh Dih1 D 1 l which completes the proof of part a 20 Proof of BellmanFord 3 Assume that the algorithm terminates after h steps 0 This implies that Dik D for all i and k 2 h 6 This means that adding more edges cannot decrease the length of any of those walks OHence there cannot be negative length cycles that do not include node 1 They could be repeated many times to make walk lengths arb1trar11y small ODelete all such cycles This yields paths of less than or equal length SQ for all z39we have a path of h hops or less to vertex 1 and Wlth length DZ 0 Since those paths have no cycles the contain at most N l edgesand therefore DiN DIN 1 for a 1 z u The algorithm terminates after at most N steps Wthh proves part b 21 BellmanFord Complexity Worstcase amount of computation to nd shortest path lengths OAlgorithm iterates up to N times OEach iteration is done for N l nodes all i 72 1 OMinimization step requires considering up to N l alternatives 3 Computational complexity is ON3 OMore careful accounting yields a computational complexity of OhM 22 Constructing Shortest Paths BF algorithm yields shortest path lengths but we are also interested in actual paths Start with BF equation D1 minDj dij Vi 721 and D1 0 EG For all vertices 139 7i 1 pick the edge 139 j that minimizes BF equation OThis generates a subgraph with N l edges a tree oFor any vertex i follow the edges from vertex 139 along this subgraph until vertex 1 is reached Note In graphs without zero or negative length cycles BF equation de nes a system of N l equations with a unique solution 23 Constructing Shortest Paths DA3DE DE4DG DG3DB3 DD2DG DC1DD DF6DB6 AB AE GB 24 Dijkstra s Algorithm 1 Different iteration criteria OAlgorithm proceeds by increasing path length instead of increasing hop count 0 Start with closest vertex to destination use it to find the next closest vertex and so on 6 Successful iteration requires nonnegative edge weights Dijkstra s algorithm maintains two sets of vertices 0L Labeled vertices shortest path is known 0 C Candidate vertices shortest path not yet known 0 One vertex is moved from C to L in each iteration 25 Dijkstra s Algorithm 2 Initialization OL 1 and C G L vertex 1 is the destination OD1OandDj ah forj 1 Iteration steps 1 Find the next closest vertex not in L ie Find the vertex REL such that D min D j Update C and L L L U i and C EC i 2 Update lengths of paths of vertices remaining in C D minDjDl a Vj e C Algorithm stops when L G 26 Proof of Dijkstra s Algorithm 1 At the beginning of each step 1 we have a D S D for all ieL andjeL b For any vertex j D is the shortest distance from j to 1 for any path using vertices except possibly j only in L non negative edge weights Proof of cond1t1on a O a is satis ed OWe have cgiz d D1 min LDj so that condition a is preserved by the update equation D 2 minDjDl d Vj E C 27