New User Special Price Expires in

Let's log you in.

Sign in with Facebook


Don't have a StudySoup account? Create one here!


Create a StudySoup account

Be part of our community, it's free to join!

Sign up with Facebook


Create your account
By creating an account you agree to StudySoup's terms and conditions and privacy policy

Already have a StudySoup account? Login here

AlgorithmsData Structure

by: Trent Dare

AlgorithmsData Structure CS 561

Trent Dare
GPA 3.76

Jared Saia

Almost Ready


These notes were just uploaded, and will be ready to view shortly.

Purchase these notes here, or revisit this page.

Either way, we'll remind you when they're ready :)

Preview These Notes for FREE

Get a free preview of these Notes, just enter your email below.

Unlock Preview
Unlock Preview

Preview these materials now for free

Why put in your email? Get access to more of this material and other relevant free materials for your school

View Preview

About this Document

Jared Saia
Class Notes
25 ?




Popular in Course

Popular in ComputerScienence

This 33 page Class Notes was uploaded by Trent Dare on Wednesday September 23, 2015. The Class Notes belongs to CS 561 at University of New Mexico taught by Jared Saia in Fall. Since its upload, it has received 38 views. For similar materials see /class/212192/cs-561-university-of-new-mexico in ComputerScienence at University of New Mexico.


Reviews for AlgorithmsData Structure


Report this Material


What is Karma?


Karma is the currency of StudySoup.

You can buy or earn more Karma at anytime and redeem it for class notes, study guides, flashcards, and more!

Date Created: 09/23/15
39 Skip List 0 Technically not a BST but they implement all of the same CS 561 Lecture 11 operations 0 Very elegant randomized data structure simple to code but Jared Saia analysis is subtle University of New Mexico 0 They guarantee that with high probability all the major op erations take Olog n time Skip List Skip List I Every node is stored in the bottom list For each node in the bottom list we flip a coin over and over until we get tails For each heads we make a duplicate of the node The duplicates are stacked up in levels and the nodes on each level are strung together in sorted linked lists Each node 1 stores a search key keyv a pointer to its next lower copy downv and a pointer to the next node in its level rightv o A skip list is basically a collection of doubly linked lists L1L2Lm for some integer as 0 Each list has a special head and tail node the keys of these nodes are assumed to be iMAXNUM and MAXNUM re spectively o The keys in each list are in sorted order non decreasing Example Search I l c To do a search for a key ac we start at the leftmost node L in the highest level 0 We then scan through each level as far as we can without passing the target value as and then proceed down to the next level 0 The search ends either when we find the key as or fail to find as on the lowest level 39 Search 39 Search Example SkipListFindX L v L while v NULL and Keyv X if KeyRightv gt x v Downv else v Rightv return v 39 Insert 39 Deletion p is a constant between 0 and l typically p 12 let rand return a random value between 0 and l o Deletion is very simple Insertk 0 First do a search for the key to be deleted First call Searchk let pLeft be the leftmost elem lt k in L1 Then delete that key from a the lists it appears in from Insert k in L39l to the right Of pLeft the bottom up making sure to zip up the lists after the 2 deletion while randlt p insert k in the appropriate place in Li 39 AnalySIS 39 Height of Skip List 0 For some key 1 let X be the maximum height of i in the skip list I I I I c Q What is the probability that X 2 2logn Intuitively each level of the skip list has about half the num AI NP 2 12I we have ber of nodes of the previous level so we expect the total I l 2 Ogn number of levels to be about Olog n PltXI Z 2 log n lt7 Similarly each time we add another level we cut the search 2 1 time in half except for a constant overhead 2 W So after Ologn levels we would expect a search time of l Olog n y We will now formalize these two intuitive observations 0 Thus the probability that a particular key 1 achieves height L 2logn is 2 39 Height of Skip List Q What is the probability that any key achieves height 2logn A We want PX1 2 2logn or X2 2 2logn or or X 2 2logn By a Union Bound this probability is no more than PX1 Zklogn PX2 2 klogn PXn Zklogn Which equals l n 2 a 1n i1 39 Height of Skip List This probability gets small as n gets large In particular the probability of having a skip list of size egtlt ceeding 2logn is 01 If an event occurs with probability 1 7 01 we say that it occurs with high probability Key Point The height of a skip list is Ologn with high probability 39 In Class ExerCIse Trick A trick for computing expectations of discrete positive random variables 0 Let X be a discrete rv that takes on values from 1 to n EX i PX 2 i i1 PXlPX2PX3 PX2PX3PX4 PX3PX4PX5 i39iPor12PX23PX3 EX 39 In Class ExerCIse Q How much memory do we expect a skip list to use up 0 Let Xi be the number of lists that element 1 is inserted in 0 Q What is PXi 2 1 PXi 2 2 PXi 2 3 c Q What is PXZ Z k for general k 0 Q What is EXi 0 Q Let X 31 Xi What is EX 39 Search Time 0 Its easier to analyze the search time if we imagine running the search backwards 0 Imagine that we start at the found node v in the bottommost list and we trace the path backwards to the top leftmost senitel L o This will give us the length of the search path from L to U which is the time required to do the search 39 Backwards Search SLFbackv while v L if UpvNIL v Upv else v Leftv Backward Search r o For every node v in the skip list Upv exists with probability 12 So for purposes of analysis SLFBack is the same as the following algorithm FlipWalkv while v L if CUINFLIP HEADS v Upv else v Leftv Analysis 39 o For this algorithm the expected number of heads is exactly the same as the expected number of tails 0 Thus the expected run time of the algorithm is twice the expected number of upward jumps 0 Since we already know that the number of upward jumps is Ologn with high probability we can conclude that the expected search time is Olog n L Hopital 39 CS 561 Lecture 2 For any functions fn and gn which approach infinity and are Jared Saia differentiable L39Hopital tells us that University of New Mexico o iimnswggg Iimnsw H 7 39 Today 5 Outline Example 39 Q Which grows faster Inn or 0 Let Inn and 901 0 0 Then fn 111 and 901 2 l2n 12 o L39Hopital39s Rule So we have 0 Log Facts I I Inn I 111 1 o Recurrence Relations WHILE nLrTQNlqu gt nangonlQ 2 0 3 0 Thus n grows faster than Inn and so Inn 2 own A di ression on lo 5 Exam les g g D It rolls down stairs alone or in pairs and over your neighbor s dog it s great for a snack or to put on your back log 1 0 it s log log log log 2 1 The Log Song from the Ren and Stimpy Show log 32 5 0 log 2k 2 k c The log function shows up very frequently in algorithm anal ySiS Note logni way way maller than n for large value ofn c As computer scientists when we use log we39ll mean log ie if no base is given assume base 2 4 l l Definition Exam les l l p o log3 9 2 o logmy is by definition the value 2 such that x2 y o log5 125 3 0 games 2 y by definition 0 log4 16 2 o log2424100 100 Facts about exponents Incredibly useful fact about logs l l Recall that 0 Fact 3 logca log 1 loge o 02 00W 0 W002 2 To prove this consider the equation a 53909 take log of both sides and use Fact 2 Memorize this fact From these we can derive some facts about logs Facts about logs Log facts to memorize l I To prove both equations raise both sides to the power of 2 and use facts about exponents FaCt l IOgltxygt Iogx logy 0 Fact 2 log 10 cloga 0 Fact 3 logca log 1 loge 0 Fact 1 Ogacy logac logy 0 Fact 2 IOgaCZCIOga H I I These facts are suffICIent for all your logarithm needs You Just need to figure out how to use them Memorize these two facts Logs and O notation Important Note l l 0 Note that loggn log n log 8 o oan log n2 0 Note that iog600n200 200 log n log 600 o iog2n is oiog2n not OIog n 0 Note that iog10000030m2 22mg nIog lOOOOOIog 30 log 100000 c This is true since 092n grows asymptotically faster than 0 Thus loggn Iog600n600 and 091000003On2 are all OIog n Iogn o In general for any constants k1 and k2 log11 ml 2 k2 log n log M o All log functions of form k1 Iogg k4nk5 for constants k1 k2 which isjust OIog n k3k4 and k5 are Ong2n 12 14 l I Take Awa At Home Exercise y l Simplify and give 0 notation for the following functions In the k c All log functions of form k1 Iogk2 kym 4 for constants k1 k2 bigo notation write a logs base 2 k3 and k4 are OIogn o For this reason we don39t really care about the base of the 2 log function when we do asymptotic notation I092 0 Thus binary search ternary search and k ary search all take I039 Oogn time 2 0 log log 13 15 39 7 39 Does big 0 really matter Let n 100000 and At lus logn 17 10 5 seconds 32 10 4 seconds n 1 seconds nlOgn 12 seconds n 316 seconds n2 28 hours n3 317 years 2 gt 1 century from Classic Data Structures in C by Timothy Budd 39 Recurrence Relations Oh how should I not lust after eternity and after the nuptial ring of rings the ring of recurrence Friedrich Nietzsche Thus Spoke Zarathustra 0 Getting the run times of recursive algorithms can be chal lenging 0 Consider an algorithm for binary search next slide 0 Let Tn be the run time of this algorithm on an array of size n 0 Then we can write T1 1 Tn Tn2 1 39 Alg Binary Search bool BinarySearch int arr int 5 int e int key if eslt0 return false int mid e 2 if keyarrmid return true else if key lt arrmid return BinarySearch arrsmidkey else return BinarySearch arr mid e key Recurrence Relations 39 Tn Tn2 1 is an example of a recurrence relation 0 A Recurrence Relation is any equation for a function T where T appears on both the left and right sides of the equation 0 We always want to solve these recurrence relation by get ting an equation for T where T appears on just the left side of the equation Recurrence Relations 39 Whenever we analyze the run time of a recursive algorithm we will first get a recurrence relation 0 To get the actual run time we need to solve the recurrence relation Example 39 Let39s guess that the solution to Tn Tn2 1Tl l is Tn Olog n o In other words Tn g clogn for all n 2 no for some positive constants qno c We can prove that Tn g clogn is true by plugging back into the recurrence 20 l 39 Substitution Method 39 Proof We prove this by induction BC T2 2 g clog2 provided that c 2 2 o IH For all j ltn Tj S clogj 015 0 One way to solve recurrences is the substitution method aka guess and check 0 What we do is make a good guess for the solution to Tn and then try to prove this is the solution by induction Tn S S 22 Tn2 1 4 WW2 1 5 clogn7log2 l 6 clognic l 7 clogn 8 Last step holds for all n gt 0 if c 2 1 Thus entire proof holds if n22 and 022 21 23 39 Recurrences and Induction Recurrences and Induction are closely related 0 To find a solution to fn solve a recurrence 0 To prove that a solution for fn is correct use induction For both recurrences and induction we always solve a big prob lem by reducing it to smaller problems 39 Some Examples o The next several problems can be attacked by inductionrecurrences o For each problem we39ll need to reduce it to smaller problems 0 Question How can we reduce each problem to a smaller subproblem 39 Sum Problem o fn is the sum of the integers l 26 I 39 Tree Problem o fn is the maximum number of leaf nodes in a binary tree of height n Recall o In a binary tree each node has at most two children 0 A leaf node is a node with no children 0 The height of a tree is the length of the longest path from the root to a leaf node Simpler Subproblems 39 Binary Search Problem 39 Sum Problem What is the sum of all numbers between 1 and n7 l ie fn7 1 Tree Problem What is the maximum number of leaf nodes in a binary tree of height 11 l ie fn7 1 Binary Search Problem What is the maximum number of queries that need to be made for binary search on a sorted array of size n2 ie fn2 Dominoes problem What is the number of ways to tile a 2 by n7 l rectangle with dominoes What is the number of ways to tile a 2 by n7 2 rectangle with dominoes ie fn 1 fn 2 o fn is the maximum number of queries that need to be made for binary search on a sorted array of size n 28 30 I I 39 Dominoes Problem 39 Recurrences 0 Sum Problem fn fn7 1 711 f1 1 I I I 0 Tree Problem fn2fnili f01 2ioi 392 n w fltn2gt1v flt2gt y g o Dominoes problem fn fn7 1 fn2i f1 2 ll f2 2 29 31 l l G uesses Sum Problem fn n ln2 Tree Problem fn 2 Binary Search Problem fn logn 39 Sum Problem 0 Want to show that fn n ln2 o Prove by induction on n 0 Base case f1 2 12 l o Inductive hypothesis for all j lt n fj j lj2 o Inductive step 1 7175 o Dominoes problem fn lt 2 gt lt 2 gt fn fn7 1n 9 nnil2n 10 nln2 ll 32 34 l l Inductive Proofs Tree Problem 39 Want to show that fn 2 Trying is the first step to failure Homer Simpson 39 Prove by indUCtion on n 0 Base case f0 20 l o Inductive hypothesis for all j ltn fj 22739 0 Now that we39ve made these guesses we can try using Induc Inductive Ste I tion to prove they39re correct the substitution method p39 o We39ll give inductive proofs that these guesses are correct for fn 2 fn7 l 12 the first three problems 2 2quot71 13 2quot 14 33 35 l l Binary Search Problem 39 Want to show that fn log n assume n is a power of 2 o Prove by induction on n 0 Base case f2 logQ l o Inductive hypothesis for all j lt n fj logj o Inductive step fn fn2l 15 logn2l 16 logn7l092l 17 IOgn 18 36 39 In Class ExerCIse 0 Consider the recurrence fn 2fn2 l f1 l 0 Guess that fn 3 ani l 0 Q1 Show the base case for what values of c does it hold 0 Q2 What is the inductive hypothesis 0 Q3 Show the inductive step 37 39 Todo o Read Chapter 3 and 4 in the text 0 Work on Homework 1 38 CS 561 Lecture 1 Jared Saia University of New Mexico 39 Today s Outline 0 Administrative Info 0 Asymptotic Analysis 39 Why study algorithms Seven years of College down the toilet John Belushi in Animal House 0 Q Can I get a programming job without knowing something about algorithms and data structures 0 A Yes but do you really want to be programming GUIs your entire life 39 Why study algorithms 11 Almost all big companies want programmers with knowledge of algorithms Microsoft Google Oracle IBM Yahoo San dia Los Alamos etc o In most programming job interviews they will ask you several questions about algorithms andor data structures Your knowledge of algorithms will set you apart from the large masses of interviewees who know only how to program 0 If you want to start your own company you should know that many startups are successful because they39ve found better algorithms for solving a problem eg Google Akamai etc 39 Why Study Algorithms III You39ll improve your research skills in almost any area You39ll write better faster code You39ll learn to think more abstractly and mathematically It39s one of the most challenging and interesting area of CS A Real Job Interview Question 39 The following is a question commonly asked in job interviews in 2002 thanks to Maksim Noy see the career center link from the dept web page for the full compilation of questions 0 You are given an array with integers between 1 and 1000000 0 All integers between 1 and 1000 000 are in the array at least once and one of those integers is in the array twice 0 Q Can you determine which integer is in the array twice Can you do it while iterating through the array only once 39 39 Solution Ideas on how to solve this problem What if we allowed multiple iterations Naive Algorithm Create a new array of ints between 1 and 1000000 which we39ll use to count the occurences of each number Initialize all entries to 0 Go through the input array and each time a number is seen update its count in the new array Go through the count array and see which number occurs twice Return this number Naive Algorithm Analysis A better Algorithm l l c Q How long will this algorithm take 0 A We iterate through the numbers 1 to 1000000 three times 0 Note that we also use up a lot of space with the extra array 0 This is wasteful of time and space particularly as the input 0 Iterate through the input array summing up all the numbers let S be this sum 0 Let ac S 7 1 000000 11 0000002 array gets very large eg it might be a huge data stream Return 00 c Q Can we do better 8 10 l l 39 Ideas for a better Algorithm 39 AnalySIs This algorithm takes iterates through the input array just 0 Note that 211 2 n ln2 once 0 Let S be the sum of the input array It uses up essentially no extra space 0 Let as be the value of the repeated number It is at least three times faster than the naive algorithm 0 0 Then S l000000 ll0000002ac Further if the input array is so large that it won39t fit in Thus as S 7 1000000 ll0000002 memory this is the only algorithm which will work 0 These time and space bounds are the best possible l However with some thought and work you can almost al 39 Ta ke Away Designing good algorithms matters Not always this easy to improve an algorithm ways get a better algorithm than the naive approach How to analyze an algorithm There are several resource bounds we could be concerned about time space communication bandwidth logic gates etc However we are usually most concerned about time Recall that algorithms are independent of programming lan guages and machine types Q So how do we measure resource bounds of algorithms 39 39 Random access machine model We will use RAM model of computation in this class All instructions operate in serial All basic operations eg add multiply compare read store etc take unit time All atomic data chars ints doubles pointers etc take unit space Worst Case Analysis We39ll generally be pessimistic when we evaluate resource bounds We39ll evaluate the run time of the algorithm on the worst possible input sequence Amazingly in most cases we39ll still be able to get pretty good bounds Justification The average case is often about as bad as the worst case Example Analysis Algorithm 2 l l 0 Consider the problem discussed last tuesday about finding a o Iterate through the input array summing up all the numbers redundant element in an array let S be this sum 0 Let39s consider the more general problem where the numbers 0 Let as s 7 n 1n2 are 1 to n instead of 1 to 1000000 0 Return as 16 18 l l Algorithm 1 Example Analysis Time l l o Create a new count array of ints of size n which we39ll use 0 Worst case Algorithm 1 does 5 n operations n inits to 0 to count the occurences of each number Initialize all entries in count array n reads of input array n reads of count to 0 array to see if value is 1 n increments and n stores into 0 Go through the input array and each time a number is seen count array update its count in the count array 0 Worst case Algorithm 2 does 2n4 operations n reads 0 As soon as a number is seen in the input array which has of input array n additions to value S 4 computations to already been counted once return this number determine as given S 17 19 39 39 39 7 39 Example AnalySIs Space 39 Asymptotic analySIs o A tool for analyzing time and space usage of algorithms 0 Assumes input size is a variable say n and gives time and 0 Worst Case Algorithm 1 uses n additional units of space to space bounds as a function ofn store the count array 0 Ignores multiplicative and additive constants 0 Worst Case Algorithm 2 uses 2 additional units of space 0 Concerned only with the rate of growth 0 Eg Treats run times of n 10000 n2000 and 5n2 all the same We use the term On to refer to all of them 20 22 I I 39 A Simpler Analysis 39 What is Asymptotic AnalysisII Analysis above can be tedious for more complicated algo rithms o In many cases we don39t care about constants 5n is about the same as 2n 4 which is about the same as an b for any constants a and b However we do still care about the difference in space n is very different from 2 o Asymptotic analysis is the solution to removing the tedium but ensuring good analysis 0 Informally O notation is the leading ie quickest growing term of a formula with the coefficient stripped off 0 O is sort of a relaxed version of g Eg n is On and n is also 0n2 o By convention we use the smallest possible 0 value ie we say n is On rather than n is 0n2 21 23 More Examples 39 Eg n 1070001172000 and 5n2 are all On 0 n logn ni are On 0 n2nlogn10n2ni w are 0n2 o nlogn 1011 is OnOgn o 10 log2n is Olog2n o n nogn 1011 is 0 10000 250 and 4 are 01 More Examples 39 Algorithm 1 and 2 both take time On 0 Algorithm 1 uses On extra space 0 But Algorithm 2 uses 01 extra space 24 25 39 Formal Defn of Big O o A function fn is Ogn if there exist positive constants c and no such that fn g cgn for all n 2 no 26 39 Example Let39s show that fn lOn 100 is Ogn where gn n c We need to give constants c and no such that fn S cgn for all n 2 no 0 In other words we need constants c and no such that lOn 100 S on for all n 2 no 27 Example 39 We can solve for appropriate constants lOn 100 S an 10 lOOn g c 0 So ifngt 1 then 5 should be greater than 110 o In other words for all n gt 1 lOn 100 g llOn 0 So lOn 100 is On 39 Questions Express the following in O notation o n31ooo 7 100m2 7 lOOn 3 o logn 100 o 10 log2n 100 o 21i l 2 28 29 39 Relatives of big O The following are relatives of big O o g e 2quot Q 2quot 0 quotltquot w quotgt 39 30 I Relatives of big O When would you use each of these Examples 0 g e 2quot Q 2quot 0 ltquot w quotgt This algorithm is 0n2 ie worst case is 9012 This algorithm is n best and worst case are 901 Any comparison based algorithm for sorting is Qnlog n Can you write an algorithm for sorting that is 0n2 This algorithm is not linear it can take time 0101 31 Rule of Thumb Problems 0 Let fn gn be two functions ofn 0 Let f1n be the fastest growing term of fn stripped of its coefficient 0 Let 91n be the fastest growing term of gn stripped of its coefficient True or False Justify your answer n3 4 is 01012 nlog n3 is n log n log3 5n2 is 9log n 10T10n2 n is om nlogn is 701 n3 4 is 0n4 Then we can say 0 If f1n S 9101 the Km O9n 0 If f1n Z 9101 the Km Qlt9n 01f f1n 9101 the Km 99n If f1n lt 9101 the Km 09n If f1n gt 9101 the Km w9n 32 34 l I More Exam es Formal Defns p o Ogn fn there exist positive constants c and no The following are all true statements such that 0 g fn g cgltngt for all n 2 no 2211 i2 is 0013 Qltn3 and 9n3 o gn fn there exist positive constants 5152 and no 0 logn is own SUCh that 0 S C1901 S 1 01 S 02901 for a n 2 n0 0 logn is olog2n o 10 OOOn2 25n is 9n2 o Qgn fn there exist positive constants c and no such that O S 0901 g fn for all n 2 no 33 35 39 Formal Defns II o ogn fn for any positive constant c gt 0 there exists no gt 0 such that 0 g fn lt 0901 for all n 2 no 0 wgn fn for any positive constant c gt 0 there exists no gt 0 such that 0 g cgn lt fn for all n 2 no 36 Another Example 39 0 Let 1 01 2 lOlog2n logn 901 log2n Let39s show that 1 99n c We want positive constants 6102 and no such that O S 61901 S S 62901 for all n 2 no 0 S61092nS lOlog2n IOgnSCQIOQQTL Dividing by log2n we get 03513 101lognS52 o If we choose 01 152 11 and n0 2 2 then the above inequality will hold for all n2 no 37 39 In Class ExerCIse Show that for n 100 and 901 l2n2 that 99n o What statement would be true if fn gn 0 Show that this statement can not be true 38 39 Todo Read Syllabus Visit the class web page wwwcsunmeduquotsaia56l Sign up for the class mailing list cs56l Read Chapter 3 and 4 in the text 39 39 Outline CS 561 Lecture 24 o All Pairs Shortest Paths Jared Saia o TSP Approximation Algorithm University of New Mexico All Pairs Shortest Paths 39 Example 39 o For the single source shortest paths problem we wanted to find the shortest path from a source vertex 5 to all the other vertices in the graph We will now generalize this problem further to that offlnding o For any vertex v we have distvv O and predvv the shortest path from every possible source to every possible NULL destination 0 If the shortest path from u to v is only one edge long then In particular for every pair of vertices u and v we need to distuv wu AU and preduv u compute the following information o If there39s no shortest path from u to u then distuv 00 distuv is the length of the shortest path if any from and preduv NULL u to v predu v is the second to last vertex if any on the short est path if any from u to v Lots of Single Sources 39 The output of our shortest path algorithm will be a pair of lVl gtlt lVl arrays encoding all lVl2 distances and predecessors Many maps contain such a distance matric to find the distance from say Albuquerque to say Ruidoso you look in the row labeled Albuquerque and the column labeled Ruidoso In this class we39ll focus only on computing the distance array The predecessor array from which you would compute the actual shortest paths can be computed with only minor ad ditions to the algorithms presented here 0 Most obvious solution to APSP is tojust run SSSP algorithm lVl times once for every possible source vertex 0 Specifically to fill in the subarray dists we invoke either Dijkstra39s or Bellman Ford starting at the source vertex 5 o We39ll call this algorithm ObviousAPSP ObviousAPSP l l AnalySIs o The running time of this algorithm depends on which SSSP UbviousAPsPVEw algorithm we use for every vertex s o If we use Bellman Ford the overall running time is OlVl2lEl dists SSSPVEws 0lVl4 o If all the edge weights are positive we can use Dijkstra39s in stead which decreases the run time to 6lVllEllVl2 log lVl OltlVl3 39 Problem We39d like to have an algorithm which takes OlVl3 but which can also handle negative edge weights 0 We39ll see that a dynamic programming algorithm the Floyd Warshall algorithm will achieve this 0 Note the book discusses another algorithm Johnson39s al gorithm which is asymptotically better than Floyd Warshall on sparse graphs However we will not be discussing this algorithm in class 39 Dynamic Programming 0 Recall Dynamic Programming 2 Recursion Memorization 0 Thus we first need to come up with a recursive formulation of the problem 0 We might recursively define distuv as follows if u v distu7 v 0 minm ltdc stu ac wac a 1a otherwise 39 The problem In other words to find the shortest path from u to u try all possible predecessors ac compute the shortest path from u to ac and then add the last edge u Av Unfortunately this recurrence doesn t work To compute distuv we first must compute distu ac for every other vertex ac but to compute any distuac we first need to compute distuv We39re stuck in an infinite loop 39 The solution To avoid this circular dependency we need some additional parameter that decreases at each recursion and eventually reaches zero at the base case One possibility is to include the number of edges in the short est path as this third magic parameter So define distuvk to be the length of the shortest path from u to u that uses at most k edges Since we know that the shortest path between any two ver tices uses at most lVl 7 l edges what we want to compute is distuv lVl 7 l l The Recurrence 39 The Algorithm It39s not hard to turn this recurrence into a dynamic program ming algorithm Even before we write down the algorithm though we can 0 if u v tell that its running time will be 9lVl4 distuvk 00 if k O and u 7b v This is just because the recurrence has four variables u minm ltdistuack 7 l wac a 10 otherwise 1 k and ac each of which can take on lVl different values Except for the base cases the algorithm will just be four nested for loops 39 DP APSP l The Problem DP APSPVEW for all vertices u for all vertices ifuv distuv0 0 else 0 This algorithm still takes OlVl4 which is no better than the ObviousAPSP algorithm 0 If we use a certain divide and conquer technique there is a way to get this down to OlVl3 log lVl think about how you might do this 0 However to get down to OlVl3 run time we need to use a different third parameter in the recurrence distuv0 infinity for k1 to V1 for all vertices u in V for all vertices u in V distuvk infinity for all vertices X in V if distuvkgtdistuxk1wxv distuvk distuxk1wxv Floyd Warshall 39 Number the vertices arbitrarily from 1 to lvl 0 Define distuvr to be the shortest path from u to v where all intermediate vertices if any are numbered r or less 0 If r O we can39t use any intermediate vertices so shortest path from u to v is just the weight of the edge if any between u and v o If r gt 0 then either the shortest legal path from u to 1 goes through vertegtlt r or it doesn39t c We need to compute the shortest path distance from u to v with no restrictions which isjust distuv lVl 39 The recurrence We get the following recurrence wu 79v ifrzz O distu7 v 7 mindistu7 v 7 7 l7 distu7 7 7 7 l d7lst7 7 v 7 7 1 otherwise 39 The Algorithm FloydWarsha11VEw for u1 to V for v1 to V distuv0 wuv for r1 to V for u1 to V for v1 to V if distuvr1 lt disturr1 distrvr1 distuvr distuvr1 else distuvr disturr1 distrvr1 39 AnalySIs There are three variables here each of which takes on lVl possible values 0 Thus the run time is 9lVl3 0 Space required is also 9lVl3 39 Take Away Floyd Warshall solves the APSP problem in 9lVl3 time even with negative edge weights Floyd Warshall uses dynamic programming to compute APSP We39ve seen that sometimes for a dynamic program we need to introduce an extra variable to break dependencies in the recurrence We39ve also seen that the choice of this extra variable can have a big impact on the run time of the dynamic program A version of the TSP problem is Given a weighted graph G what is the shortest Hamiltonian Cycle of G Where a Hamiltonian Cycle is a path that visits each node in G egtltactly once and returns to the starting node This TSP problem is NP Hard by a reduction from Hamilto nian Cycle However there is a 2 approgtltimation algorithm for this prob lem if the edge weights obey the triangle inequality 39 Triangle Inequality In many practical problems its reasonable to make the as sumption that the weights 5 of the edges obey the triangle inequality The triangle inequality says that for all vertices uvw E V cuw S cuv cvw In other words the cheapest way to get from u to w is always to just take the edge uw In the real world this is often a pretty natural assumption For example it holds if the vertices are points in a plane and the cost of traveling between two vertices is just the euclidean distance between them 39 Approximation Algorithm 0 Given a weighted graph G the algorithm first computes a MST for G T and then arbitrarily selects a root node T of T o It then lets L be the list of the vertices visited in a depth first traversal of T starting at r 0 Finally it returns the Hamiltonian Cycle H that visits the vertices in the order L


Buy Material

Are you sure you want to buy this material for

25 Karma

Buy Material

BOOM! Enjoy Your Free Notes!

We've added these Notes to your profile, click here to view them now.


You're already Subscribed!

Looks like you've already subscribed to StudySoup, you won't need to purchase another subscription to get this material. To access this material simply click 'View Full Document'

Why people love StudySoup

Steve Martinelli UC Los Angeles

"There's no way I would have passed my Organic Chemistry class this semester without the notes and study guides I got from StudySoup."

Janice Dongeun University of Washington

"I used the money I made selling my notes & study guides to pay for spring break in Olympia, Washington...which was Sweet!"

Bentley McCaw University of Florida

"I was shooting for a perfect 4.0 GPA this semester. Having StudySoup as a study aid was critical to helping me achieve my goal...and I nailed it!"


"Their 'Elite Notetakers' are making over $1,200/month in sales by creating high quality content that helps their classmates in a time of need."

Become an Elite Notetaker and start selling your notes online!

Refund Policy


All subscriptions to StudySoup are paid in full at the time of subscribing. To change your credit card information or to cancel your subscription, go to "Edit Settings". All credit card information will be available there. If you should decide to cancel your subscription, it will continue to be valid until the next payment period, as all payments for the current period were made in advance. For special circumstances, please email


StudySoup has more than 1 million course-specific study resources to help students study smarter. If you’re having trouble finding what you’re looking for, our customer support team can help you find what you need! Feel free to contact them here:

Recurring Subscriptions: If you have canceled your recurring subscription on the day of renewal and have not downloaded any documents, you may request a refund by submitting an email to

Satisfaction Guarantee: If you’re not satisfied with your subscription, you can contact us for further help. Contact must be made within 3 business days of your subscription purchase and your refund request will be subject for review.

Please Note: Refunds can never be provided more than 30 days after the initial purchase date regardless of your activity on the site.