### Create a StudySoup account

#### Be part of our community, it's free to join!

Already have a StudySoup account? Login here

# Algorithms and Data Structures CSE 331

MSU

GPA 3.97

### View Full Document

## 35

## 0

## Popular in Course

## Popular in Computer Science and Engineering

This 14 page Class Notes was uploaded by Donnell Kertzmann on Saturday September 19, 2015. The Class Notes belongs to CSE 331 at Michigan State University taught by Staff in Fall. Since its upload, it has received 35 views. For similar materials see /class/207417/cse-331-michigan-state-university in Computer Science and Engineering at Michigan State University.

## Popular in Computer Science and Engineering

## Reviews for Algorithms and Data Structures

### What is Karma?

#### Karma is the currency of StudySoup.

#### You can buy or earn more Karma at anytime and redeem it for class notes, study guides, flashcards, and more!

Date Created: 09/19/15

CSE 331 Summer 2002 Lecture 16 62102 1 Computational Complexity a V k Coal on rst day of class with asymptotic notation create a rough set of classes of complexity can be thought of as sets of problems that have same level of difficulty77 to solve Have seen many di erent upper bounds on problems 0log N 0N 0N2 0Nk 02N Problem is as dif cult to solve as its best algorithms running time belongs in same class as other problem with same running time Example sorting and heap construction are in the same class ON log N For most algorithms we ve looked at running time was a small polynomial with respect to the input size If the input size doubles the running time only increases by constant factor This is our de nition of a good77 solution and those with such solutions are generally known as wellsolved77 prob ems More Formally Complexity Classes Let denote the set of problems solvable in time TlMEn2 Sorting FindMinimum TopologicalSort All Pairs Shortest Path is not in TlMEn2 Complexity class P Ukgt0 TlMEnk v P is the set of all problems solvable in polynomial time 2 39 Polynomial time is good vii Most problems we have encountered are in P many more are not Eulerian Circuits i Problem Does a given graph contain a circuit that travels each edge once and only once ii Easy solution if degree of every vertex is even then the answer is yes Hamiltonian Cycles i Problem given a graph does it contain a cycle that visits each vertex in the graph exactly once ii lVll permutations of vertices to be checked Traveling Salesman Problem i Problem what is the minimum length tour that visits each vertex of a given graph exactly once ii Simple greedy solution is not optimal iii Again lVll permutations of vertices Decision Problems Optimization Problems i Hamiltonian Path is a decision problem the answer is yes or no ii TSP is an optimization problem the answer is an optimal tour ii We can convert an optimization problem into a decision problem iv Example TSP does there exist a tour of length L or less Veri cation of a Problem Solution i Sorting given a permutation are the elements sorted ii Can be veri ed in 0N time iii Find Minimum given a list element is it the minimum iv Can be veri ed in 0N time v Hamiltonian Cycle given a permutation of vertices is it a legal cycle vi Can be veri ed in OOV time given a permutation of vertices check that there is an edge between each consecutive vertex vii TSP given a tour does it cover all cities in length less than L viii Can be veri ed in OOV time add up the length of the tour and see if it is less than L and visits all cities ix All are veri able in polynomial time 1 Solving and Verifying i Obviously veri ability in does not imply solvability in ii Example Hamiltonian Path solution can be veri ed in polynomial time however an expo nential number of possible permutations iii Solvability in time does not imply veri ability in time m Nondeterministic Algorithms i There are two stages to a nondeterministic algorithm a guessing stage and a veri cation stage Example TSP 1 nondeterministically guess a tour in time 2 verify that the guessed tour is a valid solution 39 39 Notion of being solved if there exists a solution to an instance I of the given problem then some answer S that we guessed nondeterministically will lead the checking stage to detect a solution and output yes Conversely if there is not a solution to instance 1 nothing we guess will be right and the checking stage will always answer no iv Example n The Class NP i NP is de ned by nondeterministic algorithms ii Let the set NTlMEfn the set of problems that can be solved in by nondeter ministic algorithms iii Then complexity class NP Ukgt0 NTIMEUL C iv NP is the set of problems that can be solved in polynomial time by nondeterministic algorithms v The Hamiltonian Path Problem and TSP are in NP vi The guessing stage can only take polynomial time vi The size of the guess must be polynomial with respect to the input size viii The guess must be veri ed in polynomial time o P and NP i Obviously P Q NP ii It is believed that P y NP iii Implication a problem in NP that is not in P would require more than polynomial time to solve iv Famous open problem Answer is yet unknown p NPcomplete i A special property of a subset of NP sometimes referred to as its own complexity class de ned as follows if problem L is NPcomplete then it must be in the set NP and it must be as hard as any problem in NP In other words for any given problem L that is NPcomplete if you nd a polynomial time solution for L then you have found a polynomial time solution for all problems in NP iii These are the hardest problems of NP iv Many everyday problems have been proven NPcomplete Hamiltonian Cycle TSP Longest Path etc on P NP i It would be nice to prove this to date no one has CSE 331 Summer 2002 Lecture 3 1 Quiz 2 Asymptotic Notation a Asymptotically tight bound 3 Recurrence relations a Bounding methods i Master MethodTheorem e precedence troubles ii Nlogbaie not Nlogb zie iii Doesn t matter for some cases does for some 4 Algorithm examples from chapter two a Maximum Subsequence Sum pp 24 26 27 29 Algorithm 14 Notes i Algorithm 2 3 Recursify Divide and Conquer ii Algorithm 3 Cute iii Algorithm 4 Lean and mean b Binary Search p 31 i Like guessing numbers ii Searching good OlogN Update not so good ON iii We will see better data structures for searching c Pow p 33 another cute recursion example 5 Linear Data Structures Chapter 3 a Abstract Data Type Basics i Wellde ned interface is important ii Large programs heavily use instantiated ADTs running time is important iii Generally will look at the running time of the interfaces for comparison b List i Generic interfaces Insert Delete Find ii Somewhat malleable c List Contiguous AllocationArrays i Generally static size can make Insert and Delete expensive ON ii Generally not used for more than simple lists d List PointersLinked Lists i Uses an extra data eld to store a pointer to next list element i Insert and Delete become 01 39i Find is still ON Doublylinked lists allow reverse traversal often fast convenient v Example Radix Sort A AKA card sort generalized bucket sort B Bucket sort 1 Read input and 2 Stuff into individual array locations 3 Read out array locations in order C Bucket sort is bad for big numbers 2 32 buckets for most machines 1 2 CSE 331 Summer 2002 Lecture 13 61402 1 BreadthFirst Search a Akin to tree case we pick a node traverse to all of its children then do their children b 1 bfs graph g vertex 5 2 Initialization begins foreach vertex u in set gvertices s l l 3 4 5 color 11 white 6 distance 11 infinity 7 predecessoru NULL 8 9 colors grey 10 distanceEs 0 11 predecessors NULL 12 q s Initialization ends 13 while qempty While we have not visited all vertices 14 15 u qhead Take the first one not yet visited 16 foreach v adjacent to u Look at its neighbors 17 18 if color v white If they are not yet visited 19 20 colorv grey Change their color to grey 21 distanceEv distanceEu 1 Update their distance from s 22 predecessorv u Tell them their predecessor in BFS tree 23 qenqueue V Schedule traversal from them 24 l if 25 l foreach 26 We might want to do something with vertex u here we have scheduled 27 traversal to its neighbors and are about to move on to them 28 qdequeue Dequeue the completed vertex 29 coloru black We re done color it black 30 while 31 Initialization requires lVl main loop requires lVl inner loop requires average of counting outer loop s statements requires total of about lVl summing this with initialization section we get Elll or OOV d In process a BFStree or predecessor tree is formed sometimes useful in other algorithms 0 V e Note does not discover components not connected to given BFS root 2 DepthFirst Search a Akin to tree case we pick a node traverse its rst child and then traverse to its rst child 7 as far as we can go b 1 dfs graph g Sometimes dfs graph g vertex s 2 Initialization begin 3 Sometimes for each vertex u in set gvertices S l l 4 for each vertex u in set gvertices 5 6 color 11 white 7 predecessoru NULL 8 9 time 0 Initialization end 10 Sometimes just dfs visit s 11 for each vertex u in set gvertices For every vertex try to de 12 if coloru white 13 dfs visit u 14 339 15 dfs visit vertex v 16 17 coloru grey 39Ihe vertex has been visited color it 18 discoveryEUJ time Timestamp the discovery 19 foreach v adjacent to u Look at its neighbors 20 21 if colorv white If they have not been visited 22 23 predecessorv u Set their predecessor 24 dfs visit v Visit them 25 339 26 339 27 coloru black Finished with all of 11 s neighbors 28 finishEUJ time Set u s finish time 29 339 c Running time initialization requires about Z V operations dfs 0 s main loop executes once for every vertex giving us V operations for each iteration it calls dfs visit dfs visit executes about an average of operations including the check in dfs s main loop and initialization we come to Z V V or OOV operations d In the process a DFSforest is created sometimes useful in other algorithms 3 StronglyConnected Components DFSBased a A strongly connected component of a graph is a subgraph in which all vertices are reachable from each ot er b In the DFS of any such component all component vertices of it must be in the same tree c One vertex by necessity will have to serve as a DFS ancestor of all vertices call this vertex p By de nition p will have the maximum nish time of any vertex in the component set d So upon performing a DFS we know that we can nd at least one such p 7 the vertex with the highest valued nish time e But how to nd all vertices capable of reaching it We know which are reachable from it 7 they are in our predecessor subtree f If we reverse the directed edges if we form GT we can nd just this set of edges by performing another FS g Once we nd the vertices capable of reaching the current p vertex we can eliminate them from consideration they are their own stronglyconnected component and report them as a SCC h Now we search for the next stronglyconnected component by doing the same DFS traversal from the vertex with nexthighest ranked nish time 7 it will be the next p i One of the classic uses of DFS 4 Minimum Spanning Tree Problem De nition a Problem given the weighted graph G nd an acyclic subgraph of G with minimum weight that connects or spans all the vertices Note this is not possible for disconnected graphs b A cut CS V 7 S is de ned on graph CV E by a set S C V of vertices and its complement set V 7 S An edge is said to cross the cut77 de ned by S if one of its endpoints is in S and the other is in V 7 S c An edge is a light edge if its weight is the minimum of any edge crossing a cut d Would like to show that for any minimum spanning subtree MVM EM adding the light edge crossing the cut de ned by VM is safe in that it creates a new minimum spanning subtree e Proof Let 1111 b be the weight or cost for the edge de ned between any two vertices a and I Let 11C be the total weight or cost for all edges de ned in tree C Let u v and x y be edges which both cross the cut Let wu v S 2195 Assume that x y is part of a minimum spanning tree T which includes M but that u v is not Since they cross the same cut addition of u v to T would form a cycle in T including as Removing x y disconnects T since they cross the same cut readding u 1 would reconnect T Let T be T 7 x y u 2 Since we know that wuv S 1195 y it must also be true that wT S Thus T must also be a minimum spanning tree f So the basic algorithm works generally as follows 1 set S 2 while S does not form a spanning tree 3 find an edge uv that is safe for S 4 S S union uv g Two main algorithms exist both grow an MST piece by piece by adding light edges which cross the cut between the current minimum spanning subtree and the rest of the grap 5 Minimum Spanning Trees Kruskal s Algorithm a Kruskal s algorithm grows an MST by forming larger and larger forests lt initializes by including all vertices as singlevertex trees in its forest From here it chooses to include the lowest cost edge safe edge which does not create a cycle in any tree in its forest lt repeats this step lVl 7 1 times b Commonly implemented using set objects c Rough pseudocode 1 kruskallt graph g weight w 2 3 set allsets Holder of all our trees 4 for each vertex v gvertices Create a tree for each vertex 5 allsetsadd createsetfrom v 6 gedges sort gedges v10 increasing Sort edges 7 for each edge uv gedges Pull out edges in order 8 if s1allsetsfindsetu If u and v are not in 9 s2allsetsfindsetv the same set tree 10 allsetsunions1s2 Safe to be put in the same tree 11 d Running time without investigating the set class it s dif cult to see however initialization requires OOV sorting the edges requires log there are operations on all ets each of which is known to require Olog time Summing these gets us ZlEl log or 0lEl10glEl 6 Minimum Spanning Trees Prim s Algorithm a Prim s algorithm grows an MST by forming larger and larger minimum spanning subtrees lt initializes a minimum spanning subtree of the graph s MST by arbitarily choosing a root vertex For this subtree of the MST it examines its adjacent vertices on the other side of the cut to nd the light edge When found it adds the vertex holding the light edge and grows its vertex set by one It repeats this step until a tree of size lVl 7 1 is formed b 1 prim graph g weight w vertex r 3 priorityqueue q gvertices Initialization begin 4 for each vertex v qelements 5 6 qincreasekey v infinity 7 keyEv infinity 8 9 qdecreasekey r 0 keyEr 0 1O predecessorr NULL Initialization end 11 while qempty While there are still vertices 12 13 vertex u qdeletemin Take the minimum 14 for each vertex v adjacent to u We can reach its neighbors so 15 if v is in q ampamp wuv lt keyEv If their weight is made 16 less by this new path update 17 predecessorv u adjust their predecessor and 18 qdecreasekey v wuv keyEv wuv percolate them 19 l 20 l 21 l c Running time initialization requires lVl lVlloglVl loglVl OOV loglVl the main loop requires lVl loglVl loglVl for a total of OOV loglVl loglVl if gtgt lVl lEl dominates lVl and we reach log Faster bound of OOV log is known possible using Fibonacci heaps 7 SingleSource Shortest Path Problem De nition a Given as input a weighted graph G E and a distinguished sourceroot vertex 3 nd the shortest weighted path from s to every other vertex in G 8 SS Shortest Path Problem sans cost BPSBased Algorithm a Recall the BFS algorithm 7 each level in the graph away from the source vertex receives one unit s greater distance than the previous The distance array in the pseudocode above stores this value b If all edges have unit cost there can be no shorter path to any given vertex than that which BFS nds rst if there were we would have foundmarked it in a previous iteration c Thus BFS itself provides a shortest path algorithm for nonedge weighted graphs in OOV 9 SS Shortest Path Problem Dijkstra s Algorithm a Similar to Prim s algorithm it builds a subgraph of the given graph starting from the given vertex by forming a set of vertices connected by edges of least weight b 1 edge evaluate vertex u vertex v weight w priorityqueue q 2 Given two vertices and a weight function 3 If v s current dist gt dist of u wuv 4 if distance v gt distance UJ wuv 5 6 distance v distance 11 wuv Use the path wedge uv instead 7 if v is in q Test can be done w array 01 8 qdecreasekey v distance v If v is in q percolate up 9 predecessorv u Change the predecessor 10 11 l 12 dijkstra graph g weight w vertex s 13 14 foreach vertex v gvertices Initialization begin 15 16 distance v infinity 17 predecessorv NULL 18 l 19 distance s 0 20 set s NULL Initialization end 21 priorityqueue q gvertices Keyed on distance metric 22 while qempty While there are still vertices 23 24 u qdeletemin Grab the lowest 25 s s union u l Add it to our set 26 foreach vertex v adjacent to u For all adjacent to new vertex 27 edge evaluate u v w q Reevaluate their distance from s 28 29 339 Running time initialization requires OOV building the priority queue requires OOV log operations the main loop executes OOV times the inner loop seeks each edge adjacent to the vertex set by the outer loop and therefore runs a total of times or average of for each iteration edge evaluate 0 runs 0 log time Therefore adding these together we arrive at a total of OOV lVl loglVl lVllEllVlloglVl loglVl Improvement to lVl log is possible using more complex data structures Fibonacci heaps n V d What about negative edge weights What happens e What about negative weight cycles 10 GraphTheoretic Networks a A network in graphtheoretic terms is quite different in model than the networks we use daily to exchange data A graphtheoretic network is de ned as a directed graph with two specially marked vertices 7 a source s and a sink t A source should have indegree zero a sink should have outdegree zero b A ow network adds a capacity function similar to the weighting function of weighted graphs that maps from edges to edge capacities and the restriction that at any vertex 1 that is not 3 or t the total incoming ow must be equal to the total outgoing ow CSE 331 Summer 2002 Lecture 16 62102 1 Dynamic Programming a Studied Examples Fibonacci recursion lab b If we think of recursion as being a top down77 computational tree traversal dynamic programming is a sort of bottomup cousin that uses tables to avoid performing repeated computations c Essentially involves recursion and subsolution memoization for later use d DP Example AllPairs Shortest Path Problem 7 FloydWarshall Algorithm i1 2 DOONGU39IprJ 10 11 floydwarshall weightmatrix w n wnumrows d0 w for j1 jltn j dEk i j min dk 1ij dk 1ik dk1kj return dEHJ ii Simple example iii Running time 0lVl3 iv Running time for Dijkstra log lVl but for dense graphs this becomes 0lVl2 log Allpairs Dijkstra for dense graphs is then 0lVlSloglVl so FloydWarshall is likely an improvement for this case e DP Conclusion DP is powerful but can be quite tricky generally involves dividing a problem up into subproblems coming up with a recurrence relation for solving it and then computing the recurrence in a bottomup fashion 2 Backtracking Algorithms pp 395407 a Turnpike Reconstruction Problem i Problem given distances between points if possible reconstruct a con guration of n points that have interpoint distances that match those of the given set ii In other words let us say that you are given only the distances between the exits on 196 This problem asks you to reconstruct a valid set of exit mile marker numbers to match the distances given ii We set the rst point exit at 0 7 else we could come up with an in nite number of trivial possibilities for any single con guration iv Algorithm to accomplish this is essentially a 1 u ofallthel quotWquot 7 quot i quot those sets of possibilities we know cannot be possible using the information at hand b Turnpike Reconstruction Solution pp 395400 1 1 2 3 4 5 6 7 8 Assign first two exits largest value must be between the beginning and end place distancelist distances placementlist placements d distanceshighest dc distances d foreach placementl placements Try placing a pointexit d units away from placementl If this is possible let p placement1 d 9 10 If there is a distance of distanceplacement1 p in dc 11 12 dc dc distanceplacment1 p 13 foreach placement2 placements placement1 14 15 If there is a distance of distance placement2 p in dc delete it 16 Otherwise we did not succeed this time 17 18 If we succeeded every time and dc is empty return our placement 19 If we succeeded every time and dc is not empty 20 call place dc placements p 21 22 dc distances d 23 l 24 Otherwise if dc is still non empty return failure 25 l c MiniMax Gaming Strategy pp 400407 d i MiniMax is a name for an algorithm Which searches for a best move among a nite set of options It searches for this best move by choosing one of the set and then searching for its opponent s best move it changes hats or turns the tables on itself if it did take the given option ii But how does it determine Which is better 7 it recursively assigns a value to each vertex on the way up For example in chess taking a given piece may give a certain value check may give another u These values are sent up the call stack at each level in the recursion tree one of the set of options is declared best and its value is sent up iii In other words to determine What the best move is the computer essentially tries all available options it tries them by picking each one and recursing on it 7 playing the human role at the next level Which then recurses and plays the computer role and so on iv MiniMax gets its name from the recursion 7 at successive levels in the recursion it attempts to minimize the value of the opponent and then maximize among its oWn options MiniMax Gaming Algorithm pp 402406 movevaluet minimax optionst options minimaxt minimax 1 2 3 if there are no more options 4 return draw drawvalue 5 if we can win in one move 6 return win winvalue 7 8 foreach option options 9 10 if minimax minimize 11 movevalue minimax options option maximize 12 else 13 movevalue minimax options option minimize 14 if minimize ampamp movevalue lt bestmovevalue ll 15 maximize ampamp movevalue gt bestmovevalue 16 bestmovevalue movevalue 17 339 e MiniMax Improvement a pruning p 404 i By noticing a simple property of the the tree during the recursion we can save some effort ii Think of these choices as forming a tree from which we choose a minimum at one level and a maximum at the next We recurse downward nish some parts of the recursion tree obtain values for their nodes at some point we will have not yet chosen a nal minimum maximum value for the current vertex from among the options If at some point the current minimum maximum is less than greater than the current maximum of the level below it no more exploration of that level needs to be done i i In other words consider any two levels in the graph in which recursion is active the lower level will choose a value at least as large small as the one it is currently choosing The level above it has a current minimum maximum and it will choose a value at least as small large as it So if the value it is currently holding as minimum maximum is less than greater than the value currently held in the node below as the maximum minimum we know we need no longer consider that node or anything below it The name a pruning comes from the two types of pruning being done pruning of a node looking for a minimum an ozpruning or of a vertex looking for a maximum a pruning Backtracking algorithms are often used when the problem we are looking at can be described rea sonably we 1 by a decision tree to be searched for a solution and no better methods methods that might have some knowledge of the intrinsic organization of the data exist for nding solutions If this is the case we can some techniques for pruning the search space such as a pruning and others 2 3 Computational Complexity a Goal on rst day of class with asymptotic notation create a rough set of classes of complexity can be thought of as sets of problems that have same level of difficulty to solve Have seen many di erent upper bounds on problems Olog N ON ONQ ON OZN Problem is as dif cult to solve as its best algorithms running time belongs in same class as other problem with same running time Example sorting and heap construction are in the same class ON log N For most algorithms we ve looked at running time was a small polynomial with respect to the input size If the input size doubles the running time only increases by constant factor This is our de nition of a good solution and those with such solutions are generally known as wellsolved problems More Formally Complexity Classes 39 Let denote the set of problems solvable in time TlMEn2 Sorting FindMinimum TopologicalSort All Pairs Shortest Path is not in TlMEn2 39 k2 Complex1ty class P Ukgt0 TlMEn v P is the set of all problems solvable in polynomial time 39 Polynomial time is good Most roblems we have encountered are in P man more are not g Eulerian Circuits h i Problem Does a given graph contain a circuit that travels each edge once and only once ii Easy solution if degree of every vertex is even then the answer is yes Hamiltonian Cycles i Problem given a graph does it contain a cycle that visits each vertex in the graph exactly once ii lVll permutations of vertices to be checked i Traveling Salesman Problem i ii iii i ii iii Problem what is the minimum length tour that visits each vertex of a given graph exactly once Simple greedy solution is not optimal Again lVll permutations of vertices Decision Problems Optimization Problems Hamiltonian Path is a decision problem the answer is yes or no TSP is an optimization problem the answer is an optimal tour We can convert an optimization problem into a decision problem iv Example TSP does there exist a tour of length L or less V i ii iii Veri cation of a Problem Solution Sorting given a permutation are the elements sorted Can be veri ed in 0N time Find Minimum given a list element is it the minimum iv Can be veri ed in 0N time v Hamiltonian Cycle given a permutation of vertices is it a legal cycle vi vii Can be veri ed in OOV time given a permutation of vertices check that there is an edge between each consecutive vertex TSP given a tour does it cover all cities in length less than L viii Can be veri ed in OOV time add up the length of the tour and see if it is less than L and visits all cities ix All are veri able in polynomial time i ii iii i iv 1 Solving and Verifying Obviously veri ability77 in does not imply solvability77 in Example Hamiltonian Path solution can be veri ed in polynomial time however an expo nential number of possible permutations Solvability77 in time does not imply veri ability77 in time Nondeterministic77 Algorithms There are two stages to a nondeterministic algorithm a guessing stage and a veri cation stage Example TSP 1 nondeterministically guess a tour in time 2 verify that the guessed tour is a valid solution 39 39 Notion of being solved if there exists a solution to an instance I of the given problem then some answer S that we guessed nondeterministically will lead the checking stage to detect a solution and output yes Conversely if there is not a solution to instance 1 nothing we guess will be right and the checking stage will always answer no Examp le The Class NP i Let the set NTlMEfn the set of problems that can be solved in by nondeter NP is de ned by nondeterministic algorithms ministic algorithms Then complexity class NP Ukgt0 NTlMEnk NP is the set of problems that can be solved in polynomial time77 by nondeterministic algorithms The Hamiltonian Path Problem and TSP are in NP H N 03 Hgt U 53gt 00 9 CSE 331 Summer 2002 Lecture 1 Algorithms Data Structures amp Computer Science a Core of computer science of the youngest of math disciplines b De nitions Purposes c d Today Functions amp Rates of Growth 0 9 and 9 An algorithm is good bad better than best or worst 7 how In terms of What Requirement of notation for comparison De nitions Let nccl02 be positive integers Let all functions be of non negative range TN OfN i c3nVNgtnTN cfN T QgN i 303nVNgtnTNZcgN TN hN i 301302372 VN gt nTN S 01 02 h or N hN i TN OhN and TN QhN N oltpltNgtgt i TltNgt mm and M a eltpltNgtgt a b c 2 V hN and TN Z 5 A 61 Basic examples SS Rules IfT1N 0fN and TM oltgltNgtgt7 a mm T2N maX0fN70yN b T1NT2N OfN9N o If TN is a polynomial of degree m TN NI 0 logkN ON for any constant k Examples Style The Good The Bad and The Just Plain Wrong Assumed model of computation standard computer Turing machine General rules statements control constructs examples Recursion recurrence relations a Recurrence relation formation example Fibonacci b SolVing three general methods substitution iteration master i Substitution CSE 331 Summer 2002 Lecture 4 52002 H Sorting A 99 V lnternal vs External most algorithms will be internal A U v Array vs Linked List we will assume use of arrays A O v Records and Keys i The record is the actual data generally much larger than the key ii The key is what is used to perform the comparison i Pros aids comprehension simpli es analysis ii Cons hidden memory fetch factor d Type elements may be of any comparable type names numbers etc we ll generally assume integers e Code i C like ii Array to be sorted will be a iii Function swap a i j simply swaps positions 1 and j in array a 10 Bubble Sort a Likely simplest of all why bubble I don t exactly know b Algorithm Progress through the array comparing each two elements encountered lf larger one is left of smaller elements are swapped Repeat this array scan until no more element swaps are needed while ldone i0 done1 while iltsize1 if ai gt ai1 done 0 swap a 1 11 i l c Correctness Does this work Why d Worst case analysis Worst case occurs when How many iterations e Comparison When is this worse than bubble sort Better

### BOOM! Enjoy Your Free Notes!

We've added these Notes to your profile, click here to view them now.

### You're already Subscribed!

Looks like you've already subscribed to StudySoup, you won't need to purchase another subscription to get this material. To access this material simply click 'View Full Document'

## Why people love StudySoup

#### "Knowing I can count on the Elite Notetaker in my class allows me to focus on what the professor is saying instead of just scribbling notes the whole time and falling behind."

#### "Selling my MCAT study guides and notes has been a great source of side revenue while I'm in school. Some months I'm making over $500! Plus, it makes me happy knowing that I'm helping future med students with their MCAT."

#### "Knowing I can count on the Elite Notetaker in my class allows me to focus on what the professor is saying instead of just scribbling notes the whole time and falling behind."

#### "It's a great way for students to improve their educational experience and it seemed like a product that everybody wants, so all the people participating are winning."

### Refund Policy

#### STUDYSOUP CANCELLATION POLICY

All subscriptions to StudySoup are paid in full at the time of subscribing. To change your credit card information or to cancel your subscription, go to "Edit Settings". All credit card information will be available there. If you should decide to cancel your subscription, it will continue to be valid until the next payment period, as all payments for the current period were made in advance. For special circumstances, please email support@studysoup.com

#### STUDYSOUP REFUND POLICY

StudySoup has more than 1 million course-specific study resources to help students study smarter. If you’re having trouble finding what you’re looking for, our customer support team can help you find what you need! Feel free to contact them here: support@studysoup.com

Recurring Subscriptions: If you have canceled your recurring subscription on the day of renewal and have not downloaded any documents, you may request a refund by submitting an email to support@studysoup.com

Satisfaction Guarantee: If you’re not satisfied with your subscription, you can contact us for further help. Contact must be made within 3 business days of your subscription purchase and your refund request will be subject for review.

Please Note: Refunds can never be provided more than 30 days after the initial purchase date regardless of your activity on the site.