New User Special Price Expires in

Let's log you in.

Sign in with Facebook


Don't have a StudySoup account? Create one here!


Create a StudySoup account

Be part of our community, it's free to join!

Sign up with Facebook


Create your account
By creating an account you agree to StudySoup's terms and conditions and privacy policy

Already have a StudySoup account? Login here

Midterm Study Guide

by: Emilie Dzwonar

Midterm Study Guide Cs 224

Emilie Dzwonar

Preview These Notes for FREE

Get a free preview of these Notes, just enter your email below.

Unlock Preview
Unlock Preview

Preview these materials now for free

Why put in your email? Get access to more of this material and other relevant free materials for your school

View Preview

About this Document

Chapters 2-5
Algorithm design and analysis
Byung Lee
Study Guide
Computer Science, algorithms, Algorithm Design, Algorithm Analysis, Study Guide
50 ?




Popular in Algorithm design and analysis

Popular in ComputerScienence

This 9 page Study Guide was uploaded by Emilie Dzwonar on Tuesday March 22, 2016. The Study Guide belongs to Cs 224 at University of Vermont taught by Byung Lee in Spring 2016. Since its upload, it has received 54 views. For similar materials see Algorithm design and analysis in ComputerScienence at University of Vermont.

Similar to Cs 224 at UVM

Popular in ComputerScienence


Reviews for Midterm Study Guide


Report this Material


What is Karma?


Karma is the currency of StudySoup.

You can buy or earn more Karma at anytime and redeem it for class notes, study guides, flashcards, and more!

Date Created: 03/22/16
CS224 Midterm Exam Study Guide The emphasis of the exam is on the concept and intuition and the review of topics studied in class. Review lecture slides and reinforce the understanding from textbook. Textbook sections relevant to each topic are shown in p arentheses at the end of the topic. Chapter 2. Basics of Algorithm Analysis o Justify the argument that an algorithm is efficient if its running time complexity can be expressed as a polynomial function of the problem size. (2.1) o A poly-time algorithm holds the desirable scaling property. There exists constants c > d and d > 0 such that on every input of size N, its running time is bounded by cN steps. o Justification: Using a poly-time algorithm works in practice because the algorithms people develop alm ost always have small constants (c) and small exponents (d). o A problem is “computationally tractable” if there is a worst -case poly-time algorithm that solves the problem o Clarify the distinction among the running time complexities expressed in big -Oh (O), bigomega (Ω), and theta (Ө) notations. (2.2) o Big-Oh(O) – Upper bound; running time can reach no greater than O. o Bigomega (Ω) – Lower bound of running time. o Theta (Ө) – Tight-bound; running time is no greater and no less than Ө. o Formal definition and pr oof of running time complexities. (2.2) Formal Definitions o Big-Oh: T(n) = O(f(n)) if there exist constants c > 0 an0≥ 0 such that for all n ≥ n0 we have T(n) ≤ c · f(n). o Bigomega: T(n) = Ω(f(n)) if there exist constants c > 0 an0 ≥ 0 such that for all n ≥ n0 we have T(n) ≥ c · f(n). o Theta: T(n) = Θ(f(n)) if T(n) is both O(f(n)) and Ω(f(n)). Proofs (2.5) Let k be a fixed constant, and l1 2f , f , k . . , f and h be functions sich that f = . . . O(h) for all i. The1 + f2+ + fk= O(h). d Let f be a polynomial of degree d, in which the coefficiedtis positive. Then f = O(n ). 2 d Proof. We write f = a0+ a1n + a 2 + . . + adn , where ad>0. The upper bound is a direct application of (2.5). First, notice that coeffijients a for j < d may be negative, but j d d in any case we have ajn ≤ |j |n for all n ≥ 1. Thus each term in the polynomial is O(n ). d Since f is a sum of a constant number of functions, each of which is O(n ), it follows from (2.5) that f is O(n ). o Analyze an algorithm to derive its asymptotic running time complexity. ( 2.2) Example: Deriving asymptotic run time of Breadth First Search algorithm Total = O(n) + O(m) = O(m + n) Chapter 3. Graphs o Graph representation: adjacency list, adjacency matrix, and their comparison. Adjacency List o Checking if (u, v) is an edg e takes O(deg(u)), where deg(u) is the number of neighbors of u o Identifying all edges takes Θ(m + n) Adjacency Matrix o Checking if (u, v) is an edge takes Θ(1) o Identifying all edges takes Θ(n ) o Graph connectivity: completely connected graph, minimally connected graph. The relationship between the number of edges and the number of vertices i n each of the two. Minimally Connected Graph: A graph with the smallest amount of edges needed to create a path to every node; the removal of a single edge would leave a disconnected graph Number of Edges: n – 1, where n is the number of nodes/vertices Completely Connected Graph: A graph in which each pair of vertices is directly connected by an edge. Number of Edges: o For undirected: n(n-1)/2, where n is the number of nodes o For directed: n(n-1), where n is the number of nodes o Analyze the running time complexity of breadth-first search algorithm when a graph is represented using an adjacency list or an adjacency matrix. (3.2) BFS with adjacency list : O(m + n) o It takes O(n) for the initialization (the first line). o There are at most n lists L[i] to se t up, so initializing an empty list L[i+1] takes total O(n) time during the execution of the While loop. o The part “If Discovered[v]…” takes O(1). o Each node u is visited exactly once during the execution of the While loop, and for each node, there are deg(u) incident nodes. o So, the total run-time of the While loop is proportional to the summation of deg(u) over all nodes, which equals Σu ∈V deg(u) = O(2m) = O(m). o So, total O(m+n). BFS with adjacency matrix: O(n ) o It takes O(n) for the initialization (the fi rst line). o There are at most n lists L[i] to set up, so initializing an empty list L[i+1] takes total O(n) time during the execution of the While loop. o The part “If Discovered[v]…” takes O(1). o Each node u is visited exactly once during the execution of the While loop, and for each node, there are n nodes to check. o So, the total run-time of the While loop is proportional to n 2 2 o So, total O(n ) + O(n) = O(n ). o Give an argument for or against building a recursive algorithm for BFS. (3.2) Because BFS most oft en uses a queue (FIFO) data structure to keep track of visited nodes, and it involves observing one layer of nodes at a time, a recursive algorithm would not be ideal. A recursive algorithm would make more sense for DFS, which has a recursive nature of se arching as deep as possible before returning back. o Give the key distinction between the queue structure used for BFS and DFS. (3.2) BFS uses a FIFO queue structure, which pushes elements out of the queue in the order they were added, while DFS uses a LI FO queue (aka stack) that pushes most recently added elements out first. o Outline the bipartite testing algorithm based on the BFS algorithm and show its correctness (3.4) Lines in bold have been added to the BFS algorithm to test for bipartiteness. Algorithm: // Set Discovered[s] = true and Discovered[v] = false for all other v // Set Color[s] = red // Initialize L[0] to consist of the single element s // Set the layer counter i = 0 // Set the current BFS tree T = ∅ // While L[i] is not empty // Initialize an empty list L[i +1] // For each node u in L[i] // For each edge (u,v) incident to u // If Discovered[v] = false, then // Set Discovered[v] = true // Add edge (u,v) to tree T // Add v to list L[i +1] // If i+1 is odd, set Color[v] = blue // Else, set Color[v] = red // End If // End For // End For // Increment layer counter i by one // End While // Scan all edges and determine whether there is any edge that has the same color on both end-nodes Correctness: By assigning blue to odd numbered layers and red to even numbered layers, we are essentially saying, only the nodes in those layers can have those respective colors. Thus, if we find that a particular node has been assigned the same color as a node in a preceding or following layer, this will indicate that these nodes belong to an odd numbered cycle , which means we have identified our graph is bipartite. o Outline the strong -connectivity testing algorithm using BFS and show the running time complexity. (3.5) Algorithm // Pick any node s. // Run BFS from s in G. // Build Gre(reverse the direction of every edge in G). // Run BFS from s in G rev(set Discovered[v] back to false each time node v is discovered) rev // Return true iff every node is reachable f rom s in both BFS in G and BFS in G (if Discovered[] is empty) Running Time Complexity : O(m + n) Explanation: o Each BFS step takes O(m + n) . rev o Building G can be done in O(m + n). o Traverse all edges in adjacency list exactly once to construct new adjacency list with traversed edges. (Mark end of each list so addi ng new node takes constant time ). o Scanning Discovered array to find if any vertices are left over takes O(n) . o Total time: O(3m + 3n + n) = O(m + n). o Outline the topological sorting algorithm and show the running time complexity. (3.6) Algorithm // Maintain count[] for remaining number of incoming edges to node w // Maintain set S of remaining nodes with no incoming edge // Initialize count[w] for each node w and initialize S // Repeat until S is empty { // Remove one node v from S. // Decrement count[w] for every w with an incoming edge from v, // and add w to S if count[w] has become 0. } Running Time Complexity : O(m + n) (with G in adjacency list) o Initializing count[w] for every w requires scanning through adjacency list which takes O(m + n). o Removing each node exactly once is executed exactly n times (for n nodes) so takes O(n). o Decrementing count[w] for every w with incoming edge from v takes the number of outgoing edges from v for each iteration which totals to O(m), since there are m edges in total. o Total: O(2m + 2n) = O(m + n) . Chapter 4. Greedy Algorithms o State why a greedy algorithm does not always guarantee an optimal solution. Greedy algorithms are not fit for every sit uation but work well when a greedy choice can be made at every stage of a problem and when a solution for part of the problem is also a solution to the entire problem. Sometimes, there are algorithms that prove to be more optimal than a greedy strategy, b ut it all depends on the given problem. o For each of the following problems, state the optimal greedy strategy and justify its optimality, and find an optimal solution given a problem instance; given a greedy strategy, prove or disprove if it is an optimal strategy: o Interval scheduling (4.1) Earliest finish time o Interval partitioning (4.1) Earliest start time o Minimum lateness (4.2) Earliest deadline o State the greedy strategy of the Dijkstra’s shortest path algorithm and justify why it ensures an optimal solution. (4.5) Start at the source node s and follow the path to t that chooses each edge, which minimizes the total distance from s to t. o Show the shortest paths as they are constructed during the execution of the Dijkstra’s algorithm. (4.5) Example: o State the similarity and dissimilarity between the Dijkstra’s shortest path algorithm and the Prim’s minimum spanning tree algorithm. (4.5) While both algorithms are concerned with finding a shortest path, Dijkstra’s shortest path algorithm is used to find the shortest path between two nodes, s and t, but is not necessarily concerned with hitting every node if it does not have to in order to reach t, while Prim’s is a MST algorithm, that finds the shortest path which touches every node in the graph. In addition, Dijkstra’s evaluates the total weighted distance from s to a node as it considers which node to add to the cloud next, while Prim’s is only concer ned with the individual weights of each edge (u,v) leading from a node u, and it considers whe ther or not adding node v will create a cycle. o For each of the Prim’s and Kruskal’s minimum spanning tree algorithms, state the greedy strategy and justify its optimality. Kruskal’s Greedy Strategy: Starting with the smallest weighted edge in a graph, continually add the next smallest weighted edge to T if it will not create a cycle. Optimality: For each edge considered : Case 1: If adding e to T creates a cycle, then e is discarded. This is justified by the cycle property because e has the largest we ight among all edges in the cycle. Case 2: Otherwise, add e = (u, v) to T. This is justified by the cut property, where the cut S is the set of nodes in u's connected component, because, since adding e to T does not create a cycle, e has the smallest weight among all edges remaining to be added from the cutset of S. Prim’s Greedy Strategy: Start at any node s and add to cut S, continually consider the smallest weighted edge in the cutset of S, while adding the newly explored node to S. Optimality: Justified by the cut property: Let S b e any cut in a graph, and let e be the minimum weighted edge in the cutset D of S. Th en the MST contains e. o Describe briefly how the Kruskal’s minimum spanning tree algorithm can be implemented using the union-find data structure. (4.5) You can implement Kruskal’s algorithm using a forest implementation of disjoint sets. Initialize all nodes in the graph as disjoint singleton sets. Each time a new edge (u,v) is considered, check to see if u and v are equivalent (have the same root node) by calling find(u) and find(v). If they return different values, this means adding this edge would not create a cycle, so union them. Now they are apart of the same tree and next time find is called on either one, the same root node will be returned. Continue this process until all nodes belong to the same set; then you know you’ve reached every node and the MST is complete. o Show the minimum spanning tree as it is constructed during the execution of the Prim’s or the Kruskal’s al gorithm. (4.5) Examples: Numbers in red indicate order the edge was added to MST 1   3 1   6   2   5 7   4   1   3 1   7   2   1   5 1   4   6   Chapter 5. Divide and Conquer. (Do not use Master theorem when solving a recurrence relation.) o Describe the divide -and-conquer approach in algorithm design. o Break up problem into several parts o Solve each part recursively o Combine solutions as parts in overall solution o State the approach to deriving the running time complexity of a divide -and-conquer algorithm. Building and solving a recurrence relation. o Outline the mergesort algorithm using divide -and-conquer and derive the running time complexity. (5.1) Algorithm // Divide array into two halves // Recursively sort each half // Merge two sorted halves to make one sorted whole Running Time Complexity: O(nLogn) Runtime T(n) Divide O(1) Sort 2T(n/2) Merge O(n) (Using pre-sorted arrays and temporary array) T(n) ≤ T (⎣n/2⎦)+ T (⎡n/2⎤)+ O(n) =>  T(n) = O(n log n) o Outline the counting inversion algorithm using divide -and-conquer and derive the running time complexity. (5.3) Algorithm // Divide array into two halves // Recursively count inversion in each half // Count inversions between the two halves, while merging into whole and return sum of all three counts (Make sure two halves are pre -sorted, and use auxiliary array) Running Time Complexity : O(nLogn) Runtime T(n) Divide O(1) Sort 2T(n/2) Merge O(n) (Using pre-sorted arrays and auxiliary array) T(n) ≤ T (⎣n/2⎦)+ T (⎡n/2⎤)+ O(n) =>  T(n) = O(n log n) o Outline the Karatsuba’s integer multiplication algorithm using divide -and-conquer and derive the running time complexity. (5.5) Multiplying 2 n -digit integers Algorithm // Add two ½ n digit integers // Multiple three ½ n digit integers // Add, subtract, and shift ½ n digit integers to obtain result 1.585 Running Time Comple xity: O(n ) Recursive  calls  Add,  subtract,  shift  


Buy Material

Are you sure you want to buy this material for

50 Karma

Buy Material

BOOM! Enjoy Your Free Notes!

We've added these Notes to your profile, click here to view them now.


You're already Subscribed!

Looks like you've already subscribed to StudySoup, you won't need to purchase another subscription to get this material. To access this material simply click 'View Full Document'

Why people love StudySoup

Jim McGreen Ohio University

"Knowing I can count on the Elite Notetaker in my class allows me to focus on what the professor is saying instead of just scribbling notes the whole time and falling behind."

Kyle Maynard Purdue

"When you're taking detailed notes and trying to help everyone else out in the class, it really helps you learn and understand the I made $280 on my first study guide!"

Bentley McCaw University of Florida

"I was shooting for a perfect 4.0 GPA this semester. Having StudySoup as a study aid was critical to helping me achieve my goal...and I nailed it!"

Parker Thompson 500 Startups

"It's a great way for students to improve their educational experience and it seemed like a product that everybody wants, so all the people participating are winning."

Become an Elite Notetaker and start selling your notes online!

Refund Policy


All subscriptions to StudySoup are paid in full at the time of subscribing. To change your credit card information or to cancel your subscription, go to "Edit Settings". All credit card information will be available there. If you should decide to cancel your subscription, it will continue to be valid until the next payment period, as all payments for the current period were made in advance. For special circumstances, please email


StudySoup has more than 1 million course-specific study resources to help students study smarter. If you’re having trouble finding what you’re looking for, our customer support team can help you find what you need! Feel free to contact them here:

Recurring Subscriptions: If you have canceled your recurring subscription on the day of renewal and have not downloaded any documents, you may request a refund by submitting an email to

Satisfaction Guarantee: If you’re not satisfied with your subscription, you can contact us for further help. Contact must be made within 3 business days of your subscription purchase and your refund request will be subject for review.

Please Note: Refunds can never be provided more than 30 days after the initial purchase date regardless of your activity on the site.