### Create a StudySoup account

#### Be part of our community, it's free to join!

Already have a StudySoup account? Login here

# Parallel Algrthm&Arc CMPSCI 711

UMass

GPA 3.57

### View Full Document

## 14

## 0

## Popular in Course

## Popular in ComputerScienence

This 347 page Class Notes was uploaded by Roman McCullough on Friday October 30, 2015. The Class Notes belongs to CMPSCI 711 at University of Massachusetts taught by Andrew McGregor in Fall. Since its upload, it has received 14 views. For similar materials see /class/232248/cmpsci-711-university-of-massachusetts in ComputerScienence at University of Massachusetts.

## Similar to CMPSCI 711 at UMass

## Popular in ComputerScienence

## Reviews for Parallel Algrthm&Arc

### What is Karma?

#### Karma is the currency of StudySoup.

#### You can buy or earn more Karma at anytime and redeem it for class notes, study guides, flashcards, and more!

Date Created: 10/30/15

CMPSCI 711 Really Advanced Algorithms Lecture 17 amp 18 Entropy Randomness and Information Andrew McGregor Last Compned mm 9 2009 Definitions Entropy and Binomial Coefficients Extracting Random Bits Pairwise Independent Functions Outline Definitions Entropy Definition Given a discrete random variable X the entropy of X is HX 7ZPXXIoglPXx Given two discrete random variables X7 Y the conditional entropy ofX given Y is HXlY ZylPYyHXlYy Entropy Definition Given a discrete random variable X the entropy of X is HX izmx x oglPX x X Given two discrete random variables X7 Y the conditional entropy ofX given Y is HXlY ZleY y HXlY y Lemma For function g HgXlX 0 HXlgX O iffg invertible Entropy Definition Given a discrete random variable X the entropy of X is HX izmx x oglPX x X Given two discrete random variables X7 Y the conditional entropy ofX given Y is HXlY ZleY y HXlY y Lemma For function g HgXlX 0 HXlgX O iffg invertible Lemma Ile7 Xn are discrete random variables HX1Xn Z HXiX17X1 i6n IfX1 Xn are independent then HX17 7X Zieln HX Mutual Information Definition Given discrete random variables X Y the mutual information is x Y HX 7 HXlY HY 7 HYlX Given discrete random variables X Y Z the conditional mutual information is IX YlZ 2M2 z IX YlZ z Mutual Information Definition Given discrete random variables X Y the mutual information is x Y HX 7 HXlY HY 7 HYlX Given discrete random variables X Y Z the conditional mutual information is IX YlZ 2M2 z IX YlZ 2 Lemma IfX1 Xn Y are discrete random variables IX1Xn Y Z IX YlX1X1 i6n IfX and Y are independent IX Y 0 Outline Entropy and Binomial Coefficients Entropy and Binomial Coefficients Lemma r nH r n 2 lt n lt 2nHrn n 1 7 7 Where HX 7X Iogx 7 17 X Iog1 7 X Entropy and Binomial Coefficients Lemma n1 r Where HX 7X Iogx 7 17 X Iog1 7 X Proof gt Let q rn nHr n 2 S n 2WW Entropy and Binomial Coefficients Lemma nH r n 2 lt n lt 2nHrn n1 r 7 Where HX 7X Iogx 7 17 X Iog1 7 X Proof gt Let q rn gt RHS 7 n n k 7 n7k n n 7 n7 n 7 n 7nHq 1 7 1 gt q 1 q 7 2 H ltkgtq q 7 W a W Entropy and Binomial Coefficients Lemma nH r n 2 lt n lt 2nHrn n1 r 7 Where HX 7X Iogx 7 17 X Iog1 7 X Proof gt Let q rn gt RHS 7 n n k 7 n7k n n 7 n7 n 7 n 7nHq 1 7 1 gt q 1 q 7 2 H ltkgtq q 7 W a W gt Claim qnnqq l 7 q quot 2 qk1 7 q for 0 g k g n 1 Entropy and Binomial Coefficients Lemma nH r n 2 lt n lt 2nHrn n1 r 7 Where HX 7X Iogx 7 17 X Iog1 7 X Proof gt Let q rn gt RHS 7 n n k 7 n7k n n 7 n7 n 7 n 7nHq 1 7 1 gt q 1 q 7 2 H ltkgtq q 7 W a W 03 cin1 7qquot1 W q 2Zqk17 q for 0 g k g n gt LHS lg n 1X5 qqn1 7 qn7qn n 1qnn2nHq 1 Proof of Claim Claim 5nqq 1 q quot ZZqk1 q k for 0 S k S n Proof of Claim Claim 5nqq 1 q quot ZZqk1 q k for 0 S k S n Proof gt Consider difference of terms 7 k 7 niki 7 k1 7 nikil ltkgtq1 q ltk1gtq 1 q 397 k nek Lki QM q lt1 Proof of Claim Claim 5nqq 1 q quot ZZqk1 q k for 0 S k S n Proof gt Consider difference of terms 7 k 7 nik7 7 k1 7 nikil ltkgtq1 q ltk1gtq 1 q 7 7 k 7 nik 7niki QM q lt1 gt This is non negative when k 2 qn71 q Proof of Claim Claim 5nqq 1 q quot 2 Zqk1 q k for 0 S k S n Proof gt Consider difference of terms 7 k 7 nik7 7 k1 7 nikil ltkgtq1 q ltk1gtq 1 q 7 7 k 7 nik 7niki QM q lt1 gt This is non negative when k 2 qn71 q gt Terms increasing up to k qn and decreasing afterwards Outline Extracting Random Bits Extracting Random Bits Definition An extraction function Ext takes the value of a random variable X and outputs a sequence of bits y such that if P yi k 7 0 PiExtX YiiYi k 2 Extracting Random Bits Definition An extraction function Ext takes the value of a random variable X and outputs a sequence of bits y such that if PHyi k 7 0 PiExtX YiiYi k 2 Theorem Consider a coin With bias p gt 12 For any constant 6 gt O and n sufficiently large gt There exists an extraction function that takes n independent coin flips and outputs an average of at least 1 7 6nHp unbiased and independent random bits gt The average number of unbiased and independent bits output by any extraction function on an input sequence 0 n independent flips is at most nHp Extracting bits from uniform distributions 12 Lemma Suppose X is uniformly distributed in 07 7 m 7 1 Then there is an extraction function forX that outputs on average at least log mi 7 1 unbiased and independent bits Extracting bits from uniform distributions 12 Lemma Suppose X is uniformly distributed in 07 7 m 7 1 Then there is an extraction function forX that outputs on average at least log mi 7 1 unbiased and independent bits Proof gt Let Oz log mi and define the extraction function recursively Extracting bits from uniform distributions 12 Lemma Suppose X is uniformly distributed in 07 7 m 7 1 Then there is an extraction function forX that outputs on average at least log mi 7 1 unbiased and independent bits Proof gt Let Oz log mi and define the extraction function recursively gt If X g 20 71 output the 1 bit representation of X Extracting bits from uniform distributions 12 Lemma Suppose X is uniformly distributed in 07 7 m 7 1 Then there is an extraction function forX that outputs on average at least log mi 7 1 unbiased and independent bits Proof gt Let Oz log mi and define the extraction function recursively gt If X g 20 71 output the 1 bit representation of X gt IfX 2 20 use the extraction function on X 7 20 since this is uniform on 07 7 m7 20 71 Extracting bits from uniform distributions 12 Lemma Suppose X is uniformly distributed in 07 7 m 7 1 Then there is an extraction function forX that outputs on average at least log mi 7 1 unbiased and independent bits Proof gt Let Oz log mi and define the extraction function recursively gt If X g 20 71 output the 1 bit representation of X gt IfX 2 20 use the extraction function on X 7 20 since this is uniform on 07 7 m7 20 71 gt For each k we get uniform distribution over k bit sequences Extracting bits from uniform distributions 12 Lemma Suppose X is uniformly distributed in 07 7 m 7 1 Then there is an extraction function forX that outputs on average at least log mi 7 1 unbiased and independent bits Proof V Let Oz log mi and define the extraction function recursively If X g 20 71 output the 1 bit representation of X IfX 2 20 use the extraction function on X 7 20 since this is uniform on 07 7 m7 20 71 V V V For each k we get uniform distribution over k bit sequences V Remains to show that we expect to output log mi 71 unbiased and independent bits Extracting bits from uniform distributions 22 gt Let Y be the number of bits output Extracting bits from uniform distributions 22 gt Let Y be the number of bits output gt By induction on m 20 m7 EY 134 2 Ebitsfrom0m72 71 m m f M077 7 2m 71 i v i Q Extracting bits from uniform distributions 22 gt Let Y be the number of bits output gt By induction on m EY am2 Ebitsfrom0m72 71 20 m720 gt 7 7 0 7 7 quot704 m llogm 2l 1 gt Some algebra gives this is at least 04 71 completing induction Extracting Bits from Biased Coin Upper Bound 12 Theorem Consider coin With bias p gt 12 For any constant 6 gt O and n sufficiently large there exists a function that takes n independent coin flips and outputs an average of at least 17 6nHp independent and unbiased bits Extracting Bits from Biased Coin Upper Bound 12 Theorem Consider coin With bias p gt 12 For any constant 6 gt O and n sufficiently large there exists a function that takes n independent coin flips and outputs an average of at least 17 6nHp independent and unbiased bits Proof gt Let Z be number of heads seen Extracting Bits from Biased Coin Upper Bound 12 Theorem Consider coin With bias p gt 12 For any constant 6 gt O and n sufficiently large there exists a function that takes n independent coin flips and outputs an average of at least 17 6nHp independent and unbiased bits Proof gt Let Z be number of heads seen gt Conditioned on Z k each of sequence sequences is equally likely Can expect to extract llog 71 bits Extracting Bits from Biased Coin Upper Bound 12 Theorem Consider coin With bias p gt 12 For any constant 6 gt O and n sufficiently large there exists a function that takes n independent coin flips and outputs an average of at least 17 6nHp independent and unbiased bits Proof gt Let Z be number of heads seen gt Conditioned on Z k each of sequence sequences is equally likely Can expect to extract llog 71 bits gt Let B be total number of bits extracted EB gpg kEBlZ k 2 2M2 mliog Djil D Extracting Bits from Biased Coin Upper Bound 22 gt Consider only k such that n2 np 7 e g k np e w 2 W5 W k W M 1 k i P 5i Extracting Bits from Biased Coin Upper Bound 22 gt Consider only k such that n2 np 7 e g k np l 6 ln 6l n wk ef2kltllogltkl 1 gt Relating binomial coefficients to entropy quotHP6 llog 71 ltlog2n1 gt72 Extracting Bits from Biased Coin Upper Bound 22 gt Consider only k such that n2 np 7 e g k np l 6 ln 6l n wk ef2kltllogltkl 1 gt Relating binomial coefficients to entropy quotHP6 llog 71 ltlog2n1 gt72 gt Appealing to ChernofF bound lnP6l M2 k 2 17 2e neg3P klnpel Extracting Bits from Biased Coin Upper Bound 22 gt Consider only k such that n2 np 7 e g k np l e w Z quot5 W k log Di 1 k l P 5l gt Relating binomial coefficients to entropy quotHP6 llog 71 ltlog2n1 gt72 gt Appealing to ChernofF bound lnP6l M2 k 2 17 2e neg3P kl P 5l gt Putting it together E B HomeIogn1e21e2equot 23P 2 lawp where the last inequality is for sufficiently large n Extracting Bits from Biased Coin Tosses Lower Bound Theorem Consider a coin With bias p gt 12 The average number of bits output by any extraction function on an input sequence of n independent flips is at most nHp Extracting Bits from Biased Coin Tosses Lower Bound Theorem Consider a coin With bias p gt 12 The average number of bits output by any extraction function on an input sequence of n independent flips is at most nHp Proof gt Consider extraction function Ext Extracting Bits from Biased Coin Tosses Lower Bound Theorem Consider a coin With bias p gt 12 The average number of bits output by any extraction function on an input sequence of n independent flips is at most nHp Proof gt Consider extraction function Ext gt Ifx occurs with probability q then iEXtXi og1q since qgi EX Xi g 1 Extracting Bits from Biased Coin Tosses Lower Bound Theorem Consider a coin With bias p gt 12 The average number of bits output by any extraction function on an input sequence of n independent flips is at most nHp Proof gt Consider extraction function Ext gt Ifx occurs with probability q then iEXtXi og1q since quEX Xi g 1 gt Let B be number of bits extracted by Ext EB ZPXxiExtxi ZPXXiOgm D Outline Pairwise Independent Functions Pairwise Independent Functions gt Let n bea primeand ab R 07177n71 Pairwise Independent Functions gt Letn bea primeand ab R 07177n71 gt Consider Z R07 Rn71 where R 3 b mod n Pairwise Independent Functions gt Let n bea primeand ab R 07177n71 gt Consider Z R07 Rn71 where R 3 b mod n gt Entropy of each R Pairwise Independent Functions gt Let n bea primeand ab R 07177n71 gt Consider Z R07 Rn71 where R 3 b mod n gt Entropy of each R HR log n Pairwise Independent Functions gt Let n bea primeand ab R 07177n71 gt Consider Z R07 Rn71 where R 3 b mod n gt Entropy of each R HR log n gt Entropy of Z Pairwise Independent Functions gt Let n bea primeand ab R 07177n71 gt Consider Z R07 Rn71 where R 3 b mod n gt Entropy of each R HR log n gt Entropy of Z HZ 2ogn Pairwise Independent Functions gt Let n bea primeand ab R 07177n71 gt Consider Z R07 Rn71 where R ai b mod n gt Entropy of each R HR log n gt Entropy of Z HZ 2 Iogn Lemma For discrete random variable X and function g HgX HX With equality iffg is invertible Pairwise Independent Functions gt Let n bea primeand ab R 07177n71 gt Consider Z R07 Rn71 where R ai b mod n gt Entropy of each R HR log n gt Entropy of Z HZ 2 Iogn Lemma For discrete random variable X and function g HgX HX With equality iffg is invertible Proof gt HX7gX HX HgXiX HX Pairwise Independent Functions gt Let n bea primeand ab R 07177n71 gt Consider Z R07 Rn71 where R ai b mod n gt Entropy of each R HR log n gt Entropy of Z HZ 2 Iogn Lemma For discrete random variable X and function g HgX HX With equality iffg is invertible Proof HX7gX HX HgXiX HX HX7gX HgX HXigX Z HgX Pairwise Independent Functions gt Let n bea primeand ab R 07177n71 gt Consider Z R07 Rn1 where R ai b mod n gt Entropy of each R HR log n gt Entropy of Z HZ 2 Iogn Lemma For discrete random variable X and function g HgX HX With equality iffg is invertible Proof HX7gX HX HgXiX HX HX7gX HgX HXigX Z HgX gt Z fa7 b where f is invertible Hence HZ H37b H3 Hb 2logn CMPSCI 711 Really Advanced Algorithms Lecture 19 Summary of Research Papers Andrew McGregor Last Comp ed May 12 2009 Basic Data Stream Model Definition Input is a length m list lt31 amgt where a E You can only access the elements sequentially and have memory that is sub linear in m and n Basic Data Stream Model Definition Input is a length m list lt31 amgt where a E You can only access the elements sequentially and have memory that is sub linear in m and n Problem Let f aj 1 Compute approximations to all fi 2 Estimate Fk Z fik 3 Estimate F0 2 fi0 i6n i e the number of distinct items i6n Outline Estimating Point Queries Estimating Point Queries Problem Estimate all f upto ism error Estimating Point Queries Problem Estimate all f upto ism error Algorithm 1 Pick d 2 Wise hash functions h1 hd n H 2 Compute Ti Zr hlj fl 3 Let 121 minihlj Ti Estimating Point Queries Problem Estimate all f upto ism error Algorithm 1 Pick d 2 Wise hash functions h17 hd n H 2 Compute Ti Z hlj fl 3 Let a minihzj Tu Theorem There exists a 06 1 Iogn6 space algorithm that returns 1 17 NJ where f g f g f em With probability 1 7 6 Estimating Point Queries Problem Estimate all f upto ism error Algorithm 1 Pick d 2 Wise hash functions h17 hd n H 2 Compute Ti Z hlj fl 3 Let a minihzj Tu Theorem There exists a 06 1 Iogn6 space algorithm that returns 1 17 NJ where f g f g f em With probability 1 7 6 Proof 1 Set W 26 E739Jih j fl em2 Estimating Point Queries Problem Estimate all f upto ism error Algorithm 1 Pick d 2 Wise hash functions h17 hd n H 2 Compute Ti Z hlj fl 3 Let a minihzj Tu Theorem There exists a 06 1 Iogn6 space algorithm that returns 1 17 NJ where f g f g f em With probability 1 7 6 Proof 1 Set W 26 E739Jih j fl em2 2 Set d Iogn6 1 mini hlj Tid gt fl em 6n Outline Estimating Fk Estimating Fk Problem Estimate Fk Z fik Estimating Fk Problem Estimate Fk Z fik Algorithm 1 Picj uniformly from m 2 Compute r 1 m aj 3 3 OutputX mrk 7 mr 71k Estimating Fk Problem Estimate Fk Z fik Algorithm 1 Picj uniformly from m 2 Compute r 1 m aj aj 3 OutputX mrk 7 mr 71quot Theorem There exists a Okn1 1ke 2 Iog16 space algorithm that returns a 1 6 factor approximation to Fk With probability 1 7 6 Estimating Fk Problem Estimate Fk Z fik Algorithm 1 Pickj uniformly from m 2 Compute r 1 m aj aj 3 OutputX mrk 7 mr 71k Theorem There exists a Okn1 1ke 2 log16 space algorithm that returns a 1 l 6 factor approximation to Fk With probability 1 7 6 Proof EX Fk and VX knl lRFE Run copies of algorithm Average results appropriately and apply ChernofFChebyshev D Outline Estimating F2 Estimating F2 Problem Estimate F2 Z fiz Estimating F2 Problem Estimate F2 Z fiz Algorithm 1 Pick X17 7Xn 6 711 With X andxj independent 2 Compute EH 3 Return X Eldrawn2 Xifi Estimating F2 Problem Estimate F2 Z fiz Algorithm 1 Pick X17 7Xn 6 711 With X andxj independent 2 Compute EH 3 Return X Eldrawn2 Xifi Theorem There exists a 06 2 Iog16og m log n space algorithm that returns a 1 6 factor approximation to F2 With probability 1 7 6 Estimating F2 Problem Estimate F2 Z fiz Algorithm 1 Pick X17 7Xn 6 711 With X andxj independent 2 Compute EH 3 Return X Eldrawn2 Xifi Theorem There exists a 06 2 log16log m l log n space algorithm that returns a 1 l 6 factor approximation to F2 With probability 1 7 6 Proof EX F2 and VX 2322 Run 06 2 log16 copies Average results appropriately and apply ChernofFChebyshev D Relationship to Johnson Lindenstrauss Algorithm 1 Let Y 2X177Xn be such that each X are independent N01 variables Where HXHQ X12 Xn212 2 Compute hx Zidn 3 Repeat d times to get hx17 hxd Xifi Relationship to Johnson Lindenstrauss Algorithm 1 Let Y 2X177Xn be such that each X are independent N01 variables Where HXHQ X12 Xn212 2 Compute hx Zidnlxif 3 Repeat d times to get hx17 hxd Theorem For any p vectors in Euclidean space v17 7 VP 6 R there exists a map h R H Rd such that 1 EiihVi hVjH2 S W Vjiiz S 1 EiihVi hVjH2 Where d 06 2 log n Outline Estimating F0 Problem and First Attempt Problem Estimate F0 Z fIO ie number of distinct values in the stream Problem and First Attempt Problem Estimate F0 Z fIO ie number of distinct values in the stream Algorithm 1 Pick a random hash function h 17 n a 01 2 As the stream arrives compute ha 3 Keep track of minimum value v seen so far 4 Output 1v 7 1 as estimate for F0 Problem and First Attempt Problem Estimate F0 Z fIO ie number of distinct values in the stream Algorithm 1 Pick a random hash function h 17 n a 01 2 As the stream arrives compute ha 3 Keep track of minimum value v seen so far 4 Output 1v 7 1 as estimate for F0 Issues 1 If hash function is totally random it may be too big to store Assume h is pairwise independent 2 Precision issues Hash into n3 rather than 01 3 The above estimator varies too much Track t th smallest The Algorithm Algorithm 1 Pick a random 2 Wise hash function h n a n3 2 Compute t th smallest hash values v17 Vt 3 lfFo t output lfo F0 4 lfFo gt t output lfo tn3vt The Algorithm Algorithm 1 Pick a random 2 Wise hash function h n a n3 2 Compute t th smallest hash values v17 Vt 3 lfFo t output lfo F0 4 lfFo gt t output lfo tn3vt Theorem A Ift 2062 then 19gt Fe 7 Fol 2 J0 25 The Algorithm Algorithm 1 Pick a random 2 Wise hash function h n a n3 2 Compute t th smallest hash values v17 Vt 3 lfFo t output lfo F0 4 lfFo gt t output lfo tn3vt Theorem A Ift 2062 then 19gt Fe 7 Fol 2 J0 25 Corollary We can 1 e approximate F0 With probability 1 7 6 in only 06 2 og16 log n space Proof of Theorem 1 Assuming F0 gt t we get good estimate if 17 EF0 tn3vt 1 EF0 3 3 or eqUIvaIentIy 135 Vt Proof of Theorem 1 Assuming F0 gt t we get good estimate if 17 EF0 tn3vt 1 EF0 3 3 or eqUIvaIentIy t lt Vt t 15Fo 7 175Fo 2 Consider JP 1355 Vt 1PY lt t where Y is the number tn3 Hem Other case is similar of items hashing to under Proof of Theorem 1 Assuming F0 gt t we get good estimate if 17 eF0 tn3vt 1 l eF0 or equivalently 135 Vt 135 2 Consider l 1355 Vt lPY lt t where Y is the number m3 of Items hashing to under Hem Other case IS similar 3 For each a tn3 1 l litquot g ml W Proof of Theorem 1 Assuming F0 gt t we get good estimate if 17 eF0 tn3vt 1 l eF0 or equivalently 135 Vt 135 2 Consider l 1355 Vt lPY lt t where Y is the number m3 of Items hashing to under Hem Other case IS similar 3 For each a 3 tn 1 Plhla 1eFol 1eFo 4 By linearity of expectation E Y t1 e Proof of Theorem 1 Assuming F0 gt t we get good estimate if 17 EF0 tn3vt 1 EF0 3 3 or eqUIvaIentIy t lt Vt t 15Fo 7 175Fo 2 Consider JP 1355 g Vt JPY lt t where Y is the number tn3 of items hashing to under Hem Other case is similar 3 For each a 3 tn t 1 h lt lt 7 i 371EF0i 1EF0 4 By linearity of expectation E Y t1 e 5 Since h is 2 wise independent VY t1 e Proof of Theorem 1 M 9quot 05quot Consider l Assuming F0 gt t we get good estimate if 17 EF0 tn3vt 1 EF0 tn3 tn3 15F0 3 Vt 3 175m 135 Vt lPY lt t where Y is the number of items hashing to under or equivalently tn3 Hem Other case is similar For each a tn3 t 1 h lt 7 lt 7 llali 1EFol 1EF0 By linearity of expectation E Y t1 6 Since h is 2 wise independent VY t1 6 By Chebyshev lPY 2 t g 15 if t 2 206 2 Outline F0 lower bou nd F0 lower bound via communication complexity Theorem Any data stream algorithm that approximates F0 up to a 1 6 factor With probability 910 needs 96 2 space F0 lower bound via communication complexity Theorem Any data stream algorithm that approximates F0 up to a 1 6 factor With probability 910 needs 96 2 space Lemma Alice has X 6 01 and Bob has y 6 01 Alice needs to send Bob Qn bits if he is to estimate the hamming distance AXy up to io With probability 910 F0 lower bound via communication complexity Theorem Any data stream algorithm that approximates F0 up to a 1 i 6 factor With probability 910 needs 96 2 space Lemma Alice has X 6 01 and Bob has y 6 01 Alice needs to send Bob Qn bits if he is to estimate the hamming distance AXy up to io With probability 910 gt Define a multi set by 5 i X 1U i y 1 Fo5 iXi2 iYi12 AX7y2 gt 0 estimate for F0 gives 0 estimate for AXy gt If n 06 2 1 i 6 factor approx of F0 is 0 approx One Pass Lower Bound 12 gt Reduction from INDEX problem Alice knows z E O1t and Bob knowsj E Let39s assume izi t2 and this is odd One Pass Lower Bound 12 gt Reduction from INDEX problem Alice knows z E O1t and Bob knowsj E Let39s assume lzl t2 and this is odd gt Alice and Bob pick r ER 711 using public random bits One Pass Lower Bound 12 gt Reduction from INDEX problem Alice knows z E 01t and Bob knowsj E Let39s assume lzl t2 and this is odd gt Alice and Bob pick r ER 711 using public random bits gt Alice computes signrz and Bob computes signrj One Pass Lower Bound 12 gt Reduction from INDEX problem Alice knows z E 01t and Bob knowsj E Let39s assume lzl t2 and this is odd gt Alice and Bob pick r ER 711 using public random bits gt Alice computes signrz and Bob computes signrj gt Claim For some constant c gt 0 7 I 7 12 if 2 0 lPsIgnrz 7 signrj 7 12 C if zj 1 One Pass Lower Bound 12 gt Reduction from INDEX problem Alice knows z E 01t and Bob knowsj E Let39s assume lzl t2 and this is odd gt Alice and Bob pick r ER 711 using public random bits gt Alice computes signrz and Bob computes signrj gt Claim For some constant c gt 0 7 I 7 12 if 2 0 lPsIgnrz 7 signrj 7 12 C if zj 1 gt Repeat n Ot times to construct X Isignrz and y Isignrj One Pass Lower Bound 12 gt Reduction from INDEX problem Alice knows z E 01t and Bob knowsj E Let39s assume lzl t2 and this is odd gt Alice and Bob pick r ER 711 using public random bits gt Alice computes signrz and Bob computes signrj gt Claim For some constant c gt 0 12 ifzj0 lP sIgnrzsIgnrJ 12CE 41 gt Repeat n Ot times to construct X Isignrz and y Isignrj gt With probability 910 for some constants 1 lt c2 zj O AXy 2 n2 7 cl zj 1 AXy n2 7 Cg One Pass Lower Bound 22 Claim For some constant c gt 0 12 1210 1P5Ignrz Signrj 12 C ifzj 1 One Pass Lower Bound 22 Claim For some constant c gt 0 12 1210 1P5Ignrz Signrj 12 C ifzj 1 gt If 2 0 then signrz and signrj are independent One Pass Lower Bound 22 Claim For some constant c gt 0 12 1210 1P5Ignrz Signrj 12 C ifzj 1 gt If 2 0 then signrz and signrj are independent gt If 2 1 let 5 rz 7 rj A signrz signrj One Pass Lower Bound 22 Claim For some constant c gt 0 12 1210 1P5Ignrz Signrj 12 C ifzj 1 gt If 2 0 then signrz and signrj are independent gt If 2 1 let 5 rz 7 rj A signrz signrj PMPAsOPs0PA57 OP57 O One Pass Lower Bound 22 Claim For some constant c gt 0 12 1210 1P5Ignrz Signrj 12 C ifzj 1 gt If 2 0 then signrz and signrj are independent gt If 2 1 let 5 rz 7 rj A signrz signrj PMPAsOPs0PA57 OP57 O 1PAs 0 1 and 1PAs 7 012 One Pass Lower Bound 22 Claim For some constant c gt 0 12 1210 1P5Ignrz Signrj 12 C ifzj 1 gt If 2 0 then signrz and signrj are independent gt If 2 1 let 5 rz 7 rj A signrz signrj PM PAsOPs0PA57 OP57 O 1PAs 0 1 and 1PAs 7 012 1Ps 0 2c for some constant c gt O Outline p and Stable Distributions Problem Definition Definition Input is a length m list 5 lt31 amgt where a jhAi E nU You can only access the elements sequentially and have memory that is sub linear in m and n Problem Definition Definition Input is a length m list 5 a17 amgt where a jhAi E n U illi7 You can only access the elements sequentially and have memory that is sub linear in m and n Problem Let Vj 2 A Note that Vj can be negative Estimate 1p MV llVllp 2 vi 1 Theorem Using only 06 2 log 6 1 space we can find a 1 l 6 error approx With probability 1 7 6 p stable distribution Definition A distribution D over R is called p stable if for any v E R and iid variables X17 Xn variables with distribution D the random variable 2 Mi i has the same distribution as llvlle where X is a random variable with distribution D p stable distribution Definition A distribution D over R is called p stable if for any v E R and iid variables X17 Xn variables with distribution D the random variable 2 Mi i has the same distribution as llvlle where X is a random variable with distribution D Example The Normal01 distribution is 2 stable 1 eixgZ 27r Cauchy is 1 stable 1 1 gllx2 Ideal Algorithm for p 1 Algorithm gt Let A be a k x n matrix Where each AM is an independent Cauchy distribution Let A1 be thej column ofA gt Let c be length k zero vector For each stream element j A c e c AA gt Return medianc1ck Ideal Algorithm for p 1 Algorithm gt Let A be a k x n matrix Where each AM is an independent Cauchy distribution Let Al be thej column ofA gt Let c be length k zero vector For each stream element j A c e c AA gt Return medianlc1llckl Lemma gt c Av but algorithm only needs to maintain length k vector gt Each lcil independently distributed as Hvale Where X is a Cauchy random variable Details Lemma IfX is Cauchy distributed medianUX 1 Proof Follows since tan7r4 1 and FXz 1PX z 0 iarcta z Details Lemma IfX is Cauchy distributed medianUX 1 Proof Follows since tan7r4 1 and FXz 1PX z 0 iarcta z Lemma Let Y17 Yk iid andX medianY17 Yk 1PFxz 1276712E216 ifk 9672og671 Details Lemma le is Cauchy distributed median X 1 Proof Follows since tan7r4 1 and FXz 1PX z 0 iarctanz Lemma Let Y17 Yk iid andX medianY17 Yk M542 6 12 7 e 12 617 6 ifk 9672og671 Lemma For small 6 suf ciently small Let X be Cauchy distributed and z satisfy F X z E 12 7 6712 1 6 Then 2 E 17 4671 4e Thanks CMPSCI 711 Really Advanced Algorithms Lecture 6 amp 7 Wiring and Maximum Satisfiability Andrew McGregor Last Comp ed February 17 2009 Outline Wiring Problem Global Wiring in Gate Arrays W x array of adjacent square gates 1 pairs of terminals need connected Add a wire to connect each pair VVVV Constraint Wires only travel horizontally andor vertically and may only bend once gt Problem Want to minimize the maximum number of wires that cross the border between any two adjacent gates Let39s see a picture gt How many borders gt How many possible wirings Reformulate Define 1 if wirei is first routed horizontally X 0 O otherWIse 1 if wire 139 is first routed vertically Xi1 O otherWIse Tbo i wirei is routed through border b if X0 1 Tbl i I wirei is routed through border b if X1 1 Reformulate Define 1 if wirei is first routed horizontally X 0 O otherWIse 1 if wire 139 is first routed vertically Xi1 O otherWIse Tbo i wirei is routed through border b if X0 1 Tbl i I wirei is routed through border b if X1 1 We want to minimize mix 2 Xi0 Z Xi1 ieTbo ieTbl Formulate as 01 Linear Program minimize W where X 0X 1 6 01 for all i E 1 subject to Xio X1 1for all i E t ino Exilgwforallb ie The i6 Tm Formulate as 01 Linear Program minimize W where X 0X 1 6 01 for all i E 1 subject to Xio X1 1for all i E t ino Exilgwforallb ie The i6 Tm gt Unfortunately 01 linear programming is NP hard Formulate as 01 Linear Program minimize W where X 0X 1 6 01 for all i E 1 subject to Xio X1 1for all i E t ino Exilgwforallb ie The i6 Tm gt Unfortunately 01 linear programming is NP hard gt One option is to relax the program to get a linear program replace X0X1 6 01 by 0 X0X1 1 Formulate as 01 Linear Program minimize W where X 0X 1 6 01 for all i E 1 subject to Xio X1 1for all i E t ino Exilgwforallb ie The i6 Tm gt Unfortunately 01 linear programming is NP hard gt One option is to relax the program to get a linear program replace X0X1 6 01 by 0 X0X1 1 gt Can efficiently find a solution to the relaxed program but how can we use it to get a solution to the original problem A factor 2 approximation gt Let W be optimum solution to the relaxed program and let W0 be optimum solution to the original program W W0 A factor 2 approximation gt Let W be optimum solution to the relaxed program and let W0 be optimum solution to the original program W W0 gt For i E let 2min be solution to relaxed program max01 Z A factor 2 approximation gt Let W be optimum solution to the relaxed program and let W0 be optimum solution to the original program W W0 gt For i E let 2min be solution to relaxed program maX01 Z gt Round the larger up to 1 and smaller down to 0 A factor 2 approximation gt Let W be optimum solution to the relaxed program and let W0 be optimum solution to the original program W W0 gt For i E let 2min be solution to relaxed program maX01 Z gt Round the larger up to 1 and smaller down to 0 gt Let 2min be the new values A factor 2 approximation gt Let W be optimum solution to the relaxed program and let W0 be optimum solution to the original program W W0 gt For i E let 2min be solution to relaxed program maX01 Z gt Round the larger up to 1 and smaller down to 0 gt Let 2min be the new values gt Then for all borders b Z i0zflti1 2 ZQiO l Z il SQWSQWO ieTbo IETbO i6 Tm i6 Tm Randomized Rounding 12 gt For i E let 20721712 be solution to relaxed program Randomized Rounding 12 gt For i E let 20721712 be solution to relaxed program gt With prob fqo et 590 18 in 0 Otherwise set quotqo 0 8c in 1 Randomized Rounding 12 gt For i E let 20721712 be solution to relaxed program gt With prob fqo et 590 18 in 0 Otherwise set quotqo 0 8c in 1 gt For any border b et VTb Z 300 Z 501 ie The i6 Tm Randomized Rounding 12 gt For i E let 20721712 be solution to relaxed program gt With prob fqo et 590 18 in 0 Otherwise set quotqo 0 8c in 1 gt For any border b et VTb Z 300 Z 501 ieTbo ieTbl gt Then IEViW Z 300 Z 301 S W ieTbo i6 Tm Randomized Rounding 22 gt Note that I7vb is the sum of independent poisson trials Randomized Rounding 22 gt Note that I7vb is the sum of independent poisson trials gt Using the ChernofF bound and IE VTb W0 W0 Pv vb 2 16Wo 8 5 1 615 Randomized Rounding 22 gt Note that I7vb is the sum of independent poisson trials gt Using the ChernofF bound and IE VTb W0 W0 Pv vb 2 16Wo 8 5 1 615 gt Apply the union bound W0 1PVb I7vb 2 16W0 lt 2n 8 5 1 616 Randomized Rounding 22 gt Note that I7vb is the sum of independent poisson trials gt Using the ChernofF bound and IE VTb W0 W0 Pv vb 2 16Wo 8 5 1 615 gt Apply the union bound W0 1PVb I7vb 2 16W0 lt 2n 8 5 1 616 gt With probability at least 17 e mxvquotvb 1 W0 W0 if W0 2 wogne Outline Maximum Satisfiability Maximum Satisfiability gt Input m clauses in n Boolean variables X17 Xn eg X1X237 147 7 X9VX10VX21 where 217 X gt Goal Maximize the number of clauses that are satisfied Maximum Satisfiability gt Input m clauses in n Boolean variables X17 Xn eg X1X237 147 7 X9VX10VX21 where 217 X gt Goal Maximize the number of clauses that are satisfied Notes gt Problem is NP hard so will consider approximation algorithms Maximum Satisfiability gt Input m clauses in n Boolean variables X17 Xn eg X1X237 147 7 X9VX10VX21 where 217 X gt Goal Maximize the number of clauses that are satisfied Notes gt Problem is NP hard so will consider approximation algorithms gt A randomized algorithm is an a approximations for a maximization problem if it returns X with E X OPT 2 a Maximum Satisfiability gt Input m clauses in n Boolean variables X17 Xn eg X1X237 147 7 X9VX10VX21 where 217 X gt Goal Maximize the number of clauses that are satisfied Notes gt Problem is NP hard so will consider approximation algorithms gt A randomized algorithm is an a approximations for a maximization problem if it returns X with E X OPT 2 a gt Call X and x literals Assume X and 2 don39t appear in same clause Easy 12 approximation Theorem A truth assignment chosen uniformly at random is a 12 approx Easy 12 approximation Theorem A truth assignment chosen uniformly at random is a 12 approx Proof gt Independently set each X to O with prob 12 and 1 otherwise Easy 12 approximation Theorem A truth assignment chosen uniformly at random is a 12 approx Proof gt Independently set each X to O with prob 12 and 1 otherwise gt Let Z 1 if i th clause is satisfied and 0 otherwise Easy 12 approximation Theorem A truth assignment chosen uniformly at random is a 12 approx Proof gt Independently set each X to O with prob 12 and 1 otherwise gt Let Z 1 if i th clause is satisfied and 0 otherwise gt If i th clause has k literals 1PZ11712k212 Easy 12 approximation Theorem A truth assignment chosen uniformly at random is a 12 approx Proof gt Independently set each X to O with prob 12 and 1 otherwise gt Let Z 1 if i th clause is satisfied and 0 otherwise gt If i th clause has k literals 1PZ1 1712k 212 gt Expected number of clauses satisfied is E Elgm 2 2 m2 Ci Easy 12 approximation Theorem A truth assignment chosen uniformly at random is a 12 approx Proof gt Independently set each x to O with prob 12 and 1 otherwise gt Let Z 1 if i th clause is satisfied and 0 otherwise gt If i th clause has k literals 1PZ11712k212 gt Expected number of clauses satisfied is E Elgm 2 2 m2 El Example of the Probabilistic Method Above prove shows that there always exists a way to satisfy m2 clauses This is tight Formulate as Linear Program Define the following sets Cj i X or 7 appears in j th clause cf J C 1 7 appears in j th clause i X appears in j th clause Formulate as Linear Program Define the following sets C i X or 7 appears in j th clause CJJr i X appears in j th clause Cj i 7 appears in j th clause Consider the following program rn maximize 2 j1 where Xzj 6 01 for all i E nj E m subject to Z X l 2 17 Xi 2 Z for allj E m 7 I69 I69 Formulate as Linear Program Define the following sets C i X or 7 appears in j th clause CJJr i X appears in j th clause Cj i 7 appears in j th clause Consider the following program rn maximize 2 j1 where Xzj 6 01 for all i E nj E m subject to Z X l 2 17 Xi 2 Z for allj E m 7 I69 I69 Relax by replacing Xzj 6 01 by 0 Xzj 1 Randomized Rounding gt Let 2 f be solutions to the relaxed program Randomized Rounding gt Let 2 f be solutions to the relaxed program gt With probability fq set X 1 and X 0 otherwise Randomized Rounding gt Let 2 f be solutions to the relaxed program gt With probability fq set X 1 and X 0 otherwise gt By independence 1PZj117 Ha zy H2 7 I69 I69 Randomized Rounding gt Let 2 f be solutions to the relaxed program gt With probability fq set X 1 and X 0 otherwise gt By independence 1PZj117 Ha zy H2 ieCjr ieCf gt Since Ziea 2 21664172 if Q has k literals Pizjdizleueank Randomized Rounding gt Let 2 f be solutions to the relaxed program gt With probability fq set X 1 and X 0 otherwise gt By independence 1PZj117 Ha zy H2 ieCjr ieCf gt Since Ziea 2 21664172 if Q has k literals PZJ11717kk gt For Bk 17171kk E le 11211kk 2 kaj Z 1 1321 Randomized Rounding gt Let 2 f be solutions to the relaxed program gt With probability fq set X 1 and X 0 otherwise gt By independence pm1 12 Ha zy H2 ieCjr ieCf gt Since Ziea 2 21664172 if Q has k literals PZJ11717kk gt For Bk 17171kk E le 11211kk 2 kaj Z 1 1321 gt E lZJqm Zjl 2 1 e 1e 2W 2 17 1mm Recap gt First algorithm gt Set each variable independently to O or 1 with equal probability gt Gave 12 approximation gt Would have been better ifall clauses had many literals Recap gt First algorithm gt Set each variable independently to O or 1 with equal probability gt Gave 12 approximation gt Would have been better ifall clauses had many literals gt Second algorithm gt Set each variable independently to O or 1 with probability based on the linear program solution gt Gave 1715 approximation gt Would have been better ifall clauses had few literals Recap gt First algorithm gt Set each variable independently to O or 1 with equal probability gt Gave 12 approximation gt Would have been better ifall clauses had many literals gt Second algorithm gt Set each variable independently to O or 1 with probability based on the linear program solution gt Gave 1715 approximation gt Would have been better ifall clauses had few literals gt What would be a good third algorithm 34 Approximation Theorem Running both algorithms and returning best solution gives 34 approximation 34 Approximation Theorem Running both algorithms and returning best solution gives 34 approximation Proof 34 Approximation Theorem Running both algorithms and returning best solution gives 34 approximation Proof gt Let A and B be number of clauses satisfied by first and second algorithms IE maXA7 3 2 E A 2 E B 2 34 Approximation Theorem Running both algorithms and returning best solution gives 34 approximation Proof gt Let A and B be number of clauses satisfied by first and second algorithms IE maXA7 3 2 E A 2 l E B 2 gt Let 5k j j th clause contains exactly k literals 34 Approximation Theorem Running both algorithms and returning best solution gives 34 approximation Proof gt Let A and B be number of clauses satisfied by first and second algorithms IE maXA7 3 2 E A 2 l E B 2 gt Let 5k j j th clause contains exactly k literals gt From the analysis of the previous algorithms AZZ172R and BZZZWJ k jeSk k jesk 34 Approximation Theorem Running both algorithms and returning best solution gives 34 approximation Proof gt Let A and B be number of clauses satisfied by first and second algorithms IE maXA7 3 2 E A 2 l E B 2 gt Let 5k j j th clause contains exactly k literals gt From the analysis of the previous algorithms AZZ172R and BZZZWJ k jeSk k jesk gt Therefore using 17 24 l Bk 2 32 for all k 7 7k E A 2 EB 2 2 Z Z 2 0750PT k jeSk Outline Puzzle Puzzle gt You lose the unbiased coin with which you were planning to do the homework gt A friend lends you a biased coin but doesn39t tell you the bias gt Without trying to estimate the bias how can you use the biased coin to simulate a perfectly unbiased coin CMPSCI 711 Really Advanced Algorithms Lecture 3 Principle of Deferred Decisions amp Stable Matchings Andrew McGregor Last Compr ed February 3 2009 Outline Clock Solitaire and Principle of Deferred Decisions Recall Last Week s Puzzle Take a standard pack of 52 cards that is randomly shuffled Split into 13 piles of 4 and label piles A2 10JQK Take first card from K pile VVVV Take next card from X pile where X is the face value of the previous card taken gt Repeat until either all cards are removed you win or we get stuck you lose What39s the probability you win Structural Observations Lemma The last card before we terminate either Winning or loosing is K Structural Observations Lemma The last card before we terminate either Winning or loosing is K Proof gt Suppose at iteration j we draw card X but pile X is empty Structural Observations Lemma The last card before we terminate either Winning or loosing is K Proof gt Suppose at iteration j we draw card X but pile X is empty gt If pile X is empty and X7 K then we have already drawn 4 copies of card X prior to iteration j Contradiction 1 Structural Observations Lemma The last card before we terminate either Winning or loosing is K Proof gt Suppose at iteration j we draw card X but pile X is empty gt If pile X is empty and X7 K then we have already drawn 4 copies of card X prior to iteration j Contradiction 1 Lemma We Win iff the fourth K is the 52nd card Structural Observations Lemma The last card before we terminate either Winning or loosing is K Proof gt Suppose at iteration j we draw card X but pile X is empty gt If pile X is empty and X7 K then we have already drawn 4 copies of card X prior to iteration j Contradiction 1 Lemma We Win iff the fourth K is the 52nd card Proof gt When lst 2nd or 3rd K is seen we don39t terminate because K pile is non empty Structural Observations Lemma The last card before we terminate either Winning or loosing is K Proof gt Suppose at iteration j we draw card X but pile X is empty gt If pile X is empty and X7 K then we have already drawn 4 copies of card X prior to iteration j Contradiction 1 Lemma We Win iff the fourth K is the 52nd card Proof gt When lst 2nd or 3rd K is seen we don39t terminate because K pile is non empty gt Terminate when 4th K is seen we win ifF it39s the 52nd card D Principle of Deferred Decisions gt How do we compute the probability that the fourth K is the 52nd card lP fourth K is 52nd card equals game configurations such that K is 52nd card revealed game configurations Principle of Deferred Decisions gt How do we compute the probability that the fourth K is the 52nd card lP fourth K is 52nd card equals game configurations such that K is 52nd card revealed game configurations gt Principle of Deferred Decisions Let the random choices unfold with the progress of the analysis rather than fixing random events upfront Principle of Deferred Decisions gt How do we compute the probability that the fourth K is the 52nd card lP fourth K is 52nd card equals game configurations such that K is 52nd card revealed game configurations gt Principle of Deferred Decisions Let the random choices unfold with the progress of the analysis rather than fixing random events upfront gt For clock solitaire this means we may assume that at each draw any unseen card is equally unlikely Principle of Deferred Decisions gt How do we compute the probability that the fourth K is the 52nd card lP fourth K is 52nd card equals game configurations such that K is 52nd card revealed game configurations gt Principle of Deferred Decisions Let the random choices unfold with the progress of the analysis rather than fixing random events upfront gt For clock solitaire this means we may assume that at each draw any unseen card is equally unlikely Theorem The probability we Win clock solitaire is 113 Outline Stable Matching Problem Stable Matching Problem Consider a society in which there are n women W17 7 W and n men m17 7 In gt A matching is a 171 correspondence between the men and the women we assume a monogamous heterosexual society Stable Matching Problem Consider a society in which there are n women W17 7 W and n men m17 7 In gt A matching is a 171 correspondence between the men and the women we assume a monogamous heterosexual society gt Each female has a strict preference list for the males and each male has a strict preference list for the females Stable Matching Problem Consider a society in which there are n women W17 7 W and n men m17 7 In gt A matching is a 171 correspondence between the men and the women we assume a monogamous heterosexual society gt Each female has a strict preference list for the males and each male has a strict preference list for the females gt The matching is unstable if there exists W and m such that gt W and m are not matched to each other gt W prefers m to her matc gt m prefers W to his match Stable Matching Problem Consider a society in which there are n women W17 7 W and n men m17 7 In gt A matching is a 171 correspondence between the men and the women we assume a monogamous heterosexual society gt Each female has a strict preference list for the males and each male has a strict preference list for the females gt The matching is unstable if there exists W and m such that gt W and m are not matched to each other gt W prefers m to her matc gt m prefers W to his match gt A configuration that is not unstable is stable Stable Matching Problem Consider a society in which there are n women W17 7 W and n men m17 7 In gt A matching is a 171 correspondence between the men and the women we assume a monogamous heterosexual society gt Each female has a strict preference list for the males and each male has a strict preference list for the females gt The matching is unstable if there exists W and m such that gt W and m are not matched to each other gt W prefers m to her matc gt m prefers W to his match gt A configuration that is not unstable is stable Does a stable matching always exist Can we find one in polynomial time The Gale Shapley Proposal Algorithm gt Let i be the smallest value such that m is unmatched The Gale Shapley Proposal Algorithm gt Let i be the smallest value such that m is unmatched gt m proposes to the most desirable woman according to his list that hasn39t already rejected him The Gale Shapley Proposal Algorithm gt Let i be the smallest value such that m is unmatched gt m proposes to the most desirable woman according to his list that hasn39t already rejected him gt She accepts if either a she is currently unmatched or b she finds In more desirable than her current match in which case her current match becomes unmatched The Gale Shapley Proposal Algorithm V V V V Let i be the smallest value such that m is unmatched m proposes to the most desirable woman according to his list that hasn39t already rejected him She accepts if either a she is currently unmatched or b she finds In more desirable than her current match in which case her current match becomes unmatched Repeat until there are no unmatched men left The Gale Shapley Proposal Algorithm V Let i be the smallest value such that m is unmatched V m proposes to the most desirable woman according to his list that hasn39t already rejected him V She accepts if either a she is currently unmatched or b she finds In more desirable than her current match in which case her current match becomes unmatched gt Repeat until there are no unmatched men left Does the algorithm terminate Is the resulting matching stable Algorithm is Well Defined Lemma Whenever there39s an unmatched man m there is someone he hasn39t proposed to Algorithm is Well Defined Lemma Whenever there39s an unmatched man m there is someone he hasn39t proposed to Proof gt Once a woman becomes matched she doesn39t become unmatched although she may change her match Algorithm is Well Defined Lemma Whenever there39s an unmatched man m there is someone he hasn39t proposed to Proof gt Once a woman becomes matched she doesn39t become unmatched although she may change her match gt All the woman to which m proposed are already matched Algorithm is Well Defined Lemma Whenever there39s an unmatched man m there is someone he hasn39t proposed to Proof gt Once a woman becomes matched she doesn39t become unmatched although she may change her match gt All the woman to which m proposed are already matched gt If m has proposed to everyone all the women are matched hence all the men are matched Contradiction Algorithm is Efficient Theorem The algorithm terminates after On2 repeats Algorithm is Efficient Theorem The algorithm terminates after On2 repeats Proof gt At each stage of the algorithm let t be the number of women to which m could still potentially propose Algorithm is Efficient Theorem The algorithm terminates after On2 repeats Proof gt At each stage of the algorithm let t be the number of women to which m could still potentially propose gt At each step Zidn t decreases by 1 Algorithm is Efficient Theorem The algorithm terminates after On2 repeats Proof gt At each stage of the algorithm let t be the number of women to which m could still potentially propose V At each step Zidn t decreases by 1 gt Initially Zieln n2 so there can be at most n2 steps Algorithm is Correct Theorem The matching found by the Gale Shapley algorithm is stable Algorithm is Correct Theorem The matching found by the Gale Shapley algorithm is stable Proof gt Proof by contradiction Suppose matching includes m WJ and mk W but m and W prefer to be matched to each other Algorithm is Correct Theorem The matching found by the Gale Shapley algorithm is stable Proof gt Proof by contradiction Suppose matching includes m WJ and mk W but m and W prefer to be matched to each other gt Since m prefers W to Wj he must have proposed to W before he proposed to Wj Algorithm is Correct Theorem The matching found by the Gale Shapley algorithm is stable Proof gt Proof by contradiction Suppose matching includes m WJ and mk W but m and W prefer to be matched to each other gt Since m prefers W to Wj he must have proposed to W before he proposed to Wj gt But then W must prefer her current match to m either she already had a better match when m proposed or she matched m initially and then got a better proposal Contradiction D Outline Probabilistic Analysis of Gale Shapley Algorithm Probabilistic Analysis gt In randomized algorithms we consider algorithms that make random choices and investigate what happens when they process a fixed input Eg the min cut algorithm from lecture 1 Probabilistic Analysis gt V In randomized algorithms we consider algorithms that make random choices and investigate what happens when they process a fixed input Eg the min cut algorithm from lecture 1 In probabilistic analysis we consider random input and investigate what happens when it39s processed by a fixed algorithm Eg the GaleShapley algorithm when the preference lists are random Probabilistic Analysis gt In randomized algorithms we consider algorithms that make random choices and investigate what happens when they process a fixed input Eg the min cut algorithm from lecture 1 V In probabilistic analysis we consider random input and investigate what happens when it39s processed by a fixed algorithm Eg the GaleShapley algorithm when the preference lists are random Theorem If the preference lists are random the expected number of iterations of Gale Shapley is an Probabilistic Analysis of Gale Shapley Algorithm 12 gt Principle of deferred decision we may assume that at each step In proposes to a woman chosen uniformly at random from the women that have no yet rejected him Probabilistic Analysis of Gale Shapley Algorithm 12 gt Principle of deferred decision we may assume that at each step In proposes to a woman chosen uniformly at random from the women that have no yet rejected him gt To simplify things use a modification of the GaleShapley algorithm the amnesiac algorithm Probabilistic Analysis of Gale Shapley Algorithm 12 gt Principle of deferred decision we may assume that at each step In proposes to a woman chosen uniformly at random from the women that have no yet rejected him gt To simplify things use a modification of the GaleShapley algorithm the amnesiac algorithm gt At each step In proposes to a woman uniformly at random from the set of all n women Probabilistic Analysis of Gale Shapley Algorithm 12 gt Principle of deferred decision we may assume that at each step In proposes to a woman chosen uniformly at random from the women that have no yet rejected him gt To simplify things use a modification of the GaleShapley algorithm the amnesiac algorithm gt At each step In proposes to a woman uniformly at random from the set of all n women gt This doesn39t change the outcome of the algorithm since if m was rejected by Wj before he39ll be rejected again Probabilistic Analysis of Gale Shapley Algorithm 12 gt Principle of deferred decision we may assume that at each step In proposes to a woman chosen uniformly at random from the women that have no yet rejected him gt To simplify things use a modification of the Gale Shapley algorithm the amnesiac algorithm gt At each step In proposes to a woman uniformly at random from the set of all n women gt This doesn39t change the outcome of the algorithm since if m was rejected by Wj before he39ll be rejected again gt The expected running time of the modified algorithm is an upper bound for the running time of original algorithm Probabilistic Analysis of Gale Shapley Algorithm 22 Theorem If the preference lists are random the expected number of iterations of Gale Shapley is at most an Proof Since the algorithm terminates once all women have received at least one proposal the random process is analogous to the coupon collector problem Outline Readings Readings For next time please make sure you39ve read gt Appendix C Basic Probability Theory 9 pages gt Chapter 1 Introduction up to section 14 14 pages CMPSCI 711 Really Advanced Algorithms Lecture 8 Probabilistic Method and Lova39sz Local Lemma Andrew McGregor Last Compiied February 25 2009 Outline Probabilistic Method Probabilistic Method It39s obvious that l X 2 r gt O or E X 2 r implies that the event X 2 r can happen But it39s very powerful Probabilistic Method It39s obvious that l X 2 r gt O or E X 2 r implies that the event X 2 r can happen But it39s very powerful Examples we39ve already seen gt Any graph has a cut of size at least m2 gt For any collection of m clauses it is possible to satisfy 17 2 km of the clauses if each clause has at least k literals gt For any collection of subsets A17 An C it is possible to partition n into A and B such that mp llAi BlilAi Cll 4annn IEn k SAT gt Input A CNF formula consisting of m clauses in n Boolean variables X17 7Xn eg X1 VXQ V23 A A X9 VX10 VX21 where 217 X gt Problem Is there a satisfying assignment of the formula k SAT gt Input A CNF formula consisting of m clauses in n Boolean variables X17 7Xn eg X1 VXQV3 AAX9VX10VX21 where 217 X gt Problem Is there a satisfying assignment of the formula Theorem If each clause contains exactly k 2 3 literals and each variable appears complemented or un complemented in at most 2quot50 clauses then the formula can always be satisfied Outline Lova39sz Local Lemma Warm up gt Suppose we have n bad events 31 Bn Warm up gt Suppose we have n bad events 31 B gt To show that it is possible for no bad events to occur it is sufficient to find some random process where P ielnl i gt 0 Warm up gt Suppose we have n bad events 31 B gt To show that it is possible for no bad events to occur it is sufficient to find some random process where P ielnl i gt 0 gt Eg if B are independent and maxiP B lt 1 we are good Warm up gt Suppose we have n bad events 31 B gt To show that it is possible for no bad events to occur it is sufficient to find some random process where P ielnl i gt 0 gt Eg if B are independent and maxiP B lt 1 we are good Eg if ZiPB lt 1 we are good V Warm up gt Suppose we have n bad events 31 B gt To show that it is possible for no bad events to occur it is sufficient to find some random process where P ielnl i gt 0 gt Eg if B are independent and maxiP B lt 1 we are good Eg if ZiPB lt 1 we are good gt What if probabilities aren39t tiny and events not independent V Lov sz Local Lemma Theorem Consider events 81 B with B independent ofBj j 39i Suppose that there exist X E 07 1 fori E n such that WBIKXI H 1939 jeri Then 1 ielani 2 Hidnl ixi Where G Bi Lov sz Local Lemma Theorem Consider events 81 B with B independent ofBj j 39i Suppose that there exist X E 07 1 fori E n such that WEI SXI H 1939 jeri Then 1 ielani 2 Hidnl ixi Where G 3 Corollary Let 817Bn be events With 1PB p and B independent of all but at most d other events then P ielani gt O ifepd 1 g 1 Lov sz Local Lemma Theorem Consider events 81 B with B independent ofBj j 39i Suppose that there exist X E 07 1 fori E n such that LPlBil SXI H 1939 jeri Then l ielanil 2 Hidn l ixi Where G 3 Corollary Let 817Bn be events With lP B p and B independent of all but at most d other events then l ielanil gt O ifepd 1 g 1 Proof Use Lova39sz Local Lemma with X 1d 1 for all i E D Proof of Lov sz Local Lemma gt Sufficient to prove 1PB jes Gj X for any 5 C n7 139 5 E i6nGi 1P311P3251Dm1 13HmidnaBID Proof of Lov sz Local Lemma gt Sufficient to prove 1PBi jes Gj X for any 5 C n7 139 5 E i6nGii 1P311P32i51 1 1 Bni misinai Bii gt Proof by induction on k 5 Base case k O is immediate Proof of Lov sz Local Lemma gt Sufficient to prove 1PBi jes Gj X for any 5 C n7 139 5 ED i6nGii 1 P311P32i61i41 Bni iqnq Bii gt Proof by induction on k 5 Base case k O is immediate gt Inductive Step Let 1 j E 5 j E 39i and 2 1 LPBi N 165151 i miss 51 P 3quot 3965 6quot P in65161 i 1652 Gj Proof of Lov sz Local Lemma gt Sufficient to prove 1PBi jes Gj X for any 5 C n7 139 5 ED idani 17PBlliip32i61 171D Bni idwll B gt Proof by induction on k 5 Base case k O is immediate gt Inductive Step Let 1 j E 5 j E 39i and 2 1 LPBi N 165151 i 1652 51 Pimjesl G i 1652 61 gt M By39 4 J H iimptinn LPiiBi N 165151 i 1652 51 S LP Bii 1652 51 LP Bi 1P Bii jes 5 Proof of Lovasz Local Lemma gt Sufficient to prove 1PBi jes Gj X for any 5 C n7 139 5 ED idani 17PBlliip82i61 171D Bni idwll Bii Proof by induction on k 5 Base case k O is immediate Inductive Step Let 1 j E 5 j E 39i and 2 1 LPBi N 165151 i 1652 51 Pimjesl G i 1652 61 gt M By39 4 J H iimptinn LPiiBi N 165151 i 1652 51 S LP Bii 1652 51 LP Bi VV 1 Bii jes 5 gt Denominator Let 1 j1j amp Tk 52 U j1 jk r71 PGj1 Gjri jesg H 7 PBjk1i jeTk k0 2 H 1939 EpiBiiXi jeri Return to k SAT Theorem If each clause contains exactly k 2 3 literals and each variable appears complemented or un complemented in at most 2quot50 clauses then the formula can always be satisfied Return to k SAT Theorem If each clause contains exactly k 2 3 literals and each variable appears complemented or un complemented in at most 2quot50 clauses then the formula can always be satisfied Proof gt Pick each X value uniformly and independently from 01 Return to k SAT Theorem If each clause contains exactly k 2 3 literals and each variable appears complemented or un complemented in at most 2quot50 clauses then the formula can always be satisfied Proof gt Pick each X value uniformly and independently from 01 gt Let BJ be the event that the j th clause is unsatisfied Return to k SAT Theorem If each clause contains exactly k 2 3 literals and each variable appears complemented or un complemented in at most 2quot50 clauses then the formula can always be satisfied Proof gt Pick each X value uniformly and independently from 01 gt Let BJ be the event that the j th clause is unsatisfied gt By previous analysis p lP BJ 24 Return to k SAT Theorem If each clause contains exactly k 2 3 literals and each variable appears complemented or un complemented in at most 2quot50 clauses then the formula can always be satisfied Proof gt Pick each X value uniformly and independently from 01 gt Let BJ be the event that the j th clause is unsatisfied gt By previous analysis p lP BJ 24 gt BJ is independent of all but at most d k2k50 71 other events Return to k SAT Theorem If each clause contains exactly k 2 3 literals and each variable appears complemented or un complemented in at most 2quot50 clauses then the formula can always be satisfied Proof gt Pick each X value uniformly and independently from 01 gt Let BJ be the event that the j th clause is unsatisfied gt By previous analysis p lP BJ 24 gt BJ is independent of all but at most d k2k50 71 other events gt Since epd l 1 g 1 for k 2 3 using LLL l1 ielani gt O CMPSCI 711 Really Advanced Algorithms Lecture 4 Lazy Select and ChernofF Bounds Andrew McGregor Last Comp ed February 5 2009 Outline Lazy Select Lazy Select We have a set 5 of n 2k distinct numbers and want to find the k th smallest element Lazy Select We have a set 5 of n 2k distinct numbers and want to find the k th smallest element Algorithm 1 Let R be a set of n34 elements chosen uniformly at random With replacement from 5 Lazy Select We have a set 5 of n 2k distinct numbers and want to find the k th smallest element Algorithm 1 Let R be a set of n34 elements chosen uniformly at random With replacement from 5 2 Sort R and find a and b such that rankRa kn 14 7 and rankRb kn714 W Where rankXX t ifx is the t th smallest element in X Lazy Select We have a set 5 of n 2k distinct numbers and want to find the k th smallest element Algorithm 1 Let R be a set of n34 elements chosen uniformly at random With replacement from 5 2 Sort R and find a and b such that rankRa kn 14 7 and rankRb kn714 W Where rankXX t ifx is the t th smallest element in X 3 Compute rank5a and rank5b Output FAIL if k lt rank5a or k gt rank5b Lazy Select We have a set 5 of n 2k distinct numbers and want to find the k th smallest element Algorithm Let R be a set of n34 elements chosen uniformly at random With replacement from 5 2 Sort R and find a and b such that rankRa kn 14 7 and rankRb kn714 W Where rankXX t ifx is the t th smallest element in X 3 Compute rank5a and rank5b Output FAIL if k lt rank5a or k gt rank5b 4 Let P i e s a g y g b Output FAIL i P 2 4734 Lazy Select We have a set 5 of n 2k distinct numbers and want to find the k th smallest element Algorithm Let R be a set of n34 elements chosen uniformly at random With replacement from 5 Sort R and find a and b such that M rankRa kn 14 7 and rankRb kn714 W Where rankXX t ifx is the t th smallest element in X Compute rank5a and rank5b Output FAIL if 9quot k lt rank5a or k gt rank5b gt Let P i e s a g y g b Output FAIL i P 2 4734 Return k 7 rank5a 1 th smallest element from P Uquot Lazy Select Running Time Theorem Running time of Lazy Select is On Lazy Select Running Time Theorem Running time of Lazy Select is On Proof gt On34 steps to define R Lazy Select Running Time Theorem Running time of Lazy Select is On Proof gt On34 steps to define R gt On34 log n steps to sort R and find a and b Lazy Select Running Time Theorem Running time of Lazy Select is On Proof gt On34 steps to define R gt On34 log n steps to sort R and find a and b gt On steps to compute rank5a and rank5b in 5 Lazy Select Running Time Theorem Running time of Lazy Select is On Proof n34 steps to define R n34 log n steps to sort R and find a and b n steps to compute rank5a and rank5b in 5 3 n 4 log n steps to sort P and select element Lazy Select Probability of Being Correct 13 Theorem With probability 1 7 On 14 algorithm finds the median Lazy Select Probability of Being Correct 13 Theorem With probability 1 7 On 14 algorithm finds the median Proof gt If we don39t output FAIL then we get the answer correct Lazy Select Probability of Being Correct 13 Theorem With probability 1 7 On 14 algorithm finds the median Proof gt If we don39t output FAIL then we get the answer correct gt Only three ways in which we fail and we39ll show 1 PM lt rank5a g 0n 14 2 PM gt rank5b g 0n 14 3 P 2 4n34 g 0n 14 Lazy Select Probability of Being Correct 23 Claim 1Pk lt rank5a On 14 Lazy Select Probability of Being Correct 23 Claim lP k lt rank5a On 14 Proof gt Let u be the k th smallest element in 5 Lazy Select Probability of Being Correct 23 Claim lP k lt rank5a On 14 Proof gt Let u be the k th smallest element in 5 gt Consider choosing R Let X 1 if i th sample is g u and X 0 otherwise lPX 1 kn and lPX 0 17 kn Lazy Select Probability of Being Correct 23 Claim lP k lt rank5a On 14 Proof gt Let u be the k th smallest element in 5 gt Consider choosing R Let X 1 if i th sample is g u and X 0 otherwise lPX 1 kn and lPX 0 17 kn gt X Zi6ng4X number of elements in R that are at most u Lazy Select Probability of Being Correct 23 Claim lP k lt rank5a On 14 Proof gt Let u be the k th smallest element in 5 gt Consider choosing R Let X 1 if i th sample is g u and X 0 otherwise lPX 1 kn and lPX 0 17 kn gt X Zi6ng4X number of elements in R that are at most u gt k lt rank5a implies X lt kn l li Lazy Select Probability of Being Correct 23 Claim lP k lt rank5a On 14 Proof gt Let u be the k th smallest element in 5 gt Consider choosing R Let X 1 if i th sample is g u and X 0 otherwise lPX 1 kn and lPX 0 17 kn gt X Zi6ng4X number of elements in R that are at most u gt k lt rank5a implies X lt kn l li gt X has binomial distribution E X mil4 and VX n34kn17 kn n344 Lazy Select Probability of Being Correct 23 Claim lP k lt rank5a On 14 Proof gt Let u be the k th smallest element in 5 gt Consider choosing R Let X 1 if i th sample is g u and X 0 otherwise lPX 1 kn and lPX 0 17 kn gt X Zi6ng4X number of elements in R that are at most u gt k lt rank5a implies X lt kn l li gt X has binomial distribution E X mil4 and VX n34kn17 kn n344 gt Apply Chebyshev bound l X lt kn l4 7 is at most 1 ix 71E Xl lt m g 1 ix 7E Xl lt 2n180X owl4 Lazy Select Probability of Being Correct 33 Claim P W 2 4W4 Owl4 Lazy Select Probability of Being Correct 33 Claim P W Z 439734i S 0quot714 Proof gt If iPi 2 4n34 then either rank5a 7 2n34 or rank5b 2 k 2n3471 Lazy Select Probability of Being Correct 33 Claim P m 2 4W4 Owl4 Proof If M 2 4734 then either rank5a 7 2n34 or rank5b 2 k 2n34 71 gt To bound 1 rank5a 7 2n34 and JP rank5b 2 k 2n34 71 define X and use Chebyshev along the same lines as the previous claim Lazy Select Probability of Being Correct 33 Claim P in 2 4W4 Owl4 Proof If M 2 4734 then either rank5a 7 2n34 or rank5b 2 k 2n34 71 gt To bound 1 rank5a 7 2n34 and JP rank5b 2 k 2n34 71 define X and use Chebyshev along the same lines as the previous claim gt Apply union bound Outline ChernofF Bounds ChernofFBound Upper Tm13 Theorem Let X17 7Xr be independent booean random variables such that 1PX 1 p Then forX ZiXi MEX and6 gt 0 M PMgtQ Mlt 8 5 1 616 Chernoff Bound Upper Tail 23 Proof gt For any 1 gt 0 1PX gt 1 6 1P etX gt et15 Chernoff Bound Upper Tail 23 Proof gt For any 1 gt 0 1PX gt 1 6 1P etX gt et15 gt Apply Markov inequality 1 etX gt Squaw 2 E em et16g Chernoff Bound Upper Tail 23 Proof gt For any 1 gt 0 1PX gt 1 6 1P etX gt et15 gt Apply Markov inequality 1 etX gt Squaw 2 E em et15p gt By independence IE etX E etZXi E He i miwi Chernoff Bound Upper Tail 23 Proof gt For any 1 gt 0 1PX gt 1 6 1P etX gt et15r gt Apply Markov inequality 1 etX gt et16p 2 E em et15p gt By independence IE etX E etZXi E 1719 HE etx gt We will prove HIE etXi eet 1r in a sec Chernoff Bound Upper Tail 23 Proof gt For any 1 gt 0 1PX gt 1 6 1P etX gt et15r gt Apply Markov inequality 1 etX gt et16p 2 E em et15p gt By independence IE etX E etZXi E 1719 HE etx gt We will prove HIE etXi eet 1r in a sec gt For t n1 6 5 E ietxi et16M S 9et 1wet04r6w i1e615 Chernoff Bound Upper Tail 33 Lemma HEWHSA W Proof gt Using 1X ex EFWM5QM1M5US MMgim Chernoff Bound Upper Tail 33 Lemma HEWHSA W Proof gt Using 1x ex EFWM5QM1M5US MMgim gt Using M E Zin 2 pi HeXMPIet 1 em Met 1 eXPet 1 D Chernoff Bound Upper Tail Simplification Theorem Let X17 7Xr be independent booean random variables such that 1PX 1 p LetX XIX andpEX Chernoff Bound Upper Tail Simplification Theorem Let X17 7Xr be independent booean random variables such that 1PX 1 p LetX XIX andlu EX gt 1 6 gt 2e 7 1 1PX gt 1 6M lt 215M Chernoff Bound Upper Tail Simplification Theorem Let X17 7Xr be independent booean random variables such that 1PX 1 p LetX XIX andlu EX gt 1 6 gt 2e 7 1 1PX gt 1 6M lt 215M gt If0lt6 2e71 1PX gt 1 6M lt WE4 Chernoff Bound Lower Tail 12 Theorem Let X17 7Xr be independent booean random variables such that 1PX 1 p Then forX ZiXi MEX and 1 gt 6 gt 0 E X lt 1 i 6M lt eXPHMSZ2 Chernoff Bound Lower Tail 22 Proof gt For any 1 gt 0 1PX lt 17 6M 1P e tX gt 941 5W Chernoff Bound Lower Tail 22 Proof gt For any 1 gt 0 1PX lt 17 6M 1P e tX gt 941 5W gt Apply Markov inequality 1 eitX gt 64075 2 E eitX 34075 Chernoff Bound Lower Tail 22 Proof gt For any 1 gt 0 lPX lt 17 6M l e tX gt e t1 5l gt Apply Markov inequality 1 eitX gt 64075 2 E eitX 34075 gt Similarly to before IE e tX HIE e tXl 9974 Chernoff Bound Lower Tail 22 Proof gt For any 1 gt 0 lPX lt 17 6M l e tX gt e t1 5l gt Apply Markov inequality 1 eitX gt 64075 2 E eitX 34075 gt Similarly to before IE e tX HIE e tXl 9974 gt For t iln176 75 M 4X imam siting 7t176g 9 Ele ie e luwl Chernoff Bound Lower Tail 22 Proof gt For any 1 gt 0 lPX lt 17 6u l1 e tX gt e t1 5l gt Apply Markov inequality 1 eitX gt 64075 2 E eitX 34075 gt Similarly to before 1E e tX HIE e tXl 9974 gt For t iln176 75 M 4X imam siting 7t176u 9 Ele ie e law gt Simplify using 17 31 5 gt exp76 l 622 since 6 6 01 ll Outline Set Balancing Set Balancing Let A17 An be subsets of n such that iAii n2 We want to partition n into B and C such that maxiiAi Bi 7 iAi CM is minimized Hint Use PHX 7EX i lt 6p 2exp7E X 624 Outline Readings Readings For next time please make sure you39ve read gt Chapter 3 Moments and Deviations 20 pages Outline Puzzle Too Easy Puzzle gt There are 3 coins in a bag the first coin has two heads the second coin has two tails and the third coin has one head and one tail gt You draw a coin at random without looking and toss it in the air It lands heads up gt What39s the probability that the other side of the coin is heads CMPSCI 711 Really Advanced Algorithms Lecture 11 12 amp 13 Probability Amplification and Expanders Andrew McGregor Last Compiied March 30 2009 Outline Probability Amplification with Two Point Sampling Probability Amplification gt Consider language L and randomized algorithm A such that for random r E 07 p 71 p is prime gt Ifx E L then iP AX7 r 1 2 12 gt fx L then PAXr 0 1 Probability Amplification gt Consider language L and randomized algorithm A such that for random r E 07 p 71 p is prime gt Ifx E L then lP AX7 r 1 2 12 gt fx L then lP AXr 0 1 gt If algorithm has worst case polynomial running time in we call it an RP algorithm Probability Amplification gt Consider language L and randomized algorithm A such that for random r E 07 p 71 p is prime gt Ifx E L then lP AX7 r 1 2 12 gt fx L then lP AXr 0 1 gt If algorithm has worst case polynomial running time in we call it an RP algorithm gt We say r is a witness for X E L if AX7 r 1 Probability Amplification gt Consider language L and randomized algorithm A such that for random r E 07 p 71 p is prime gt Ifx E L then lP AX7 r 1 2 12 gt fx L then lP AXr 0 1 gt If algorithm has worst case polynomial running time in we call it an RP algorithm gt We say r is a witness for X E L if AX7 r 1 gt Easy to boost the probability of returning 1 for X E L up to 17 24 using Ot log p random bits Probability Amplification gt Consider language L and randomized algorithm A such that for random r E 07 p 71 p is prime gt Ifx E L then lP AX7 r 1 2 12 gt fx L then lP AXr 0 1 gt If algorithm has worst case polynomial running time in we call it an RP algorithm gt We say r is a witness for X E L if AX7 r 1 gt Easy to boost the probability of returning 1 for X E L up to 17 24 using Ot log p random bits gt What if we only have Olog p bits Two Point Sampling gt Let p be a prime and let a7 b be chosen uniformly at random from 07177p71 gt Define r07 7 171 where r 3 b mod p Two Point Sampling gt Let p be a prime and let a7 b be chosen uniformly at random from 07177p71 gt Define r07 7 171 where r 3 b mod p gt Exercise r are pairwise independent but not three wise Two Point Sampling gt Let p be a prime and let a7 b be chosen uniformly at random from 07177p71 gt Define r07 7 171 where r 3 b mod p gt Exercise r are pairwise independent but not three wise gt Let Y ELI AX7 r Two Point Sampling gt Let p be a prime and let a7 b be chosen uniformly at random from 07177p71 Define r07 7 171 where r 3 b mod p Exercise r are pairwise independent but not three wise Let Y ELI AX7 r Exercise Ifx E L 1PY 0 1t Two Point Sampling VVVVV Let p be a prime and let a7 b be chosen uniformly at random from 07177p71 Define r07 7 171 where r 3 b mod p Exercise r are pairwise independent but not three wise Let Y ELI AX7 r Exercise Ifx E L 1PY 0 1t Hence with Oog p random bits can boost probability to 17 1t given 1 trials Outline Probability Amplification with Expanding Graphs Probability Amplification with Expanding Graphs gt We39ll show approach with error probability z lnl g using log2 n random bits where there are On witnesses Probability Amplification with Expanding Graphs gt We39ll show approach with error probability z lnl g using log2 n random bits where there are On witnesses gt Approach is based on sparse expanding graphs Probability Amplification with Expanding Graphs gt We39ll show approach with error probability z 1n39 g using log2 n random bits where there are On witnesses gt Approach is based on sparse expanding graphs Theorem For suf ciently large n there is a bipartite graph G L7 R7 E With lLl n lRl 2mg With 1 Every subset of n2 vertices from L has at least 239quotg2 neighbors in R 2 No vertex ofR has more than 12 log2 n neighbors in Proof of Theorem gt By probabilistic method For each vertex in L pick d 2I g2 4log2 nn neighbors in R with replacement Proof of Theorem gt By probabilistic method For each vertex in L pick d 2l g2 4log2 nn neighbors in R with replacement gt Probability there is a set of n2 vertices in L with fewer than 2mg 7 n neighbors in R is at most 7 2kan 2kan 7 n dn2 M n l Proof of Theorem gt By probabilistic method For each vertex in L pick d 2l g2 4log2 nn neighbors in R with replacement gt Probability there is a set of n2 vertices in L with fewer than 2mg 7 n neighbors in R is at most 7 2kan 2kan 7 n dn2 M n l gt Usual tricks shows this is lt 12 Proof of Theorem gt By probabilistic method For each vertex in L pick d 2l g2 4log2 nn neighbors in R with replacement gt Probability there is a set of n2 vertices in L with fewer than 2mg 7 n neighbors in R is at most 7 2log2 n 2log2 n 7 n dn2 m l n l gt Usual tricks shows this is lt 12 gt Second condition expected number of neighbors of v E R is 4 log2 n Proof of Theorem V V V V V By probabilistic method For each vertex in L pick d 2l g2 4log2 nn neighbors in R with replacement Probability there is a set of n2 vertices in L with fewer than 2mg 7 n neighbors in R is at most dnZ n n 2log2 n 2log2 n 7 m l n l Usual tricks shows this is lt 12 Second condition expected number of neighbors of v E R is 4 log2 n ChernofF bound shows that there exists v E R with more than 12 log2 n neighbors with probability lRle312l g2 ltlt12 Back to probability amplification gt Pick v ER R Uses Oog2 n random bits Back to probability amplification gt Pick v ER R Uses Oog2 n random bits gt Consider neighbors of v Back to probability amplification gt Pick v ER R Uses Oog2 n random bits gt Consider neighbors of v gt At least n2 of nodes in L are witnesses if X E L Back to probability amplification gt Pick v ER R Uses Olog2 n random bits gt Consider neighbors of v gt At least n2 of nodes in L are witnesses if X E L gt Probability we find a witness is at least 17 n2l g2 Outline Probability Amplification with Random Walks on Expanders Expanders Definition An n7 d7 c expander is a d regular bipartite multi graph G X7 Y7 E with iXi iYi n2 such that for any 5 C X WW 2 1 c lt17 isi Expanders Definition An n7 d7 c eXpander is a d regular bipartite multi graph G X7 Y7 E with le lYl n2 such that for any 5 C X WW 2 1 c lt17 isi Example Gabber Galil expanders For positive integer m let n 2m2 Each vertex in X has distinct label consisting of a pair 37 b for a7 b E 07 7 m7 1 Similarly for Y X7y EX has neighbors in Ywith labels XiyL X7 2X y X72X y 17X72X y 2 X2y7y7x2y17y7x2y27y where addition is modulo m Resulting graph is a n7772oz example where 04 2 7 34 Expanders Definition An n7 d7 c eXpander is a d regular bipartite multi graph G X7 Y7 E with le lYl n2 such that for any 5 C X WW 2 1 c lt17 isi Example Gabber Galil expanders For positive integer m let n 2m2 Each vertex in X has distinct label consisting of a pair 37 b for a7 b E 07 7 m7 1 Similarly for Y X7y EX has neighbors in Ywith labels XiyL X7 2X y X72X y 17X72X y 2 X2y7y7x2y17y7x2y27y where addition is modulo m Resulting graph is a n7772oz example where 04 2 7 34 Eigenvalues and Algebraic Graph Theory Fact For symmetric n x n matrix A there exists n orthonormal eigenvectors e17 7 en With eigenvectors 1 2 A2 2 2 A ie 0 ifi j eA e and e ej 1 if 1 IfA is adjacency matrix ofgraph G then G connected implies A2 lt 1 1 6 is d regular and bipartite 1 d and n 7 Eigenvalues and Algebraic Graph Theory Fact For symmetric n x n matrix A there exists n orthonormal eigenvectors e17 7 en With eigenvectors 1 2 A2 2 2 A ie 0 ifi j eA e and e ej 1 if 1 IfA is adjacency matrix ofgraph G then G connected implies A2 lt 1 1 6 is d regular and bipartite 1 d and n 7 Theorem Alon 3986 1 6 is an n7 d7 c expander then 2 g d7 W262 If 2 g d7 6 then G is an n7 d7 c expander With c 2 3732 Expanders are Rapidly Mixing Definition Let G be an n7 d7 c expander and let Q be transition matrix on G with QMl 12 and Q 12d if v is a neighbor of u Let be eigenvalues of G and In 1 l d2 be eigenvalues of Q Expanders are Rapidly Mixing Definition Let G be an n7 d7 c expander and let Q be transition matrix on G with QMl 12 and Q 12d if v is a neighbor of u Let be eigenvalues of G and In 1 l d2 be eigenvalues of Q Theorem Consider aperiodic random walk on G defined by Q Distribution at time t la 7 7rl S 2 M5 This is at most 1 7 e2dt if2 d 7 e Expanders are Rapidly Mixing Definition Let G be an n7 d7 c expander and let Q be transition matrix on G with QLlLl 12 and Q 12d if v is a neighbor of u Let be eigenvalues of G and In 1 l d2 be eigenvalues of Q Theorem Consider aperiodic random walk on G defined by Q Distribution at time t la 7 7rl S 2 M5 This is at most 1 7 e2dt if2 d 7 6 Lemma For any vector v E R llvllg llvlll Where llvlll lel andllvllz EDI i i Proof of Theorem gt Distribution at time t is q q0Qt Proof of Theorem gt Distribution at time t is q q0Qt gt Choose orthonormal eigenvectors 91 79 for Q Proof of Theorem gt Distribution at time t is q q0Qt gt Choose orthonormal eigenvectors e17 en for Q gt Express 70 as 70 Z i6n Ciei Proof of Theorem gt Distribution at time t is q q0Qt gt Choose orthonormal eigenvectors e17 en for Q gt Express 70 as 70 Z i6n Ciei gt Orthonormality implies 1 Hq0ii1 2 Hq0ii2 4 2i cl Proof of Theorem gt Distribution at time t is q q0Qt gt Choose orthonormal eigenvectors e17 en for Q gt Express 70 as 70 Z i6n Ciei gt Orthonormality implies 1 Hq0ii1 2 Hq0ii2 4 2i cl The 7 Xian CieiQt Ziqn Ciei i Proof of Theorem gt Distribution at time t is q q0Qt gt Choose orthonormal eigenvectors e17 en for Q gt Express 70 as 70 Z i6n Ciei gt Orthonormality implies 1 Hq0ii1 2 Hq0ii2 4 2i cl The 7 Xian CieiQt Ziqn Ciei i Proof of Theorem gt Distribution at time t is q q0Qt gt Choose orthonormal eigenvectors e17 en for Q gt Express 70 as 70 Z i6n Ciei gt Orthonormality implies 1 llqmllll 2 llq0ll2 4 2i cl The qlt Xian CieiQt Ziqn Ciei i gt Let X clel and using previous lemma llqlt Xlll S llqlt Xllz ll Z Ciemillz 172 Proof of Theorem gt Distribution at time t is q 70Qt gt Choose orthonormal eigenvectors e17 en for Q gt Express 70 as 70 Zieln ciei gt Orthonormality implies 1 llqmllll 2 llq0ll2 4 2i cl The qlt Xian CieiQt Ziqn Ciei i gt Let X clel and using previous lemma llqlt Xlll S llqlt Xllz ll Z Ciemillz 172 gt BecausengnganO n q Xlll S WHEN Z Cieillz S WME i2 BPP and Probability Amplification gt Consider language L and randomized algorithm A such that for random r 6 01 gt Ifx E L then iP AX7 r 1 2 99100 gt Ifx g L then iP AX7 r 0 2 99100 BPP and Probability Amplification gt Consider language L and randomized algorithm A such that for random r 6 01 gt Ifx E L then lP AX7 r 1 2 99100 gt Ifx g L then lP AX7 r 0 2 99100 gt If algorithm has worst case polynomial running time in we call it an BPP algorithm Probability Amplification via Expanders gt Let G be a N7772a Gabber Galil expander where N 2 and nodes are labelled by 01 Probability Amplification via Expanders gt Let G be a N7772a Gabber Galil expander where N 2 and nodes are labelled by 01 gt Consider random walk on G with prob 12 of not moving in a given step and random starting position X07X17X27 7X7 where B 01 satisfies Ag 110 Probability Amplification via Expanders gt Let G be a N7772a Gabber Galil expander where N 2 and nodes are labelled by 01 gt Consider random walk on G with prob 12 of not moving in a given step and random starting position X07X17X27 7X7 where B 01 satisfies Ag 110 gt Total random bits requires n Ok Probability Amplification via Expanders gt Let G be a N7772a Gabber Galil expander where N 2 and nodes are labelled by 01 gt Consider random walk on G with prob 12 of not moving in a given step and random starting position X07X17X27 7X7 where B 01 satisfies Ag 110 gt Total random bits requires n Ok gt For 0 g i 7k let r be label of X Probability Amplification via Expanders gt Let G be a N7772a Gabber Galil expander where N 2 and nodes are labelled by 01 gt Consider random walk on G with prob 12 of not moving in a given step and random starting position X0X1X27 7X7 where B 01 satisfies Ag 110 gt Total random bits requires n Ok gt For 0 g i 7k let r be label of X Theorem Majority ofAX7 r07 7AX7 ryk are correct With prob 17 12quot Proof of Theorem gt Let W E 01NXN with WW 1 ifF node u is good Proof of Theorem gt Let W E 01NXN with WW 1 ifF node u is good gt Let B Qt For A E 7k probability r39s are good for i E A and r39s are not good for 139 A is HqOB1B7kH1 whereSiWifiEAandWI7Wifi A Proof of Theorem gt Let W E 01NXN with WW 1 ifF node u is good gt Let B Qt For A E 7k probability r39s are good for i E A and r39s are not good for 139 A is iiq B1Bs7kH1 whereSiWifieAandWI7Wifi A Claim Foranyp e R HpBWiiz iipiiz and HpBWiiz iipiizS Proof of Theorem gt Let W E 01NXN with WW 1 ifF node u is good gt Let B Qt For A E 7k probability r39s are good for i E A and r39s are not good for 139 A is iiq B1Bs7kH1 whereSiWifieAandWI7Wifi A Claim For any P 6 RN iiPBWii2 S iiPii2 and iiPBWii2 S iiPii25 gt Applying claim repeatedly miiqoB1B7kH2 iiq0B1B7kii1 S W15 quotA Hq ii2 15 quotA Proof of Theorem gt Let W E 01NXN with WW 1 ifF node u is good gt Let B Qt For A E 7k probability r39s are good for i E A and r39s are not good for 139 A is iiq B1Bs7kH1 whereSiWifieAandWI7Wifi A Claim For any P 6 RN iiPBWii2 S iiPii2 and iiPBWii2 S iiPii25 gt Applying claim repeatedly miiqoB1B7kH2 W15 quotA Hq ii2 15 quotA iiq0B1B7kii1 g gt Probability iAi 7k2 is at most 27k157k2 lt12k

### BOOM! Enjoy Your Free Notes!

We've added these Notes to your profile, click here to view them now.

### You're already Subscribed!

Looks like you've already subscribed to StudySoup, you won't need to purchase another subscription to get this material. To access this material simply click 'View Full Document'

## Why people love StudySoup

#### "Knowing I can count on the Elite Notetaker in my class allows me to focus on what the professor is saying instead of just scribbling notes the whole time and falling behind."

#### "Selling my MCAT study guides and notes has been a great source of side revenue while I'm in school. Some months I'm making over $500! Plus, it makes me happy knowing that I'm helping future med students with their MCAT."

#### "I was shooting for a perfect 4.0 GPA this semester. Having StudySoup as a study aid was critical to helping me achieve my goal...and I nailed it!"

#### "It's a great way for students to improve their educational experience and it seemed like a product that everybody wants, so all the people participating are winning."

### Refund Policy

#### STUDYSOUP CANCELLATION POLICY

All subscriptions to StudySoup are paid in full at the time of subscribing. To change your credit card information or to cancel your subscription, go to "Edit Settings". All credit card information will be available there. If you should decide to cancel your subscription, it will continue to be valid until the next payment period, as all payments for the current period were made in advance. For special circumstances, please email support@studysoup.com

#### STUDYSOUP REFUND POLICY

StudySoup has more than 1 million course-specific study resources to help students study smarter. If you’re having trouble finding what you’re looking for, our customer support team can help you find what you need! Feel free to contact them here: support@studysoup.com

Recurring Subscriptions: If you have canceled your recurring subscription on the day of renewal and have not downloaded any documents, you may request a refund by submitting an email to support@studysoup.com

Satisfaction Guarantee: If you’re not satisfied with your subscription, you can contact us for further help. Contact must be made within 3 business days of your subscription purchase and your refund request will be subject for review.

Please Note: Refunds can never be provided more than 30 days after the initial purchase date regardless of your activity on the site.