×

### Let's log you in.

or

Don't have a StudySoup account? Create one here!

×

or

## Digital Image Processing

by: Cassidy Effertz

11

0

0

# Digital Image Processing ECE 6258

Cassidy Effertz

GPA 3.64

Staff

These notes were just uploaded, and will be ready to view shortly.

Either way, we'll remind you when they're ready :)

Get a free preview of these Notes, just enter your email below.

×
Unlock Preview

COURSE
PROF.
Staff
TYPE
Class Notes
PAGES
0
WORDS
KARMA
25 ?

## Popular in ELECTRICAL AND COMPUTER ENGINEERING

This 0 page Class Notes was uploaded by Cassidy Effertz on Monday November 2, 2015. The Class Notes belongs to ECE 6258 at Georgia Institute of Technology - Main Campus taught by Staff in Fall. Since its upload, it has received 11 views. For similar materials see /class/233897/ece-6258-georgia-institute-of-technology-main-campus in ELECTRICAL AND COMPUTER ENGINEERING at Georgia Institute of Technology - Main Campus.

×

## Reviews for Digital Image Processing

×

×

### What is Karma?

#### You can buy or earn more Karma at anytime and redeem it for class notes, study guides, flashcards, and more!

Date Created: 11/02/15
I Quiz 1 Statistics Lecture 17 I Quiz 1 has been graded Mean 68 u Median 69 B Low 48 B High 82 Markov Source Models for Images I Solutions attached to your returned quiz 929ZDEI3 ECESZSE Rune M M 1111 In I 929ZDEI3 ECESZSE Rune M M eeeeee nu Arithmetic Coding quotWorking Example Working example continued 04 07 09 0 I q4 0 I n 052 0N 7 P104 P203 P302 P401 Mi Wm F0 0 F1 04 F2 07 F3 09 F4 10 WeiWWW 045376 v g to new I Sequence to encode 32 s1 32 s1 s1 34 AE 9292003 ECESZSE mien M nnnn m 3 9292003 ECESZSE hum M eeeeee nu Recursive update equations I In the general case each interval is defined by its left point Lk and its width WK W0 7 L0 0 9292003 ECE 6258 Russell M Mersereau I Decoding the number I Assume that the decoded value is r 04532 I The receiver must synthesize the intervals and perform the appropriate comparisons L00 W01 I The interval limits are calculated as 04 07 09 Tillie 333 I The new interval is L1 040 W1 3 The interval limits are 052 061 067 The mm 354me is 0 9292003 ECE 6258 Russell M Mersereau Decoding continued I The new interval is L2 040 W2012 The interval limits are 0448 0484 0508 How do we know when to stop I EOF symbol or symbol count The width of the final interval is equal to the probability of the whole sequence It contains at least one binary number that is not the pre x of another codeword that can be represented using 1 N 2 IO 7 1 bits 92 WM 9292003 ECE 6258 Russell M Mersereau I Coding Ef ciency HSM LSHSM2 L HltssRMHs Minput length I As the intervals become narrower it becomes increasingly likely that the interval will be forever confined to either the upper or lower halfinterval i All points in the interval have the same leading bit in Send the bit and rescale the interval E1 0005 a 01 E1x 2x E2 0510 a 01 52m 2x1 9292003 ECE 6258 Russell M Mersereau Notes I The code that was described is called an Elias Code I Arithmetic coding refers to a nite precision implementat39on u Patents held by IBM I Code can be made adaptive by updating its estimates of the probabilities based on input sequence vzvim mm on M Mm I Where we are going I Up to this point we have assumed a zero memory source model independent symbols u Huf 39nan c des u Vectorized Huffman codes a Arithmetic codes We can reduce the bitrate further by exploiting the correlation between symbols a Vector sequences a Markov sequences spatially correlated vmim mm mum M Mm m Firstiorder statistics of color components Considerthe512x512image ena 8bppcolor plane We 7 25 bpp HG 7 59 bpp 218i bpp HB 6 97 bpp HY 7 23 b Cb 547 bpp 91812bpp 2 DD 2 HCr 54 vzvim mm on M Mm IJoint Entropy I Consider random vectors with discrete nitealphabet components X Xm Xm1 Entropy fY V Wm A mom Mylo mxmn 7 l I Shannon s Noiseless Source Coding Theorem Consider vector source generating llD random vectors X The joint entropy HX is an achievable lower bound on the bitrate for encoding X vmim mm mum M MM 2 ljoint entropy and statistical dependence Theorem HX0X1Xm1 g HX0HX1 HXm1 Equality for statistical independence of X0X1Xm1 Exploiting statistical dependence can reduce bitrate Statistically independent components can be compressed and decompressed separately without loss 9292003 ECE 6258 Russell M Mersereau 13 Statistical dependence between color components R xed 3x824 bpp R xed 3x824 bpp HYCbCr 1501 bpp HRGB 1684 bpp HYHCbHCr 1812 bpp HRHGHB 2182 bpp AH 311 bpp AH 498 bpp Image Lena 512 x 512 pixels 8 bpp Statistical dependence between RGB is stronger If the joint sources are treated as sources with memory the possible gain byjoint coding is much smaller 9292003 ECE 6258 Russell M Mersereau 14 I Markov Sources IID source models are generally inadequate even when vectorized particularly for images I Usually occurrence of symbol Si depends on M previous symbols Need to know Ss7 i12q i12q slls j178327 SJM jp 12 9292003 ECE 6258 Russell M Mersereau 15 I Markov Sources 2 u The values of the M previous symbols sj1sj2st define the state context of the process I There are q V possible states 9292003 ECE 6258 Russell M Mersereau 16 I Second order Markov Example O so 1 O 02 g 05 P000P11108 P100P01102 1 05 1 P001P01005 05 11 02 P101P11005 008 q2 M2 4 states 9292003 ECE 6258 Russell M Mersereau 17 I Steady State Distribution of the States Pwmospwmoepmm Pwno2pwmoepmm Pmnospmnoepwn Pmmo2pmnoepwn 9PwmPmn5m4 9PmmPwn2m4 Wm mammmm 0 Average Information in a Markov Source The information we obtain if s occurs while we are In state sj1sj2 sM Is 1 slsl 8 2 sMOg Z 3 7 J J PSisjlasj2aquotvst And the average information while we are in this state is the conditional entropy HSs1 5392tsM Ps8391s251MOg i J a J J 7 J J 1 PSilsj13j2avsjll 9292003 ECE 6258 Russell M Mersereau 19 Average Information in a Markov Source 2 I To get the total information in the source we average this quantity over all qM possible states H09 ENS1312 4 3j11HSlSj1isJ21 H501 SH 1 PS 1 5392 3 PS lS 1 slums Nog 5239 17 a I J 7 J P51511132v15j1l 1 Ps 13 2 sM smog 5 397 397 7 J PSil5jlisj27H a jlll Shannon s Noiseless Source Coding Theorem achievable lower bound for bitrate W HleX H09 9292003 ECE 6258 Russell M Mersereau 20

×

×

### BOOM! Enjoy Your Free Notes!

×

Looks like you've already subscribed to StudySoup, you won't need to purchase another subscription to get this material. To access this material simply click 'View Full Document'

## Why people love StudySoup

Steve Martinelli UC Los Angeles

#### "There's no way I would have passed my Organic Chemistry class this semester without the notes and study guides I got from StudySoup."

Anthony Lee UC Santa Barbara

#### "I bought an awesome study guide, which helped me get an A in my Math 34B class this quarter!"

Jim McGreen Ohio University

Forbes

#### "Their 'Elite Notetakers' are making over \$1,200/month in sales by creating high quality content that helps their classmates in a time of need."

Become an Elite Notetaker and start selling your notes online!
×

### Refund Policy

#### STUDYSOUP CANCELLATION POLICY

All subscriptions to StudySoup are paid in full at the time of subscribing. To change your credit card information or to cancel your subscription, go to "Edit Settings". All credit card information will be available there. If you should decide to cancel your subscription, it will continue to be valid until the next payment period, as all payments for the current period were made in advance. For special circumstances, please email support@studysoup.com

#### STUDYSOUP REFUND POLICY

StudySoup has more than 1 million course-specific study resources to help students study smarter. If you’re having trouble finding what you’re looking for, our customer support team can help you find what you need! Feel free to contact them here: support@studysoup.com

Recurring Subscriptions: If you have canceled your recurring subscription on the day of renewal and have not downloaded any documents, you may request a refund by submitting an email to support@studysoup.com