### Create a StudySoup account

#### Be part of our community, it's free to join!

Already have a StudySoup account? Login here

# Computer Vision CS 4495

GPA 3.81

### View Full Document

## 20

## 0

## Popular in Course

## Popular in ComputerScienence

This 0 page Class Notes was uploaded by Alayna Veum on Monday November 2, 2015. The Class Notes belongs to CS 4495 at Georgia Institute of Technology - Main Campus taught by Staff in Fall. Since its upload, it has received 20 views. For similar materials see /class/234152/cs-4495-georgia-institute-of-technology-main-campus in ComputerScienence at Georgia Institute of Technology - Main Campus.

## Reviews for Computer Vision

### What is Karma?

#### Karma is the currency of StudySoup.

#### You can buy or earn more Karma at anytime and redeem it for class notes, study guides, flashcards, and more!

Date Created: 11/02/15

Nonlinear LeastSquares m Ki 2 Z Z 1121 111niax39 i1 kl JZk I II I Centroid Mapping from 3D to 8D space for 4 cameras 10 5dimensional nullspace economy SVD 11 12 13 NON FACES WM 39 F0 LJXb 0 ecf39smn 36quot ka Classi er For 1 T Freund amp Shapire 0 Train weak learner using distribution 0 s Get weak hypothesis 1 X a 1 1 with error a Prwn wz m V 7 DM rm iii39rl39x D39Hm Z X 1quot ifHJ39x7 yn D i0x1 u yritr Zr where Z is a normalization factor chosen so that D will be a distribution Output the nal hypothesis 1 Hr Sign Hl39 PL 60000 x 100 6000000 Unique Binaly Features M 3354 M 1042 A real Source C9 Occludcr 1 2 Penumbra Umbra t we see more shadows in 0 s 7 E i Imual lminm m m w A m a w m w m m n e S n e J k n n e H Y b e 9 a m I Graphics by Phi ippe Bekaert cos I GOSH v15gt11 Ghv y 2 mm iota radiosity Form factor a E E WV E Image Plane direction of projection I I Absorbed transmitted scattered I fluorescence O Miller and Hoffman I984 anme Pang 2 visas 5 35 5 2545 7 Mfer 10 44 10 For wmpnrbnn here are Iwn phnms ofthe moan under the same Illnmtnmion munitions 11 12 E 1 L 0056 dmi 9 13 14 Projector Camera Systems Cameras used to orient one or more projectors in relation to each other and to elements in the world such as projection surfaces Screen I 0 I 9 Computer Projector Multiple Projectors Shadows Muted Note that the projectors are aligned suf ciently so that the overlap between their output appears Visually perfect isplay screen P1 P2 Sukthankan Cham amp Sukthankar CVPR2001 Finding Planes Lines fall in planes Discontinuities fall at plane boundries Identify discontinuities Fit line to discontinuities Photos from Mark Ashdown amp Rahul Sukthankar 7 HP Labs CRL TR 2003701 Line Fitting To Find Calibration Points Fitting Choose a parametric I Three main objectsome objects questions to represent a set of u what object tokens represents this set of tokens best 39 MOSE Intereth Case which of several not local token u can t tell whether a I how many objects are set of points lies on a there line by looking only at each point and the next you could read line for object here or circle or ellipse or Fitting and the Hough Transform as I Purports to answer Different choices of e rgt0 all three questions giVe different hes For any token x y there is f I We explain for lines a one parameter family 0 I One representation Des tErough thIS pomt given y a he 395 the set Of coseX sin eYr0 po39nts XI V SUCh Each point gets to vote for that each line in the family if 05 e X sin e Y there is a line that has lots r O of votes that should be the line passing through the points Vows Tokens r 0 to 155 Theta 45 0785 Fad Theta 010 314 rad r N2 2 0707 Brightest point 20 Votes Mechanics of the Hough tra nsfo rm g i if Construct an array How many lines representing 939 r I count the peaks in the I For each pomt render Hough array the curve 6 r into this array adding one at 39 Who belongs to each cell which line Difficulties I tag the votes I how big should the cells be too big and we I Hardly ever cannot distinguish between quite different Sat39SfaCtory In lines too small and practice because noise causes lines to be probems with noise missed and cell size defeat it Brightest point 2 6 votes Noise Lowers the Peaks 20 I I I I I I I I I 9E 02 8 9 w a 15 H D F Kb 5 s C D E 10 I r fE 2 Ir 4 RE IE 5 I I I I I 1 I I 39 V 0 001 002 0 004 005 003 00 003 009 01 Noise level Noise Increases the Votes in Spurious Accumulator Elements Maximum number of votes i 3 I I I 20 40 60 33 103 12 I 140 183 180 203 Number of noise points Optimizations to the Hough Transform l Noise If the orientation of tokens pixels is known only accumulator elements for lines With that general orientation are voted on Most edge detectors give orientation information Speed The accumulator array can be coarse then repeated in areas of interest at a ner scale Real World Example l Original Edge Detection Found Lines Parameter Space Fitting other objects The Hough transform can be used to t points to any object that can be paramatized eg Circle elipse Objects of arbitrary shape can be parameterized by building an R Table Assumes orientation information for each token is utueg an available z z rms y 11 rsini l R and beta Values are obtained from the ReTable based upon omega orientation Circle Example With no orientation each token point votes for all possible circles With orientation each token can vote for a smaller number of circles es Real Worl Circle Exampl Crosshair indicates results of Hough transform bounding box found Via motion differencing Finding Coins Original Edges note noise Finding Coins Continued 7 f Penny Quarters Finding Coins Continued Note that because the quarters and penny are different sizes a different Hough transform With separate accumulators was used for each circle size Coin nding sample images from Vivik Kwatra Wm Wm q C Gradient O O I r O O O I I r O O Gradient s O O O 10 11 12 cmfelkcvutczl demosFishei Ill Ki 2 Huik hmX39 II2 kl J39zk39 Centroid Mapping from 3D to 8D space for 4 cameras 5dimensional nullspace Rank 3 economy SVD E WE Affine Structure from Motion These notes are meant to clarify the slides which are the primary reference Af ne fundamental matrix Af ne cameras model the case where 0 camera centers at in nity scene far away and depth variation is small all lines to camera are parallel all parallel lines in the scene map to parallel lines in the image ie 3D points at in nity map to 2D points at in nity The general case M11 M12 M13 M14 M21 M22 M23 M24 0 0 0 l The calibrated case R11 R12 R13 t1 R21 R22 R23 t2 0 0 0 1 This assumes that we know the skew and stretching so you end up with rotation matrices Note the t s are the projection of the world origin To nd the epipoles we need to ask where C is projected onto C The af ne epipolar matrix has a special form 00ax x39 y 100by0 cdel lt3 ax by cx dy e 0 There are only 4 degrees of freedom Hence to recover F only 4 correspondences are needed This is not surprising if you have 2 af ne pictures of a building one can take the building itself as a calibration object and we can easily gure out the transformation to the 2nd image RANSAC Suppose we have 2 images as above and want to estimate a transformation from one image to the other The biggest problem when extracting features and matching them based on appearance is the existence of outliers which make the estimation process impossible RANSAC is a way to get rid of these outliers RANSAC is best explained with a simple line tting example I randomly sample minimal needed to estimate a line ie 2 points 2 threshold on error and count inliers 3 pick model with most inliers The number of data points or correspondences needed depends on the degrees of eedom of the transformation and on the algorithm that you have available to estimate it Using SVD we need I For 8DOF homography gt 4 corresponding points I For 4DOF af ne mdamental nlatrix gt gt 4 corresponding points I For 7DOF mdamental matrix gt 8 corresponding points The number of times N we are required to sample to obtain a minimal set with no outliers with p percent probability is Ngtog1pog1wquots Where w l etha the fraction of outliers If etha is increased there is an increase in number of times we are required to sample If we increase the size s of the minimal set then there is a more signi cant nonlinear increase The COOL thing about RANSAC N only depends on the percentage of mismatches etha not the number of data points 56 DOF 11 56 DOF 11 FUNDAMENTALi M Ala FH Fl F13 1 F21 22 F33 uu uu39uuu w39vu39 03971 131 F32 F33 ulu39l 11111 544 11311 713739 Hg mm 7393739 th og 121 pnl MM 7 Mn 2 mum q 11 1111J JLH quot L HQ J39lJ39l AlumAl 23 1H1 J ll2 Bayes risk P2x PUIX i L JqL v X 1 Histograms 2 Hypermbos 3 Gaussiaus For Classi cation we newt PC1 I Uncond wma Dnnsny Pm Cunmd gal Denswsyuplxlca 5 3 Cunm gal Densn ynmxlcu ED 20 30 an an so 70 an 9 Pusleriur Prababmy PKCDIX V RJ 1 30 x 40 Posterior onbabl 4g9FHCulx Nutv39 area 11ndm the 39lll Vi is not 1 Probabilities stlogmm mums 40 so stlogvam 20 3an 40 so stmgram 90 bms non parametric Smoothing parnymder number of bins we can dis ard data afterwards discontinuous boundaries arbitrary d dimensions gt HIquot bins curse of dimensionality o if slu39nim gt 001 a 1fpgtkillilt C 0 39121351139 21gt not skin 0 i139pkium U know lawws uniformly and ell random 39 39uvnug client 039 I Rabat I7 0 lt u39mrz P39mamm m m Helen un Panen Box ha 4d 5b Parzen Bux W 5 40 50 Parien Box h 11011 par2unotric smoothing parameter size of the box neod the data always still discontinuous boundaries data driven a hit 1035 of a curse 10 o ASSIUIII PI39 slowly varying and continuous 0 Q Pquot of landing in a volume V around 1quot I A Pr Pz39V c Q how many out of N sznnples fall into a volunm V 2 0 A E IC PyN s PTl39N 0 Q what is an 39stiinator for Pz39 39 o A 1 PM z Hyperculw kernel Parzen window l 1Llt12 j 1 1 Hui 0 otherwise I 17 Trick Think about only one box not many boxes 11 Parzen 05 29 s so 10 so Pansn n41 294 5 4U 50 Parzenqu 29 Proport o nonpzn39zunotric smoothing pmumrtrr standard deviation 0 nocd the data km 39s a continuous I no boundaries 0 111011 XlX IlSiVC 12 a another classi cation technique 0 much simpler 0 memory based simply keep big instance database 0 form locally weighted average at query point 0 take that as an estimate for PClr Locally Weighted Averaging o Assign a real value fi 1 to positives f739 0 to negatives 0 Form a weighted average P C z 1 39lll 2 Gl39l39 039 13 Very Cool Fact Locally Weighted Averaging E Parzen Window Classi cation Ronu mbm Parzml Vinduw pp1390ximati0ns i Z G39rz39 a39 A eu P z39C39 z 1 v Z G1 139rr ampl PzC39 z A C39 s N E 39Gl 1 f P C z E Al EEGI39HI39039gtJE G1zr a Assign 1 1021 valuv f i 1 to graduate students i 0 t0 1111101 g139a d Thvnz I very cool classi cation tool I memory based a lazy learning 0 global vs local 0 quotan use any old kernel o probabilistic interpretation 0 can be Slow 0 curse of dimensionality m Data Zhglasses 0 A at 3 39Hf 2i n 5 fmg w 5 ggu H m 4 w o w t g 0 Hgt ti g 1 gt 5 1 1V W 5 5 4 w 10 10 0 r15 r10 5 0 5 1D 15 20 A 0 715 A10 75 0 5 10 15 20 LWA Ppsterior 10 amp H 1o 7quot H Wk 3 wz 39N K ar s mfg s t 3 v 5 h N 1 3 J t f w 31 w 5 1 11 4F 5 3 it 3 t 4 r10 r10 0 715 V10 75 D 5 1D 15 20 ll 715 10 75 D 5 10 15 20 a problem with KRzl xed kernel width I sparse gt enlarge volume o dense a dccroasc volqu o idea x K not V o Knearest neighbors Very Cool Fact Corrcrt L lassi ration majority vote of K nearest neighburs 16 Probability and Particle Filters Frank Dellaert X495 Computer Vision Fall 2004 inference estimation Ptx PXiZ xiMAP xiPM Figure l Inference is the process of upgrading a prior distribution Px to a posterior distribution Pxz by conditioning on the observed data z Inference and Estimation In the Bayesian probability framework we encode knowledge about a random variable x by means of a probability distribution Px over the space of possible x values If we now observe some evidence or data z the knowledge we have about x after this observation is captured in the posterior distribution Pxz which by Bayes rule is proportional to Pxz olt LxzPx 1 Here the function Lxz is the likelihood of x given z de ned as any function proportional to Pzx The likelihood measures how likely the parameters x are given that we have seen the data z and is typically derived from a measurement model see below The notation Lx z is used to emphasize that the likelihood is a function of x while the data z is given and xed If you re lost now for a lucid and insightful introduction into these concepts you might want to check out Andrew Moore s tutorials on the web httpwwwZcscmuedu nwmtutorinls especially Probability for Data Miners and Probability Density Functions The process of upgrading the prior distribution Px to a posterior distribution Pxz is called inference and is illustrated schematically in Figure 1 Estimating a variable x is closely related to inference while the posterior Pxz captures all knowledge about x given z we are frequently interested in summarizing this knowledge for example using a single typical value for x Such a value is called a point estimate for x Probabilistic Modeling Inference using Bayes law 1 where we simply multiply a prior Px with a likelihood function Lxz is a special case of inference for which we only have two random variables x and z governed by the simple generative model P x z P 301 zlx For example x could be the position of a robot and z a measurement of distance to the nearest wall It is es sential to have a good model Pzx The behavior of z conditioned on x is often modeled via a measurement model z hx n where R a R is a continuous measurement function and where noise is often a zeromean normally distributed noise with covariance matrix R In that case the conditional density Pzx of z given x is a Gaussian centered on hx 1 1 P zx 7ex if zihx 2 I gt W p 2H ltgtHR The function above is a function of z for a given x and is a multivariate Gaussian in R the measurement space However when the data z is given and hence xed to a certain value we obtain a function in x E R alone the likelihood of x given z Lltxzgt expllzihxgtllfe lt2gt Note that Lxz R a 1R does not in general have Gaussian shape and is not a proper probability function on x In general hx can be very nonlinear and Lxz can be of arbitrary shape Importance Sampling In the case of arbitrary densities inference can be quite challenging to do analytically However in many cases we can still visualize the densities involved and perform approximate inference through the use of sampling Importance sampling is a technique for estimating the expected value of a function of random variable Ef fxpx Let us assume we have a density px that we can evaluate up to a multiplicative constant ie we do not need to know or evaluate the normalizing constant but it is impossible to sample from this density Yet assume there is another density qx from which we can sample easily As with px you only need to know and evaluate qx up to a constant In addition qx needs to be more spread out than px that is the support of qx should contain the whole support of px First we generate R samples x gtf1 from this proposal distribution qx Now consider this if we could actually sample from px then the samples we got from the proposal distribution qx would contain too many samples from regions where qx is larger than px and too few samples from regions where px is larger than qx To correct for this we can simply accord each sample xwfrom qx an importance weight equal to Note that samples where px gt is larger than qx gt get more weight We can use this simple reweighting to approximate the expected value of a function of a random variable E f 36 simply by taking the weighted mean of all of the samples from qx A w 3dr Efxgt Inference through Importance Sampling A particularly easy way to approximate the posterior Pxz olt Pxz for the simple x a z model is to use the prior Px as the proposal density ie 1 Sample 3dr Px 2 Calculate the importance weight 7 Pxrgt7z 7 r w 7 W 7 Pzx ie the importance weight is simply the likelihood The Particle Filter A Bayes lter recursively updates the posterior Pxt21 j of the current state x conditioned on all previous measurements 21 j The marginal posterior Pxt21t is often called the ltering distribution We can derive a recursive expression for the posterior Pxt21 j by rst applying Bayes law Pxtlzltgt ktPZtxtgtPxtlzlt71gt 3 The distribution P0642134 is called the predictive distribution over the state x given all previous mea surements 2131 but excluding the most recent measurement zt It can be obtained from the posterior Pxt1 21 31 at the previous time t 7 1 by marginalizing over xtelz Pltx211gt Pltxx1zl1gt f 6 tr 1 It PX X i1 Pxr71 Z1t71 4 71 where we applied the chain rule and Pxtxt1 is the motion model Particle lters 3 2 1 take an importancesampling approach to deal with both nonlinear measurements and nonlinear system dynamics The basic idea is to approximate the posterior Pxt21t at each time t by means of a weighted sample set N 391 7 WED 39 11 This yields the following Monte Carlo approximation to the posterior PxtZlt N 0 0 P064213 2 wt 5xt7xt gt j1 3 Our goal is now to recursively update this Monte Carlo approximation which can be done through impor tance sampling In particular let us make the inductive assumption that the posterior Pxt1 21 31 at time t 7 1 is approximated in the same way M Poem 21Hgt z 2w 21ltsltxtihx 21gt lt5 i1 Substituting this in the Bayes lter 3 we obtain the following approximate lter A A M i i Pltxtzljgt z Pltxtzljgt ktPltztxtgt ZWHPlterxHgt i1 A particularly easy way to approximate the distribution 113064213 is by means of importance sampling using as the proposal distribution Qxt the empirical predictive distribution I5121 31 A A M a i QW P9421171 ZWHPWIXH 6 i1 Note that ygzl 31 in 6 is a Monte Carlo approximation to the actual predictive distribution 4 based on the approximation 5 In addition 1130642134 has a very intuitive interpretation as a mixture motion model where the mixture coef cients are the exactly sample weights WEI and each mixture component Pxtxt1u1 is the motion model for an individual particle x91 The importance sampling is done by iterating the following procedure N times i 1 Choose a sample mixture component index according to i N wt1 2 Sample from the chosen motion model mixture component xx PM 3 Calculate the importance weight wt which will upgrade the unweighted sample xtm from the empiri cal predictive distribution PxtZm1 to a weighted sample from the approximate posterior I306 IZM Wm ISMHz ktPltztx fgtgt ltx DIzlm 7 A ktPZtxtjgtgt got PotHAM The resulting approximation for the posterior Pxt 21 j is the newly obtained set of weighted samples gjgt7wtltjgt l 6m m 200 300 Ann m0 200 SW 400 t sun sou son 7 um 2m Jan no mo znu 3m awn 100 200 am am 100 200 m 400 9 aIgmaX Z P JU 8 JEJ no need to know formulas b heart I tokens I bottom up I segmentation I I top down segmentation I Not grouped O O O O O O Similarity k K 0 q 0 Common Fate Common Region Parallelism Symmetry Continuity Closure u 7 n lugh tlu sh Background is assumed to be mostly static Each pixel is modeled as by a gaussian distribution in YUV space Model mean is usually updated using a recursive lowpass lter Given new image generate silhouette by marking those pixels that are signi cantly different from the background value 10 2D Head hands localization contour analysis mark extremal points highest curvature or distance from center of body as hand features use skin color model when region ofhand or face is found color model is independent ol esh tone intensity 11 12 I Agglomerative I I Divisive I 13 14 15 17 18 eigenvector 19 20 groups me m an Manual and can be unwed mule llandm nung 5 Wang u me cnrl e pondlng candninn e zamhed mere Rewnle m as 5 2 39 algm39nhm we win uw hxs normalized cm I W w ruym a 1112 pamhon cmenn own 7 Wu E Unfnnuum elv mlmmmngnmmnhzed cm exacllv 1 N17 Mww 7 91 gen ch 3 in m ily mm A gvnvalue D 7D 7 WJDquot n vmmcmc m J r w alwmlhJ Iw upnth mum i Lnuwn m m pmmu wnmmimlc us Hunrc a u m cm lhc wwwth EIgcm39Cdm m 71 and all elgem ednh m 7 me p8 Eulm39 m emh mm m pammm 1 me u comple e even fm39 me e eclal can m gmphz on gum the V prom due m l apadlmm39mu can be mund m Appenle A um Hmvewr we wm lthow um when we embed lhe nnrma 7 me d cm pmblem 111 me real value domain an apprnxlmale gt mexe Mwlxmnn can be mm emaenny 1 Compu ng me opumal Pannign M 2M viwuwnnz v p 39lwndiculm m z mm GivenapamlmnoinodeltotagraphVmnwnwtsAmd 9 y b b M A quotHquot w mwml vmwymn v v I H M 7 l H 173939T D7 w1171nn7 W1 I M 7 1 ye the Wernen mgcnvcchu mm mgcm39ilub r m i Mum and 2 n lay D1 whcre m 391 me verond mulled n L Ms M Aaud 7L mnemse Lemmy z u 1 1 be me w 7 new 7 Wk vigmwe m m m m mum mm m E m aquot mm quotmm Wu H Wm Wm anH a muth m1 alum uw Kayrm411mm le demmm I and d we can mung EMT 7 wk mm wn m A bun real Aymmutru mm Under the um n m 477 7 1 e unhngnnnl to the H mnlatmgmwnun 1 14 bl he quntlenr 1 u minimized by r e m mallesr New1A H 7 lt1 7 1 D 7 Wm mgom39 m e And m mmlmum mluv V m wncspamlmg 1 rmcnmlur A1 721quotD7wl17z A a Ie xHV00b1all 1 01 1 u 7b 11 n m w1 1 m qnuHV m y D m D be an N 4 A x lt1 11 7 M1 r z 1 ltD 7 Wm 217 w 71L quotdr Wq mn b39r e 7 I w be an N x N ymmerwal malnx wuh vu J 7 m M D1 7 1139 D 7 W11 quot Z d Scum y11171 72 n 1 easy m see um W 39 W quot u lgt1 will W 1th 7 Id v gm 4 e y L 7 n lt4 mm me wmnd un lledmgcm39crlm39nl me gcnumh 1 Z Pal valued snhuion 0 run normaL and 1 be an N vector of all nlws mug me an 7 and 7 7 d m m n e n WWW areludlcalm vmmgrom gt and 1 ltnepenve1v 27e mlminn m mu 1ng I vam um Hw mm we can rewnlc Juwmz as rn eew lwc n3950 yTDv7Zd 7h22d1 nnln Mummm muwm rut luun m 7 7 7 mnmnaucmlly smi ed m 1m 10L g u c mumm 15 391 win WW7 H2 whatmakmlhhoplnmzammpmhlumlmrmbiennhehr L7 Wm 2quot I 2 pm W W Amw n we aim 1m mum quot quot qrhmnnmn m munmnnn mm a hrm quotZU39AH ZW mmlm gumant can aw Lw mad In hnw um me D1 iquot mwmm mm me uum mane e ew I m 71rl D1 l umngevervlhmgngmhcrwehave quotw n hm 1 1u 39 Ulmmnjutm mm z D 7 W112 mu urbdnnh uu Mmg mph mch mm H b r 7 e ea u39 he nexl emaueex ewem39a e new 71 71me 7 y D WW I inem mm m n lt l U quotquotquotx39 quot zquotquotm 71TD1 Vquot v in pmrhrhlwcml oIwappmxlmnnnnOrrnl39h39nm w 11 Diwllv gt mhumn u hv mm w r mluml nluhnn and mm W mndmm v lt v 439 and v D mm e M mudm39 mm and all men Me 2 meme exwewm 1 me having winner ednH hme m wlMy a glam mutual mlhngmmhly M 1390 11 My n nemer m mke on real mm we can mmnmza mman whmmK bawd nu mth mgom e Keven gt Nul ltvV vyem39lul39 weranhenfurtheraxpandmeabm39e equaum a Wyml mv he MMZL E F quot 3 quot 3 m39 Km f L 7 M m 391 quotquot pmhlmn Am mh umqu mmmmnv 1 7quot 7217 2A2 D 7 WW 7 my 6 r v v 10 n e um whik 11w Lnmm wanm m 1 gt1 m nnlyappmxlmates m o hmal quot0139an 7 Hmvcvcr we have mo rnmumm on 3 which Cnme mm E EE WWW 1l 17040 7 2mm 204137 a 21yz 2 39 quot J M an iv 9mm Imnnmnm llw 4011mm S V we m7ml u 39r39 quot U h conltid2r he consumm y39m7n We can thv m 7 y me snluhnn n we gel me genexallzed egemyqem We mu a m bv rm 22 Often are more interested in relative spectral composition than in overall intensity so the spectral BRDF computation simpli es a wavelength by wavelength multiplication of relative energies c s0 60 Re ectance 0 Relative energy V 0 13 50 m me 500 on 700 sou am 700 my mm wm clcngmlnmb 39nvelcnglhvnm Foundations of Vision b Brian Vandell Sinauer Anson 1995 14 g 0 34 w gt 5 cu a 2 0 0 I 4 l l I 1 1 39 400 500 600 700 400 500 600 700 Wavelength nm vaxamanw a mule new yalluw mam kw M 7 I umge Hmvsr 1 I MH wannabe3W Mm mm m Spectral albedocs for several different leaves with color names a ac ed Notice that different colours typically have di brent spectral albedo but that r a m E 5 a color compare the two whites Spectral albedoes are typically quite smooth functions Measurements by EKoivisto

### BOOM! Enjoy Your Free Notes!

We've added these Notes to your profile, click here to view them now.

### You're already Subscribed!

Looks like you've already subscribed to StudySoup, you won't need to purchase another subscription to get this material. To access this material simply click 'View Full Document'

## Why people love StudySoup

#### "I was shooting for a perfect 4.0 GPA this semester. Having StudySoup as a study aid was critical to helping me achieve my goal...and I nailed it!"

#### "I made $350 in just two days after posting my first study guide."

#### "I was shooting for a perfect 4.0 GPA this semester. Having StudySoup as a study aid was critical to helping me achieve my goal...and I nailed it!"

#### "Their 'Elite Notetakers' are making over $1,200/month in sales by creating high quality content that helps their classmates in a time of need."

### Refund Policy

#### STUDYSOUP CANCELLATION POLICY

All subscriptions to StudySoup are paid in full at the time of subscribing. To change your credit card information or to cancel your subscription, go to "Edit Settings". All credit card information will be available there. If you should decide to cancel your subscription, it will continue to be valid until the next payment period, as all payments for the current period were made in advance. For special circumstances, please email support@studysoup.com

#### STUDYSOUP REFUND POLICY

StudySoup has more than 1 million course-specific study resources to help students study smarter. If you’re having trouble finding what you’re looking for, our customer support team can help you find what you need! Feel free to contact them here: support@studysoup.com

Recurring Subscriptions: If you have canceled your recurring subscription on the day of renewal and have not downloaded any documents, you may request a refund by submitting an email to support@studysoup.com

Satisfaction Guarantee: If you’re not satisfied with your subscription, you can contact us for further help. Contact must be made within 3 business days of your subscription purchase and your refund request will be subject for review.

Please Note: Refunds can never be provided more than 30 days after the initial purchase date regardless of your activity on the site.