×
Get Full Access to UCLA - HHS 120 - Class Notes - Week 10
Get Full Access to UCLA - HHS 120 - Class Notes - Week 10

×

UCLA / Health and Human Services / HHS 120 / Refers to how much time does it take for the wave to start and repeat

Refers to how much time does it take for the wave to start and repeat Description

Description: These notes cover all of the lecture material in Week 10
7 Pages 44 Views 3 Unlocks
Reviews

Week 10, Day 1

Refers to how much time does it take for the wave to start and repeat itself?

The Mathematical Description of Sound

Basic Unit: Sine Wave

- Cycle: How much time it takes for the wave to start and repeat itself - Frequency: measured in Hertz

- Increasing the frequency creates a shorter wave cycle and more cycles per second

- Amplitude: amount of displacement from the baseline (0)

- Amount of energy that the wave contains in any one particular cycle - Phase: where in the cycle the wave is at time 0

- 2 Waves may have the same amplitude and wavelength, but are at different places at time 0

Simple and Complex Sounds

What is measured in hertz?

- Simple Sounds:

- Sine waves are rare to encounter because not many vibrations in the world are so pure

- Complex sounds:

- Most sounds in the world

- (ex) voices, birds, cars

- All sound waves can be described as a combo of sine waves

Auditory Pathway

- Machinery that helps transform these sine waves into something we can process

Structure of the Human Ear

- Pinna: properties for locating/funneling sound

- Middle ear: contains 3 tiny bones that vibrate against each other

- Tympanic Membrane

- Oval Window

What is the meaning of amplitude?

We also discuss several other topics like How does culture affect communication?

- Round Window

- Amplification provided by ossicles = essential for hearing faint sounds - Inner ear made up of a collection of fluid-filled chambers

- 3 tiny bones push on the inner ear creating waves of pressure in the fluid - Fine changes in sound pressure then translated into neural signals

- Cochlear canals and Membranes

- Cochlea: spiral, snail-like structure of the inner ear containing the organ of Corti; filled with watery fluids in 3 parallel canals If you want to learn more check out What is a kill shelter?

- Basilar Membrane: plays a crucial role in hearing

- Stereocilia rooted here and sandwiched between B membrane and Tectorial Membrane which it brushes against in response to fluid

- Efferent fibers: adjust how response the organ of corti is

- Inner hair cells: convey almost all info about sound waves to brain

- Outer hair cells: convey info from the brain by efferent fibers - elaborate feedback system

- Firing of auditory nerve fibers into pattern of neural activity finally completes process of translating sound waves into patterns of neural activity

- Coding of amplitude and frequency in the cochlea

- Place code: tuning of diff. Parts of cochlea to diff. Frequencies (particular frequency of incoming sound wave is coded by place within cochlea)

- Cochlea tunes to different frequencies

- Low freq. Activate the apex more where base responds to more high frequencies

Psychological Dimensions of Hearing

- Loudness = sound pressure If you want to learn more check out What are the three basic concepts of american privacy?

- Pitch is related to sound frequency (physical)

- Human sensitivity to freq. Ranges from 20-20,000 Hz

- Children have greatest range; loss of high freq with age, especially with people exposed to loud noises

- Effects are delayed - hearing loss shows up in 40’s and 50’s

- Auditory system interested in speech If you want to learn more check out What refers to the detection of information in the environment and transmission of it to the brain?

Auditory Space Perception

- Ability to localize sound; helps us determine direction of sound in regards to us - Multiple ways direction is computed by auditory system - relates to certain differences among sounds and situations - perceive through a combo of

processes

- Space info depends on both of our ears (stereoscopic) - work as a system together

Interaural Time Difference (ITD)

- Difference in time between sound arriving in one ear versus another → onset of sound can be used to localize it If you want to learn more check out Define reinforcing successive approximations of a target behavior.

- Head acts as a barrier

- Azimuth

- Imaginary circle that extends around us in a horizontal plane

- For maximum possible ITD: sound source would need to be directly to the right or left (90 degree angle)

- For minimum possible ITD: sound source directly in front or back of us (0 degree or 180 degree angle)

- Intermediate locations: can be ambiguous as to where sound is

Phase differences

- arriving at both ears can be used for localization

- If difference is small, relative to wavelength, it can help localize sound - Tone w/ higher freq. Makes it harder to localize; low frequency is less ambiguous

Interaural Level Difference (ILD)

- Difference in intensity in one ear versus the other

- Sounds are more intense at the ear closer to sound source

- Largest at 90 and -90 degrees; nonexistent and 180 and 0 degrees

- Correlates with angle of sound source (not as big of a correlation as ITD) - Works best with high frequency sounds

Combination of Phase differences and ILD

- Allows us to localize most sounds

- Below 1,000 Hz: make more errors using ILD Don't forget about the age old question of Does volume affect speed of gas particle?

- Above 1,000 Hz: make more errors using ITD

Cone of Confusion

- A set of points in space that produce identical onset (phase or intensity differences) due to symmetries of being in front/behind/above/below the head - It’s difficult to distinguish these 2 sounds due to time differences

- Head movements can be used to resolve this

- Shape of pinnae helps also resolve ambiguities in sound location,

primarily for high-freq. Sounds

- Funnels certain sound freq. better than others but the intensity of

each freq. Varies depending on the direction of the sound

Auditory Distance Perception

- Simplest cue: relative intensity of sound

- How loud a sound is

- Inverse-square Law:

- As you get further away from the source, the intensity of the sound decreases squared time as fast

- Spectral Composition of Sounds:

- Higher freq. Lose energy faster than low. Freq. As sound travels to your ears (based on how much disappeared, you can say how far away a source is) - Direct vs. Reverberant Energy

- Energy of sound goes directly to you vs. bouncing off other aspects to you Week 10, Day 2

Complex Sounds

- Harmonics (physical characteristic)

- Fundamental Frequency: Lowest multiple of those fourier transform-- lowest freq. Of harmonic spectrum

- Auditory system sensitive to relationships between harmonics

- Missing Fundamental Effect

- Even if you take out the fundamental freq., we are still able to

determine what the lowest freq. Is

- Perception isn’t altered in receiving the pitch (even if only a few

other harmonics are available)

- When various frequencies are combined, the peak of their combination = the fundamental frequency (their common denominator)

- Timbre (psychological sensation)

- A listener can judge that 2 sounds have the same loudness but dissimilar pitch - conveyed by harmonics

- We are sensitive to the pattern of harmonics

- Sound Onsets (Attacks)

- Violin (pluck) vs. Violin (bow)

- Natural environments usually involve multiple sound sources

- Cocktail Party Problem

- All the source’s waveforms get summed into one waveform by the eardrum

- Solution = Source Segregation

- Brain has to decompose combo of sounds and infer the

different sources

- Spatial separation between sounds coming from the same

location)

- Separation on basis of sounds’ spectral or temporal

qualities (similar harmonics/sounds)

- Auditory stream segregation: perceptual organization of a

complex acoustic signal into separate auditory events for

which each stream is heard as a separate event

- Similar to gestalt principles: color denotes sound

seg., similarity principles help with grouping

streams, continuity (“pop” out), timbre

- Grouping by onset

- Harmonics of speech sound/music

- Grouping different harmonics into a single complex

tone

- It’s much easier to distinguish two notes from one

another when onset of one precedes onset of other

by very short time

- Gestalt law of common fate

Continuity and Restoration Effects

- Principle of good continuation: in spite of interruptions, one can still “hear” sound as continuous

- Experiments that use signal detection task suggest that at some point restored missing sounds are encoded in brain as if they were actually present

- Restoration of complex sound (music, speech)

- Higher-order sources of info, not just auditory info

- Brain fills in what is missing (like vision)

Crossmodal Interactions in Perception

- Multisensory perception: brain has evolved to process input from different modalities all at once

- Compartmental Brain: different modalities have different brain areas and different compartments in those areas for specific functions

- Pathways involved in processing

- Somatosensory

- Visual (Retina, LGN, V1, V2, V4)

- Auditory (Ear, cochlear n., IC, MGN, A1)

- Modular - each area does its own thing and they don’t communicate to each other (color, motion, etc.)

- Bottom-up (hierarchical [v1, v2, v3, etc])

- Vision is the dominant modality

- Integrative network of all kinds of interaction of brain areas and functions - Non-hierarchical - feedback and feed-forward projections

- Multisensory - vision isn’t dominant; processing is multisensory (what dominates depends on the task/stimulus)

Vision Alters Other Modalities

- People think vision is the dominant sensory modality because:

- When conflict between modalities:

- Vision dominates

- We have come to rely on vision more and more

- (ex) Ventriloquism Illusion

- Movement of puppet’s mouth (visual info) and sound from

puppeteer (no visual info, only auditory info)

- Causes us to perceive sound coming from the puppet -

location of sounds captured by visual event

- (ex) Visual Capture

- Artificially shifted image causes our bimodal perception to

believe we are where our vision shows us and not where

we feel we actually are

- Spatial conflict between vision and proprioception (info we

get from out muscles)

- Vision captures body position and dominates touch

- Vision modifies speech perception

- (ex) McGurk Effect

- “Ba” (hear) + “Ga” (see) = we think “Da”

- Visual processing influences speech perception

- (ex) Pluck and Bow

- Musical sound never changes - either paired with plucking image

or bow image

- When presented with bow, rated as lower

- Visual info influences the quality of sound perceived

- Vision influenced by other modalities

- Change in temporal characteristics

- Enhancement of spatial attention by auditory stimuli

- Visual intensity enhancement

- Disambiguate direction of motion

- Stream-bounce Illusion

- Visual stimuli identical (two blue dots crossing)

- When click is inputted (auditory), when the 2 dots coincide), gives

the illusion that they bounce off each other

- Change the quality of the perception of motion (visual modality) by

inserting sound

- Effect is robust to many stimuli… doesn’t weaken even with

trial-by-trial feedback

- Vision isn't dominant and strongly influenced by other modalities like sound - Likely multisensory modalities that interact with no specific order (emerging view)

fMRI: Visual vs. Visual-auditory

- When a visual stimulus is paired with sounds, activity in primary visual cortex is increased

- When they perceive the illusion (A+V), activity in V1 is much higher than when they don’t see the illusion (V)-- illusion correlated with enhanced activity in V1

- If the number of flashes is lower, so is activity in V1… activity in v1 mirrors the number of flashes we see

- Illusion correlated with decreased activity in V1

- Not based on modulation of attention, but the way our auditory and visual brain areas interact (AV integration, not attention in V1)

- Suggest interaction between auditory and visual areas

Summary

- Sound can modulate activity in visual cortical areas.. Primary cortex is communicating with other modalities

- Multisensory integration occurs in “sensory-specific” areas

- Happens as early as V1 processing

- Host of studies using different techniques (fMRI) have shown: cross-connectivity across sensory cortices and crossmodal interactions in a number of functions

Benefits of Multisensory Processing

- Improved Accuracy

- Improved Precision

- Improved Reaction Time

- More complete info

Page Expired
It looks like your free minutes have expired! Lucky for you we have all the content you need, just sign up here
References: