New User Special Price Expires in

Let's log you in.

Sign in with Facebook


Don't have a StudySoup account? Create one here!


Create a StudySoup account

Be part of our community, it's free to join!

Sign up with Facebook


Create your account
By creating an account you agree to StudySoup's terms and conditions and privacy policy

Already have a StudySoup account? Login here


by: Whitney Lakin


Whitney Lakin
GPA 3.84


Almost Ready


These notes were just uploaded, and will be ready to view shortly.

Purchase these notes here, or revisit this page.

Either way, we'll remind you when they're ready :)

Preview These Notes for FREE

Get a free preview of these Notes, just enter your email below.

Unlock Preview
Unlock Preview

Preview these materials now for free

Why put in your email? Get access to more of this material and other relevant free materials for your school

View Preview

About this Document

Class Notes
25 ?




Popular in Course

Popular in Humanities

This 5 page Class Notes was uploaded by Whitney Lakin on Saturday September 12, 2015. The Class Notes belongs to Human 24 at University of California - Irvine taught by Staff in Fall. Since its upload, it has received 46 views. For similar materials see /class/201948/human-24-university-of-california-irvine in Humanities at University of California - Irvine.

Similar to Human 24 at UCI




Report this Material


What is Karma?


Karma is the currency of StudySoup.

You can buy or earn more Karma at anytime and redeem it for class notes, study guides, flashcards, and more!

Date Created: 09/12/15
Review of Gmeml Psychology ZOO3V01 7No 2 1837188 Copyright 2003 by the Educational Publishing Foundation 1089726800312 00 D01 101037108972680 7 2 183 Whatever Happened to Information Theory in Psychology R Duncan Luce University of California Irvine Although Shannon s information theory is alive and well in a number of elds after an initial fad in psychology during the 19505 and 1960s it no longer is much of a factor beyond the word bit in psychological theory The author discusses what seems to him and others to be the root causes of an actual incompatibility between information theory and the psychological phenomena to which it has been applied Claude Shannon the creator of information theory or communication theory as he preferred to call it died on February 24 2001 at age 84 So I would like to dedicate this brief piece to his memory and in particular to recall his sem inal contribution A Mathematical Theory of Communication which was published in two parts in the Bell System Technical Journal in 1948 and rendered more accessible in the short monograph by Shannon and Weaver 1949 Let me begin by saying that information the ory is alive and well in biology engineering physics and statistics although my conclusion is that for quite good reasons it has had little longrange impact in psychology One rarely sees Shannon s information theory in contem porary psychology articles except to the extent of the late John W Tukey s term bit which is now a permanent word of our vocabulary Ifwe look at the table of contents of I Skilling s 1989 Maximum Entropy and Bayesian Meth ods we nd the pattern of chapters on applica tions shown in Table 1 You will note none are in psychology Because it is doubtful that many young psy chologists are learning the subject a few words are necessary to set the stage As mathematical expositor extraordinaire Keith Devlin 2001 p 21 stated Shannon s theory does not deal This article was prepared at the instigation of Alan Bo neau as a contribution to a Division 1 session at the 2001 Annual Meeting ofthe American Psychological Association in San Francisco I have bene ted from interchanges on the topic with Peter Killeen and Donald Larning Correspondence concerning this article should be ad dressed to R Duncan Luce School of Social Sciences Social Science Plaza 2133 University of California Irvine California 926975100 Email rdluceuciedu with infor m ation as that word is generally understood Instead it deals with dataithe raw material out of which information is obtained Now that makes it sound akin to what we norm ally think to be the role of statistics which is correct It begins with an abstract nite set of elements and a probability distribution over it Let the elements of the set be identi ed with the rstm integers and letpi where i 1 m be the probabilities that are assumed to cover all possibilities that is 2 pi 1 Uncertainty is a summary number that is a function of the probabilities that is Upi Up1 pm What function Shannon s Measure of Information To get at that Shannon imposed a number of plausible properties that he believed such a number should satisfy Others subsequently proposed alternatives among them Acze l and Daroczy 1975 Acze l Forte and Ng 1974 and Luce 1960 Probably the best alternative was that offered by Acze l et a1 1974 which with less than total precision was as follows 1 The labeling of the elements is totally immaterial all that counts is the set of m probabilities 2 For m 2 Ul 1 This de nes the unit of uncertainty the bit 4 Up1pm 0 Up1pm 5 If P and Q are two distributions and P Q denotes their joint distribution UP Q S UP UQ with holding if P and Q are independent The mathematical conclusion is that 183 184 Table 1 Pattern of Application Topics in Skilling s 1989 Table of Contents Topic No of articles Thermodynamics and quantum mechanics 5 Physical measurement Crystallography Time series power spectrum E a a E n E 5 1quot a e a m usuau Statistical mdamentals Upi r E 11039 logzp l 1 Maximal uncertainty about what will be se lected occurs when all probabilities are equally likely that is pl ln in which case Ul n n Uncertainty measured in this way is thus a sin le number associated with a nite probabil ity distribution In some physical contexts U is identi able with thermodynamic entropy and some authors call it that and use the physical notationH rather than U Other authors use I for information This measure is to be distinguished sharply from familiar statistics such as the mean and variance which are associated not with a prob ability distribution over a set of elements but with a random variable that maps the set into numbers and in the process introduces an or dering of the elements not available in Shan non s context The role of information transmission is to reduce uncertainty Suppose 171 is the distri bution before information is transmitted and 17039 denotes the probability of the joint state of i being transmitted and j received The condi tional probability is de ned in the usual way as 7pm mm 7 W p0 gt 0 The posterior uncertainty Upfli can be shown to be the original uncertainty less the transmitted information that is Upfli Upi 7 Upij This can be thought of as the posterior uncertainty LUCE There clearly is at least a conceptual relation to Bayes s theorem which was really worked out in the form we know it by P S LaPlace ISM1820 In fact the relation is far more than just conceptual and has been well devel oped for a variety of statistical concepts in many papers and books I return to this linkage later Shannon s theory then went on to consider the limitations of channels to transmit informa tion for which he de ned a concept of channel capacity and given a noisy transmission line the often quite elaborate coding necessary to achieve neartoperfect transmission As noted earlier Shannon strongly preferred the term communication theory to information theory but in psychology at least information became the standard term Introduction of Information Theory in Psychology During graduate school at the Massachusetts Institute of Technology and during a brief stint 195071953 at the Research Laboratory for Electronics RLE I was surrounded by a urry of ideas that had matured in research laborato ries during World War II and shortly thereafter and that have in fact all played a signi cant scienti c role since then They were as follows information theory feedback and cybernetics networks of various sorts and automata theory the beginnings of arti cial intelligence and of both analogue and digital computers Bayesian statistics and the theory of signal detectability Chomsky s developing theory of linguistics and game and utility theory Some psycholo gists had been close to one or another of these developments and so different ones latched onto different ideas A substantial group of psychologists and en gineers at RLE focused on information theory and held a very well attended weekly seminar on the topic which among its incidental con sequences provided me with an education on the topic Later in 1954 some of these and related developments were reported at the Conference on the Estimation of Information Flow organized by Henry Quastler and held in Monticello Illinois The conference was surn marized in Quastler s 1955 edited volume In formation Theory in Psychology INFORMATION THEORY 185 Perhaps for psychologists the two de ning articles to come out of those meetings were the late William J McGill s 1954 Multivariate Information Transmission and George Miller s 1956 The Magical Number Seven Plus or lIinus Two Some Limits on Our Capacity for Processing Information The latter article ad dressed several phenomena that seem to exhibit information capacity limitations including ab solute judgments of unidimensional and multi dimensional stimuli and shortterm memory Comprehensive but quite different summaries of the ideas and experiments were given by Attneave 1959 and Garner 1962 Later arti cles expanded the interest to the relation be tween mean response times and the uncertainty of the stimuli to which participants were re sponding In early experiments mean response time appeared to grow linearly with uncertainty but glitches soon became evident The deepest push in the response time direction was Donald Laming s 1968 subtle Information Theory of ChoiceReaction Times although he later stated This idea does not work While my own data 1968 might suggest otherwise there are further unpublished results that show it to be hopeless Laming 2001 p 642 The enthusiasm nay faddishnessiof the times is hard to capture now Many felt that a very deep truth of the mind had been uncovered Yet Shannon was skeptical He is quoted as saying Information theory has perhaps bal looned to an importance beyond its actual ac complishments as cited in Johnson 2001 And Myron Tribus 1979 p I wrote In 1961 Professor Shannon in a private conversation made it quite clear to me that he considered applications of his work to problems outside of communication theory to be suspect and he did not attach fundamental signi cance to them These skeptical views strike me as appropriate Why Limited Application in Psychology The question remains Why is information theory not very applicable to psychological problems despite apparent similarities of con cepts Laming 2001 provided a very detailed critiqueifar more speci c than this oneiof a variety of attempts to use information in par ticular the appealing concept of channel capac ity He pointed out p 639 however that Shannon s way of de ning the concept requires that not individual signals be transmitted but rather very long strings of them so as to be rid of redundancies That is rarely possible within psychological experiments This is de nitely part of the answer But in my opinion the most important an swer lies in the following incompatibility be tween psychology and information theory The elements of choice in information theory are absolutely neutral and lack any internal struc ture39 the probabilities are on a pure unstruc tured set whose elements are functionally inter changeable That is ne for a communication engineer who is totally unconcerned with the signals communicated over a transmission link39 interchanging the encoding matters not at all By and large however the stimuli of psycho logical experiments are to some degree struc tured and so in a fundamental way they are not in any sense interchangeable If one is doing an absolute judgment experiment of pure tones that vary in intensity or frequency the stimuli have a powerful and relevant metric structure namely differences or ratios of intensity and frequency measures between pairs of stimuli And that structure has been shown to matter greatly in the following sense Substantial se quential effects exist between a stimulus and at least the immediately preceding stimulusire sponse pair but with the magnitude of the cor relation dropping from close to one for small signal separation in either decibels or frequency to about zero for large separations Green Luce amp Duncan 197739 Luce Green amp Weber 1976 Similarly if one does a memory test one has to go to very great pains to avoid associations among the stimuli Stimulus similarity al though still ill understood and under active in vestigation is a powerful structural aspect of psychology Gradually as the importance of this reality began to set in one saw fewerialthough still a fewiattempts to understand global psycholog ical phenomena in simple information theory terms Of course the word information has been almost seamlessly transformed into the concept of informationprocessing models in which information theory per se plays no role The idea of the mind being an informationprocess ing network with capacity limitations has stayed with us but in far more complex ways than pure 1 information theory Much theorizing in cogni tive psychology is of this type now being more or less well augmented by brain imaging techniques Before going on let me note that this incom patibility between either a summary measure of a probability distribution or the distribution it self and the psychological structure of sets of related stimuli is an issue not only for informa tion theory The unstructured elements of prob ability theory and the structure of psychological stimuli have made it very dif cult indeed to add in a principled fashion probabilistic as pects to models of behavior of any complexity at all The melding for example of utility the ory and randomness has proved to be as elusive as it is important Information Theory and Statistics As noted earlier there is a close linkage between Shannon s information measures and statistics that has been very well explored Di amond 195939 Jaynes 1979 1986 198839 Jus tice 198639 Levine amp Tribus 197939 Mathai 197539 Skilling 198939 Zellner 1988 As Jaynes 1988 p 281 remarked commenting on Zell ner s linking the information measure directly to Bayes s theorem entropy has been a recog nized part of probability theory since the work of Shannon 40 years ago But now we see that there is after all a close connection be tween entropy and Bayes theorem Laming 2001 also emphasized the tight connection But in contrast to their wide use in econom ics Bayesian statistics have not yet been a roar ing success in psychology despite many years of being promoted notably by Ward Edwards For more than 40 years he has run an annual F eb ruary Bayesian conference that in recent years has been held at the Sportsman Lodge in Studio City California The major place where Bayes ian ideas have owered quite well in psychol ogy is in decision theory in which there is quite a natural melding of utility theory ideas with Bayesian updating of the probabilities underly ing subjective expected utility But as an infer ential engine in psychology proper which con tinues to be haunted by the hypothesis testing paradigm of agriculture and clinical trials Bayesian methods have so far played very little LUCE role This may well be unfortunate given their successes in other e s There is one notable exception to what 1 have just stated about the incompatibility of psycho logical structure and information theory This is when the number of stimuli m in information theory and the number of hypotheses being con sidered in a Bayesian analysis is two In that case the role of stimulus structure largely dis appears and one can sometimes get away with treating the two stimuli as without structure beyond probabilities of choice Here Bayesian ideas have played a major role in the widely used theory of signal detectability Egan 197539 Green amp Swets 196639 Macmillan amp Creelman 199139 Swets 1964 Recall that one has among the variables a probability distribution over sig nal presentations ie p and 1 7 7 one has two conditional probabilities of response one for each of the possible stimuli39 and one has payoffs that can be modeled by the simplest of utility theories The main role of the theory is to pro vide a decomposition of the process into two parts one attributable to inherent sensory prop erties and the other to decisionmaking criteria This is similar in some ways to the information theory decomposition So it comes as no great surprise that people have approached the prob lem from the Bayesian perspective in the form of classical signal detectability theory and from that of information theory and have attempted to relate the two Perhaps the most elaborate effort in this di rection has been that of Kenneth H Norwich 1993 who summarized his approach in Infor mation Sensation and Perception Although he states his hypothesis quite generally that subj ec tive sensation is basically the measure of uncer tainty reduction that occurs when the stimulus is presented his detailed explorations are largely con ned to binary situations wherein his style closely resembles that of classical 19thcentury physical thermodynamics A more recent com parison of signal detection and information the ories is a manuscript authored by Peter Killeen and Thomas J Taylor 2001 titled Bits of the ROC Signal Detection as Information Transmission And as 1 suggested earlier most decision theory in which Bayesian ideas play a role involves binary decisions based on binary data with distinct sources of information being dealt with independently Laming 2001 pp INFORMATION THEORY 4627463 pointed out however that careful analyses of the criterion in signal detection show systematic shifts depending on previous outcomes which rules out any simple use of information or Bayesian theory in this context indeed simple probability matching of re sponses to presentations is far more predictive Conclusions Laming 2001 despite his detailed critique of almost all attempts to apply information the ory in psychology ended on a surprisingly op timistic note He wrote Information theory provides as it were a nonparametric tech nique for the investigation of all kinds of sys tems without the need to understand the ma chinery to model the brain without modelling the neural responses Laming 2001 p 645 I am not so optimistic The fact is that gen eralizations of either Bayesian or information theory ideas to situations with more than two hypotheses or stimuli remain unful lled and my conjecture is that this is not likely to change soon This is my brief answer to the question Whatever happened to information theory in psychology References Aczel J amp Daroczy Z 1975 On measures of information and their characterizations New York Academic Press Aczel J Forte B amp Ng C T 1974 Why the Shannon and Hartley entropies are natural Ad vances in Applied Probability 6 1317146 Attneave F 1959 Applications ofinformation the ory to psychology New York He Devlin K 2001 Claude Shannon 191672001110 cus The Newsletter of the Mathematical Associa tion ofAmerica 21 20721 Diamond S 1959 Information and error An in troduction to statistical analysis New York Basic Books Egan J P 1975 Signal detection theory andROC analysis New York Academic Press Garner W R 1962 Uncertainty and structure as psychological concepts New York Wiley Green D M Luce R D amp Duncan J E 1977 Variability and sequential effects in magnitude production and estimation of auditory intensity Perception ampPsychophysics 22 4507456 Green D M amp Swets J 1966 Signal detection theory and psychophysics New York Wiley 187 Jaynes E T 1979 Where do we stand on maxi mum entropy In R D Levine amp M Tribus Eds The maximum entropy formalism pp 157118 Cambridge MA MIT Press Jaynes E T 1986 Bayesian methods An intro ductory tutorial In J H Justice Ed Maximum entropy and Bayesian methods in applied statistics pp 1725 Cambridge England Cambridge Uni Jaynes E T 1988 Discussion American Statisti cian 42 2807281 Johnson G 2001 February 27 Claude Shannon mathematician dies at 84 obituary New York imes p B Justice J H 1986 Maximum entropy and Bayesian methods in applied statistics Cambridge England Cambridge University Press Killeen P R amp Taylor T J 2001 Bits of the ROC Signal detection as information transmis sion Unpublished manuscri Laming D R J 1968 Information theory of choicereaction times New York Academic Press Laming D 2001 Statistical information uncer tainty and Bayes theorem Some applications in experimental psychology In S Benferhat amp P Besnard Eds Symbolic and quantitative ap proaches to reasoning with uncertainty pp 6357 646 Berlin SpringerVerlag LaPlace P S 1820 Theorie analytique desproba biliti s Analytic theory of probability 2 vols39 3rd ed with supplements Paris Courcier Orig inal published 1812 reprints available from Edi tions Culture et Civilisation 115 Avenue Gabriel Lebron 1160 Brussels Belgium Levine R D amp Tribus M Eds 1979 The maximum entropy formalism Cambridge MA MIT Press psychology pp 57119 Glencoe IL Free Press Luce R D Green D M amp Weber D L 1976 Attention bands in absolute identi cation Percep tion amp Psychophysics 20 49754 Macmillan N A amp Creelrnan C D 1991 Detec tion theory A user s guide Cambridge England Cambridge University Press Mathai A M 1975 Basic concepts in information theory and statistics Axiomatic foundations and applications New York Wile McGill W J 1954 Multivariate information trans mission Psychometrika 19 977116 Miller G A 1956 The magical number seven plus minus two Some limits on our capacity for processing information Psychological Review 63 81797


Buy Material

Are you sure you want to buy this material for

25 Karma

Buy Material

BOOM! Enjoy Your Free Notes!

We've added these Notes to your profile, click here to view them now.


You're already Subscribed!

Looks like you've already subscribed to StudySoup, you won't need to purchase another subscription to get this material. To access this material simply click 'View Full Document'

Why people love StudySoup

Jim McGreen Ohio University

"Knowing I can count on the Elite Notetaker in my class allows me to focus on what the professor is saying instead of just scribbling notes the whole time and falling behind."

Anthony Lee UC Santa Barbara

"I bought an awesome study guide, which helped me get an A in my Math 34B class this quarter!"

Steve Martinelli UC Los Angeles

"There's no way I would have passed my Organic Chemistry class this semester without the notes and study guides I got from StudySoup."

Parker Thompson 500 Startups

"It's a great way for students to improve their educational experience and it seemed like a product that everybody wants, so all the people participating are winning."

Become an Elite Notetaker and start selling your notes online!

Refund Policy


All subscriptions to StudySoup are paid in full at the time of subscribing. To change your credit card information or to cancel your subscription, go to "Edit Settings". All credit card information will be available there. If you should decide to cancel your subscription, it will continue to be valid until the next payment period, as all payments for the current period were made in advance. For special circumstances, please email


StudySoup has more than 1 million course-specific study resources to help students study smarter. If you’re having trouble finding what you’re looking for, our customer support team can help you find what you need! Feel free to contact them here:

Recurring Subscriptions: If you have canceled your recurring subscription on the day of renewal and have not downloaded any documents, you may request a refund by submitting an email to

Satisfaction Guarantee: If you’re not satisfied with your subscription, you can contact us for further help. Contact must be made within 3 business days of your subscription purchase and your refund request will be subject for review.

Please Note: Refunds can never be provided more than 30 days after the initial purchase date regardless of your activity on the site.