cognitive science - information theory

0.0(0)
Studied by 0 people
call kaiCall Kai
learnLearn
examPractice Test
spaced repetitionSpaced Repetition
heart puzzleMatch
flashcardsFlashcards
GameKnowt Play
Card Sorting

1/45

encourage image

There's no tags or description

Looks like no tags are added yet.

Last updated 10:05 PM on 4/16/26
Name
Mastery
Learn
Test
Matching
Spaced
Call with Kai

No analytics yet

Send a link to your students to track their progress

46 Terms

1
New cards

who revolutionized the way we think about and quantify information?

Claude Shannon

2
New cards

what is information theory?

the field of study of the way we think about and quantify information

3
New cards

what term did Claude Shannon coin?

bit

4
New cards

what is a bit?

a way of quantifying information regardless of its form

5
New cards

what does Shannon say information is?

the resolution of uncertainty

6
New cards

do surprising or ordinary events contain more info?

surprising events

7
New cards

how much info do we receive when we are told that the outcome of a fair coin flip (P(heads) = 0.5) is heads?

1 bit

8
New cards

how much info do we receive when we are told that the outcome of a unfair coin (P(heads) = 1) is heads?

0 bits

9
New cards

how much info do we receive when we are told the outcome of an unfair coin flip (P(heads) = 0.9) is heads?

more than 0 but less than 1 bit

10
New cards

what is a binary memoryless source?

two outcomes represented by an alphabet of 0 and 1 and symbol probabilities of 0.5 each

11
New cards

what is Shannon’s equation for calculating the amount of self-information contained in a symbol?

I(ax) = -log2(px)

12
New cards

what is I(0) when px = 0.5?

-log2(0.5) or 1 bit

13
New cards

what is I(0) when px = 1?

-log2(1) or 0 bits

14
New cards

what is I(0) when px = 0.9?

-log2(0.9) or about 0.15 bitswha

15
New cards

what is information entropy?

information contained in a source, which is a weighted average of the self-information of each of the possible symbols

16
New cards

what is the equation for information entropy?

H(X) = -sigmapxlog2(px)

17
New cards

what is channel capacity?

theoretical maximum rate at which a message can be reliably transmitted in bits per second over a communication channel

18
New cards

do languages have the same amount of information per syllable?

no, because they differ in the number of possible syllables allowed

19
New cards

do languages with many possible syllables, like English, or languages with few possible syllables, like Japanese, contain more information in one syllable of speech?

languages with many possible syllables, like English

20
New cards

is the effective information rate different across languages?

no, it is relatively similar because of natural variation in speaking rates

21
New cards

what natural variations in speaking rates allow for effective information rate to be similar across languages?

languages conveying more info per syllable are spoken more slowly and languages conveying less information per syllable are spoken faster

22
New cards

what happens when you speak too quickly?

info processing overload

23
New cards

what happens when you speak too slowly?

inefficient use of channel

24
New cards

what is the uniform information density hypothesis?

language users prefer to distribute info uniformly over a message

25
New cards

according to the uniform information density hypothesis, what do speakers do when they are about to produce something very surprising?

they slow down, allowing themselves or the listener more time to process the information

26
New cards

what is Hick’s law?

the more possible responses there are, the longer it will take to choose the correct response

27
New cards

what does Hick’s law predict?

reaction time as a function of the number of choices

28
New cards

what is the equation for Hick’s law?

RT = a + b(log2(n))

29
New cards

what does a refer to in Hick’s law?

the constant time it takes to perceive stimulus and execute response, the intercept

30
New cards

what does b refer to in Hick’s law?

the rate of gain of information, the slope

31
New cards

what kind of increase is seen when there is an increase in choice RT with the log of the number of stimulus-response alternatives?

a linear increase

32
New cards

according to the Hick-Hyman law, when designing an interface where the user may need to make rapid decisions, do you want more or fewer interface elements?

fewer

33
New cards

what does Fitts’ law tell you?

how long it should take to execute a movement to a target

34
New cards

what is the equation for Fitts’ law?

MT = a + b(log2(2D/W))

35
New cards

what does MT refer to in Fitts’ law?

movement time

36
New cards

what do a and b refer to in Fitts’ law?

the intercept and slope, which are empirically determined

37
New cards

what does D refer to in Fitts’ law?

distance

38
New cards

what does W refer to in Fitts’ law?

width of the target

39
New cards

what does a smaller W mean for how long it should take to execute a movement to a target under Fitts’ law?

should mean more precision required to make movement

40
New cards

what does a longer D mean for how long it should take to execute a movement to a target under Fitts’ law?

makes precise movements harder

41
New cards

what does movement depend on?

relative precision, the ratio of D/W

42
New cards

under Fitts’ law, for which target would you expect a slower movement time, once closer but less wide or one further but wider?

movement time is the same because the ratio of D/W is the same in both cases

43
New cards

according to the Zheng & Meister reading, what is the information throughput of human behavior?

about 10 bits per second

44
New cards

according to the Zheng & Meister reading, what is the ratio of rate of sensory information input to rate of information throughput of human behavior?

about 100,000,000

45
New cards

according to Zheng & Meister, how many bits per second can a single neuron transmit?

about 10

46
New cards

what does Zheng & Meister say is the reason for the difference in peripheral and central information rates?

the peripheral system largely uses parallel processing while the central system seems to be strictly serial