CogPsych 1a

0.0(0)
studied byStudied by 0 people
0.0(0)
full-widthCall Kai
learnLearn
examPractice Test
spaced repetitionSpaced Repetition
heart puzzleMatch
flashcardsFlashcards
GameKnowt Play
Card Sorting

1/64

encourage image

There's no tags or description

Looks like no tags are added yet.

Study Analytics
Name
Mastery
Learn
Test
Matching
Spaced

No study sessions yet.

65 Terms

1
New cards

What is the goal of cognitive psychology?

TO develop formal theories about cognition that can be tested

2
New cards

What is a theory

a principle that explains a body of facts

set of sentences written in formal language from which things can be derived

specifies causal relations between states

must predict

must specify structures of interest and how things react between them (eg X exists because of Y, must be notable)

(eg evolution theory, how biological state arises as function of genetic sequence)

3
New cards

What is a theory NOT

a description

a set of data

not a diagram

4
New cards

how do we develop good theories of mind

good science = testable theories

understand intelligent systems

5
New cards

what is an intelligent system

a system that responds to its environment under some constraint or set of goals

must be able to adapt to changes in environment (eg lancet fluke therefore isn’t intelligent)

6
New cards

what are Marr’s three levels to understand how does it work

1) computational theory

2) representation and algorithm

3) implementation

7
New cards

What is Marr’s computational level theory

describe What problem is it solving (and why)

what are constraints on solution the system has

What is the problem getting solved / how is the function being computed

8
New cards

What is Marr’s representation and algorithm theory

how does it get from one state to other

identify things system is representing and how it represents that info

how it manipulates these things to turn input state into output state

what is the algorithm that the system is running?

take input to system, whats the output, what are the stages in between

9
New cards

What is Marr’s physical implementation theory

how are these representations and algorithms realised in the hardware of the device itself (eg neurons, harddrive)

10
New cards

What is the cash register example for Marr’s 3 levels

computational = what is the problem it is solving? arithmetic

Rep and Alg= representation by arabic numerals, need to specify an operation

Implementation = 10 notch metal wheels turned by control structure

11
New cards

Does Marr argues that it is better to move down levels or up?

Down

eg washing machine in space

easier to know function first

12
New cards

What is the computational theory model?

some sort of functional state or functional transformation

describe the math transformation that takes it from input to output

eg your memory gets worse by 10% for every hour of sleep missed

13
New cards

What is a computational theory model example

learning via association - we are describing how this happens

Rescola-Wagner model

describes associate strength between the US, i(() and the CS, (j)

delta Vij is the change of strength of association between ij

a = salience of i= how easy (high) it is for system to detect i

b= salience of j

lambda is learnability of Vij

Vik is associative strength between US (i) and other CSs (k) which also predict i

how much will association Vij change as function of co-occurence of ij in environment, changes based on how salient they are, how they are seen to be associated in world, how learnable Vij is against Vik (subtractive relation)

more things Vik, harder for Vij

14
New cards

What is the rep and alg theory model?

we have smth that specifies relationship between input and output

content (what are you representing, eg a chair has a certain list of features)

and format (how traits should be organised)

specifies algorithm for transforming input representations into output representations

-precise series of operations

15
New cards

What is a rep and alg model example

Rescorla-Wagner model

what is content of a - what does it mean to rep salience of smth and how is it represented

what is i? it is a symbol, holistic pattern, etc

what processes operate on a and how do these interact with other representations

16
New cards

What is a physical implementation model example

how do neurons behave, learn connections, learn associations

17
New cards

What does Newell argue?

cog psych deals with phenomena

we discover that people do X

we deal with phenomena as set of oppositions (eg is way we do X affected by x or ~x) (eg memory issue, is this affected by people wearing a red hat or a blue hat)

we need better constraints on how we ask these questions, we need formal situations to tell us what are the good questions- we need strong constraining theories

18
New cards

what is a model

a logical representation of a system in a formal language that describes- and is simpler- than the actual system

19
New cards

what is cognition

a set of processes that give rise to intelligent behaviour

20
New cards

a good theory of cognition will

account for the problems the mind is solving

account for how the mind solves those problems

21
New cards

Turing’s Machine

conceptually allows you to solve any issue

anything that can be done by a finite amount of precise steps can be done by a Turing machine

22
New cards

4 components of a Turing Machine

1) machine table

2) machine state

3 tape of unbound length

4) read-write head

23
New cards

Turing machine- machine table

a list of instructions of the form “if in state x, and input is y, then do z”

program or algorithm

24
New cards

Turing machine- machine state

a current state of the machine, represented as x

25
New cards

Turing machine- tape of unbound length

a tape divided into discreet parts, each part containing or with the ability to contain a 0 or a 1

finite in length

input and output of the machine

26
New cards

Turing machine-read-write head

device that can read current slot on tape, move tape forwards or backwards, and erase or write a 1 or 0 to slot on tape

what interacts with input and output of machine

27
New cards

what does mathematically random mean

the thing cannot be shortened

shortest description of thing is itself

10/3= 3.3333 = 3.·3

pi is random because the shortest of itself is itself

28
New cards
29
New cards

Universal Turing Machine (UTM)

can imitate any Turing machine

eg programming languages, computers etc are UTMs

30
New cards

are minds UTMs?

minds are weaker than UTMs

mind only carries out actions that can be carried out in principle (tautological)

mind may be able to be imitated by a UTM

could in principle simulate mind as computer program

31
New cards

How do we ‘solve’ unsolvable problems

we use our own representations to guess what is going on - we use our solvable methods (previous ideas) to approximate solutions (heuristics) to problems

32
New cards

what is the computational theory of mind

view that the human mind is best understood as a computer system as it operates by performing computations

we represent and process info all the time, the algorithm is a specification for representing and manipulating it

33
New cards

what are computations

any type of info processing that can be repped mathematically

34
New cards

what is an information processing system

take input info, transforms into another state (output) via an algorithmic process

35
New cards

algorithm

formally describable procedure (can write it in lang of maths)

or a set of instructions for performing an operation in finite number of ways

36
New cards

what is the goal of computational theory of mind

to figure out the algorithms and representations they operate on

37
New cards

what is an algorithmic account of mind

it specifies the means by which the mind represents info and the processes by which is manipulates these representations

38
New cards

what is the mind if not computational?

algorithmic as it still takes input and produces output

but you then believe mind is not understandable (but if mind is irregular / random we cannot shorten/compress it and understand it in a systematic way)

39
New cards

Turing’s test of machine intelligence

3 rooms

in room 3 there is a tester who sits at an input-output device and asks questions through terminal and receives answers

2 other rooms- one with human; one is computing device

if tester can’t tell which is human and which is computer, machine is as intelligent as human

40
New cards

Issues with Turing’s test of machine intelligence

it is dependent on the tester and how well they can decipher who is the human

is being able to fool a tester a good test of intelligence - other aspects of intelligence

tests performance (what the machine does), not competence (what machine is capable of doing)

it is a. behaviourist test

smart programmers can anticipate questions and invent answers

41
New cards

What is the Chinese Room (Searle) procedure

Against a computational theory of mind

there is a room with an English monolingual in it

they have an ‘in’ slot: squiggles on paper come in

an ‘out’ slot: person can write squiggles out

blackboard where they write numbers

set of English instructions eg:

if X on blackboard + input: W —> output: draw Y —> replace X on blackboard with Z

42
New cards

What is the Chinese Room (Searle) conclusions

seems like there’s a very smart chinese person in room —> but nor person nor room are intelligent yet still pass Turing Test

this is a replication of a turing machine, saying that neither computers nor the person in the chinese room have human intelligence

43
New cards

what is an issue with the chinese room

Searle’s definition of intelligence is to speak Mandarin… but the room does- therefore room is technically intelligent

eg it is like saying read-write head doesn’t do computation, when actually it is the whole machine that does this computation

there is no operational definition of consciousness - can only assume smth is consciousness- it has no observable consequences so can’t be observed (bit confusing)

44
New cards

different types of models in psych

math models (Rescorla-wagner), models of personality (computational)

process models- cog and behavioural neuroscience (Turin)

45
New cards

2 types of process models

1) symbolic

2) connectionist

46
New cards

what is a symbolic process model /representations

-represents knowledge as symbolic data structures

-basic or atomic elements

-rules for composing elements to make complex structures (eg languages)

-manipulate data structures with variabilised rules

eg variables S T U V , operators & and | , rules for propositions (legal proposition = any variable) (any two legal propositions can be combined by an operator) —> infinite options

47
New cards

what is a connectionist process model

represent knowledge as patterns of activation within a set of units (nodes) in a network

processing is carried out by passing activation between nodes

48
New cards

what are the symbolic models processes

there are symbolic operations on the data structures

there is an application of symbolic rules

eg if larger than (x,y) then can occlude (x,y)

49
New cards

what are the 3 production system components (this is a prototypical symbolic model)

*+mathemetical rules but doesn’t mean actually true just has to match cognition

1) a base set of known facts (represented as data structures within a data base (eg elephant is larger than housecat)

2) a set of inference rules (eg if x is larger than z, then larger is (x,z)

3) how data matches to rule, you need an executive control system that randomly chooses which one (check this lec 3 , 4228)

50
New cards

what are the operations of a production system

current state= current contents of data base (eg known knowledge)

state space= the set of all possible states (all propositions that could exist by having its knowledge base interact with its rule set)

goal state= the state you want the data base to be in (what you want to know about the world)

state transition rules= moving from one state to another (your inference rules that fire and allow you to change your data base)

search= algorithm for travelling through state space, lets you find best path moving from current state to goal state

51
New cards

what are the advantages of symbolic models

computational power

you can define variablised and universally quantified rules, doesn’t actually matter if you have ever seen x y z before

52
New cards

what are the disadvantages of symbolic models

might be too rigid to capture human behaviour (we don’t always apply rules, fail to capture shades of meaning, not very automatic processing)

we don’t know how they’re learned ( we know new representations are built from existing ones, but where do these structures come from first?)

no graceful degradation with damage- if you take out one rule, the whole thing collapses

not immediately obvious how you would take a symbolic representation and assign it to the brain- how does our brain represent it?

53
New cards

what is the connectionist model?

models composed of networks of interconnected nodes

like neurons in your brain, simple processors

they mimic populations of neurons in your brain

representation: pattern of activation on nodes or neurons

54
New cards

what is an example of a connectionist model

encode “lion= —> feline, wild, carnivore

“octopus”—> cephalopod, carnivore

you teach netwrokt hat lions like antelope so there is a positive connection between units reprsenting “lion” and “lion eats antelope”

55
New cards

how does processing work in the connectionist model

nodes pass activation over weighted connections

positive weights = excitatory connections

56
New cards

need to review!!!!

connectionist model

57
New cards

what are the 7 sins of memory

transience

absent-mindedness

blocking

misattribution

suggestibility

bias

persistence

58
New cards

transience

memories fade over time

59
New cards

absent-mindedness

forget what we are trying to remember

60
New cards

blocking

tip of the tongue, can’t retrieve it

61
New cards

misattribution

misremember the source of smth

62
New cards

suggestibility

can create false memories (eg using misleading questions)

63
New cards

bias

memories that are consistent with self image rather than facts

64
New cards

persistence

events from your life you wish you could forget

eg ptsd

65
New cards