knowt logo

The Rise of Cognition

Cognition and Cognitive psychology are a bit of a tricky one to map out chronologically. The actual starting point and its founders seem to be contested. This plan may jump around in timeframe for certain events. This should be seen to enrich arguments in the exam, rather than falter them.

The Mid-19th century gave way to one of the most important discoveries for cognitive psychology. Franciscus Donders was an ophthalmologist and a professor of physiology in the Netherlands. In 1868, he was experimenting with reaction times and choice reactions to visual stimulus. He had discovered a method that gave very accurate (at the time) temporal resolution for any observable function. In his control experiment, the participant would be instructed to press a button upon seeing a specific visual stimulus (a light). Donders would take the reaction time from the light being switched on to the button being pressed. Then Donders would introduce another stimulus to the experiment (a light of different colour). The participant was instructed to only press the button for one of the two stimuli (press button when green light is shown not blue one). Donders would then take the reaction times. What Donders found was that the difference in times between the first (T1) and Second (T2) experimental conditions must be the time taken for the participant to discriminate the colour of the light. Donders then developed a general mathematical method:

T2 - T1 = Time taken to perform the studied cognitive function (or TDiscrimination)

This was the first method that allowed for the scientific study of mental function. It is a method that is still used today in fMRI studies. Franciscus Donders and his method was the starting point of experimental and cognitive psychology. However, Donders remained an ophthalmologist. He never took his method further than visual discrimination tasks and did not see how it could be applicated to study the mind. Because of this, we tend to not see Donders as a founding father of psychology as he never fully embraced the discipline, regardless of his enormous contributions. Some could argue that Donders and Gustav Fechner are very similar in this way, given that both contributed to but never embraced the psychological discipline.

Late 19th century baby. Herman Ebbinghaus was studying human beings and wanted to work out what objective factors signified the capacity of memory and the process of memory encoding. Ebbinghaus had a structuralist POV; smaller, modular nodes execute functions that contribute to cognition as a whole. He used nonsense syllables as items in his experiment. This allowed him to truly test memory without the confounding variable of a participant’s previous knowledge. Ebbinghaus then suggested that the hard limit of the short-term memory was about 7 items, however, the madman only tested himself. STM aside, Ebbinghaus’ contribution to LTM study was much more helpful, including his discovery of the practice effect.

George Muller (1850-1934) followed in Ebbinghaus’ footsteps. Muller set out to replicate Ebbinghaus’ memory capacity experiments and improve on them. Importantly, Muller added a qualitative element to studying memory: Introspection. Muller asked participants to analyse their own metacognitive ability when it came to memory. This method diverged from structuralism. Introspection is a method that is not included in our modern scientific practice but gave Muller some strong insights into the mechanisms of memory; Muller was able to establish that subjects believed that they could remember more items if they attributed semantic meaning to them. This came to be known as ‘Chunking’ in the cognitive study of memory.

Muller was now doing the most. He was working on the reasons to why people forget when he created a series of conditions that allowed the learning of one task to transfer to a second one (transfer appropriate processing). This finding led him to develop a tool that would improve the experimental accuracy of memory testing: The Memory Drum. The drum allowed for experimenters to programme the exact time when lists would be shown to participants during a memory experiment. Now with a more scientific method, phenomena like retroactive interference could be studied and understood.

Most study in cognition at the turn of the 19th century was based on the visual senses. Elements like reading lists or looking at fucked up shapes on a board used the eyes. However, during when Muller was getting his cognitive rocks of, Carl Stumpf was experimenting with reasoning and auditory perception. This gave way to the idea that perception was not just vision, perception was sensory.

Good Ol’ William James (1842-1910). Serial lab dodger and probably liked the sound of his own voice. One of his most famous works was the first ever psychology textbook: The principles of psychology. James also looked at the structure and function of memory, attention and many other forms of cognition. James was one of the first people to distinguish between two different types of memory, named: Primary and secondary (or memory proper) memory. Primary memory can hold information at a limited rate if it was held consciously but secondary memory allowed for the implicit storage and recall of a vast amount of information. This was the pre-cursor to STM and LTM memory models. James also thought that Wundt was too reductionist in his approach. He believed in more of a stream of consciousness, functionalism over structuralism.

James also theorised some other concepts that seem to still hold merit to this day. James’ view on consciousness appears to line up with common beliefs in cognitive neuroscience. James saw consciousness as a continuous flowing process rather than a sequence of separate ideas and events. During this time in Jame’s life (1884), he was beginning to work on a theory for emotion. James saw that most thinkers had established that the physiological response of emotion was triggered by the emotional response itself. James didn’t like this theory. One conjecture he used was that if you had sighted a bear, your heart would bound, you would sweat and you would be inclined to run away. It is only after you have run away from the bear that you feel the emotional reaction of fear. Carl Lange was also looking at his theory of emotion (1885) that had a similar idea to James; Whilst James was looking at visceral and somatic responses, Lange was focusing on cardiovascular responses to emotion. Their theories were then retrospectively married to give the James-Lange theory of emotion. The combined theory diluted both men’s work but the principle was still the same: Physiological response then affective response. The James-Lange theory has come under scrutiny in contemporary spheres. James-Lange theory did not have any explanation for the experience of emotion without arousal and the role of learning and cognition in emotional responses.

The biggest contender for William James’ founding father in psychology would have to be Wilhelm Wundt. Wundtian psychology was based on the principle of structuralism. Structuralism saw to breakdown complex mental concepts and analyse their individual parts. Structuralism used methods like introspection to get a better understanding of the sequence of events that lead to the execution of a specific function. Functionalism was developed a school of thought in response to structuralism. Functionalism focused on understanding the root cause or function of a specific behaviour or mental state. Functionalism wished to find links between internal states (being happy) and external behaviours (laughing). This approach was embodied by James and influenced allot of his work.

James also wrote himself into the history books by formally training Mary Calkins, who became the first female president of the APA. Despite finding resistance from Harvard university and his male students. James persisted in aiding Mary in her studies of Cognitive psychology. Calkins aided James on many of his theories with one of her biggest contributions being the development of Associative learning, which became her thesis.

At this point in time, case studies had been established as an essential tool for understanding cognition. One of the most famous studies that lead to the discovery of a double dissociation was by Paul Broca (1824-1967). Broca’s patient, ‘Tan’, suffered epilepsy related damage to his lower left hemisphere. This left Tan with severe language production impairments and Broca to theorise that the damaged area of brain tissue was related to that function. Then in the late 19th century Carl Wernicke discovered an area that caused for impairment to language comprehension when damaged, leading to a double dissociation. The use of the patient legion study was a popular method of conducting study in countries like Italy and the UK during the 70 and 80s.

At a similar timeframe. Psychologist, Karl Lashley (1890-1958) was attempting to do the impossible. Lashley was attempting to find neurological evidence for stored memories, or memory engrams. Ultimately, his search was not successful, but he did develop an important neurological principle. Mass action sees learning as a cognitive process that executes functions all over the brain opposed to one specific store being responsible for the process.

Frederick Bartlett (1886-1969) was not a fan of Ebbinghaus’ work. He thought it was cringe. Bartlett didn’t like the use of nonsense syllables to test memory capacity. He suggested that by using nonsense syllables, you are measuring laboratory habits and excluding an important part of remembering items, semantic value. This led Bartlett to develop his own experiments for testing memory, His most famous one being ‘War of the Ghosts’. Bartlett tasked his students with remembering a short story. To prevent previous knowledge being a confounding variable, he picked a story that would be culturally alien to all his British students: A native American tale called ‘War of the Ghosts’. After consolidation of the story, Bartlett’s students would be surprised by him around campus, asking them to recall as much of the story that they could remember. Bartlett then analysed these recalled versions of the story. He found that cognition and memory had social and cultural influences too. Bartlett’s work went slightly underappreciated at the time. This was due to the mack daddy of behaviour studies brewing in the US….

50 years after William James, the mid-20th century. Behaviourism was beginning to take over the brain science landscape. Behaviourism only measures the objective stimulus response. Fuck the cognitive process/study, the brain is a ‘black box’ to us behaviourists. Spear-headed by BF Skinner and his work with operant conditioning. Skinner’s theory went beyond the papers for him though; Skinner believed that behaviourist line of thought could be used to change humanity for the better. It must be said that allot of people shit on behaviourism. Operant conditioning still has many helpful applications to modern day science (i.e. OC underlines the basic methods for cognitive behavioural therapy.

Now then, the cognitive revolution. Many events caused the trigger of this shift in thinking back to cognitive study. One event could be the legendary academic drama between BF Skinner and Noam Chomsky. Skinner published a paper in 1957, suggesting that the acquisition of language was influenced by parents reinforcing behaviour in children to form appropriate sentences. Chomsky clapped back by giving two convincing counter arguments:

1.      Children make nonsense/inappropriate sentences; if children were reinforced to learn vocabulary, then their parents’ grammar would be ingrained in them too. However, this is not the case, children often make grammatical errors when speaking rough phrases. Furthermore, phrases that have content that would offend/upset parents, like ‘I hate Mum’ would be filtered out of childrens’ vernacular through parental conditioning. However, any parent would tell you that this is not the case.

2.      Deep and Shallow syntax;

 

Another event that put cognition back on the map was the rise of computational science in the 1960s. Psychologist were beginning to realize that a computers process is a perfect analogue for cognition. We began to see the implementation of models that described human function that were influenced by computers. It also created a new school of cognitive thought, the information processing approach. Secondly, computers allowed for more accurate testing in experiments. The UI of computer screens allowed for the creation of experiments that were fiscally and physically impossible.

1956, Stanley Miller was getting his jiggy on. Doing some tests on short term memory, he developed a legendary number the capacity for short term memory: STM can hold up to 5+-2 items at any given time. This number was influential for the development of experiments and models studying attention.

Time for some more patient study action. During the 1955 meeting of the American Neurological association William Scoville, a Neurologist by trade, presented a unique patient in the meeting. HM recently had a bilateral medial temporal lobe resection (performed by Scoville), a successful surgery in removing his seizures. HM however described some interesting effects to Scoville, regarding his memory. This interested Brenda Milner (a psychologist at the event). When Milner first met HM, she was able to establish that HM had reduced recognition memory whilst normal STM and preservation of motor skills were observed: HM could learn a motor task and repeat it, as if it was committed to memory, but when asked about it the next day, HM could not recall ever learning the task. This suggested further evidence for a separation between long term and short-term memory and that the medial temporal lobes were responsible for recognition-based memory. Milner’s work came under scrutiny when there were attempts to create an animal legion model in monkeys. Scoville himself performed an identical surgery on the monkeys, removing the bilateral medial temporal lobe. However, upon testing, the monkeys were seen to be completing memory tasks that HM had failed when Milner last visited him. It was much later when Mortimer Mishkin established that such memory tasks are learned in structurally different areas in humans compared to monkeys (Basal Ganglia in monkeys compared to BMTL area in humans) and was able to make a monkey HM analogue in 1978.

A decade later than the first meeting of Milner and HM, Elizabeth Warrington had stumbled across a seminal deficit patient. After a motorcycle crash, KF had severe damage to the Parieto-occipital region of his brain. Upon her first visit to KF in 1969, she established that KF had a reduced digit span and impeded recency effect, suggesting a poor short-term memory, however, the long-term memory system appeared to be in perfect working order. One of Warrington’s most exceptional discoveries what that KF had a poorer memory for auditory information compared to visual information (1970). This discovery was directly conflicting with the current model for working memory at the time, the STS (or short-term Store). Furthermore, Warrington’s 1970 discovery prompted Alan Baddeley to investigate a multi modal working memory system which created the Baddeley 1980 model of working memory. Patient KF and Elizaeth Warrington had an impact on psychology that is still relevant to the study today.

1968, Atkinson and Shiffrin began to develop a computational model of memory:

Atkinson’s model came under scrutiny however when deficit patients like KF (1969-70) started to suggest that short-term memory might be multi-modal.

At the same time, Broadbent’s filter model was beginning to develop. This included the academically contested ‘selective filter’. The filter supposedly cherrypicked one vein of important stimuli for the subject’s consciousness to comprehend. However, this was heavily criticized due to model not being able to explain the cocktail party effect. This model was then triumphed by Anne Treisman’s model which included the attenuator; a conceptual module that placed emphasis on one message over a series of others, rather than picking one message and blocking the others from recognition.

Alan Baddeley + Graham Hitch developed their first rendition of their model for working memory in 1980. The model was developed from their 1974 paper in which they attempt to find evidence of a working memory model. The ’74 paper used techniques like phonemic similarity, articulatory suppression, and memory load to record the impact it had on verbal reasoning, comprehension, and free recall. The results allowed Baddeley and Hitch to establish that a working memory model could exist, given that some of the functions above were affected by the same experimental manipulations. In 1980, revisions and further experimentation lead to the invention of the working memory model. This was the famous model that included the Central executive and it’s ‘slave systems’ the phonological loop and the Visio-spatial sketchpad.

1967- Ulrich Neisser published the first ever psychology textbook.

The Rise of Cognition

Cognition and Cognitive psychology are a bit of a tricky one to map out chronologically. The actual starting point and its founders seem to be contested. This plan may jump around in timeframe for certain events. This should be seen to enrich arguments in the exam, rather than falter them.

The Mid-19th century gave way to one of the most important discoveries for cognitive psychology. Franciscus Donders was an ophthalmologist and a professor of physiology in the Netherlands. In 1868, he was experimenting with reaction times and choice reactions to visual stimulus. He had discovered a method that gave very accurate (at the time) temporal resolution for any observable function. In his control experiment, the participant would be instructed to press a button upon seeing a specific visual stimulus (a light). Donders would take the reaction time from the light being switched on to the button being pressed. Then Donders would introduce another stimulus to the experiment (a light of different colour). The participant was instructed to only press the button for one of the two stimuli (press button when green light is shown not blue one). Donders would then take the reaction times. What Donders found was that the difference in times between the first (T1) and Second (T2) experimental conditions must be the time taken for the participant to discriminate the colour of the light. Donders then developed a general mathematical method:

T2 - T1 = Time taken to perform the studied cognitive function (or TDiscrimination)

This was the first method that allowed for the scientific study of mental function. It is a method that is still used today in fMRI studies. Franciscus Donders and his method was the starting point of experimental and cognitive psychology. However, Donders remained an ophthalmologist. He never took his method further than visual discrimination tasks and did not see how it could be applicated to study the mind. Because of this, we tend to not see Donders as a founding father of psychology as he never fully embraced the discipline, regardless of his enormous contributions. Some could argue that Donders and Gustav Fechner are very similar in this way, given that both contributed to but never embraced the psychological discipline.

Late 19th century baby. Herman Ebbinghaus was studying human beings and wanted to work out what objective factors signified the capacity of memory and the process of memory encoding. Ebbinghaus had a structuralist POV; smaller, modular nodes execute functions that contribute to cognition as a whole. He used nonsense syllables as items in his experiment. This allowed him to truly test memory without the confounding variable of a participant’s previous knowledge. Ebbinghaus then suggested that the hard limit of the short-term memory was about 7 items, however, the madman only tested himself. STM aside, Ebbinghaus’ contribution to LTM study was much more helpful, including his discovery of the practice effect.

George Muller (1850-1934) followed in Ebbinghaus’ footsteps. Muller set out to replicate Ebbinghaus’ memory capacity experiments and improve on them. Importantly, Muller added a qualitative element to studying memory: Introspection. Muller asked participants to analyse their own metacognitive ability when it came to memory. This method diverged from structuralism. Introspection is a method that is not included in our modern scientific practice but gave Muller some strong insights into the mechanisms of memory; Muller was able to establish that subjects believed that they could remember more items if they attributed semantic meaning to them. This came to be known as ‘Chunking’ in the cognitive study of memory.

Muller was now doing the most. He was working on the reasons to why people forget when he created a series of conditions that allowed the learning of one task to transfer to a second one (transfer appropriate processing). This finding led him to develop a tool that would improve the experimental accuracy of memory testing: The Memory Drum. The drum allowed for experimenters to programme the exact time when lists would be shown to participants during a memory experiment. Now with a more scientific method, phenomena like retroactive interference could be studied and understood.

Most study in cognition at the turn of the 19th century was based on the visual senses. Elements like reading lists or looking at fucked up shapes on a board used the eyes. However, during when Muller was getting his cognitive rocks of, Carl Stumpf was experimenting with reasoning and auditory perception. This gave way to the idea that perception was not just vision, perception was sensory.

Good Ol’ William James (1842-1910). Serial lab dodger and probably liked the sound of his own voice. One of his most famous works was the first ever psychology textbook: The principles of psychology. James also looked at the structure and function of memory, attention and many other forms of cognition. James was one of the first people to distinguish between two different types of memory, named: Primary and secondary (or memory proper) memory. Primary memory can hold information at a limited rate if it was held consciously but secondary memory allowed for the implicit storage and recall of a vast amount of information. This was the pre-cursor to STM and LTM memory models. James also thought that Wundt was too reductionist in his approach. He believed in more of a stream of consciousness, functionalism over structuralism.

James also theorised some other concepts that seem to still hold merit to this day. James’ view on consciousness appears to line up with common beliefs in cognitive neuroscience. James saw consciousness as a continuous flowing process rather than a sequence of separate ideas and events. During this time in Jame’s life (1884), he was beginning to work on a theory for emotion. James saw that most thinkers had established that the physiological response of emotion was triggered by the emotional response itself. James didn’t like this theory. One conjecture he used was that if you had sighted a bear, your heart would bound, you would sweat and you would be inclined to run away. It is only after you have run away from the bear that you feel the emotional reaction of fear. Carl Lange was also looking at his theory of emotion (1885) that had a similar idea to James; Whilst James was looking at visceral and somatic responses, Lange was focusing on cardiovascular responses to emotion. Their theories were then retrospectively married to give the James-Lange theory of emotion. The combined theory diluted both men’s work but the principle was still the same: Physiological response then affective response. The James-Lange theory has come under scrutiny in contemporary spheres. James-Lange theory did not have any explanation for the experience of emotion without arousal and the role of learning and cognition in emotional responses.

The biggest contender for William James’ founding father in psychology would have to be Wilhelm Wundt. Wundtian psychology was based on the principle of structuralism. Structuralism saw to breakdown complex mental concepts and analyse their individual parts. Structuralism used methods like introspection to get a better understanding of the sequence of events that lead to the execution of a specific function. Functionalism was developed a school of thought in response to structuralism. Functionalism focused on understanding the root cause or function of a specific behaviour or mental state. Functionalism wished to find links between internal states (being happy) and external behaviours (laughing). This approach was embodied by James and influenced allot of his work.

James also wrote himself into the history books by formally training Mary Calkins, who became the first female president of the APA. Despite finding resistance from Harvard university and his male students. James persisted in aiding Mary in her studies of Cognitive psychology. Calkins aided James on many of his theories with one of her biggest contributions being the development of Associative learning, which became her thesis.

At this point in time, case studies had been established as an essential tool for understanding cognition. One of the most famous studies that lead to the discovery of a double dissociation was by Paul Broca (1824-1967). Broca’s patient, ‘Tan’, suffered epilepsy related damage to his lower left hemisphere. This left Tan with severe language production impairments and Broca to theorise that the damaged area of brain tissue was related to that function. Then in the late 19th century Carl Wernicke discovered an area that caused for impairment to language comprehension when damaged, leading to a double dissociation. The use of the patient legion study was a popular method of conducting study in countries like Italy and the UK during the 70 and 80s.

At a similar timeframe. Psychologist, Karl Lashley (1890-1958) was attempting to do the impossible. Lashley was attempting to find neurological evidence for stored memories, or memory engrams. Ultimately, his search was not successful, but he did develop an important neurological principle. Mass action sees learning as a cognitive process that executes functions all over the brain opposed to one specific store being responsible for the process.

Frederick Bartlett (1886-1969) was not a fan of Ebbinghaus’ work. He thought it was cringe. Bartlett didn’t like the use of nonsense syllables to test memory capacity. He suggested that by using nonsense syllables, you are measuring laboratory habits and excluding an important part of remembering items, semantic value. This led Bartlett to develop his own experiments for testing memory, His most famous one being ‘War of the Ghosts’. Bartlett tasked his students with remembering a short story. To prevent previous knowledge being a confounding variable, he picked a story that would be culturally alien to all his British students: A native American tale called ‘War of the Ghosts’. After consolidation of the story, Bartlett’s students would be surprised by him around campus, asking them to recall as much of the story that they could remember. Bartlett then analysed these recalled versions of the story. He found that cognition and memory had social and cultural influences too. Bartlett’s work went slightly underappreciated at the time. This was due to the mack daddy of behaviour studies brewing in the US….

50 years after William James, the mid-20th century. Behaviourism was beginning to take over the brain science landscape. Behaviourism only measures the objective stimulus response. Fuck the cognitive process/study, the brain is a ‘black box’ to us behaviourists. Spear-headed by BF Skinner and his work with operant conditioning. Skinner’s theory went beyond the papers for him though; Skinner believed that behaviourist line of thought could be used to change humanity for the better. It must be said that allot of people shit on behaviourism. Operant conditioning still has many helpful applications to modern day science (i.e. OC underlines the basic methods for cognitive behavioural therapy.

Now then, the cognitive revolution. Many events caused the trigger of this shift in thinking back to cognitive study. One event could be the legendary academic drama between BF Skinner and Noam Chomsky. Skinner published a paper in 1957, suggesting that the acquisition of language was influenced by parents reinforcing behaviour in children to form appropriate sentences. Chomsky clapped back by giving two convincing counter arguments:

1.      Children make nonsense/inappropriate sentences; if children were reinforced to learn vocabulary, then their parents’ grammar would be ingrained in them too. However, this is not the case, children often make grammatical errors when speaking rough phrases. Furthermore, phrases that have content that would offend/upset parents, like ‘I hate Mum’ would be filtered out of childrens’ vernacular through parental conditioning. However, any parent would tell you that this is not the case.

2.      Deep and Shallow syntax;

 

Another event that put cognition back on the map was the rise of computational science in the 1960s. Psychologist were beginning to realize that a computers process is a perfect analogue for cognition. We began to see the implementation of models that described human function that were influenced by computers. It also created a new school of cognitive thought, the information processing approach. Secondly, computers allowed for more accurate testing in experiments. The UI of computer screens allowed for the creation of experiments that were fiscally and physically impossible.

1956, Stanley Miller was getting his jiggy on. Doing some tests on short term memory, he developed a legendary number the capacity for short term memory: STM can hold up to 5+-2 items at any given time. This number was influential for the development of experiments and models studying attention.

Time for some more patient study action. During the 1955 meeting of the American Neurological association William Scoville, a Neurologist by trade, presented a unique patient in the meeting. HM recently had a bilateral medial temporal lobe resection (performed by Scoville), a successful surgery in removing his seizures. HM however described some interesting effects to Scoville, regarding his memory. This interested Brenda Milner (a psychologist at the event). When Milner first met HM, she was able to establish that HM had reduced recognition memory whilst normal STM and preservation of motor skills were observed: HM could learn a motor task and repeat it, as if it was committed to memory, but when asked about it the next day, HM could not recall ever learning the task. This suggested further evidence for a separation between long term and short-term memory and that the medial temporal lobes were responsible for recognition-based memory. Milner’s work came under scrutiny when there were attempts to create an animal legion model in monkeys. Scoville himself performed an identical surgery on the monkeys, removing the bilateral medial temporal lobe. However, upon testing, the monkeys were seen to be completing memory tasks that HM had failed when Milner last visited him. It was much later when Mortimer Mishkin established that such memory tasks are learned in structurally different areas in humans compared to monkeys (Basal Ganglia in monkeys compared to BMTL area in humans) and was able to make a monkey HM analogue in 1978.

A decade later than the first meeting of Milner and HM, Elizabeth Warrington had stumbled across a seminal deficit patient. After a motorcycle crash, KF had severe damage to the Parieto-occipital region of his brain. Upon her first visit to KF in 1969, she established that KF had a reduced digit span and impeded recency effect, suggesting a poor short-term memory, however, the long-term memory system appeared to be in perfect working order. One of Warrington’s most exceptional discoveries what that KF had a poorer memory for auditory information compared to visual information (1970). This discovery was directly conflicting with the current model for working memory at the time, the STS (or short-term Store). Furthermore, Warrington’s 1970 discovery prompted Alan Baddeley to investigate a multi modal working memory system which created the Baddeley 1980 model of working memory. Patient KF and Elizaeth Warrington had an impact on psychology that is still relevant to the study today.

1968, Atkinson and Shiffrin began to develop a computational model of memory:

Atkinson’s model came under scrutiny however when deficit patients like KF (1969-70) started to suggest that short-term memory might be multi-modal.

At the same time, Broadbent’s filter model was beginning to develop. This included the academically contested ‘selective filter’. The filter supposedly cherrypicked one vein of important stimuli for the subject’s consciousness to comprehend. However, this was heavily criticized due to model not being able to explain the cocktail party effect. This model was then triumphed by Anne Treisman’s model which included the attenuator; a conceptual module that placed emphasis on one message over a series of others, rather than picking one message and blocking the others from recognition.

Alan Baddeley + Graham Hitch developed their first rendition of their model for working memory in 1980. The model was developed from their 1974 paper in which they attempt to find evidence of a working memory model. The ’74 paper used techniques like phonemic similarity, articulatory suppression, and memory load to record the impact it had on verbal reasoning, comprehension, and free recall. The results allowed Baddeley and Hitch to establish that a working memory model could exist, given that some of the functions above were affected by the same experimental manipulations. In 1980, revisions and further experimentation lead to the invention of the working memory model. This was the famous model that included the Central executive and it’s ‘slave systems’ the phonological loop and the Visio-spatial sketchpad.

1967- Ulrich Neisser published the first ever psychology textbook.

robot