Prehistory of Cognitive Science
Reaction Against Behaviorism
Behaviorism: This school of thought posited that psychology should primarily concern itself with observable behavior, rather than delving into speculative, unobservable mental states. Influential figures such as John B. Watson and B.F. Skinner championed this approach, emphasizing the importance of stimulus-response relationships in understanding behavior. Behaviorism sought to establish psychology as a rigorous, empirical science by focusing on what could be directly observed and measured.
Landmark Papers Challenged Behaviorism:
Tolman and Honzik (1930): In a groundbreaking experiment, rats demonstrated the ability to learn mazes without any explicit reinforcement. This phenomenon, known as latent learning, suggested that cognitive processes were actively at play during learning, even in the absence of immediate rewards or punishments. This finding directly contradicted the behaviorist tenet that learning only transpires when there is explicit reinforcement.
Tolman, Ritchie, and Kalisz (1946): Further solidifying the challenge to behaviorism, this study revealed that rats employ cognitive maps for spatial learning. Rather than simply memorizing a sequence of movements, the rats formed internal representations of their environment, enabling them to navigate efficiently. This discovery provided compelling evidence for the existence of internal cognitive representations, a concept that was largely dismissed by behaviorists.
Lashley (1951): Lashley's research delved into the complexities of behavior, arguing that intricate actions are not merely chains of stimulus-response associations. Instead, he proposed that complex behaviors are hierarchically organized and involve sophisticated planning mechanisms. His work highlighted the inadequacies of behaviorism in elucidating the intricacies of complex actions, such as playing a musical instrument, which necessitates intricate cognitive orchestration.
Algorithmic Computation
Turing (1936-7): Alan Turing introduced the concepts of algorithms and Turing machines. These provided a theoretical framework and model for computation. This work was foundational to both computer science and cognitive science, suggesting mental processes could be understood in computational terms. Turing machines offered a way to formalize the concept of an algorithm, a step-by-step procedure for solving a problem.
Church-Turing Thesis: This thesis states that any computation that can be performed by an algorithm can also be performed by a Turing machine. It provides a formal and profound link between computation and cognition by implying that the human brain, when engaged in cognitive processes, is essentially performing algorithmic computations.
Halting Problem: A fundamental concept demonstrating the limits of computation, the Halting Problem posits that there is no universal mechanical procedure that can determine whether all mathematical problems have a solution. This has significant implications for understanding the boundaries of computation and, potentially, cognition, suggesting that not all cognitive processes may be fully captured by computational models.
Formal Analysis of Language
Chomsky (1957): Noam Chomsky, in his seminal work Syntactic Structures, introduced the concept of transformational grammar. This framework distinguishes between the deep structure and surface structure of sentences, marking a significant departure from behaviorist approaches to language. Chomsky’s work had a revolutionary impact on the field of linguistics and exerted a profound influence on cognitive psychology, particularly in the study of language acquisition and processing.
Transformational Grammar: This approach employs algorithms to transform deep structures into surface structures, reflecting a hierarchical organization of language. It suggests that language is not simply a collection of learned associations, but rather a complex system governed by underlying rules and structures. This framework provides insights into how humans generate and understand an infinite number of sentences from a finite set of grammatical rules.
Information Processing Models
Miller (1956): George Miller's influential paper, "The Magical Number Seven, Plus or Minus Two," proposed that our capacity for processing information in short-term memory is limited to approximately seven chunks of information. This paper highlighted the constraints on human information processing and spurred considerable research into the nature of memory and attention.
Broadbent (1958): Donald Broadbent introduced an information-processing model that included a selective filter, a short-term store, and a limited-capacity channel. This model aimed to explain the mechanisms underlying attention and memory, suggesting that attention acts as a filter, selecting which information is processed further. Broadbent's model significantly influenced early cognitive psychology research on attention and memory, paving the way for subsequent models and theories.
Key Concepts
Information: A fundamental concept in cognitive science, information is central to phenomena such as latent learning, linguistic analysis, and perceptual systems. It serves as the currency of cognitive processes, representing knowledge about the environment and enabling organisms to make informed decisions.
Representation: Representation refers to stored information about the environment. This information is actively manipulated and transformed by organisms as they interact with their surroundings. Representations serve as the fundamental building blocks of thought, enabling organisms to reason, plan, and solve problems.
Information Processing: At its core, the cognitive approach posits that organisms actively adapt to their environment by modifying and utilizing information. This perspective emphasizes the dynamic and adaptive nature of cognition, highlighting how organisms extract, process, and use information to navigate their surroundings effectively.
Algorithms: Algorithms are mechanical procedures utilized for solving problems and processing information. Exemplified by Turing machines and transformational grammar, algorithms provide a formal framework for understanding cognitive processes, offering insights into the step-by-step procedures underlying various cognitive functions.
Specialized Systems: Information processing is carried out by dedicated and specialized systems within the organism. These systems, such as those described in Broadbent’s model of selective attention, are specifically designed to handle particular types of information or cognitive tasks.