computer models of information processing & human cognition.pptx
1. COMPUTER MODELS OF INFORMATION
PROCESSING & HUMAN COGNITION
PRESENTED BY: RABIA JAVED IQBAL
2. INTRODUCTION
Information processing theory is a cognitive theory that
uses computer processing as a metaphor for the workings
of the human brain. Initially proposed by George A. Miller
and other American psychologists in the 1950s, the theory
describes how people focus on information and encode it
into their memories.
3. ORIGINS OF INFORMATION PROCESSING THEORY
• Around the 1950s, computers came into existence, giving psychologists a metaphor to
explain how the human mind functioned. The metaphor helped psychologists explain the
different processes the brain engages in, including attention and perception, which could be
compared to inputting information into a computer, and memory, which could be compared
to a computer’s storage space.
• This was referred to as the information processing approach and is still fundamental to
cognitive psychology today.
• Information processing is especially interested in how people select, store and retrieve
memories.
• In 1956, psychologist George A. Miller developed the theory and also contributed the idea
that one can only hold a limited number of pieces of information in short-term memory.
• Miller specified this number as seven plus or minus two (or five to nine chunks of
information), but more recently other scholars have suggested the number may be smaller.
4. SIGNIFICANT MODELS
Atkinson and Shiffrin’s Stage Theory
• In 1968, Atkinson and Shiffrin developed the stage theory model. The model was later modified by
other researchers but the basic outline of stage theory continues to be a cornerstone of information
processing theory. The model concerns how information is stored in memory and presents a sequence
of three stages, as follows:
1. Sensory memory
2. STM
3. LTM
5. CRAIK AND LOCKHART’S LEVEL OF PROCESSING MODEL
• In 1973, their levels of processing theory states that the ability to
access information in long-term memory will be affected by how
much it was elaborated upon. Elaboration is the process of making
information meaningful so it is more likely to be remembered.
• People process information with different levels of elaboration
that will make the information more or less likely to be retrieved
later.
• Craik and Lockhart specified a continuum of elaboration that starts
with perception, continues through attention and labeling, and
ends at meaning.
6. PARALLEL-DISTRIBUTED PROCESSING MODEL AND
CONNECTIONIST MODEL
• The parallel-distributed processing model and connectionist model
contrast to the linear three-step process specified by the stage
theory. The parallel-distributed processing model was a precursor
to connectionism that proposed that information is processed by
multiple parts of the memory system at the same time.
• This was extended by Rumelhart and McClelland’s connectionist
model in 1986, which said that information is stored in various
locations throughout the brain that is connected through a
network. Information that has more connections will be easier for
an individual to retrieve.
7. LIMITATIONS
• Computers aren’t influenced by things like emotions or
motivations in their ability to learn and remember
information, but these things can have a powerful impact
on people.
• In addition, while computers tend to process things
sequentially, evidence shows humans are capable of
parallel processing.