PERCEPTIONS AND INTERPRETATIONS
PERCEPTIONS -
DEFINITION
In philosophy, psychology, and cognitive science,
perception is the process of attaining awareness or understanding of sensory
information. The word "perception" comes from the Latin words
perceptio, percipio, and means "receiving, collecting, action of taking
possession, apprehension with the mind or senses.
INTERPRETATIONS - INTRODUCTION
Interpretation, or interpreting, is an activity
that consists of establishing, either simultaneously (known as simultaneous
interpretation) or consecutively (known as consecutive interpretation), oral or
gestural communications between two or more speakers who are not able to use
the same set of symbols. By definition it is available as a method only in
those cases where there is a need for interpretation - if an object (of art, of
speech, etc.) is obvious to begin with, it cannot draw an interpretation. In
any case the term interpretation is ambiguous, as it may refer to both an
ongoing process and a result.
MEMORY
In psychology, memory is an organism's ability to
store, retain, and recall information and experiences. Traditional studies of
memory began in the fields of philosophy, including techniques of artificially
enhancing memory. The late nineteenth and early twentieth century put memory
within the paradigms of cognitive psychology. In recent decades, it has become
one of the principal pillars of a branch of science called cognitive
neuroscience, an interdisciplinary link between cognitive psychology and
neuroscience.
OUR
PRECONCEPTIONS CONTROL OUR INTERPRETATIONS AND MEMORIES:
Our preconception guides how our mind perceives and
interprets information. People fail to realize how great the effect of
preconception. Preconception can also be manipulated and in an experiment at
the University of Oregon, students were asked to assess the facial expressions
of a man. The students who judged his expressions as cruel were first told he
was responsible for cruel acts performed at concentration camps of WWII. Others
who assessed the man as warm and kind were previously told he was an anti-nazi
who saved numerous Jewish lives. The experiment above strongly demonstrates
that our preconceptions do control the way we view issues and people but beyond
preconception is the ability to manipulate and consrue the way we see
things.(Ross and Lepper).This seems to hold true in everyday life from assuming
that someone is shy to wondering if someone feels the same way that you do. If
you constue an idea and continue to perceive certain ideas then there will be
little room for change or consideration.
Experiments show that a significant fact about the
human mind is the extent to which preconceived notions guide how were perceive,
interpret, and remember information. People will grant that preconceptions
matter, yet fail to realize how great the effect is. Let’s look at some recent
experiments. Some examine how
prejudgments affect the way people perceive and interpret information. Others plant a judgment in people’s minds
after they have been given information to study how after the fact ideas bias
people’s recall.
Key words: “As
I am, so I see”
Preconceptions sway scientists too. We noted that
beliefs and values penetrate science- Philosophers of science remind us that
our observations are "theory-laden." There is an objective reality
out there, but we view it through the spectacles of our beliefs, attitudes, and
values. This is one reason our beliefs are so important; they shape our
interpretation of everything else. Often, this is justifiable. Your
preconceptions of the editorial standards of certain tabloid newspapers
probably justify your disregarding headlines proclaiming, "Computers talk
with the dead." So preconceptions can be useful: The occasional biases
they create are the price we pay for their helping us filter and organize vast.
amounts of information efficiently.
Key words: “Once
you have a belief, it influences how you perceive all other relevant
information Once you see a country as
hostile, you are likely to interpret ambiguous actions on their part as
signifying their hostility.”
Key words: “The
error of our eye directs our mind: what error leads must error.
Perceiving and Interpreting Events: The effect of prejudgments and expectations
are standard fare for psychology’s introductory course.
A
BIRD
IN THE
THE HAND
May it is noticed anything wrong with it? There is more to perception than meets the
eye. This was tragically demonstrated in
1988 when the crew of the USS Vincennes took an Iranian airliner to be an F-14
fighter plane and then proceeded to shoot it down. As social psychologist Richard Nisbett (1988)
noted in a congressional hearing on the incident, “The effects of expectations
on generating and sustaining mistaken hypotheses can be dramatic.”
The same is true of social perception. Because
social perceptions are very much in the eye of the beholder, even a simple
stimulus may strike two people quite differently. Saying Canada’s Brian
Mulroney is “an okay prime minister” may sound like a put-down to one of his
ardent admirers and as positively biased to someone who regards him with
contempt. When social information is ambiguous—subject to multiple
interpretations—preconceptions matter (Hillton & von Hippel, 1990).
An experiment by Robert Vallone, Lee Ross, and Mark
Lepper (1985) reveals just how powerful preconceptions can be. They showed pro-Israeli and pro-Arab students
six network news segments describing the 1982 killing of civilian refugees at
two camps in Lebanon. The phenomenon is commonplace. Presidential candidates and their supporters
nearly always view the partial to the other side. People in conflict (married couples, labor
and management, opposing racial groups) see impartial mediators as biased
against them.
One study confirmed and the other disconfirmed the
students’ beliefs about the deterrence effect of the death penalty. Both proponents and their belief but were
sharply critical of disconfirming evidence.
Showing the two sides an identical body of mixed evidence had therefore
not narrowed their disagreement but increased it. Each side perceived the evidence as
supporting its belief and now believed even more strongly. Is this why, in
politics, religion, and science, ambiguous evidence often fuels conflict? Presidential TV debates in the United States
have mostly reinforced predebate opinions.
By nearly a 10 to 1 margin, those who already favored one candidate or
the other in the 1960, 1976 and 1980 debates perceived their candidate as
having won (Kinder & Sears, 1985)
Preconceptions sway scientists too. Beliefs and
values penetrate science. Philosophers of science remind us that our observations
are “thorry-laden.” There is an
objective reality out there, but we view it through the spectacles of our
beliefs, attitudes, and values. This is
one reason our beliefs are so important; they shape our interpretation of
everything else.
Thomas Hill, Pawel Lewicki, and their colleagues
(1989) have shown how these interpretative biases become
self-perpetuating. They first showed
University of Tulsa students tow-minute films of three men and three women,
each of whom were heard “thinking out loud” while in some situation. Students in the “male sadness” condition
heard each of the happy-looking men confess their inner anxiety and unhappiness
and heard each of the women mentioning some minor frustration. Other students heard the females verbalize
hidden sadness. When the students
afterward rated certain male and female acquaintances on traits related to
sadness, there was little effect of the filmed hints that happy-looking males
(or females) may actually be harboring an inner sadness.
Two weeks later the students were unexpectedly
asked again to rate their friends. Now,
after many fresh opportunities to notice and interpret their friends’
behaviors, those who’d seen the film clips of sad men did perceive their male
friends as sadder, while those who’d seen the film clips sand women perceived
their female friends as sadder. Supplied
with the idea that cheerful-seeming men (or women) may actually be hurting
inside, they had perceived their friends accordingly.
Belief
Perseverance
If a false idea biases information processing, will
later discrediting it erase its effects?
Imagine a baby-sitter who decides, during an evening with a crying
infant, that bottle feeding produces colicky babies: “Come to think of it, cow’s milk obviously
better suits calves than babies.” If the
infant turns out to be suffering a high fever, will the sitter nevertheless
persist in believing that bottle feeding causes colic (Ross & Anderson,
1982)? To find out, Lee Ross, Craig Anderson, and their colleagues planted a
falsehood in people’s minds and then tried to discredit it.
Their experiments reveal that it is surprisingly
difficult to demolish a falsehood, once that person conjures up a rationale for
it. Each experiment first implanted a
belief, either by proclaiming it was true or by inducing people to come to that
conclusion after inspecting two sample cases.
Then the people were asked to explain why it is true. Finally, the researchers totally discredited
the initial information, by telling the person the truth: The information manufactured for the
experiment, and half the people in the experiment had received opposite
information. Nevertheless, the new
belief survived about 75 percent intact, presumably because the people still retained
their invented explanations for the belief.
This phenomenon, named belief perseverance, shows that beliefs can take
on a life of their own, surviving the discrediting of the evidence that gave
them birth.
For instance, Anderson, Lepper, and Ross (1980)
asked people, after giving them two concrete cases to inspect, to decide
whether people who take risks make good or bad fire fighters. One group
considered a risk-prone person who was a successful fire fighter and a cautious
person who was an unsuccessful one. The
other group considered cases suggesting the opposite conclusion. After forming their theory that rust-prone
people make better or worse fire fighters, the people wrote an explanation for
it—for example, that risk-prone people are brave or that cautious people are
careful. Once formed, each explanation
could exist independently of the information that initially created the
belief. Thus when that information was
discredited, the people still held the self-generated explanations and therefore
continued to believe that rust prone people really do make better or worse
firefighters.
These experiments also show that the more we
examine our theories and explain how they might be true, the more closed we
become to information that challenges our belief. Once we consider why an accused person might
be guilty, why someone of whom we have a negative first impression acts that
way, or why a favored stock might rise in value, our explanations may survive
challenging evidence to the contrary (Jelalian & Miller,1984). Once we attribute our failure at the hands of
an incompetent teacher to our own incompetence, our low self-image may
persevere (Lepper & others,. 1986).
The evidence is compelling: Our beliefs and
expectations powerfully affect how we notice and interpret events. We usually benefit from our preconceptions,
just as scientist benefit from creating theories that guide them in noticing
and interpreting events. But the
benefits sometimes entail a cost: we become prisoners of our own thought pattern. Thus the “canals” that were so often seen on
Mars turned out indeed to be the product of intelligent life and intelligence
on earth’s side of the telescope.
Is there an way we can restrain belief
perseverance? There is a simple remedy”:
Explain the opposite. Charles Lord, Mark
Lepper, and Elizabeth Preston (1984) repeated the capital punishment study
described earlier and added two variations.
First, they asked some of their subjects when evaluating the evidence to
be “as objective and unbiased as possible.”
It was to not avail; whether for or against capital punishment, those
who received this pleas made evaluations as biased as those who did not.
The researchers asked a third group of subjects to
consider the opposite to asks themselves “whether you would have made the same
high or low evaluations had exactly the same study produced results on the
other side of the asked in their evaluations of the evidence for and against
their views. In this experiments, Craig
Anderson (1982;Anderson & Sechler, 1986) consistently found that explaining
why an opposite theory might be true why a cautious rather than a risk-taking
person might be a better fire fighter reduces or eliminates belief
perseverance. So, to contract belief
perseverance, force yourself to explain why the opposite belief might be true.
We Are More Swayed By Memorable Events Than By
Facts Because people assume that something is commonplace simply because it is
easily available in memory, people are often more compelled with powerful
anecdote than statistics(Allison&others,1992). This is known as the
availability heuristic and is often why people overestimate the reality of
situations. For example, the 9/11 attacks are often a visual and readily
available in memory, many people thus believe they are more at risk for similar
situations during commercial travel than they actually are. Psychologist Daniel
Kahneman supports this belief and conducts studies based on probability of an
event based on the way our mind imagines and retrieves information. When we
view an unlikely event the picture in our mind allows us to over estimate the
likeliness that this would happen (kahneman).
Our Beliefs Can
Generate Their Own Confirmation
A study conducted by Robert Rosenthal found that people sometimes live up to what is expected of them. When partisans knew they were expected to give high ratings on a picture they viewed, they did so more than others who were expected to see the photos as failures. This study demonstrated self-fulfilling prophecy. Do teachers have the same expectation? Do teachers expectations affect student performance? Teachers do hold their standards higher for some students and think highly of those who do well(Jussin&Others,1996). Certainly low expectation may not always discourage an average child but a teachers high expectation of one child does not guarantee their success. An article posted in the New York Times demonstrated the self-fulfilling prophecy while gathering men and women between the ages of 48 and 62 and then divided them into groups. One group was assigned to complete a memory test against another group averaging 70 or older. The second group was to complete the same test but against a group averaging 20 years of age. The third group completed the test not aware of any competition. When the results were in the group competing against younger participants retrieved 14 words on the average the results remained the same for the group who had no competition. The group that tested against the "older" crowd retrieved the least amount of words on average. Perhaps, the mental perception of being included with an "older" group brought about an unconscious thought that advancing age automatically affects memory thus the stereotype may have reflected test results(Carey). Having these standards does seem to boost confidence. Teachers who do see high potential in a student have been reported to look, smile, nod more often to those students as well as call on and allow more time for them to answer questions(Cooper,1983). On the flip side, a students expectation of an instructor can affect the way a student perceives the class. For example, a student who goes into class having heard positive feedback about the instructor was more likely to find the class more interesting than a student with lower expectation(Feldman&Theiss,1982).
Comments
Post a Comment