12/20/2009

Terry Pratchett: Rather be a rising monkey than a fallen angel.

My favourite author just did some interviews for the guardian. If you have not read the discworld novels, you are missing out. They are the only books that I have laughed so hard while reading someone thought I was weeping.
More interviews can be found at:
 http://www.guardian.co.uk/books/video/2009/dec/18/book-club-terry-pratchett


High Voltage Cattle Prod

I had an impulse purchase at Chapters the other week. It was a book called, "What We Believe But Cannot Prove; Today's Leading Thinkers on Science in the age of Certainty". I was shopping for other people, but the price was right and it seemed interesting. The entries vary in length and tone, but I'm going to reproduce the one that induced the purchase.

"Seth Lloyd"
"Seth Lloyd is a quantum mechanical engineer and a professor in the Department of Mechanical Engineering at the Massachusetts Institute of Technology, where he specializes in the design of quantum computers and quantum communications systems. He is the author of Programming the Universe."

"I believe in science. Unlike mathematical theorems, scientific results can't be proved. They can only be tested again and again until only a fool would refuse to believe them.
I cannot prove that electrons exist, but I believe fervently in their existence. And if you don't believe in them, I have a high-voltage cattle prod I'm willing to apply as an argument on their behalf. Electrons speak for themselves."

The "high-voltage cattle prod" line made me laugh out loud.

12/07/2009

Last Comment

I couldn't watch this whole video, the American makes it unbearable. The last comment of the video is pretty funny and makes it worth the pain. Saw the video via Pharyngula.


12/05/2009

Cognitive Biases

I just finished A.J. Jacobs book, "The Guinea Pig Diaries". It was a good, easy little read, and I would recommend it. His one experiment of trying to live rationally was interesting, and at the end of the book he had a list of cognitive biases which I found fascinating.
Cognitive biases are interesting for many reasons, I especially like the blind spot bias, which I am probably very guilty of. Just knowing that I probably commit several of these a day is, in a weird way, helpful. It's a long list, which is also kind of depressing. It does help to explain why there is so much woo woo in the world though...
I thought it would be helpful to list them, so I just cut and past the entire list from the Wikipedia article.

"Many of these biases are studied for how they affect belief formation, business decisions, and scientific research.
  • Bandwagon effect — the tendency to do (or believe) things because many other people do (or believe) the same. Related to groupthink and herd behaviour.
  • Base rate fallacy — ignoring available statistical data in favour of particulars.
  • Bias blind spot — the tendency not to compensate for one's own cognitive biases.[1]
  • Choice-supportive bias — the tendency to remember one's choices as better than they actually were.
  • Confirmation bias — the tendency to search for or interpret information in a way that confirms one's preconceptions.
  • Congruence bias — the tendency to test hypotheses exclusively through direct testing, in contrast to tests of possible alternative hypotheses.
  • Contrast effect — the enhancement or diminishing of a weight or other measurement when compared with a recently observed contrasting object.
  • Déformation professionnelle — the tendency to look at things according to the conventions of one's own profession, forgetting any broader point of view.
  • Denomination effect — the tendency to spend more money when it is denominated in small amounts (e.g. coins) than large amounts (e.g. bills).[2]
  • Distinction bias — the tendency to view two options as more dissimilar when evaluating them simultaneously than when evaluating them separately.[3]
  • Endowment effect — "the fact that people often demand much more to give up an object than they would be willing to pay to acquire it".[4]
  • Experimenter's or Expectation bias — the tendency for experimenters to believe, certify, and publish data that agree with their expectations for the outcome of an experiment, and to disbelieve, discard, or downgrade the corresponding weightings for data that appear to conflict with those expectations.[5]
  • Extraordinarity bias — the tendency to value an object more than others in the same category as a result of an extraordinarity of that object that does not, in itself, change the value.[citation needed]
  • Focusing effect — prediction bias occurring when people place too much importance on one aspect of an event; causes error in accurately predicting the utility of a future outcome.
  • Framing — Using an approach or description of the situation or issue that is too narrow. Also framing effect — drawing different conclusions based on how data is presented.
  • Hyperbolic discounting — the tendency for people to have a stronger preference for more immediate payoffs relative to later payoffs, where the tendency increases the closer to the present both payoffs are.
  • Illusion of control — the tendency for human beings to believe they can control or at least influence outcomes that they clearly cannot.
  • Impact bias — the tendency for people to overestimate the length or the intensity of the impact of future feeling states.
  • Information bias — the tendency to seek information even when it cannot affect action.
  • Irrational escalation — the tendency to make irrational decisions based upon rational decisions in the past or to justify actions already taken.
  • Just-world phenomenon - witnesses of an "inexplicable injustice . . . will rationalize it by searching for things that the victim might have done to deserve it"
  • Loss aversion — "the disutility of giving up an object is greater than the utility associated with acquiring it".[6] (see also sunk cost effects and Endowment effect).
  • Mere exposure effect — the tendency for people to express undue liking for things merely because they are familiar with them.
  • Money illusion — the tendency of people to concentrate on the nominal (face value) of money rather than its value in terms of purchasing power.
  • Moral credential effect — the tendency of a track record of non-prejudice to increase subsequent prejudice.
  • Need for Closure — the need to reach a verdict in important matters; to have an answer and to escape the feeling of doubt and uncertainty. The personal context (time or social pressure) might increase this bias.[7]
  • Neglect of probability — the tendency to completely disregard probability when making a decision under uncertainty.
  • Not Invented Here — the tendency to ignore that a product or solution already exists, because its source is seen as an "enemy" or as "inferior".
  • Omission bias — the tendency to judge harmful actions as worse, or less moral, than equally harmful omissions (inactions).
  • Outcome bias — the tendency to judge a decision by its eventual outcome instead of based on the quality of the decision at the time it was made.
  • Planning fallacy — the tendency to underestimate task-completion times.
  • Post-purchase rationalization — the tendency to persuade oneself through rational argument that a purchase was a good value.
  • Pseudocertainty effect — the tendency to make risk-averse choices if the expected outcome is positive, but make risk-seeking choices to avoid negative outcomes.
  • Reactance — the urge to do the opposite of what someone wants you to do out of a need to resist a perceived attempt to constrain your freedom of choice.
  • Restraint bias - the tendency to overestimate one's ability to show restraint in the face of temptation.
  • Selective perception — the tendency for expectations to affect perception.
  • Semmelweis reflex — the tendency to reject new evidence that contradicts an established paradigm.[8]
  • Status quo bias — the tendency for people to like things to stay relatively the same (see also loss aversion, endowment effect, and system justification).[9]
  • Von Restorff effect — the tendency for an item that "stands out like a sore thumb" to be more likely to be remembered than other items.
  • Wishful thinking — the formation of beliefs and the making of decisions according to what is pleasing to imagine instead of by appeal to evidence or rationality.
  • Zero-risk bias — preference for reducing a small risk to zero over a greater reduction in a larger risk.

[edit] Biases in probability and belief

Many of these biases are often studied for how they affect business and economic decisions and how they affect experimental research.
  • Ambiguity effect — the avoidance of options for which missing information makes the probability seem "unknown".
  • Anchoring effect — the tendency to rely too heavily, or "anchor," on a past reference or on one trait or piece of information when making decisions (also called "insufficient adjustment").
  • Attentional bias — neglect of relevant data when making judgments of a correlation or association.
  • Authority bias — the tendency to value an ambiguous stimulus (e.g., an art performance) according to the opinion of someone who is seen as an authority on the topic.
  • Availability heuristic — estimating what is more likely by what is more available in memory, which is biased toward vivid, unusual, or emotionally charged examples.
  • Availability cascade — a self-reinforcing process in which a collective belief gains more and more plausibility through its increasing repetition in public discourse (or "repeat something long enough and it will become true").
  • Belief bias — an effect where someone's evaluation of the logical strength of an argument is biased by the believability of the conclusion.
  • Clustering illusion — the tendency to see patterns where actually none exist.
  • Capability bias — The tendency to believe that the closer average performance is to a target, the tighter the distribution of the data set.
  • Conjunction fallacy — the tendency to assume that specific conditions are more probable than general ones.
  • Disposition effect — the tendency to sell assets that have increased in value but hold assets that have decreased in value.
  • Gambler's fallacy — the tendency to think that future probabilities are altered by past events, when in reality they are unchanged. Results from an erroneous conceptualization of the normal distribution. For example, "I've flipped heads with this coin five times consecutively, so the chance of tails coming out on the sixth flip is much greater than heads."
  • Hawthorne effect — the tendency of people to perform or perceive differently when they know that they are being observed.
  • Hindsight bias — sometimes called the "I-knew-it-all-along" effect, the inclination to see past events as being predictable.
  • Illusory correlation — beliefs that inaccurately suppose a relationship between a certain type of action and an effect.[10]
  • Ludic fallacy — the analysis of chance-related problems according to the belief that the unstructured randomness found in life resembles the structured randomness found in games, ignoring the non-gaussian distribution of many real-world results.
  • Neglect of prior base rates effect — the tendency to neglect known odds when reevaluating odds in light of weak evidence.
  • Observer-expectancy effect — when a researcher expects a given result and therefore unconsciously manipulates an experiment or misinterprets data in order to find it (see also subject-expectancy effect).
  • Optimism bias — the systematic tendency to be over-optimistic about the outcome of planned actions.
  • Ostrich effect — ignoring an obvious (negative) situation.
  • Overconfidence effect — excessive confidence in one's own answers to questions. For example, for certain types of question, answers that people rate as "99% certain" turn out to be wrong 40% of the time.
  • Positive outcome bias — a tendency in prediction to overestimate the probability of good things happening to them (see also wishful thinking, optimism bias, and valence effect).
  • Pareidolia — vague and random stimulus (often an image or sound) are perceived as significant, e.g., seeing images of animals or faces in clouds, the man in the moon, and hearing hidden messages on records played in reverse.
  • Primacy effect — the tendency to weigh initial events more than subsequent events.
  • Recency effect — the tendency to weigh recent events more than earlier events (see also peak-end rule).
  • Disregard of regression toward the mean — the tendency to expect extreme performance to continue.
  • Selection bias — a distortion of evidence or data that arises from the way that the data are collected.
  • Stereotyping — expecting a member of a group to have certain characteristics without having actual information about that individual.
  • Subadditivity effect — the tendency to judge probability of the whole to be less than the probabilities of the parts.
  • Subjective validation — perception that something is true if a subject's belief demands it to be true. Also assigns perceived connections between coincidences.
  • Telescoping effect — the effect that recent events appear to have occurred more remotely and remote events appear to have occurred more recently.
  • Texas sharpshooter fallacy — the fallacy of selecting or adjusting a hypothesis after the data is collected, making it impossible to test the hypothesis fairly. Refers to the concept of firing shots at a barn door, drawing a circle around the best group, and declaring that to be the target.

[edit] Social biases

Most of these biases are labeled as attributional biases.
  • Actor-observer bias — the tendency for explanations of other individuals' behaviours to overemphasize the influence of their personality and under-emphasize the influence of their situation (see also fundamental attribution error). However, this is coupled with the opposite tendency for the self in that explanations for our own behaviours overemphasize the influence of our situation and under-emphasize the influence of our own personality.
  • Egocentric bias — occurs when people claim more responsibility for themselves for the results of a joint action than an outside observer would.
  • Forer effect (aka Barnum Effect) — the tendency to give high accuracy ratings to descriptions of their personality that supposedly are tailored specifically for them, but are in fact vague and general enough to apply to a wide range of people. For example, horoscopes.
  • False consensus effect — the tendency for people to overestimate the degree to which others agree with them.
  • Fundamental attribution error — the tendency for people to over-emphasize personality-based explanations for behaviours observed in others while under-emphasizing the role and power of situational influences on the same behaviour (see also actor-observer bias, group attribution error, positivity effect, and negativity effect).
  • Halo effect — the tendency for a person's positive or negative traits to "spill over" from one area of their personality to another in others' perceptions of them (see also physical attractiveness stereotype).
  • Herd instinct — Common tendency to adopt the opinions and follow the behaviors of the majority to feel safer and to avoid conflict.
  • Illusion of asymmetric insight — people perceive their knowledge of their peers to surpass their peers' knowledge of them.
  • Illusion of transparency — people overestimate others' ability to know them, and they also overestimate their ability to know others.
  • Illusory superiority — overestimating one's desirable qualities, and underestimating undesirable qualities, relative to other people. Also known as Superiority bias (also known as "Lake Wobegon effect", "better-than-average effect", "superiority bias", or Dunning-Kruger effect).
  • Ingroup bias — the tendency for people to give preferential treatment to others they perceive to be members of their own groups.
  • Just-world phenomenon — the tendency for people to believe that the world is just and therefore people "get what they deserve."
  • Notational bias — a form of cultural bias in which a notation induces the appearance of a nonexistent natural law.
  • Outgroup homogeneity bias — individuals see members of their own group as being relatively more varied than members of other groups.
  • Projection bias — the tendency to unconsciously assume that others share the same or similar thoughts, beliefs, values, or positions.
  • Self-serving bias (also called "behavioural confirmation effect") — the tendency to claim more responsibility for successes than failures. It may also manifest itself as a tendency for people to evaluate ambiguous information in a way beneficial to their interests (see also group-serving bias).
  • Self-fulfilling prophecy — the tendency to engage in behaviours that elicit results which will (consciously or not) confirm existing attitudes.[11]
  • System justification — the tendency to defend and bolster the status quo. Existing social, economic, and political arrangements tend to be preferred, and alternatives disparaged sometimes even at the expense of individual and collective self-interest. (See also status quo bias.)
  • Trait ascription bias — the tendency for people to view themselves as relatively variable in terms of personality, behavior and mood while viewing others as much more predictable.
  • Ultimate attribution error — Similar to the fundamental attribution error, in this error a person is likely to make an internal attribution to an entire group instead of the individuals within the group.

[edit] Memory errors

  • Consistency bias — incorrectly remembering one's past attitudes and behaviour as resembling present attitudes and behaviour.
  • Cryptomnesia — a form of mis-attribution where a memory is mistaken for imagination.
  • Egocentric bias — recalling the past in a self-serving manner, e.g. remembering one's exam grades as being better than they were, or remembering a caught fish as being bigger than it was
  • False memory — confusion of imagination with memory, or the confusion of true memories with false memories.
  • Hindsight bias — filtering memory of past events through present knowledge, so that those events look more predictable than they actually were; also known as the 'I-knew-it-all-along effect'.
  • Reminiscence bump — the effect that people tend to recall more personal events from adolescence and early adulthood than from other lifetime periods.
  • Rosy retrospection — the tendency to rate past events more positively than they had actually rated them when the event occurred.
  • Self-serving bias — perceiving oneself responsible for desirable outcomes but not responsible for undesirable ones.
  • Suggestibility — a form of mis-attribution where ideas suggested by a questioner are mistaken for memory.
  •  
  • [edit] Common theoretical causes of some cognitive biases