Useful and practical content

Take advantage of the content and register your opinion for us

image

Judgment in Managerial Decision Making

Humans are not perfect decision makers. Not only are we not perfect, but we depart from perfection or rationality in systematic and predictable ways. The modern world is a refereed as VUCA, VUCA is an acronym – first used in 1987, drawing on the leadership theories of Warren Bennis and Burt Nanus – to describe or to reflect on the volatility, uncertainty, complexity and ambiguity of general conditions and situations. VUCA conflates four distinct types of challenges that demand four distinct types of responses.

VOLATILITY , We live in a world that’s constantly changing, becoming more unstable each day, where changes big and small are becoming more unpredictable – and they’re getting more and more dramatic and happening faster and faster. As events unfold in completely unexpected ways, it’s becoming impossible to determine cause and effect.

UNCERTAINTY, It’s becoming more difficult to anticipate events or predict how they’ll unfold; historical forecasts and past experiences are losing their relevance and are rarely applicable as a basis for predicting the shape of things to come. It’s becoming nearly impossible to plan for investment, development, and growth as it becomes increasingly uncertain where the route is heading.

COMPLEXITY, Our modern world is more complex than ever. What are the reasons? What are the effects? – Problems and their repercussions are more multi-layered, harder to understand. The different layers intermingle, making it impossible to get an overview of how things are related. Decisions are reduced to a tangled mesh of reaction and counter-reaction – and choosing the single correct path is almost impossible.

AMBIGUITY, “One size fits all” and “best practice” have been relegated to yesterday – in today’s world it’s rare for things to be completely clear or precisely determinable. Not everything is black and white – grey is also an option. The demands on modern organizations and management are more contradictory and paradoxical than ever, challenging our personal value systems to the core. In a world where the “what” takes a back seat to the “why?” and the “how?”, making decisions requires courage, awareness, and a willingness to make mistakes.

Furthermore, the number of people, the amount of knowledge, and the degree of complexity are all expanding rapidly. Despite the sophistication of our corporations and the speed of our technological development, the capabilities of the human brain have not changed dramatically in the last ten thousand years. Yet, individuals rely on rules of thumb, or heuristics, to lessen the information-processing demands of making decisions. Heuristics reduce the effort people must put into making decisions by allowing them to examine fewer pieces of information, simplify the weights of different information, process less information, and consider fewer alternatives in making decisions. By providing managers with efficient ways to deal with complex problems, heuristics frequently produce effective decisions. However, heuristics also can lead managers to make systematically biased judgments. Biases result when an individual inappropriately applies a heuristic.

 

The understanding of these systematic and predictable departures is core to the field of judgment and decision making. By understanding these limitations, we can also identify strategies for making better and more effective decisions.

Here we can understand the systematic biases that affect our judgment and decision making by learning this managerial topic, terms and definitions.

 

Cognitive biases

 

Cognitive bias

A cognitive bias is a systematic pattern of deviation from norm or rationality in judgment. Individuals create their own "subjective reality" from their perception of the input. An individual's construction of reality, not the objective input, may dictate their behavior in the world. Thus, cognitive biases may sometimes lead to perceptual distortion, inaccurate judgment, illogical interpretation, or what is broadly called irrationality.

Some cognitive biases are presumably adaptive. Cognitive biases may lead to more effective actions in a given context. Furthermore, allowing cognitive biases enables faster decisions which can be desirable when timeliness is more valuable than accuracy, as illustrated in heuristics. Other cognitive biases are a "by-product" of human processing limitations, resulting from a lack of appropriate mental mechanisms (bounded rationality), impact of individual's constitution and biological state (see embodied cognition), or simply from a limited capacity for information processing. Although this field of research overwhelmingly involves human subjects, some findings that demonstrate bias have been found in non-human animals as well.

Explanations include information-processing rules (i.e., mental shortcuts), called heuristics, that the brain uses to produce decisions or judgments. Biases have a variety of forms and appear as Cognitive ("cold") bias, such as mental noise, or motivational ("hot") bias, such as when beliefs are distorted by wishful thinking.

 

List of cognitive biases

-          Abilene paradox- In the Abilene paradox, a group of people collectively decide on a course of action that is counter to the preferences of many or all of the individuals in the group. It involves a common breakdown of group communication in which each member mistakenly believes that their own preferences are counter to the group's and, therefore, does not raise objections. A common phrase relating to the Abilene paradox is a desire to not "rock the boat". This differs from groupthink in that the Abilene paradox is characterized by an inability to manage agreement.

-          Agent detection: The inclination to presume the purposeful intervention of a sentient or intelligent agent.

-          Ambiguity effect: The tendency to avoid options for which the probability of a favorable outcome is unknown.

-          Anchoring or focalism: The tendency to rely too heavily, or "anchor", on one trait or piece of information when making decisions (usually the first piece of information acquired on that subject).

-          Anthropocentric thinking: The tendency to use human analogies as a basis for reasoning about other, less familiar, biological phenomena.

-          Anthropomorphism or personification: The tendency to characterize animals, objects, and abstract concepts as possessing human-like traits, emotions, and intentions. The opposite bias, of not attributing feelings or thoughts to another person, is dehumanised perception, a type of objectification.

-          Attentional bias: The tendency of perception to be affected by recurring thoughts.

-          Attribute substitution: Occurs when a judgment has to be made (of a target attribute) that is computationally complex, and instead a more easily calculated heuristic attribute is substituted. This substitution is thought of as taking place in the automatic intuitive judgment system, rather than the more self-aware reflective system.

-          Automation bias: The tendency to depend excessively on automated systems which can lead to erroneous automated information overriding correct decisions.

-          Availability heuristic: The tendency to overestimate the likelihood of events with greater "availability" in memory, which can be influenced by how recent the memories are or how unusual or emotionally charged they may be.

-          Backfire effect: The reaction to disconfirming evidence by strengthening one's previous beliefs. Note: the existence of this bias as a widespread phenomenon has been disputed in empirical studies.

-          Base rate fallacy or Base rate neglect: The tendency to ignore general information and focus on information only pertaining to the specific case, even when the general information is more important.

-          Belief bias: An effect where someone's evaluation of the logical strength of an argument is biased by the believability of the conclusion.

-          Berkson's paradox: The tendency to misinterpret statistical experiments involving conditional probabilities.

-          Bystander effect, or Bystander apathy: The bystander effect, or bystander apathy, is a social psychological theory that states that individuals are less likely to offer help to a victim when there are other people present.

-          Clustering illusion: The tendency to overestimate the importance of small runs, streaks, or clusters in large samples of random data (that is, seeing phantom patterns).

-          Compassion fade: The predisposition to behave more compassionately towards a small number of identifiable victims than to a large number of anonymous ones.

-          Confirmation bias: The tendency to search for, interpret, focus on and remember information in a way that confirms one's preconceptions.

-          Congruence bias: The tendency to test hypotheses exclusively through direct testing, instead of testing possible alternative hypotheses.

-          Conjunction fallacy: The tendency to assume that specific conditions are more probable than a more general version of those same conditions. For example, subjects in one experiment perceived the probability of a woman being both a bank teller and a feminist as more likely than the probability of her being a bank teller

-          Conservatism bias: The tendency to revise one's belief insufficiently when presented with new evidence.

-          Continued influence effect: The tendency to believe previously learned misinformation even after it has been corrected. Misinformation can still influence inferences one generates after a correction has occurred. cf. Backfire effect.

-          Contrast effect: The enhancement or reduction of a certain stimulus' perception when compared with a recently observed, contrasting object.

-          Curse of knowledge: When better-informed people find it extremely difficult to think about problems from the perspective of lesser-informed people.

-          Declinism: The predisposition to view the past favorably (rosy retrospection) and future negatively.

-          Decoy effect: Preferences for either option A or B change in favor of option B when option C is presented, which is completely dominated by option B (inferior in all respects) and partially dominated by option A.

-          Default effect: When given a choice between several options, the tendency to favor the default one.

-          Denomination effect: The tendency to spend more money when it is denominated in small amounts (e.g., coins) rather than large amounts (e.g., bills).

-          Disposition effect: The tendency to sell an asset that has accumulated in value and resist selling an asset that has declined in value.

-          Distinction bias: The tendency to view two options as more dissimilar when evaluating them simultaneously than when evaluating them separately.

-          Dread aversion: Just as losses yield double the emotional impact of gains, dread yields double the emotional impact of savouring.

-          Dunning–Kruger effect: The tendency for unskilled individuals to overestimate their own ability and the tendency for experts to underestimate their own ability.

-          Duration neglect: The neglect of the duration of an episode in determining its value.

-          Empathy gap: The tendency to underestimate the influence or strength of feelings, in either oneself or others.

-          End-of-history illusion: The age-independent belief that one will change less in the future than one has in the past.

-          Endowment effect: The tendency for people to demand much more to give up an object than they would be willing to pay to acquire it.

-          Exaggerated expectation: The tendency to expect or predict more extreme outcomes than those outcomes that actually happen

-          Experimenter's or expectation bias: The tendency for experimenters to believe, certify, and publish data that agree with their expectations for the outcome of an experiment, and to disbelieve, discard, or downgrade the corresponding weightings for data that appear to conflict with those expectations.

-          Forer effect or Barnum effect: The observation that individuals will give high accuracy ratings to descriptions of their personality that supposedly are tailored specifically for them, but are in fact vague and general enough to apply to a wide range of people. This effect can provide a partial explanation for the widespread acceptance of some beliefs and practices, such as astrology, fortune telling, graphology, and some types of personality tests.

-          Form function attribution bias: In human–robot interaction, the tendency of people to make systematic errors when interacting with a robot. People may base their expectations and perceptions of a robot on its appearance (form) and attribute functions which do not necessarily mirror the true functions of the robot.

-          Framing effect: Drawing different conclusions from the same information, depending on how that information is presented.

-          Frequency illusion or Baader–Meinhof phenomenon: The frequency illusion is that once something has been noticed then every instance of that thing is noticed, leading to the belief it has a high frequency of occurrence (a form of selection bias).[46] The Baader–Meinhof phenomenon is the illusion where something that has recently come to one's attention suddenly seems to appear with improbable frequency shortly afterwards.[47] [48] It was named after an incidence of frequency illusion in which the Baader–Meinhof Group was mentioned.

-          Functional fixedness: Limits a person to using an object only in the way it is traditionally used.

-          Gambler's fallacy: The tendency to think that future probabilities are altered by past events, when in reality they are unchanged. The fallacy arises from an erroneous conceptualization of the law of large numbers. For example, "I've flipped heads with this coin five times consecutively, so the chance of tails coming out on the sixth flip is much greater than heads.

-          Gender bias: A widely held set of implicit biases that discriminate against a gender. For example, the assumption that women are less suited to jobs requiring high intellectual ability. Or the assumption that people or animals are male in the absence of any indicators of gender. Or the assumption that academia discriminates against women even as they outnumber men in college and graduate school in the US, and earn the majority of undergraduate and graduate degrees.

-          Hard–easy effect: The tendency to overestimate one's ability to accomplish hard tasks, and underestimate one's ability to accomplish easy tasks.

-          Hindsight bias: Sometimes called the "I-knew-it-all-along" effect, the tendency to see past events as being predictable at the time those events happened.

-          Hot-hand fallacy: The "hot-hand fallacy" (also known as the "hot hand phenomenon" or "hot hand") is the belief that a person who has experienced success with a random event has a greater chance of further success in additional attempts.

-          Hyperbolic discounting: Discounting is the tendency for people to have a stronger preference for more immediate payoffs relative to later payoffs. Hyperbolic discounting leads to choices that are inconsistent over time – people make choices today that their future selves would prefer not to have made, despite using the same reasoning. Also known as current moment bias, present-bias, and related to Dynamic inconsistency. A good example of this: a study showed that when making food choices for the coming week, 74% of participants chose fruit, whereas when the food choice was for the current day, 70% chose chocolate.

-          IKEA effect: The tendency for people to place a disproportionately high value on objects that they partially assembled themselves, such as furniture from IKEA, regardless of the quality of the end product.

-          Illicit transference: Occurs when a term in the distributive (referring to every member of a class) and collective (referring to the class itself as a whole) sense are treated as equivalent. The two variants of this fallacy are the fallacy of composition and the fallacy of division.

-          Illusion of control: The tendency to overestimate one's degree of influence over other external events.

-          Illusion of validity: Overestimating the accuracy of one's judgments, especially when available information is consistent or inter-correlated.

-          Illusory correlation: Inaccurately perceiving a relationship between two unrelated events.

-          Illusory truth effect: A tendency to believe that a statement is true if it is easier to process, or if it has been stated multiple times, regardless of its actual veracity. These are specific cases of truthiness.

-          Impact bias: The tendency to overestimate the length or the intensity of the impact of future feeling states.

-          Implicit association: The speed with which people can match words depends on how closely they are associated.

-          Information bias: The tendency to seek information even when it cannot affect action.

-          Insensitivity to sample size: The tendency to under-expect variation in small samples.

-          Interoceptive bias: The tendency for sensory input about the body itself to affect one's judgement about external, unrelated circumstances. (As for example, in parole judges who are more lenient when fed and rested.)

-          Irrational escalation or Escalation of commitment: The phenomenon where people justify increased investment in a decision, based on the cumulative prior investment, despite new evidence suggesting that the decision was probably wrong. Also known as the sunk cost fallacy.

-          Law of the instrument: An over-reliance on a familiar tool or methods, ignoring or under-valuing alternative approaches. "If all you have is a hammer, everything looks like a nail.

-          Less-is-better effect: The tendency to prefer a smaller set to a larger set judged separately, but not jointly.

-          Loss aversion: The perceived disutility of giving up an object is greater than the utility associated with acquiring it.

-          Mere exposure effect: The tendency to express undue liking for things merely because of familiarity with them.

-          Money illusion: The tendency to concentrate on the nominal value (face value) of money rather than its value in terms of purchasing power.

-          Moral credential effect: Occurs when someone who does something good gives themselves permission to be less good in the future.

-          Neglect of probability: The tendency to completely disregard probability when making a decision under uncertainty.

-          Non-adaptive choice switching: After experiencing a bad outcome with a decision problem, the tendency to avoid the choice previously made when faced with the same decision problem again, even though the choice was optimal. Also known as "once bitten, twice shy" or "hot stove effect".

-          Normalcy bias: The refusal to plan for, or react to, a disaster which has never happened before.

-          Observer-expectancy effect: When a researcher expects a given result and therefore unconsciously manipulates an experiment or misinterprets data in order to find it.

-          Omission bias: The tendency to judge harmful actions (commissions) as worse, or less moral, than equally harmful inactions (omissions).

-          Optimism bias: The tendency to be over-optimistic, underestimating greatly the probability of undesirable outcomes and overestimating favorable and pleasing outcomes (see also wishful thinking, valence effect, positive outcome bias).

-          Ostrich effect: Ignoring an obvious (negative) situation

-          Outcome bias: The tendency to judge a decision by its eventual outcome instead of based on the quality of the decision at the time it was made.

-          Overconfidence effect: Excessive confidence in one's own answers to questions. For example, for certain types of questions, answers that people rate as "99% certain" turn out to be wrong 40% of the time.

-          Pareidolia: A vague and random stimulus (often an image or sound) is perceived as significant, e.g., seeing images of animals or faces in clouds, the man in the moon, and hearing non-existent hidden messages on records played in reverse.

-          Pessimism bias: The tendency for some people, especially those suffering from depression, to overestimate the likelihood of negative things happening to them.

-          Plan continuation bias: Failure to recognize that the original plan of action is no longer appropriate for a changing situation or for a situation that is different than anticipated.

-          Planning fallacy: The tendency to underestimate one's own task-completion times.

-          Pluralistic ignorance: pluralistic ignorance is a situation in which a majority of group members privately reject a norm, but go along with it because they assume, incorrectly, that most others accept it.  This is also described as "no one believes, but everyone thinks that everyone believes".

-          Preference falsification: Preference falsification is the act of communicating a preference that differs from one's true preference.

-          Present bias: The tendency of people to give stronger weight to payoffs that are closer to the present time when considering trade-offs between two future moments.

-          Plant blindness: The tendency to ignore plants in their environment and a failure to recognize and appreciate the utility of plants to life on earth.

-          Probability matching: Sub-optimal matching of the probability of choices with the probability of reward in a stochastic context.

-          Pro-innovation bias: The tendency to have an excessive optimism towards an invention or innovation's usefulness throughout society, while often failing to identify its limitations and weaknesses.

-          Projection bias: The tendency to overestimate how much our future selves share one's current preferences, thoughts and values, thus leading to sub-optimal choices.

-          Proportionality Bias: Our innate tendency to assume that big events have big causes, may also explain our tendency to accept conspiracy theories.

-          Pseudocertainty effect: The tendency to make risk-averse choices if the expected outcome is positive, but make risk-seeking choices to avoid negative outcomes.

-          Recency illusion: The illusion that a phenomenon one has noticed only recently is itself recent. Often used to refer to linguistic phenomena; the illusion that a word or language usage that one has noticed only recently is an innovation when it is, in fact, long-established.

-          Systematic Bias: Judgement that arises when targets of differentiating judgement become subject to effects of regression that are not equivalent.

-          Restraint bias: The tendency to overestimate one's ability to show restraint in the face of temptation.

-          Rhyme as reason effect: Rhyming statements are perceived as more truthful.

-          Risk compensation / Peltzman effect: The tendency to take greater risks when perceived safety increases.

-          Salience bias: The tendency to focus on items that are more prominent or emotionally striking and ignore those that are unremarkable, even though this difference is often irrelevant by objective standards.

-          Scope neglect or scope insensitivity: The tendency to be insensitive to the size of a problem when evaluating it. For example, being willing to pay as much to save 2,000 children or 20,000 children.

-          Selection bias: The tendency to notice something more when something causes us to be more aware of it, such as when we buy a car, we tend to notice similar cars more often than we did before. They are not suddenly more common – we just are noticing them more. Also called the Observational Selection Bias.

-          Selective perception: The tendency for expectations to affect perception.

-          Semmelweis reflex: The tendency to reject new evidence that contradicts a paradigm.

-          Status quo bias: The tendency to like things to stay relatively the same (see also loss aversion, endowment effect, and system justification).

-          Stereotyping: Expecting a member of a group to have certain characteristics without having actual information about that individual.

-          Subadditivity effect: The tendency to judge the probability of the whole to be less than the probabilities of the parts.

-          Subjective validation: Perception that something is true if a subject's belief demands it to be true. Also assigns perceived connections between coincidences.

-          Surrogation: Losing sight of the strategic construct that a measure is intended to represent, and subsequently acting as though the measure is the construct of interest.

-          Survivorship bias: Concentrating on the people or things that "survived" some process and inadvertently overlooking those that didn't because of their lack of visibility.

-          System justification: The tendency to defend and bolster the status quo. Existing social, economic, and political arrangements tend to be preferred, and alternatives disparaged, sometimes even at the expense of individual and collective self-interest.

-          Time-saving bias: Underestimations of the time that could be saved (or lost) when increasing (or decreasing) from a relatively low speed and overestimations of the time that could be saved (or lost) when increasing (or decreasing) from a relatively high speed.

-          Parkinson's law of triviality: The tendency to give disproportionate weight to trivial issues. Also known as bikeshedding, this bias explains why an organization may avoid specialized or complex subjects, such as the design of a nuclear reactor, and instead focus on something easy to grasp or rewarding to the average participant, such as the design of an adjacent bike shed.

-          Unconscious bias: Also known as implicit biases, are the underlying attitudes and stereotypes that people unconsciously attribute to another person or group of people that affect how they understand and engage with them. Many researchers suggest that unconscious bias occurs automatically as the brain makes quick judgments based on past experiences and background.

-          Unit bias: The standard suggested amount of consumption (e.g., food serving size) is perceived to be appropriate, and a person would consume it all even if it is too much for this particular person.

-          Weber–Fechner law: Difficulty in comparing small differences in large quantities.

-          Well travelled road effect: Underestimation of the duration taken to traverse oft-travelled routes and overestimation of the duration taken to traverse less familiar routes.

-          Women are wonderful effect: A tendency to associate more positive attributes with women than with men.

-          Zero-risk bias: Preference for reducing a small risk to zero over a greater reduction in a larger risk.

-          Zero-sum bias: A bias whereby a situation is incorrectly perceived to be like a zero-sum game (i.e., one person gains at the expense of another).

comment

Your email address will not be published. Required parts are marked *