Behavioural Law and Economics Introduction - Behavioural Law and Economics

  Behavioural Law and Economics Introduction

  • Traditionally, people are expected to maximise their utility under several conditions.
    • – First, they should be well informed about their choice.

    • – Second, they should be fully aware of the consequences of the choice they have made.
    • – And finally, in determining such a choice, they should pursue their wishes in a way that is logically consistent (for example, if A is preferred than B, and B is preferred than C, than A must be preferred than C)

    bounded rationality, bounded

    willpower, and bounded self-interest

  • • Bounded rationality occurs because people often

    use heuristic to make their judgements, which,

    although it helps people to make fast decisions,

    leads to errors in some circumstances.
  • • bounded willpower occurs because people often

    make decisions that they know to be in conflict

    with their long-term interests.
  • • bounded self-interest occurs because people are

    willing to be treated fairly hence they will treat

    others fairly as long as these others treat them

    fairly.

  • people may disregard the probability of the future events when they have to make a judgement about such events.
    • – Hence, in the face of uncertainty, people tend to base their judgement on the “rules of thumb”, referred to as “heuristics”, namely a shortcut that helps people to make a decision when information is incomplete or uncertain.
    • – Although much of the time such a shortcut could provide roughly correct answers, it also often leads people to erroneous decisions.

  

Availability heuristics

  • “retrievability of instances”, namely that familiar and salient ones. Biases occur since instances that are faster and easier to instances are easier to retrieve than unfamiliar and less salient instances that are less retrievable recall from memory will appear to be more numerous than
  • the imaginability of events. The occurrence of events will appear to be more likely when it can be easily imagined.
    • – Therefore, it is argued that activities whose risks can easily be risks are difficult to imagine portrayed will appear to be more dangerous than activities whose

  • “illusory correlation” factor may contribute to the availability

    simultaneously, although they will not, because they are seen as

    heuristic. With such a bias, one considers two things would occur

    associates.
    • – When people think two events are closely associated, they expect that another or that two strongly associated events are judged to occur one event will take place as a consequence of the occurrence of simultaneously.

  

“representativeness”

  • people consider an instance as a representative of a population based on the similarity of this instance to the population.
    • – “insensitivity to prior probability of outcomes”.

  • For example, people tend to assess that a person belongs to others, say a lawyer, because the similarity of this person to a certain occupation, for instance an engineer, rather than namely that there are more lawyers than engineers, does occupation. In this case, information about probability, a stereotype description is considered as representing this not significantly alter the people’s judgement
    • – “gambler’s fallacy”, the gambler feels that the fairness of the coin entitle him to expect that any deviation in one direction will soon be cancelled by corresponding deviation in the other
    • – “insensitivity to predictability”

  • People use information that is actually not a good predictor, without so much wonder about the accuracy and relevance of this information to their prediction.
    • – Tversky and Kahneman give an empirical example where the performance of a teacher during a particular period of subjects were presented with several paragraphs describing teaching. At the end, subjects were asked to evaluate the teacher’s performance based on the provided description. judgements for these two conditions were identical next 5 years. Tversky and Kahneman found that the They were also asked to predict the teacher’s career for the

  

“adjustment and anchoring”

  • if people are given a value and then are asked to give their judgement, they often adjust their judgements to this initial value (referred to as “anchor”), even when this value is irrelevant to their judgement. Shortly, people’s judgements are biased toward their initial values, which are used as starting point for their judgement.
    • – Tversky and Kahneman present an experiment where countries in the UN. A number between 0 and 100 was subjects were asked to estimate the number of African of the countries was higher or lower than the randomly Subjects were first asked to estimate whether the number determined randomly by spinning a wheel of fortune. 10 and 45 when the anchor was 65 that the median estimates were 25 when the anchor was chosen number (the “anchor”). This experiment explained
    Prospect Theory, Endowment Effect,

    and Status Quo Bias

  • • Respondents were asked to imagine that the U.S

    was threatened by an unusual Asian disease, which was expected to kill 600. A choice had to be made between two alternative programs with different consequences as follows:

    – If Program A is adopted, 200 people will be saved.

    • – If Program B is adopted, there is 1/3 probability that 600 will be saved, and 2/3 probability that no people will be saved.

  • Most respondents in this experiment (72%) preferred
  • A second group of respondents was given the

    same story with the following consequences:

    – If Program C is adopted, 400 people will die.

  • – If program D is adopted there is 1/3 probability that nobody will die, and 2/3 probability that 600 people will die.
    • In this experiment, the majority of the respondents (78%) preferred Program D to Program C.
    • Kahneman and Tversky conclude that respondents tended to be risk averse in “lives saved” version and risk-seeking in “lives lost” version.

  • – It means that the frame of the outcomes will determine the people’s preference.
    • If the outcomes are framed in term of gains (such as the number of lives saved), people would be risk outcomes. averse, and prefer the prospect that offers more certain
    • if the outcomes are described in term of losses, people tend to be risk-seeking and prefer the prospect that offers more uncertain outcomes

  • Kahneman and Tversky also argue that people’s evaluations are influenced by the way they weight the probability, referred to as “certainty effect”, a phenomenon indicates that people overweight outcomes that are considered certain, relative to outcomes that are merely “probable”
    • – 80% of the respondents preferred a certain win of $ 3000 a higher expected utility rather than a 0.80 chance to win $4000, although the latter has
    • – if the probabilities to win are “possible but not probable” (e.g. a most people will chose the prospect that offers the larger gain 0.001 chance to win $ 6000 and a 0.002 chance to win $ 3000), (namely a 0.001 chance to win $ 6000)

  • Thus, in the positive domain (gains) people prefer a sure gain to a larger gain that is considered as merely probable. In the negative domain, people prefer a loss that is merely probable to a smaller loss that is certain
  • loss aversion, namely that the response to losses

    is more extreme than the response to gains.

  • – changes that make things worse (such as losses) loom larger than improvements or gains
    • An important implication of loss aversion is the “status quo bias”.

  • – people’s strong tendency to remain at the status quo because the disadvantages of leaving it loom larger than advantages.
    • an alternative becomes significantly more popular when it is designated as the status quo.
    • people may have an exaggerated preference for the option set as the default choice

  • – Examples: Organ donors in USA vs. Europe; insurance schemes

   LIBERTARIAN PATERNALISM!!!

  • “Endowment effect”: people ascribe more value to things merely because they own them
  • In one experiment, Kahneman, Knetsch, and Thaler asked students to trade among them. The objects of trade were “induced value tokens”, i.e. mugs. Half of the students were made the owners of mugs (students knew that a similar mug was sold at price of $6.00), and the other half were not, so that the supplies and demands for tokens were created..
  • After several trials, the median owner was unwilling to sell for less than $5.25, while the median buyer was unwilling to pay more than $2.25-$2.75. This experiment was also repeated for various goods such as pens, chocolate bars, and binoculars, leading to a conclusion that the value that an individual assigns to objects appears to increase substantially as soon as that individual is given the object.

  

Public Perception of Risk v. Experts

Opinion

  • Risk perception or risk attitude
  • Biases:
    • – Status quo
    • – Availability Heuristics – The Mythical Benevolence of Nature – Probability neglect
    • – System neglect (Sunstein)
      • Representative bias, anchoring, etc
      • Breyer: compared to experts, public’s reactions to risks reflect different understanding about the underlying risk- related facts, such as

  • – Rule of thumbs (heuristics)
  • – Prominence – Ethics, the strength of which will diminish with distance
  • – Mathematics:
    • Framing
    • Slovic: admits some problems in public perception of risk

  • – Heuristics – Amplification of risk (the ripple effect)
  • – Fixed belief
    • However, there are Qualitative Attributes to Risk (Psychometric)

  • – Slovic: Risk perception is influenced by:
    • Perceived “severity” of risk, consisting of 12 characteristics, namely:

  • – not controllable, dread, globally catastrophic, hard to prevent, certain to be fatal, risks and benefits inequitable, catastrophic, threatens future generations, not easily reduced, risks increasing, involuntary, and affects the respondent personally.
    • Perceived familiarity : not observable, unknown to those exposed, effects immediate, new (unfamiliar), unknown to science.
    • Perceived number of people exposed to the risk.
    • Fischhoff, et al: show that the level of acceptable risk is not only a function perceived control, familiarity, knowledge, and immediacy. of benefit and voluntariness, but also of other characteristics such as

  • – Technologies that are controversial, such as nuclear technology, tend to be deemed high-dread and high-unknown technologies.