Saturday, December 15, 2012

Fast and Behavioral vs Slow and "Rational"

There have been two papers that seem to lead to the conclusion that making decisions quickly leads to a large influence of "behavioral" behavior: More rejection of unfair offers in ultimatum games and more donations in public good games, than when subjects are forced to take time to make a decision. However the first paper seems to suggest that the result may be a little fragile in that a "pre-choice" may eliminate the impact of a delayed decision, in this case, a delayed possibility to revise choices.

For the Ultimatum Game see: Veronika Grimm and Friederike Mengel, "Let me sleep on it: Delay reduces rejection rates in ultimatum games" Economics Letters, Volume 111, Issue 2, May 2011, Pages 113–115.
The Abstract reads: "Delaying acceptance decisions in the Ultimatum Game drastically increases acceptance of low offers. While in treatments without delay less than 20% of low offers are accepted, 60–80% are accepted as we delay the acceptance decision by around 10 min."

For the Public Good Game see: David G. Rand, Joshua D. Greene & Martin A. Nowak: "Spontaneous giving and calculated greed" Nature 489, 427–430 (20 September 2012).
The abstract reads:
"Cooperation is central to human social behaviour. However, choosing to cooperate requires individuals to incur a personal cost to benefit others. Here we explore the cognitive basis of cooperative decision-making in humans using a dual-process framework. We ask whether people are predisposed towards selfishness, behaving cooperatively only through active self-control; or whether they are intuitively cooperative, with reflection and prospective reasoning favouring ‘rational’ self-interest. To investigate this issue, we perform ten studies using economic games. We find that across a range of experimental designs, subjects who reach their decisions more quickly are more cooperative. Furthermore, forcing subjects to decide quickly increases contributions, whereas instructing them to reflect and forcing them to decide slowly decreases contributions. Finally, an induction that primes subjects to trust their intuitions increases contributions compared with an induction that promotes greater reflection. To explain these results, we propose that cooperation is intuitive because cooperative heuristics are developed in daily life where cooperation is typically advantageous. We then validate predictions generated by this proposed mechanism. Our results provide convergent evidence that intuition supports cooperation in social dilemmas, and that reflection can undermine these cooperative impulses."

I think there is room for more understanding on that issue, how it would work for other biases, and whether the findings so far are robust. It may also be nice to combine this with an environment where this tendency can be exploited... Please let me know if you're aware of other papers on that issue.


  1. Have you seen this ( They find that people that are made to think more intuitively are _less_ ambiguity and risk averse.

  2. David Rand and Co-authors did some work on response times and behavior in dilemma games.
    They also have a recent (and controversial) paper in Nature