Imagine that (for some reason involving cultural tradition, family pressure, or a shotgun) you suddenly have to get married.
Fortunately, there are two candidates. One is charming and a lion in bed but an idiot about money. The other has a reliable income and fantastic financial sense but is, on the other fronts, kind of meh.
Which would you choose?
Sound like six of one, half-dozen of the other? Many would say so. But that can change when a third person is added to the mix. Suppose candidate number three has a meager income and isn’t as financially astute as choice number two.
For many people, what was once a hard choice becomes easy: They’ll pick the better moneybags, forgetting about the candidate with sex appeal. On the other hand, if the third wheel is a schlumpier version of attractive number one, then it’s the sexier choice that wins in a landslide. This is known as the “decoy effect”—whoever gets an inferior competitor becomes more highly valued.
The decoy effect is just one example of people being swayed by what mainstream economists have traditionally considered irrelevant noise. After all, their community has, for a century or so, taught that the value you place on a thing arises from its intrinsic properties combined with your needs and desires.
It is only recently that economics has reconciled with human psychology. The result is the booming field of behavioral economics, pioneered by Daniel Kahneman, a psychologist at Princeton University, and his longtime research partner, the late Amos Tversky, who was at Stanford University.
It’s all about leveraging the unconscious factors that drive 95 percent of consumer decision-making.
It has created a large and growing list of ways that humans diverge from economic rationality. Researchers have found that all sorts of logically inconsequential circumstances—rain, sexual arousal (induced and assessed by experimenters with Saran-wrapped laptops), or just the number “67” popping up in conversation—can alter the value we assign to things. For example, with “priming effects,” irrelevant or unconsciously processed information prompts people to assign value by association (seeing classrooms and lockers makes people slightly more likely to support school funding).
With “framing effects,” the way a choice is presented affects people’s evaluation: Kahneman and Tversky famously found that people prefer a disease-fighting policy that saves 400 out of 600 people to a policy that lets 200 people die, though logically the two are the same. While mainstream economists are still wrestling with these ideas, outside of academe there is little debate: The behaviorists have won.
Yet for all their revolutionary impact, even as the behaviorists have overturned the notion that our information processing is economically rational, they still suggest that it should be economically rational. When they describe human decision-making processes that don’t conform to economic theory, they speak of “mistakes”—what Kahneman often calls “systematic errors.” Only by accepting that economic models of rationality lead to “correct” decisions, can you say that human thought-processes lead to “wrong” ones.
But what if the economists—both old-school and behavioral—are wrong? What if our illogical and economically erroneous thinking processes often lead to the best possible outcome? Perhaps our departures from economic orthodoxy are a feature, not a bug. If so, we’d need to throw out the assumption that our thinking is riddled with mistakes. The practice of sly manipulation, based on the idea that the affected party doesn’t or can’t know what’s going on, would need to be replaced with a rather different, and better, goal: self knowledge.
Read More: Here