Why Decision Theory Tells You to Eat ALL the Cupcakes

cupcakeImagine that you have a big task coming up that requires an unknown amount of willpower – you might have enough willpower to finish, you might not. You’re gearing up to start when suddenly you see a delicious-looking cupcake on the table. Do you indulge in eating it? According to psychology research and decision-theory models, the answer isn’t simple.

If you resist the temptation to eat the cupcake, current research indicates that you’ve depleted your stores of willpower (psychologists call it ego depletion), which causes you to be less likely to have the willpower to finish your big task. So maybe you should save your willpower for the big task ahead and eat it!

…But if you’re convinced already, hold on a second. How easily you give in to temptation gives evidence about your underlying strength of will. After all, someone with weak willpower will find the reasons to indulge more persuasive. If you end up succumbing to the temptation, it’s evidence that you’re a person with weaker willpower, and are thus less likely to finish your big task.

How can eating the cupcake cause you to be more likely to succeed while also giving evidence that you’re more likely to fail?

Conflicting Decision Theory Models

The strangeness lies in the difference between two conflicting models of how to make decisions. Luke Muehlhauser describes them well in his Decision Theory FAQ:

This is not a “merely verbal” dispute (Chalmers 2011). Decision theorists have offered different algorithms for making a choice, and they have different outcomes. Translated into English, the [second] algorithm (evidential decision theory or EDT) says “Take actions such that you would be glad to receive the news that you had taken them.” The [first] algorithm (causal decision theory or CDT) says “Take actions which you expect to have a positive effect on the world.”

The crux of the matter is how to handle the fact that we don’t know how much underlying willpower we started with.

Causal Decision Theory asks, “How can you cause yourself to have the most willpower?”

It focuses on the fact that, in any state, spending willpower resisting the cupcake causes ego depletion. Because of that, it says our underlying amount of willpower is irrelevant to the decision. The recommendation stays the same regardless: eat the cupcake.

Evidential Decision Theory asks, “What will give evidence that you’re likely to have a lot of willpower?”

We don’t know whether we’re starting with strong or weak will, but our actions can reveal that one state or another is more likely. It’s not that we can change the past – Evidential Decision Theory doesn’t look for that causal link – but our choice indicates which possible version of the past we came from.

Yes, seeing someone undergo ego depletion would be evidence that they lost a bit of willpower.  But watching them resist the cupcake would probably be much stronger evidence that they have plenty to spare.  So you would rather “receive news” that you had resisted the cupcake.

A Third Option

Each of these models has strengths and weaknesses, and a number of thought experiments – especially the famous Newcomb’s Paradox – have sparked ongoing discussions and disagreements about what decision theory model is best.

One attempt to improve on standard models is Timeless Decision Theory, a method devised by Eliezer Yudkowsky of the Machine Intelligence Research Institute.  Alex Altair recently wrote up an overview, stating in the paper’s abstract:

When formulated using Bayesian networks, two standard decision algorithms (Evidential Decision Theory and Causal Decision Theory) can be shown to fail systematically when faced with aspects of the prisoner’s dilemma and so-called “Newcomblike” problems. We describe a new form of decision algorithm, called Timeless Decision Theory, which consistently wins on these problems.

It sounds promising, and I can’t wait to read it.

But Back to the Cupcakes

For our particular cupcake dilemma, there’s a way out:

Precommit. You need to promise – right now! – to always eat the cupcake when it’s presented to you. That way you don’t spend any willpower on resisting temptation, but your indulgence doesn’t give any evidence of a weak underlying will.

And that, ladies and gentlemen, is my new favorite excuse for why I ate all the cupcakes.

How has Bayes’ Rule changed the way I think?

People talk about how Bayes’ Rule is so central to rationality, and I agree. But given that I don’t go around plugging numbers into the equation in my daily life, how does Bayes actually affect my thinking?
A short answer, in my new video below:

 

 

(This is basically what the title of this blog was meant to convey — quantifying your uncertainty.)

What Would a Rational Gryffindor Read?

In the Harry Potter world, Ravenclaws are known for being the smart ones. That’s their thing. In fact, that was really all they were known for. In the books, each house could be boiled down to one or two words: Gryffindors are brave, Ravenclaws are smart, Slytherins are evil and/or racist, and Hufflepuffs are pathetic loyal. (Giving rise to this hilarious Second City mockery.)

But while reading Harry Potter and the Methods of Rationality, I realized that there’s actually quite a lot of potential for interesting reading in each house. Ravenclaws would be interested in philosophy of mind, cognitive science, and mathematics; Gryffindors in combat, ethics, and democracy; Slytherins in persuasion, rhetoric, and political machination; and Hufflepuffs in productivity, happiness, and the game theory of cooperation.

And so, after much thought, I found myself knee-deep in my books recreating what a rationalist from each house would have on his or her shelf. I tried to match the mood as well as the content. Here they are in the appropriate proportions for a Facebook cover image so that you can display your pride both in rationality and in your chosen house (click to see each image larger, with a book list on the left):

Rationality Ravenclaw Library

Rationality Gryffindor Library

Rationality Slytherin Library

Rationality Hufflepuff Library

What do you think? I’m always open to book recommendations and suggestions for good fits. Which bookshelf fits you best? What would you add?

Spirituality and “skeptuality”

Is “rational” spirituality a contradiction in terms? In the latest episode of the Rationally Speaking podcast, Massimo and I try to pin down what people mean when they call themselves “spiritual,” what inspires spiritual experiences and attitudes, and whether spirituality can be compatible with a naturalist view of the world.

Are there benefits that skeptics and other secular people could possibly get from incorporating some variants on traditional spiritual practices — like prayer, ritual, song, communal worship, and so on — into their own lives?

We xamine a variety of attempts to do so, and ask: how well have such attempts worked, and do they come with any potential pitfalls for our rationality?

http://www.rationallyspeakingpodcast.org/show/rs55-spirituality.html

How to want to change your mind

New video blog: “How to Want to Change your Mind.”

This one’s full of useful tips to turn off your “defensive” instincts in debates, and instead cultivate the kind of fair-minded approach that’s focused on figuring out the truth, not on “winning” an argument.

A rational view of tradition

In my latest video blog I answer a listener’s question about why rationalists are more likely to abandon social norms like marriage, monogamy, standard gender roles, having children, and so on. And then I weigh in on whether that’s a rational attitude to take:

You’re such an essentialist!

My latest video blog is about essentialism, and why it’s damaging to your rationality — and your happiness.

My Little Pony: Reality is Magic!

(Cross-posted at 3 Quarks Daily)

You probably won’t be very surprised to hear that someone decided to reboot the classic 80’s My Little Pony cartoon based on a line of popular pony toys. After all, sequels and shout-outs to familiar brands have become the foundation of the entertainment industry. The new ‘n improved cartoon, called My Little Pony: Friendship is Magic, follows a nerdy intellectual pony named Twilight Sparkle, who learns about the magic of friendship through her adventures with the other ponies in Ponyville.

But you might be surprised to learn that My Little Pony: Friendship is Magic’s biggest accolades have come not from its target audience of little girls and their families, but from a fervent adult fanbase. I first heard of My Little Pony: Friendship is Magic from one of my favorite sources of intelligent pop culture criticism, The Onion’s AV Club, which gave the show an enthusiastic review last year. (I had my suspicions at first that the AV Club’s enthusiasm was meant to be ironic, but they insisted that the show wore down their defenses, and that it was “legitimately entertaining and lots of fun.” So either their appreciation of My Little Pony: Friendship is Magic is genuine, or irony has gotten way more poker-faced than I realized.)

And you might be even more taken aback to learn that many, if not most, of those adult My Little Pony: Friendship is Magic fans are men and that they’ve even coined a name for themselves: “Bronies.” At least, I was taken aback. In fact, my curiosity was sufficiently piqued that I contacted Purple Tinker, the person in charge of organizing the bronies’ seasonal convention in New York City. Purple Tinker was friendly and helpful, saying that he had read about my work in the skeptic/rationalist communities, and commended me as only a brony could: “Bravo – that’s very Twilight Sparkle of you!”

But when I finally sat down and watched the show, I realized that while Purple Tinker may be skeptic-friendly, the show he loves is not. The episode I watched, “Feeling Pinkie Keen,” centers on a pony named Pinkie Pie, who interprets the twitches in her tail and the itches on her flank as omens of some impending catastrophe, big or small. “Something’s going to fall!” Pinkie Pie shrieks, a few beats before Twilight Sparkle accidentally stumbles into a ditch. The other ponies accept her premonitions unquestioningly, but empirically-minded Twilight Sparkle is certain that Pinkie Pie’s successes are either a hoax or a coincidence. She’s detemined to get to the bottom of the matter, shadowing Pinkie Pie in secret to observe whether the premonitions disappear when there’s no appreciative audience around, and hooking Pinkie Pie up to what appears to be a makeshift MRI machine which Twilight Sparkle apparently has lying around her house, to see whether the premonitions are accompanied by any unusual brain activity.

Meanwhile, Twilight Sparkle is being more than a little snotty about how sure she is that she’s right, and how she just can’t wait to see the look on Pinkie Pie’s face when Pinkie Pie gets proven wrong. Which, of course, is intended to make it all the more enjoyable to the audience when — spoiler alert! — Twilight Sparkle’s investigations yield no answers, and Pinkie Pie’s premonitions just keep coming true. Finally, Twilight Sparkle admits defeat: “I’ve learned that there are some things you just can’t explain. But that doesn’t mean they’re not true. You just have to choose to believe.”

Nooo, Twilight Sparkle, no! You are a disgrace to empirical ponies everywhere. And I’m not saying that because Twilight Sparkle “gave in” and concluded that Pinkie Pie’s premonitions were real. After all, sometimes it is reasonable to conclude that an amazing new phenomenon is more likely to be real than a hoax, or a coincidence, or an exaggeration, etc. It depends on the strength of the evidence. Rather, I’m objecting to the fact that Twilight Sparkle seems to think that because she was unable to figure out how premonitions worked, that therefore science has failed.

Twilight Sparkle is an example of a Straw Vulcan, a character who supposedly represents the height of rationality and logic, but who ends up looking like a fool compared to other, less rational characters. That’s because the Straw Vulcan brand of rationality isn’t real rationality. It’s a gimpy caricature, crafted that way either because the writers want to make rationality look bad, or because they genuinely think that’s what rationality looks like. In a talk I gave at this year’s Skepticon IV conference, I described some characteristic traits of a Straw Vulcan, such as an inability to enjoy life or feel emotions, and an unwillingness to make any decisions without all the information. Now I can add another trait to my list, thanks to Twilight Sparkle: the attitude that if we can’t figure out the explanation, then there isn’t one.

Do you think it’s possible that anyone missed the anti-inquiry message?  Hard to imagine, given the fact that the skeptical pony seems mainly motivated by a desire to prove other people wrong and gloat in their faces, and given her newly-humbled admission that “sometimes you have to just choose to believe.” But just in case there was anyone in the audience who didn’t get it yet, the writers also included a scene in which Twilight Sparkle is only able to escape from a monster by jumping across a chasm – and she’s scared, but the other ponies urge her on by crying out, “Twilight Sparkle, take a leap of faith!”

And yes, of course, My Little Pony: Friendship is Magic is “just” a kids’ cartoon, and I can understand why people might be tempted to roll their eyes at me for taking its message seriously. I don’t know to what extent children internalize the messages of the movies, TV, books, and other media they consume. But I do know that there are plenty of messages that we, as a society, would rightfully object to if we found them in a kids’ cartoon – imagine if one of the ponies played dumb to win the favors of a boy-pony and then they both lived happily ever after. Or if an episode ended with Twilight Sparkle chirping, “I’ve learned you should always do whatever it takes to impress the cool ponies!” So why aren’t we just as intolerant of a show that tells kids: “You can either be an obnoxious skeptic, or you can stop asking questions and just have faith”?

How rationality can make your life more awesome

(Cross-posted at Rationally Speaking)

Sheer intellectual curiosity was what first drew me to rationality (by which I mean, essentially, the study of how to view the world as accurately as possible). I still enjoy rationality as an end in itself, but it didn’t take me long to realize that it’s also a powerful tool for achieving pretty much anything else you care about. Below, a survey of some of the ways that rationality can make your life more awesome:

Rationality alerts you when you have a false belief that’s making you worse off.

You’ve undoubtedly got beliefs about yourself – about what kind of job would be fulfilling for you, for example, or about what kind of person would be a good match for you. You’ve also got beliefs about the world – say, about what it’s like to be rich, or about “what men want” or “what women want.” And you’ve probably internalized some fundamental maxims, such as: When it’s true love, you’ll know. You should always follow your dreams. Natural things are better. Promiscuity reduces your worth as a person.

Those beliefs shape your decisions about your career, what to do when you’re sick, what kind of people you decide to pursue romantically and how you pursue them, how much effort you should be putting into making yourself richer, or more attractive, or more skilled (and skilled in what?), more accommodating, more aggressive, and so on.

But where did these beliefs come from? The startling truth is that many of our beliefs became lodged in our psyches rather haphazardly. We’ve read them, or heard them, or picked them up from books or TV or movies, or perhaps we generalized from one or two real-life examples.

Rationality trains you to notice your beliefs, many of which you may not even be consciously aware of, and ask yourself: where did those beliefs come from, and do I have good reason to believe they’re accurate? How would I know if they’re false? Have I considered any other, alternative hypotheses?

Rationality helps you get the information you need.

Sometimes you need to figure out the answer to a question in order to make an important decision about, say, your health, or your career, or the causes that matter to you. Studying rationality reveals that some ways of investigating those questions are much more likely to yield the truth than others. Just a few examples:

“How should I run my business?” If you’re looking to launch or manage a company, you’ll have a huge leg up over your competition if you’re able to rationally determine how well your product works, or whether it meets a need, or what marketing strategies are effective.

“What career should I go into?” Before committing yourself to a career path, you’ll probably want to learn about the experiences of people working in that field. But a rationalist also knows to ask herself, “Is my sample biased?” If you’re focused on a few famous success stories from the field, that doesn’t tell you very much about what a typical job is like, or what your odds are of making it in that field.

It’s also an unfortunate truth that not every field uses reliable methods, and so not every field produces true or useful work. If that matters to you, you’ll need the tools of rationality to evaluate the fields you’re considering working in. Fields whose methods are controversial include psychotherapy, nutrition science, economics, sociology, consulting, string theory, and alternative medicine.

“How can I help the world?” Many people invest huge amounts of money, time, and effort in causes they care about. But if you want to ensure that your investment makes a difference, you need to be able to evaluate the relevant evidence. How serious of a problem is, say, climate change, or animal welfare, or globalization? How effective is lobbying, or marching, or boycotting? How far do your contributions go at charity X versus charity Y?

Rationality shows you how to evaluate advice.

Learning about rationality, and how widespread irrationality is, sparks an important realization: You can’t assume other people have good reasons for the things they believe. And that means you need to know how to evaluate other people’s opinions, not just based on how plausible their opinions seem, but based on the reliability of the methods they used to form those opinions.

So when you get business advice, you need to ask yourself: What evidence does she have for that advice, and are her circumstances relevant enough to mine? The same is true when a friend swears by some particular remedy for acne, or migraines, or cancer. Is he repeating a recommendation made by multiple doctors? Or did he try it once and get better? What kind of evidence is reliable?

In many cases, people can’t articulate exactly how they’ve arrived at a particular belief; it’s just the product of various experiences they’ve had and things they’ve heard or read. But once you’ve studied rationality, you’ll recognize the signs of people who are more likely to have accurate beliefs: People who adjust their level of confidence to the evidence for a claim; people who actually change their minds when presented with new evidence; people who seem interested in getting the right answer rather than in defending their own egos.

Rationality saves you from bad decisions.

Knowing about the heuristics your brain uses and how they can go wrong means you can escape some very common, and often very serious, decision-making traps.

For example, people often stick with their original career path or business plan for years after the evidence has made clear that it was a mistake, because they don’t want their previous investment to be wasted. That’s thanks to the sunk cost fallacy. Relatedly, people often allow cognitive dissonance to convince them that things aren’t so bad, because the prospect of changing course is too upsetting.

And in many major life decisions, such as choosing a career, people envision one way things could play out (“I’m going to run my own lab, and live in a big city…”) – but they don’t spend much time thinking about how probable that outcome is, or what the other probable outcomes are. The narrative fallacy is that situations imagined in high detail seem more plausible, regardless of how probable they actually are.

Rationality trains you to step back from your emotions so that they don’t cloud your judgment.

Depression, anxiety, anger, envy, and other unpleasant and self-destructive emotions tend to be fueled by what cognitive therapy calls “cognitive distortions,” irrationalities in your thinking such as jumping to conclusions based on limited evidence; focusing selectively on negatives; all-or-nothing thinking; and blaming yourself, or someone else, without reason.

Rationality breaks your habit of automatically trusting your instinctive, emotional judgments, encouraging you instead to notice the beliefs underlying your emotions and ask yourself whether those beliefs are justified.

It also trains you to notice when your beliefs about the world are being colored by what you want, or don’t want, to be true. Beliefs about your own abilities, about the motives of other people, about the likely consequences of your behavior, about what happens after you die, can be emotionally fraught. But a solid training in rationality keeps you from flinching away from the truth – about your situation, or yourself — when learning the truth can help you change it.

The Straw Vulcan: Hollywood’s illogical approach to logical decisionmaking

I gave a talk at Skepticon IV last weekend about Vulcans and why they’re a terrible example of rationality. I go through five principles of Straw Vulcan Rationality(TM), give examples from Star Trek and from real life, and explain why they’re mistaken:

  1. Being rational means expecting everyone else to be rational too.
  2. Being rational means you should never make a decision until you have all the information.
  3. Being rational means never relying on intuition.
  4. Being rational means eschewing emotion.
  5. Being rational means valuing only quantifiable things — like money, productivity, or efficiency.

In retrospect, I would’ve streamlined the presentation more, but I’m happy with the content –  I think it’s an important and under-appreciated topic. The main downside was just that everyone wanted to talk to me afterwards, not about rationality, but about Star Trek. I don’t know the answer to your obscure trivia questions, Trekkies!

 

UPDATE: I’m adding my diagrams of the Straw Vulcan model of ideal decisionmaking, and my proposed revisions to it, since those slides don’t appear in the video:

The Straw Vulcan view of the relationship between rationality and emotion.

After my revisions.

Follow

Get every new post delivered to your Inbox.

Join 488 other followers

%d bloggers like this: