Saturday, July 12, 2014

Systems 1?

[These ideas were inspired by/stolen from Nate Sores, aka So8res, in an ongoing email conversation about the Dark Arts of Rationality.]

Summary: There's more than one thing we might mean by "System 1", and the different referents require different rationality techniques.
___________________________________________


I went skydiving once. On the way up, I was scared. Not as scared as I expected to be, but more scared than I thought I should have been. I believed at the time that there was about a 0.0007% chance of dying in a skydiving accident.* In other words, if I and around 150,000 other people all went skydiving, about one of us would die. And that's before taking into account that I was jumping with an expert.

Part of me knew this. Otherwise, I wouldn't have gotten into the plane. But part of me didn't seem to know it, and I knew part of me didn't know it, because I was seriously wondering whether I'd have the guts to jump on my own.** I wanted all of me to understand, so I could experience the excitement without the fear.

So I tried picturing 150,000 people all jumping out of planes. (Dictionary of Numbers helpfully informs me that that's about the population of Guam.) It helped. Then I called to mind the people I knew who had been in car crashes, and remembered how willing I was to climb into a car anyway. My methods weren't as sophisticated then, but I was basically seeking what I've recently been calling a "System 1 handle" to arm System 2's abstract understanding with a clear emotional impact. It was enough, though, to calm my nerves.

We lined up. I was calm. The door opened, and the wind roared. I was calm. The pair in front of me jumped. I was calm. 

The floor disappeared from beneath me. It took me about two seconds to regain enough composure to scream.

Dual process theory is mostly just a quick-and-dirty way of framing cognitive processes. Speaking as though "System 1" and "System 2" are people in my head with very different personalities lets me apply a bunch of useful heuristics more intuitively. I've been fairly good about keeping track of the reality that they aren't *people*. I've been less good about guarding against a false dilemma.

The framing tracks something like "degrees of deliberation". But there's a lot more in that continuum than "very deliberative" and "very not deliberative". I think I've been treating everything below some point on the line as "System 1 processing", and it's simply "everything that I don't think of as System 2".

There seem to be (at least) two natural clusters in the "not System 2" part of the spectrum that might call for different treatment. During skydiving, one cluster responded to vivid, concrete examples. The other cluster was too simple, to instinctual to get a grip on even that. The link between "ground disappears" and "freeze in terror" was too basic to manipulate with the kind of technique I was using. The "oh shit I'm falling" process is a different animal than the one responsible for "this is dangerous and therefore I'm going to die".

The "System 1 translation" techniques I've been writing about are meant to deal with the part-of-yourself-that-you-argue-with-when-you're-trying-to-convince-yourself-to-do-something-difficult, and the part-of-yourself-that-needs-to-remember-important-details-but-doesn't-care-about-numbers-or-other-abstractions. The part that's anxious about the jump and doesn't understand the odds.

But I'm not sure S1 translation does much of anything for the part that panics when you pull the ground out from under it. To deal with that part, I think you probably need tools more along the lines of exposure therapy.

When you're in a car driving on icy roads and you start to slide, the best way to regain control is to steer into the skid and accelerate slightly. But most people's instinctive reaction is to slam on the brakes. I've tried to understand why the steer-into-the-skid method works so I could translate that understanding into the language of System 1, and while I've not thought of anything I expect would  work for most people, I've got something that makes sense to me: When I'm on
icy roads, I can imagine that I'm in a Flintstones car, with my feet scrambling against the ice. If I were in a Flintstones car, my immediate reaction to sliding would be to run a little faster in the direction of the skid in order to gain control. I figure this is probably because I spent some of my childhood playing on frozen ponds, so I wouldn't suggest that translation to just anyone.

But I doubt it would work no matter how robust the translation. The part of my brain that panics and slams on the brakes is more basic than the part that's scared of the whole idea of skydiving, or that resists checking the balance of my bank account. I'm not sure it can be reasoned with, no matter what language I use.

To ensure I do the right thing when driving on icy roads, a much better plan would be to find some way to practice. Find an icy parking lot, and actually expose myself to the experience over and over, until the right response is automatic.

I'm not sure about this, but I'd be at least a little more surprised if S1 translation worked for ice driving than if it didn't. If I'm right, lumping together all the "System 1 techniques" and using them on anything that's "not System 2" can be dangerous. If this is a real distinction, it's an important one for applied rationality.

___________________________________________


*I still believe that, but with less confidence 'cause I'm better calibrated and recognize I haven't done enough research.
**With our setup, I was actually hanging from a harness attached to the expert by the time we were about to leave the plane, so my feet weren't on the floor and I didn't get to jump on my own. Still kinda pissed about that.