February 08, 2010
Wearing rationality badges, popularizing neutrality and saying "I don't know" to politics: Colin Marshall talks to economist, blogger and rationalist Robin HansonRobin Hanson is a professor of economics at George Mason University, research associate at Oxford’s Future of Humanity Institute and chief scientist at Consensus Point. He’s also the thinker behind Overcoming Bias, a popular blog about issues of honesty, signaling, disagreement, forecasting and the far future, around which a large rationality-centric community has developed on the internet. Colin Marshall originally conducted this conversation on the public radio show and podcast The Marketplace of Ideas. [MP3] [iTunes link]
If we are both honest truthseekers, we should not, over the course of this discussion, disagree. Is that correct?
more than this discussion. It would be any discussion between any two
people who are honest truthseekers on any matter of fact, and it
wouldn't have to be by the end of the discussion. It would be at any
point in time. I should be able to pick a topic now and guess your next
opinion on it. My guess of your next opinion should be the opinion I'm
stating to you right now. If I say, "I think this interview will last
an hour," my best guess of what you'll say for the interview lasting
should be an hour.
This is going to sound hard to get the mind around for somebody not familiar with what you've written. They'll say, "But people disagree all the time. Humanity is here, essentially, to disagree with one another." How do you quickly get across to someone like that why there shouldn't theoretically be disagreement?
whole reason this is interesting is that you have a theory that differs
from behavior. It's a normative theory; it says what you should do. It doesn't say what you do do,
but it gives you some idea of how we're going wrong. The key idea is
that we should be respecting each other's opinions. That is, I don't
know how you came to your opinion, I don't know what evidence it's
based on, I don't know what reasoning you went through or analysis. I'm
sure there's lots of noise and errors in the whole process, but
nevertheless I think you were trying to estimate the truth, and that's
the key point. When you tell me your opinion, I take that very
seriously as a summary of all the things you know and think and have
analyzed up to this point on that topic.
What gets in the way of the reality matching the theory? I could probably come out stating an opinion of mine as fact, I could be overstating the probability of some guess I'm making. That's one way this could go off the rails. Why else?
The first thing to notice is that theory and reality do match up, on lots of of ordinary topics we don't care about. It's when our pride or enthusiasm gets hyped up that we start to disagree. If you and I were walking down the street and I said, "I think there's a tree around the corner," you probably wouldn't disagree with me there. If you said, "No there isn't," I would say, "Oh, okay." When our pride isn't on the line or we're working together on a project and we need to achieve something — maybe our job is at stake — we're much more likely to be reasonable. But when we talk about politics or religion or whatever we talk about on these radio shows, that's when we're much more likely to not be reasonable to find it more enjoyable to speak to listen.
Politics, religion — these are topics where people can hold opinions, but when they hold them, they don't actually act on them much of the time, is that correct?
That's true, although it also applies to topics that you do act on but where your pride is on the line. A CEO versus a subordinate director might disagree, or we might disagree about some actual business decision we're making, or about which restaurant we should go to and which is likely to be open and tasty. We disagree on things where we'd rather think that we're right. It's very pleasant and affirming to think we're right and they're wrong. We might rather indulge that feeling than actually be right.
In life, when you can foresee a disagreement coming down the pike, do you have an easy way of testing whether the belief to be disagreed about is one of these — I don't know if volatile is the right word, but — pride- or identity-based beliefs?
There's not much need to, because it's pretty easy to see when you're tempted to disagree. It happens all the time, constantly, in our lives. Take pretty much any two people, give me five minutes with them and I can probably identify half a dozen topics on which they disagree. I imagine you could do this as well.
When you're observing this, in any context — you mentioned religion, you mentioned politics — is there a typical suite of defined areas of belief that are always or very often going to be conflict points?
You can just look inside yourself, you can see how people are reacting when they get emotional, when they get energetic, when it really seems to matter to them. Again, this phenomena's so widespread that there's not much of a need for a diagnosis to see when it's happening. It's almost everywhere.
It does tie up with one of the main themes of your blog, Overcoming Bias. It's not solely about what the title says — or maybe you could frame it that it is — but there's so much to do with the gap between peoples' beliefs that they actually live by and peoples' beliefs that they, to use your own words, "wear like clothes." Is that a good summation of a decent chunk of what you cover?
Yes, although in a sense it's worse than that. I think, to a good first approximation, our conscious minds are something like the P.R. department. Our minds have this vast depth that we're usually not aware of. When we want to present ourselves to other people, we talk and we pay attention to how other people perceive us. That part of our mind, that manages that, is the part we're most aware of. A lot of that is done with an eye to how it looks and how it will make us seem to other people, whether we'll seem loyal and intelligent and thoughtful and submissive or dominant as we wish.
Our whole conscious mind is designed to manage that process, and sometimes it's in the P.R. department's interest to be reasonable and accurate and careful, but many other times it's not. That's the hard thing, looking inside ourselves to realize that, even when we're very sincere, we're rarely honest, in the sense that we're not being very careful to be accurate. But is sure feels sincere to us. We look inside and say, "Yeah, that's what I really think."
This sincerity-honesty gap, sincerity being the side where you think you're being true to your own thoughts, but if you were being honest — honesty is what you get when you correct for the mental P.R. department?
is when you're really trying hard to be accurate. To be honest, you
have to think about possible criticisms and take them seriously. You
have to ask what the evidence on the other side would be. You have to
wonder who has supported which sides of a position. If you're going to
be honest about something, there's a set of considerations you're
supposed to look at. We all pretty much know what they are. But when
we're sincere, that doesn't mean we've done those sorts of things.
if you just want to be successful in the way your ancestors were
successful, if the environment you're in hasn't changed substantially
in relevant ways since then, you should do what your intuitions tell
you to do. You should go along with the P.R. department and sincerely
think that you're honest about whatever you say. On the other hand, if
you really think it's important to be right about something, if, for
example, you're actually concerned about the future or global warming
and you think it matters more than a convenient talking point for
parties or showing your loyalty to some side of some coalition, you
need to go beyond that sort of intuitive, easy sincerity to try to be
How far does this effort toward truthseeking go back in your own life? I recall on your web site a few questions you wrote at a very young age asking about the nature of certain things. How long have you thought of yourself as in the mission of truthseeking?
actually think most people think they are. Unfortunately. It's a common
point in almost every person's life when they see people around them
who have beliefs they don't quite understand and they decide for
themselves that, "Well, I must just be more honest than those
other people. I must be trying harder." That's the easiest way to
explain your disagreement with other people. We do disagree, and it
does bother us; we know, at some level, that something's not right
about that, and we're eager to find explanations. The easiest
explanation that usually comes to mind is just our own superior
sincerity or honesty. It's just quick and easy. We're not very honest
about considering that explanation.
People don't just go to their own honesty as an explanation; they also go to, "Well, maybe everyone's got a different truth," something I would call more pernicious, but not more widespread.
people don't usually mean that. It's a nice diplomatic way to settle a
temporary dispute, but if you really mean that we all have our own
truth, at some level you don't really think you have a truth. What you
think you have is a way of thinking that comforts you and helps you
deal with things, that sort of attitude. You might think it's a good
way of framing your life in order to help you get through the day, et cetera. But you don't really believe it's true unless you believe things that contradict it are false.
I can tell this is going to be a theme of this discussion. I'm going to bring up a lot of things people say that I assume they mean, but I can think about it a little and realize they probably don't.
a sense in which they mean it sincerely. They just don't mean it
honestly enough to follow it through two steps of inference to
conclusions that follow from it. There's that level of dishonesty, in a
sense, of not bothering to notice what implications a belief has in
order to check whether it's actually the sort of belief you want to
I derailed what I was asking, but how closely can you pin down the moment where you said, "I've got to dedicate myself to truthseeking as a life mission"?
I'm not actually sure I ever did that. I'm not even sure I'm doing it now. I'm trying to be honest. It's certainly a fun hobby. It's a nice way of organizing the kinds of things I'm interested in, and it certainly pushes people's buttons to tell them that's what I'm doing. But if I actually look at all the different things I do, I'm not sure I can put together a strong case that I'm actually paying high costs to do that. I'll have to be honest there and say I'm trying, somewhat. I'm not going to put myself up as an exemplar.
But at the same time, with Overcoming Bias and the constellation of rationality communities that have popped up on the internet as a result of that whole thing, there seem to be a lot of people who respond to the truthseeking ideals that you state. I've called it an "unquenched thirst" up to this point. There's a current of truthseeking that people didn't seem to be able to define until they started reading your blog and others. Does that line up with your observations at all?
There's always been a current of people who thought in those terms. Certainly a large chunk of professional academic philosophy is in those terms, and there's been an undercurrent in many areas of the world. But as the blogosphere has become larger, it allows more refined topics, it allows more refined publication. I think we're just seeing a niche being filled out as the blogosphere has fragmented into a wide range of topics.
But since so much of the rationality sub-blogosphere has developed around the sort of thing you do on the net, what do you think of how it's developed? You've seen it come up around you, this specific community. There has to be something quite fascinating of the emergence of a rationality community in that setting.
There's easy ways and hard ways to deal with it. Unfortunately, the hard ways are the more honest ways.
If you recall, during the Bush administration many Democrats were calling themselves the "Reality-based community" by contrast to the Bush administration. They thought the Bush folk were delusional, and self-admittedly so, and they were, of course, trying to be realistic. Honestly, this goes way back: in almost any long-standing dispute, usually both sides think of themselves as the realists and the other side as delusional. People have long wanted not just to claim that but have liked to have little things they can point to as justifying evidence. There was a whole literature on heuristics and biases that came out of psychology forty years ago — Daniel Kahneman won the Nobel Prize recently. A lot of people love that literature, because it shows why the other guys are wrong.
There's long been a fascination with that sort of thing. Many academics who can, say, master mathematics look down on people who don't use mathematics as obviously delusional because they don't have access to the formal methods that are the key to being accurate. People who study vast amounts of detail in history or sociology think of themselves as the people who are honest because they're actually paying attention to detail in reality as opposed to those delusional theorists lost in their abstract thoughts.
Again, a lot of what we
have is a community with people who like to pat themselves on the back
and say, "I'm rational, those people aren't," and they feel a little
more able to say that sincerely to themselves if they've tracked a few
rules of how to tell the difference between rationality or not. But the
key question is, how hard really are you trying to be honest, as
opposed to wanting to get a badge of honesty so you can beat it over
the head of people you disagree with? Again, we're back to disagreement.
Given that, is it more frustrating to look for honest rationality in the sphere of politics or academia?
Well, they overlap substantially. We're more familiar with politics. Obviously, politics is a really hard area for people to be honest in. The emotions and the confidence just well up so quickly and fully that it's hard to knock down. It took a lot of effort for me to put on my web page, years ago, that my main opinion about politics was "I don't know." Too many people are too confident in what they think they know. But that doesn't win you many friends or allies in the great world of politics. If you were on the other side, they might think that was good for you to have admitted your uncertainty, but not for people on their side.
Academia's interestingly different in the sense that the main divisions are mostly over method as opposed to conclusions. Conclusions are secondary. What academics usually invest themselves in is how to study something. By the time they've spend five or ten years learning how to study something a certain way, they are very sensitive about the idea that that's the best way. Which topics they apply the method to they're much more flexible on. If their method came up with a conclusion that was different than the one they originally held, they're willing to embrace it because it embraces their method and shows how they're the best sort of academics because they're using the right method.
So we might say a politician can get hung up easily on a conclusion and find any way possible to stick to it, whereas an academic is more likely to stick to the method they have come up with, no matter where that method leads.
in some sense the academic isn't that interested in the conclusions. At
some level, academia is about just showing that you are an impressive
person via the methods you have mastered.
We talked about the rationality communities you can find online, but those are directed specifically at rationality. It sounds to me as if there's no sphere in the public world where you can find much more rationality going on than in other places.
It's hard to say,
because so many people would like to appropriate the label for
themselves. As soon as any group started to acquire a reputation for
being more rational, that reputation would suddenly be contested by
people who would challenge their rationality. There's remarkably little
positive energy toward wanting some sort of neutral analysis, but
there's an enormous amount of energy in wanting to deflect other
peoples' claims to be more rational.
I think back to a brief post on Overcoming Bias, the subject like of which was "Neutrality Isn't Popular." You were talking specifically about history, but it sounds like you could apply that pretty much anywhere. Actual neutrality isn't popular, though it sounds as if the word neutrality is extremely popular.
If you created a political think tank and started to publish white papers, and you consistently did very neutral analyses in these white papers, you would get almost no donations. There would be nobody paying to do this. Pretty soon it would be gone, and you wouldn't get much in the way of donations or people signing up for membership or any of those sorts of things. It would be almost complete disinterest.
the other hand, if you take a side in some battle, there's certainly a
niche for people who take a side but nevertheless try to give the
outward appearance of neutrality. They will take a neutral academic
tone and use numbers and analyses and they'll try to, however possible,
seem to be a neutral analyst. But they will, of course, only get
money and support and attention to the extent they are thought to
identify with a side. Similarly, even governmental agencies that offer
neutral analysis are relatively unpopular.
This may be going a little bit down the rabbit hole, but how popular is neutrality versus what one might expect, even within the specifically rationality-oriented communities?
They're too small and diverse to easily characterize. People like to collect methods, certainly, and there's some sense to that. That's somewhat different than endorsing a neutral institution. For example, I'm a big advocate of this concept of a prediction market. This is a betting market which would be on an interesting, important question so that one way we could produce a neutral, reliable estimate on a topic would be to create a betting market and then invite many people to bet — or even subsidize them — and see what the betting odds are. A great many people nod and say, "Yes, that sounds like a great idea for how we, as the world, should be rational and produce rational beliefs."
the other hand, people who are big fans of rationality are not usually
interested in spending much time creating or participating in such
markets. They would rather talk more about rationality, which makes
sense if what you're really doing is just talking in order to achieve
the other sorts of things everybody else achieves by talking. At a
basic level, as you've probably noticed from Overcoming Bias, I try to
be very honest about the nature of the social behaviors we have and
what their origins are, what functions they serve, and many of them are
said, supposedly, to be trying to achieve truth. That's usually not
very plausible and the other explanations we have are not especially
noble, but at least they are more illuminating. Conversation is one of
those things we do because we like to pretend we're acquiring truth,
but that's not very plausible.
One of these ways behavior might be less noble than we would initially think is the concept of "signaling," which a lot of people read about on your blog. That's what hooks them, the way you describe what normal things people do turn out to have this signaling function. How best to describe what signaling is?
Again, a lot of what we do has a P.R. department function. We're trying to look good to other people. It's very important to humans now and to our distant ancestors to look good to each other. We are a large social species. We have large tribes. Our large brains are probably that large mainly in order to handle the social complexities of figuring out how to deal with each other in a large tribe. A great deal of what we had to figure out is who we are impressed with, so that we want to affiliate with them, and who we're loyal to, or who is loyal to us. It was extremely important for them, and it continues to be important for each of us to both figure out in other people how impressive and loyal they are and also in ourselves, to try to give that impression to other people.
The idea is, a lot of our behaviors that don't make much sense in terms of apes trying to feed themselves and stay warm make a lot more sense in terms of that sort of function. If you look at the things humans do that other apes don't, like talk for all hours about strange, abstract things or draw art or throw balls through hoops, they make a lot more sense as ways we can show off to each other. This concept of signaling, or showing off, or showing our characteristics has some details that help you make more sense of it. There's this key idea that you need to do something that can't easily be faked as a signal of either your ability or your loyalty. Obviously you can just say you're able or you're loyal, but other people should be skeptical about mere words. You have to figure out how to do something that actually seems more impressive and loyal.
example, academics are trying to signal that they are very smart, very
well-read, very knowledgeable and have mastered difficult techniques,
so when they write articles for journals, the text of the article says,
"Here we are, discovering some new insight about the world and isn't
this great, we're all helping learning more about the world," but of
course the subtext is, "Look how impressive we are and this complicated
analysis we've done." If you actually look at how academics choose each
other, in terms of hiring or publications, they focus almost entirely
on that impressiveness of the effort, and very little on the actual
content of the conclusion. Academia isn't largely about the content or
the conclusion; it's largely about showing how impressive you are
through the methods you can master or display, which makes perfect
sense as animals who descended from others who were trying to impress
each other. Why should such people be trying to understand the nature
of the universe?
It does seem like there's quite a few reasons, evolutionarily, why the deck is stacked against our figuring out the nature of the universe. When I think about this, I'm actually surprised humans have discovered as much truth as they have.
something really remarkable about the human brain that allowed it to
have an amazing breadth of topics it could consider and set of
abstractions it could master. Those abstractions have gone remarkably
far in helping us understand a wide range of things. Of course, we are
tempted to overrate how much we've done. Truth be told, the main reason
we are rich and powerful has relatively little to do with our ability
to do abstract reasoning. We're very proud of how much we understand
about stars or mathematics or things like that, but economic growth is
largely driven by lots of small modifications and improvements that we
slowly accumulate over time, for which this more abstract reasoning
isn't especially useful.
Thinking about signaling a little more, I want to turn it back on myself. I think, okay, on one level I'm asking you a question because I feel that I am curious to know the answer and that it would be a more interesting world if listeners were exposed to Robin Hansonian ideas. But on the other hand, it' s plausible that I'm signaling my interviewing skill to gain higher interviewing status, correct?
very plausible. Of course, the problem is that it's hard to consciously
accept that. This is in general true about all sorts of theories of
human behavior that aren't very pretty. They're much easier theories to
accept about other people than about yourself. We have this whole P.R.
spiel that our minds are all set up to give about ourselves. We're also
relatively willing to be cynical and negative about other people who
aren't at least our close allies. But turning that around on ourselves
is really hard.
I don't find it all that unpalatable, though, now that I think about it. I can accept that I am trying to gain more interviewing status.
In the abstract you can accept it, but in each particular thing you're doing it's going to be much harder to think about that particular thing as applying those particular functions. We understand a bit about why we are this way. You might say, "Why doesn't the P.R department just be honest about how it's trying to push P.R.? Everybody knows it anyway." Or you could say, "Why aren't we honest about the fact that we're really just trying to impress each other?" But we understand a bit about why, say, bragging looks bad, and why people who are too eager to show off show an insecurity that's not very impressive. It's actually moderately low-status and unimpressive to be too eager to impress, and so we're better off impressing people indirectly and unconsciously because we evolved that way. It's related to the fact that it's easier to sincerely be a good salesman if you actually believe in your product. Not necessarily honestly believe in your product, but sincerely believe in your product. You don't give off as many clues about the reservations you might have if you were more honest.
Would it work to say to yourself, "Okay, I'm seeking status. But maybe I'll try to bump myself into status-seeking endeavors that happen to generate the most positive by-products for humanity"? Is that possible, or is that delusional?
It's certainly possible on a broad, crude level. The question might be how fine-grained you could take that. In general, there are a wide range of things you can do to be impressive and a great many of them honestly don't do that much for humanity in the long run. It's not that hard to analyze the things you do in order to estimate those consequences. If you're trained like I am, as an economist, you'll be familiar with a wide range of ways we analyze behavior to see what are called "externalities," side effects that people aren't taking into account. We can identify where there are positive and negative externalities, and in some sense status-seeking has this negative externality of putting other people down when you're bringing yourself up.
But there are other sorts
of ways that behavior can have positive externalities, and innovation
that has long-term legs is one of those big positive externalities. To
the extent that you are achieving status through the right kind of
innovation, you might be doing something better for the world. Don't
fool yourself into thinking that you wanted to do something for the
world because you really cared that much about the world, but perhaps
even so, as long as you could move yourself in one direction or the
other, maybe you can at least show that you like to appear more
altruistic and concerned about the world by choosing to show off in
ways that actually do help the world.
It does remind me of a post on Overcoming Bias; it was part of a blog-to-blog discussion about charities and how they seem to serve more of a signaling function than anything. Someone replied, "Well, there's this one service that finds out which charities actually generate the most measurable benefits." I believe you replied, "Yes, but then you can put out the signal that you're investing in the effective charity and thus are smarter and higher-status than everybody else who is putting their money into charity." You can always make a signaling explanation by going one level up, can't you?
the ability to always offer an explanation, and the ability to actually
marshal evidence in support of it. I don't want to offer this as
something that equally explains every possible observation, because
then you can't ever confirm it by your observations. You could offer it
as an explanation for a wide range of things, but it becomes ess
plausible depending on the details of the behavior. But I do think
that, in fact, it's just not very plausible that humans evolved to care
about the universe or the world, but it it plausible that we evolved to
appear caring, the sort of person that, if you were to be married to us
or work with us in an office, that we would be considerate and
observant of you and take you into account. Those are nice signals to
send out. We also evolved to show ourselves as being reasonably smart
and thoughtful, and if we offer an argument for something that's
transparently silly, then that makes us look silly or stupid, and we
don't like to do that.
When I talk to people who occasionally read your blog, or don't follow it closely but kind of do, they'll sometimes say things like, "I like it, but he's always bringing up the status and the signaling. It's always about the status and the signaling." I think that's an interesting framework to view humanity in, but at the same time, I suppose there's the danger of looking like you're leaning very heavily on a small number of factors. Is that ever something you feel ou have to act to make it seem like you're not doing?
actually more the other direction. By nature, you might notice that I'm
interested in a very wide range of things. Intuitively, I easily pursue
a wide range of topics, a wide range of thoughts. But I know that I'm
in a very large world, and that if I want to make a contribution or
even just get attention and respect, I need to specialize. That's hard,
because it's more natural to be broad and cover a wide range of topics.
I try to make myself specialize. Obviously I'm not specializing on just
one thing, but I'm specializing on a more narrow range of things. It
seems to me that, because people are more interested in these signaling
and status topics, that I should make myself try to focus more on them.
That's the nature of the world of specialists.
But it seems like, status and signaling, you can take those to anything humanity is doing and profitably, or at least interestingly, examine it through that framework. You'll still be able to get the variety; it's just a specialized method of looking.
It's a degree of specialization. If I specialized in signaling in the arts, that would be more speciality. If I focused on signaling in the dance arts, that would be even more specific. You have to choose how specific to become. I don't think it's obvious that all explanations are equally valid. Let's take conversation, for instance. One theory of conversation people like to present is, it's about sharing information, sharing knowledge. I talk and share my knowledge and insights with you, and you talk and share your knowledge and insights with me, and it's an exchange, it's a trade, we're offering each other our insights and knowledge in a mutually beneficial trade.
theory of conversation is that we're trying to show off. You could say
either of these theories explains the fact that people talk, so you
might think they were equally valid explanations. How would you ever
know the difference? But a very simple prediction that's different
between the two theories is that the information theory says I'm eager
to listen and reluctant to talk. If I was going to cheat on the deal, I
would be most likely to cheat by not offering you as much information
and trying to get as much as possible from listening and not so much
talking. On the other hand, the signaling theory, were I'm trying to
show off my various characteristics, says I should be eager to talk and
reluctant to listen. And of course we all know the answer here.
In this specific conversation, I've essentially asked you to come on the program and show off. I've put my thumb on the scale and said, "Signal for me."
But you, as a radio host, show your thoughtfulness and your taste by the kind of people you choose to bring on. You also show that they're a specialist, and you're a generalist across many topics — you can still hold your own. You're showing your generality and robustness by the ability to hold your own in conversations with dozens of specialists in very other topics.
"Showing my robustness." That should've been the title of the show. To go back into your own life story, there is a tern you've used several times — the idea of "viewquakes." What are these?
That's just a way of talking about an insight that really changes how you think about a wide range of things, that goes to the heart of important things. It's the sort of news that's big news, in a fundamental intellectual sense. You might think it's trivially the sort of thing that most people would be eager to find, that is, if you have a simple model of intellectuals as people who are trying to understand the world and trying to make sense of things. This would be the very best thing you could ever get, that people would treasure these viewquakes, these great insights that really change how they thought. Their intellectual history would be composed with those as the high points and the kind of contributions that they might hope to make to the rest of the world. They might hope to do something in that direction, and if they could every produce something like a viewquake for others, they might think they would achieve enormous intellectual accomplishment beyond the hope of most people.
In fact, people are not very
interested in such things, surprisingly. That has to help you revise
your sense of what most intellectuals are up to or trying to do.
But you enjoy these viewquakes. What are some of the ones most memorable to you?
Big sets of viewquakes are around physics, because I first learned physics in a solid, deep way. Relativity, quantum mechanics, thermodynamics: these have some deep insights that are extremely counterintuitive, but once you come to appreciate them, they make a lot of sense and help you understand a great many things. When I left physics and went into computer science, I really had no idea of how to deal with complicated systems and how to manage them. I didn't even realize there was a need to deal with that. Physicists have been taught to think that they know most everything important, and I of course thought that as well. As I learned about modularity in software and abstraction and those kinds of concepts that helped me manage complexity in software, I was struck by how powerful those insights were.
science, I came across a range of insights that were extremely powerful
and useful, and still counterintuitive. The very simplest concepts of
supply and demand, of incentives, of rationality. In my early education
as an economist, I also came across this standard result about
disagreement, which was quite a viewquake to me: the simplest version
of the result, that people who are trying to be honest would not
knowingly disagree. I also came across a range of viewquakes on various
issues on how the future might be different from the present of similar
dramatic degree to how the present is different from the past. That's
always striking, to see how very different other parts of the universe
are from where you are.
You moved from physics to computer science to social science, economics. That doesn't happen very often. Why is that so rare?
and Aristotle have nice things to say about that. I think it's quite
common to think that young people can have a good grasp of something
like physics or mathematics, but it takes many years of experience with
the social world to have enough of a sense of how it works to really
study it competently. On some level, politics is even harder. In order
to appreciate the difficulty of politics, it helps to understand other
human organizations and interactions before you try to get that level.
There's a sense in which there are just a lot of prerequisites for that
Here's the thing that I notice is different about you, as opposed to others I've bet who have studied physics or computer science: a lot of those people get into the physics, the computer science, and those become the things that matter. The social world is something ephemeral, surface, insubstantial, to be ignored as much as possible in favor of physics, in favor of computer science and so on and so forth.
There's tendencies that cut both ways here. Honestly, a big tendency cuts the other way, which is that, as I indicated before. human minds are naturally built to be relatively general. This is the remarkable power of the human brain, that it's so flexible to apply to a wide range of things. That also means we are built to expect to apply our minds to a wide range of topics. We aren't built to expect the enormous division of labor we find in our world, which is a recent addition, an enormously powerful change to the world, but is also somewhat at odds with our mind's expectations.
We have to adapt ourselves to this
specialization, and so there's a degree of adaptation. Most of the
intellectual failures I know, in the sense of very smart people who
still fail to achieve intellectual development, insight or a career,
largely it's just because they find it hard to make themselves
specialize. They're so interested in such a wide range of topics that
it's really hard to focus enough to get a job or fame or things like
that. I'd actually say, what you see in terms of people who have some
degree of fame, attention or achievement in some intellectual area, is
those people having become an exception to the rule. That is, they've
managed to make themselves focus enough to do that, but a side effect
is going to be that they continue to focus because that's what's going
to give them their success.
Might this create the public mirage that there are more obsessives in the fields of than there actually are, because the obsessives tend to rise to prominence?
but it's also that we all learn how to adapt ourselves to the world we
live in. We slowly do adapt, so in some sense, relative to what we
could have become in a different world, we are all more specialized
because we have made ourselves become more specialized. At some
internal resistance, but still, we see where the winds blew and decided
that's what we needed to do. We can't all specialize in everything, and
there's some sense in which we do differ in our innate abilities.
There's some truth to the nerd scenario, which is, a nerd is somebody
who understands intuitively social relations less than they understand
some other things. It makes sense for them to focus on those other
things, to go with their strengths rather than their weaknesses.
There's another concept on Overcoming Bias that you've thought a lot about on there recently. It's the idea of "near" versus "far" modes of thought. What are those? How do those differ?
There are dozens of ways to divide up the brain into two parts: conscious/unconscious, left/right, intuitive/logical, et cetera.
Most of these ways to divide up the mind seem to have some relation to
something we see, but this near/far relation seems to be relatively
deeply embodied in the mind at an architectural level. It actually
seems that our minds have a near mode and a far mode for processing
things. These are different modes, and they produce different kinds of
thoughts. When you look out upon a scene, what you see is some things
that are near to you and other things that are far away. Literally, the
things near to you are being processed in near mode, and the things
that are far are being processed in far mode. When you pause and think
about people living in China or Haiti versus the person in the office
next door, you're also thinking in far mode about people who are far
away. Similarly, if we think about the distant future, you start to
think more in far mode. The main reason is that things that are closer
to us have more detail. We just need a different way of thinking to
manage that detail. Things that are farther away, we focus less on the
detail and more on abstract categories that they're in.
What problems does this create, the fact that we have a near and far mode? When I read your posts about it, it seems this helps but causes trouble as well when we're trying to think about complicated things.
Obviously we need to be able to handle detail, and if we thought about everything far away as if it had lots of detail, we would run out of space in our heads to think about things. Clearly we need to have these different modes, at some level. But what it also seems is that our minds have adjusted the parameters of how we think in far mode to take into account that far mode thoughts usually make less personal difference in terms of our decisions and more social difference in terms of how people think about us. Things that are farther away in time, space or social distance, when we think about them, other people are more likely to hear our thoughts and judge us based on our thoughts relative to how much it matters for what we actually do.
concrete example is love versus sex. Love is a far mode thought; sex is
a near mode thought. The way we think about love or the way we feel
love is designed, in large part, as P.R. to present to other people
this image of ourselves as a caring, thoughtful, committed sort of
person. We do achieve that; we are that sort of person in far mode when
we think about love, but those love thoughts often don't actually
influence concrete behaviors in ways that would be expensive relative
to, say, the near mode thoughts of, say, sex. When you are focused on
sex, you are in a very near mode where you're focused on details of
concrete things very close in space and time and other abstract
thoughts go away. Near mode thoughts are extremely important because
who you have sex with is extremely important. You are much more
concretely focused on, well, "Is this a good person to have sex with?"
As far as far mode's effect even on, let's say, the future — one of what seems to be your favorite topics on the blog — how does far mode affect the way people in general think about the future, maybe versus the way you yourself do?
I don't know if it's about me personally versus other people. It's more about all the other things that come with far mode come into thinking about the future. In fact, the literature that focused on this near-far analysis actually started with observations about how people think differently about the distant future. People notice that when you ask the same question about something farther away in time, they thought about it very differently. They explored that further to see this whole near-far structure. When we think about things farther in time, we also tend to think about things farther in space. So space colonization makes so much sense to us about the future, not because we have any good evidence that it's useful or interesting. It just makes perfect sense that far things away in time would be far in space as well. It also makes sense to us that the far future will be populated with creatures who are very far from us socially. We love the idea of visiting aliens and all sorts of odd, strange cultures, and of course that would be the kind of thing that would happen in a far future: we'd have far acquaintances.
Similarly, we tend to think that simple trends or theories about human
behavior or society would clearly and obviously follow through to their
logical conclusion in the far future, whereas they don't today. When we
see the world today, we see that it's a complicated mess, and we
acknowledge that it's hard to tell what's about to happen which way.
But when we think about the far future, all that fades away. We think
there are these clear trends toward increasing niceness or increasing
wealth or being more logical or whatever. It's also true that we're
just less careful about thoughts about the future. This is another mark
of far mode. We're more into loose associations and vague analogies,
and that's enough for us in far mode because we're not actually trying
to avoid getting squashed by something or dying in the far, distant
future. It's more an entertaining place to spin yarns, like talking
about a dragon in a fantasy story. You tell those stories and you're
moderately careful not to be obviously inconsistent, but you're not
very careful to think the whole thing through. It's just a story, after
When we think about the future, how much can we counterbalance this drop in resolution that far mode gives us, or is important to do that?
It's important to do that
the extent that we actually make choices now that affect the future. If
you were concerned about global warming and you were actually going to
impose a large carbon tax or a cap-and-trade system today because you
were trying to prevent problems in the distant future, it suddenly
becomes a lot more important that you've actually thought through the
details of this distant future rather than projecting onto this future
various concerns you have about feeling guilty that we're consuming too
much or that we're not spiritual enough or we're not unified enough at
a global level — these sort of vague concerns you like to
express through concern about the future. I don't mean to pick on them;
it's also true of the hard-techie science optimists about the future.
They're also more interested in using the future to talk about why
science and tech is great than to be honest and thoughtful about what
actually is likely to happen.
How do you be more honest and thoughtful about the future, then? How do you avoid putting money on your favorite horse subconsciously when you're discussing what's going to go on a hundred, a thousand years down the line?
I think it's more how you make yourself do what you know you already need to do, rather than how do you do it. If you looked at something on the horizon and I asked, "Paint a detailed picture of what it looks like," you would move into a nearer mental mode because you needed to paint that extra detail. I think that's similarly true about thinking about the far future. I'm an economist, and I learned a variety of mathematical and analytic tools for analyzing social behavior and social situations. I think that, for the distant future, you should pull all those tools off the shelf and apply them directly there. A mathematical and analytical frame of mind is a nearer frame of mind, even if it's more abstract in some sense. It's more focused on certain kinds of details. There's the equation and the equals sign and you have to make the symbols work out right, and that's, in a sense, a nearer mode of trying to be careful about those things.
It takes not just an ability, but a willingness. Part of what goes on with the future, unfortunately, is there really are quite a few people out there with the ability to think carefully and analytically about the future, but they're not in much demand as inspirational speakers. We have the larger problem of this huge selection effect of who we choose as our people we quote on the future. Any one person can choose to be more honest, more analytic, more careful and more near-mode about the future, and they can then come up with more reasonable projections. There are people who do that, but they're usually not very popular.
there was a web, there were people thinking about and forecasting and
projecting the web, and they did successfully predict many important
aspects of the web. When the web showed up, people weren't very
interested. Those people didn't win very much, they didn't become rich
or something because they had forecast the web. At the time they were
forecasting the web, people weren't very interested in those forecasts
either, except to the extent that they spoke to concerns about the
little people or the big people or math versus English. There was a
range of ways people framed that issue as speaking to their concerns at
the time, but they really weren't interested in actually forecasting
Another casualty of the unpopularity of honesty?
All feedback welcome at colinjmarshall at gmail.
Posted by Colin Marshall at 12:04 AM | Permalink