Russ Roberts, Robin Hanson EconTalk notes
You can find the link to the conversation here if you're interested, but I think I've covered the major stuff. I did skip the first 28 minutes on a suggestion from a commenter on Overcoming Bias, which was apparently just Russ Roberts giving a monologue on the various ways that economists disagree. There were lots of nuggets, and Robin always gives the necessary framework behind his ideas. I'll summarize the main points:
If there is an incentive for an argument to occur, like in a trial or politics, then economists could be disproportionately rewarded for taking contrary opinions, which leads to more disagreement than there would be if everyone was solely interested in truth.
If you think that everybody on the other side is biased, you have to ask yourself, why am I different? Robin thinks you have to "bite that bullet." Unless you have hard evidence that you are less likely to reject temptations of support, you have to assume that you are biased too.
The only way we can say what's not true and get away with it is to believe it, because humans "leak" it when they are lying. This is the evolutionary psychological explanation for self-deception and many cognitive biases.
Russ: Pragmatist philosophy does not a lot of romance about it. "Your grandmother is right", even if your grandmother doesn't have a reason. It's hard to analyze your reasons objectively, so you can't fully trust them. Sometimes it can be better to go with intuition. Robin: "Certainly there's some truth in that, the question is how much." That is a golden response.
Problems in our hypothesis forming procedures: 1) The vast majority of brain thinks stuff that you're not aware of. I'm not exactly sure how I got from A to B, my mind has left out a lot of the details. 2) Other people have other stuff that they're aware of that you're not. "They could have thought some stuff that I'm not aware of". You have to ask yourself, "How likely is it that they know something I don't?"
Weak signs that you are willing to be honest: 1) You go against somebody on "your own side" because you don't think their argument is true, even if they are influential. 2) You are not willing to change your argument just to get it published.
However, these are only weak signs and most people have this kind of evidence that they hold in memory preferentially in order to think of themselves as honest and truth-seeking. Even though you may not be at the extreme, you must still be skeptical.
The media selects for the confident people on both sides of the debate in order to get better sound-bytes.
It's hard for thinking, curious people who have their opinions and intellect as a part of their identity to consistently say "I don't know." It's much more fun and stimulating to have opinions, plus you are more impressive that way. Maybe individuals don't have sufficient incentives to be honest. Robin makes his case for prediction markets again, which in my opinion do not suck.
Calling Bush a "free market ideologue" is the most ridiculous thing anybody could ever say. If anybody says this to my face they will be mocked publicly and summarily shunned.
What is the least quixotic strategy? Promoting prediction markets, promoting skepticism of theory, or promoting statistical knowledge. The best way to convince people to be skeptical is two-fold. You have to show them that oftentimes the newspapers and consensus is wrong, with empirical studies. And you have to show that those who take that message too much to heart are wrong too, because the average theory will probably be more true than the outliers.
When you talk about something specific, all the abstract theorizing about skepticism is forgotten and you once again become biased. Robin's solution is to have multiple selves, one to put his head down and create a theory, and one to step back, correct, and be skeptical of his own results.
Right now I think Robin Hanson is the smartest person on the internet. Prove me wrong! Who's better? I'll read their stuff too.