Skip to content

Bi-Weekly "Metapolitics" Discussion in Fishtown

Photo of Brian B.
Hosted By
Brian B.
Bi-Weekly "Metapolitics" Discussion in Fishtown

Details

This discussion, like the one back on September 17th, is a bit more philosophical and explores some of the political essays from the Less Wrong forum, particularly the "Politics is the Mind-Killer" sequences. (Note: Less Wrong has several series of connected essays which they call "Sequences".)

The last set of Less Wrong essays looked at why we don't typically see all "truth" converge around one political philosophy. This time, we look at whether we can expect "truth" to converge around the expert consensus on a specific political issue. This raises the question of whether "argument from authority" and "argument from majority" - normally considered to be informal logical fallacies which cannot prove a case definitively - can still generate "strong Bayesian evidence" and can be used to construct a useful tool for laymen to resolve political issues known as "expert consensus". The essays also address what level of resolution does expert consensus give us on a particular issue - is it the end of the discussion or just a good starting point? And what should we make of experts that eschew the consensus in their field - how do we separate the crackpots from the lone geniuses who are ahead of their time?

Less Wrong started as an offshoot of Robin Hanson's blog, "Overcoming Bias", and many Less Wrong regulars became acquainted with the unique brand of "libertarian-ish" ideas promoted by Hanson and the other economists at George Mason University (GMU). The GMU economists - particularly Bryan Caplan - have heavily debated the reliability of expert consensus. Caplan made a strong argument for favoring expert consensus in his book, The Myth of the Rational Voter (discussed in past meetups) and it appears that Less Wrong members picked up his arguments & engaged with them, and that Eliezer developed his own unique interpretation that favors contrarianism in certain cases. (Caplan seems to have developed a unique balance between favoring expert consensus & contrarianism - he is after all, an anarcho-capitalist - an uncommon stance among economists.)

Each of the following articles is very short, except for essays #14 & #15 by Scott Alexander and #16 by Chris Hallquist. Hopefully some of you can read (or at least skim) all of them. For those who don't have the time to read everything, I've added in some notes with each article to give you a general sense of the points they're making.

LESS WRONG DISCOVERS THE "EXPERT CONSENSUS" ARGUMENT:

  1. Bryan Caplan, "Trust the Experts: A Reasonable, Defensible Presumption"

http://econlog.econlib.org/archives/2007/05/trust_the_exper.html

Caplan points out that we don't have the time to independently examine the evidence for & against every proposition, so trusting the expert consensus on most matters is "defeasible" - i.e. rationally compelling although not deductively valid.

  • For the definition of "defeasible reasoning", see Wikipedia:

https://en.wikipedia.org/wiki/Defeasible_reasoning

  1. Chris Hallquist, "Trusting Expert Consensus"

http://lesswrong.com/lw/iu0/trusting_expert_consensus/

Hallquist suggests we should calibrate our confidence on empirical questions based on the strength of the expert consensus:

  • When the data show an overwhelming consensus in favor of one view (say, if the number of dissenters is less than the "Lizardman's Constant (http://slatestarcodex.com/2013/04/12/noisy-poll-results-and-reptilian-muslim-climatologists-from-mars/)" - i.e. the 4% or so of people who give crazy answers in polls), this almost always ought to swamp any other evidence a non-expert might think they have regarding the issue

  • When a strong but not overwhelming majority of experts favor one view, non-experts should take this as strong evidence in favor of that view, but there's a greater chance that evidence could be overcome by other evidence (even from a non-expert's point of view).

  • When there is only barely a majority view among experts, or no agreement at all, this is much less informative than the previous two conditions. It may indicate agnosticism is the appropriate attitude, but in many cases non-experts needn't hesitate before having their own opinion.

  • Expert opinion should be discounted when their opinions could be predicted solely from information not relevant to the truth of the claims. This may be the only reliable, easy heuristic a non-expert can use to figure out a particular group of experts should not be trusted.

  • This last point means if a group of experts' opinions tend to all cluster into a standard political ideology, they should be discounted. This recalls Eliezer Yudkowsky's essay, "Policy Debates Should Not Appear One-Sided," which argues that we shouldn't expect to see a convergence of evidence around political policy issues because they usually deal with multi-factorial phenomena and so almost any course of action has both costs/risks & benefits, and these issues combine positive & normative elements. If experts in a field all prefer the same types of tradeoffs and have the same norms/values, this may make policy debates artificially appear one-sided. - http://lesswrong.com/lw/gz/policy_debates_should_not_appear_onesided/
  1. Kaj Sotala, "Fallacies as Weak Bayesian Evidence"

http://lesswrong.com/lw/aq2/fallacies_as_weak_bayesian_evidence/

The above essay doesn't specifically addresses the informal fallacies of "argument from authority" or "argument from popularity" (which combine to form "argument from expert consensus"), but it addresses the way these several other informal fallacies generate Bayesian evidence and so it's relevant.

  • Sotala's post is essentially a summary of a larger philosophical essay by Ulrike Hahn & Mike Oaksford entitled, "The Rationality of Informal Argumentation: A Bayesian Approach to Reasoning Fallacies"

https://www.researchgate.net/profile/Mike_Oaksford/publication/6199092_The_rationality_of_informal_argumentation_a_Bayesian_approach_to_reasoning_fallacies/links/0c96051e9a3d941970000000.pdf

  1. Scott Alexander, "Epistemic Learned Helplessness"

http://squid314.livejournal.com/350090.html

Scott invents the term "epistemic learned helplessness" to mean that "If I know that a false argument sounds just as convincing as a true argument, argument convincingness provides no evidence either way, and I should ignore it and stick with my prior [i.e. the expert consensus]."

  • Scott's idea of epistemic learned helplessness is similar to one advanced by Eliezer Yudkowsky in "Expecting Short Inferential Distances" - i.e. the "inferential distance" between experts & the average person is often too great for the average person to be able to understand them. Today, we have "abstract disciplines with vast bodies of carefully gathered evidence generalized into elegant theories transmitted by written books whose conclusions are a hundred inferential steps removed from universally shared background premises."

https://wiki.lesswrong.com/wiki/Inferential_distance

ELIEZER YUDKWOSKY ARGUES FOR CONTRARIANISM:

  1. Eliezer Yudkowsky, "Argument Screens Off Authority"

http://lesswrong.com/lw/lx/argument_screens_off_authority/

Eliezer argues that "a good technical argument is one that eliminates reliance on the personal authority of the speaker... provided we have enough technical ability to process the argument." But he adds a caveat: "In practice you can never completely eliminate reliance on authority. Good authorities are more likely to know about any counterevidence that exists and should be taken into account; a lesser authority is less likely to know this, which makes their arguments less reliable. This is not a factor you can eliminate merely by hearing the evidence they did take into account. It's also very hard to reduce arguments to pure math; and otherwise, judging the strength of an inferential step may rely on intuitions you can't duplicate without the same thirty years of experience.... But this slight strength of authority is only ceteris paribus, and can easily be overwhelmed by stronger arguments. "

  1. Eliezer Yudkowsky, "Hug The Query"

http://lesswrong.com/lw/ly/hug_the_query/

Yudkowsky explains: "In the art of rationality there is a discipline of closeness-to-the-issue—trying to observe evidence that is as near to the original question as possible, so that it screens off as many other arguments as possible. The Wright Brothers say, 'My plane will fly.' If you look at their authority (bicycle mechanics who happen to be excellent amateur physicists) then you will compare their authority to, say, Lord Kelvin [the renowned physicist who infamously claimed in 1895 that heavier-than-air flight was impossible], and you will find that Lord Kelvin is the greater authority. If you demand to see the Wright Brothers' calculations, and you can follow them, and you demand to see Lord Kelvin's calculations (he probably doesn't have any apart from his own incredulity), then authority becomes much less relevant. If you actually watch the plane fly, the calculations themselves become moot for many purposes, and Kelvin's authority not even worth considering."

  1. Eliezer Yudkowsky, "The Majority is Always Wrong"

http://lesswrong.com/lw/hd/the_majority_is_always_wrong/

Yudkowsky argues that "in any case where you've got (1) a popularity effect (it's easier to use something other people are using) and (2) a most dominant alternative, plus a few smaller niche alternatives, then the most dominant alternative will probably be the worst of the lot - or at least strictly superior to none of the others." This ties in with Scott Sumner's point in Essay #10 below where he points out that a lack of independence of thought renders the "wisdom of the crowd" much less useful because it leads to information cascades & herd behavior. * Wikipedia has a useful entry for "information cascade":

https://en.wikipedia.org/wiki/Information_cascade

  1. Eliezer Yudkowsky, "The Contrarian Status Catch-22"

http://lesswrong.com/lw/1k4/the_contrarian_status_catch22/

This essay comes close to making the other half of the central argument in "The Majority Is Always Wrong" - i.e. just as mainstream ideas benefit from being popular aside from their actual merit, contrarian ideas suffer from being unpopular aside from their merit. Yudkowsky theorizes that a Catch-22 situation keeps many plausible contrarian opinions from being taken seriously by the mainstream: When an appealing contrarian idea comes along, its initial proponents are likely to be those with less sensitivity to conformity and more intellectual confidence who will come across as "arrogant". And once that happens, the only people who'll be willing to believe the idea will be those willing to tar themselves by affiliating with a group of arrogant nonconformists, which limits the idea's chances for mainstream adoption.

  1. Robin Hanson, "The Smart, Sincere Contrarian Trap"

What Eliezer Yudkowsky calls the "contrarian status catch-22" is related to what the GMU economist Robin Hanson calls the "smart sincere contrarian trap" - smart & sincere yet socially naive people may end up stumbling evidence that leads them to adopt contrarian positions without realizing there's a social penalty to be paid for publicly espousing them, and they can't successfully advocate for these contrarian beliefs (especially if they adopt more than one) if they have insufficient social status.

http://www.overcomingbias.com/2016/10/smart-sincere-contrarian-trap.html

  1. Yvain (Scott Alexander), "Intellectual Hipsters and Meta-Contrarians"

http://lesswrong.com/lw/2pv/intellectual_hipsters_and_metacontrarianism/

As a counter-point to Robin Hanson's essay, Scott points out that more status-conscious smart people may be "intellectual hipsters" who adopt contrarian beliefs as a type of social counter-signalling to distinguish themselves from middlebrow folks who are primarily preoccupied with signalling that they're not stupid lowbrows. He gives several examples, such as highbrow intellectuals adopting libertarian or neoreactionary beliefs to distinguish themselves from middlebrow liberals who take pains to distinguish themselves from redneck conservatives.

BALANCING EXPERT CONSENSUS WITH CONTRARIANISM AND FINDING THE "CORRECT CONTRARIAN CLUSTER":

  1. Bryan Caplan, "Me versus The Economic Consensus"

http://econlog.econlib.org/archives/2007/10/me_versus_the_e.html

Caplan explains that he only has a "presumption of expert consensus" and considers "expert consensus" to provide common ground and a useful starting point for productive discussions between people who disagree about a subject. He doesn't see a problem with his personal economic views (anarcho-capitalism) differing from the economic consensus because he thinks the expert consensus is biased towards nationalism & socialism (although not as badly as the popular misconceptions of economics), and he's a moral realist whose values are different from most of his colleagues.

  1. Scott Sumner, "Why Bryan Caplan Almost Always Wins His Bets"

http://econlog.econlib.org/archives/2016/05/why_bryan_capla_1.html

Sumner points out that in order for expert consensus to function optimally like the "efficient market hypothesis" (EMH), individual experts must make their decisions on issues without regard for what the majority of other experts have to say on the subject. This makes sense in terms of the importance of "independence of thought" being one of the 4 elements necessary to form a "wise crowd" -- it averts information cascades that lead to things like market speculation bubbles and moral panics.

https://en.wikipedia.org/wiki/The_Wisdom_of_Crowds#Four_elements_required_to_form_a_wise_crowd

  1. Eliezer Yudkowsky, "The Correct Contrarian Cluster"

http://lesswrong.com/lw/1kh/the_correct_contrarian_cluster/

Yudkowsky speculates that it may be possible to find the "correct contrarian cluster" by finding a few contrarian "slam dunks" you think are correct (he lists atheism, many worlds interpretation of quantum mechanics & denial of p-zombies) and then polling experts to see who agrees with them and how they view other issues.

  1. Scott Alexander, "The General Factor of Correctness"

http://slatestarcodex.com/2015/07/23/the-general-factor-of-correctness/

Scott suggests that the most promising area to look for a "general factor of correctness" that might help us find the "correct contrarian cluster" is from Phil Tetlock's work on geopolitical forecasting from the Good Judgement Project: "These people aren’t succeeding because they parrot the experts, they’re not succeeding because they have more IQ or education, and they’re not succeeding in some kind of trivial way like rejecting things that will never happen. Although the article doesn’t specify, I think they’re doing something more than just being well-calibrated."

  • Many members of the Less Wrong forum have looked at the work of psychologists like Daniel Kahneman & Keith Stanovich into cognitive biases, and hope that by learning about them & eliminating them they can increase their rationality. In 2012, a nonprofit called the Center for Applied Rationaility (CFAR) began developing workshops to train people in avoiding cognitive biases and using the type of rationality skills (e.g. Fermi estimation, Bayesian inference, cognitive bias mitigation) that the top forecasters use:

https://en.wikipedia.org/wiki/Center_for_Applied_Rationality

http://rationality.org/faq/

  1. Scott Alexander, "Contrarians, Crackpots and Consensus"

http://slatestarcodex.com/2015/08/09/contrarians-crackpots-and-consensus/

Scott Alexander suggests that a useful heuristic for distinguishing potentially correct contrarians from crackpots is the way the mainstream experts treat them. Correct contrarians are often working with some anomalous results in their field that are (for now) mostly ignored by the mainstream, whereas crackpots are vehemently opposed & ridiculed by the mainstream because their theories aren't at the cutting edge of research but rather contradict the mainstream's well-established fundamentals. Contrarians also tend to be "people with enough expertise to understand a field who nevertheless acquired that expertise outside of the field itself," whereas crackpots tend to have little or no understanding of the field or the mainstream consensus they're contradicting.

  1. Chris Hallquist, "Self-Congratulatory Rationalism"

http://lesswrong.com/lw/jq7/selfcongratulatory_rationalism/

Chris complains that the Less Wrong community is too prone to adopting contrarian beliefs for the type of social counter-signalling Scott analyzed in his "Intellectual Hipsters" essay. He makes the case that raw intelligence shouldn't be mistaken for subject matter expertise, and argue for more intellectual humility & less self-assurance per se. He also advocates against groupthink & tribalism within the Less Wrong community, namely being less dismissive of those not in your in-group, while also not taking for granted the rationality of those who are in your in-group.

Photo of Philadelphia Political Agnostics group
Philadelphia Political Agnostics
See more events
front street cafe
1253 N Front St · Philadelphia, PA