To be clear, sometimes authority bias is good and proper. For instance, valuing the opinion of a climate scientist who has been studying climate chaos for thirty years more than your Aunt who saw Rush Limbaugh say climate change is a hoax in the 1990s is normal and rational.
Basically, authority bias as a reasoning flaw stems from misidentifying who is authoritative on a subject.
In a vacuum, appealing to authority is fallacious. An idea must stand up on its own merits.
IRL, things get fuzzy. No one has the expertise and time to derive everything from first principles and redo every experiment ever performed. Thus we sadly have to have some level of trust in people.
As long as the paper has the experiment well documented and it’s double blind, you don’t need to appeal to authority.
Counterpoint: the replication crisis
Well most people will choose a politician or actor instead of unknown Nobel prize winner. That’s how we got here.
not all bias is made equal or always something negative. Sometimes it’s good to be biased towards the opinion of a scientist over the opinion of your aunt.
I guess authority bias is most absurd when one tries to use it as a crutch to validate an argument.
You should believe me simply because ‘x’ researcher said this about the topic
I have to respectfully disagreed with your example. Ostensibly the researcher should be an authority. I think the example given in the chart is not quite right either. I think the confusion comes from the three definitions of “Authority”.
-
the power or right to give orders, make decisions, and enforce obedience. “he had absolute authority over his subordinates”
-
a person or organization having power or control in a particular, typically political or administrative, sphere. “the health authorities”
-
the power to influence others, especially because of one’s commanding manner or one’s recognized knowledge about something.
In your example the “Authority” is definition 3, someone with specialized knowledge of a topic that should be listened to by those who are lay on the topic.
In the chart I think they were trying to go for 1, which is the correct source of Authority Bias, but they didn’t want to step on toes or get political. The actual example is someone who has decision authority like a police officer or politician or a boss at a workplace who says things and a listener automatically believes them regardless of the speakers actual specialized knowledge of the topic they are speaking on. A better example would be “Believing a vaccine is dangerous because a politician says it is.”
This all feeds into a topic I have been kicking around in my head for a while that I have been contemplating attempting to write up as a book. “The Death of Expertise”. So many people have been so brainwashed that authorities in definition 3 are met with a frankly asinine amount of incredulity, but authorities in the first are trusted regardless of education or demonstrable specialized knowledge.
I’ll also have to respectfully disagree with you on this. If I’m listening to someone speak on a topic who is by your 3rd definition an authority on it, that is not a yardstick for them to claim correctness. Yes, i might probably be better off listening to them than a lay person, but it still doesn’t give them the right to claim correctness nor does it grant me the right to rehash these claims and say that i should be listened to since I’m regurgitating the words of an expert. All assertions should be backed up by verifiable sources.
I’m interested to hear about that book though
Fair enough argument. I do wonder who, in your opinion, is someone who can justifiably have authority on a topic if not a topic expert? Who is reasonable to be educated by?
As for the book, at this point I have not put pen to paper as it were, but the premise is the observation that there is a concerted effort on the part of some political parties to sew so much doubt in subject experts as to render their knowledge meaningless to the general populace and how dangerous that becomes when the situation is something that has potentially dire consequences. I have seen it happening for a long time, but it really came to a head for me in 2020 when I saw entirely lay politicians and pundits undermining warnings from virologists, epidemiologists, and statististians and sewing distrust in public health organizations essentially to trade people’s lives for political points. Since then I have been seeing an ever escalating trend for people in category 1 of authority to push the populace away from category 3 on topics which really only category 3 should be talking at all. The rest of us should be shutting up and taking notes, asking questions for clarification, and learning.
Abortion, gender identity, climate change, economics geopolitics, etc. Essentially every topic that has been politicized into a hot button issue is really somerhing that is so beyond complex that we should not be arguing with the people who have dedicated their entire adult lives, sometimes 40+ years, to studying.
My father has the perfect microcosm anecdote from his working days. He worked for a garage door manufacturer who hired some fresh faced MBAs into middle management. They were all sitting in a meeting one day and thought they came up with an amazing idea, so they took it to the veteran engineers who had been designing garage door openers for decades, some of them essentially since the damn things were invented, and told them to make their hairbrained idea. The enginners looked over what they were given and told them that they had had the idea decades earlier and that it did not work and that materials science and engineering had not progressed to the point that it would be feasible. Did the MBAs who were trying to make waves and make a name for themselves listen? Nope, they fired all of the veteran engineers and hired in a bunch of fresh faced engineers who had never actually designed a garage door opener and told them to build their hairbrained idea. The engineers, only knowing what they had learned in school and a couple of years in other jobs got excited by this revolutionary idea and dove into it. Fast forward about 2 years, and millions in R&D, and we find the fresh faced engineers, now not so fresh, somberly telling the MBA dickheads exactly what the veteran engineers had told them initially. This, along with a few other boneheaded schemes to make earnings sheets look better for the MBAs actually ended up tanking the company and it was sold like 10 years later.
Subject expertise matters. Respecting subject expertise matters. Being able to recognize when you are sitting atop Mount DK is one of the finest skills we could ever teach our children.
Fair enough argument. I do wonder who, in your opinion, is someone who can justifiably have authority on a topic if not a topic expert? Who is reasonable to be educated by?
Like i said, if an authority on a subject (an academic or an experienced individual) is only stating a priori or a posteriori facts about a topic, then it’s all well and good. However, if they’re discussing a topic for which there is no commonly agreed opinion or for which the answer exists but they are not privy to, it would be wrong for them to use that authority to claim correctness over another, say an interlocutor.
I have a PhD in ‘x’ related field, so even though I’m not too sure, I must be right
This does not however mean that their opinion isn’t worth listening to. We would be better off listening to category 3 authority figures than anyone else on subject specific matters.
On the topic of your book, i completely agree with the premise of politicians and their efforts in trying “dumb down” the populace. Something which, in America at least, will only be exacerbated by a Trump presidency and his most likely implementation of the Project 2025 manifesto. I think many of these things are due to the conservative party aiming to transform America into a Christian theocracy which would practically make it an authoritarian state.
I also think it’s worth noting the public’s own influence in undermining scientific praxis through the rise of anti-intellectualism in the form of flat earthism, climate change denialism and Christian theocrats. There are many people who are being given a platform who do not deserve one e.g Terrence Howard and his pseudoscience. The public seemingly has a fascination with engaging with these absurd opinions from category 1 authorities which contributes to the rise of anti-intellectualism. There’s also the demonization of university by especially Gen Z and the downplay of scientific reasoning in favour of “freedom of thought” a.k.a wokeism. I use this term in the form it’s used today which is excessive political correctness, cancel culture, or an overemphasis on perceived victimhood. There are many liberals here who will not be pleased by my use of the term, but i think it’s worth not only condemning conservatives, but also the ideologies of many radical liberals (my opinion on this is however not steady, so i am open to change).
There are so many more factors at play here, such as postmodernism (which is thankfully unpopular now), populist anti-elitism, and the pursuit of knowledge only when it has material benefit, but this is already long as it is
You hit the nail on the head for the conservative agenda, though that is not as impressive as it once was since they all have started saying the quiet parts out loud. Anyone with enough brain cells in close contact to notice that Jesus was about as anticapitalist as you could possibly get is appalled and concerned for their safety. At least all of the ones I know are.
To your point on the authority of a postdoctoral level person who assumes they are right in hubris, I feel like they have kinda earned it. It is also likely that their “wrong” is going to be far closer to right than that of a lay. I am personally a polymath, so I don’t find lay topics for myself very often, but when I do, I do listen to topical experts and respect what they say, while checking the voracity of things that feel off to me using reputable journals and prepublication articles on the likes of arXiv.
If someone is lay and they are either unable of unwilling to do the diligence to verify the person claiming topical authority, then they really need to just take what is said at face value and not enter the more global conversation aside from trying to learn more (eg asking questions). I am so tired of my numbskull uncle claiming that Anthony Faucci doesn’t know how viruses work of some distant relation bitching about how student loan forgiveness is theft from taxpayers. My uncle knows nothing about virology and my distant relation couldn’t parse economic principles to save his life, but there they sit, acting as counter-authorities to people with doctorates and 40+ years of professional experience. That is the part I want to see stop. If you have a Masters degree, fine, argue with the expert, but if you have never stepped foot inside a classroom where that topic was being taught, just don’t. Your opinion is woefully uninformed and thus not worth the CO2 you expended to voice it.
I do like your take on the societal and philosophical underpinnings for the Death of Expertise. It gels well with some things and gives me some avenues to investigate should I finally get fed up with this world enough to write it. Until that time, I will just keep Farnsworthing it.
-
How do you validate an argument?
YSK: the Dunning-Kruger effect is controversial because it’s part of psychology’s repeatability problem.
Other famous psychology experiments like the ‘Stanford prison experiment’ or the ‘Milgram experiment’ fail to show what you learned in psych101. The prison experiment was so flawed as to be useless, and variations on the Milgram experiment show the opposite effect from the original.
For those familiar with the Milgram experiment: one variation of the study saw the “scientist” running the test replaced with a policeman or a military officer. In these circumstances, almost everybody refused to use high voltage.
Controversial in the sense that it can be easily applied to anyone. There is some substance to the idea that a person can trick themselves into thinking they know more based on limited info. A lot of these biases are like that, they aren’t cut and dry but more of an gray area where people can be fooled in various ways. Critical thinking is hard even if it’s taught, and it’s not taught well enough or at all.
And all of that is my opinion and falls into various biases, but oh well. The easiest person to fool is yourself because we are hardwired in our brain to want to be right, with rewards to ourselves when we find things that help confirm it even if the evidence is not valid. I think the best way to try and avoid the pitfalls is to always back up your claim with something. I’ve found myself often(!) erasing a response to someone because what I was going to reply didn’t have the data that I thought it did and I couldn’t show I was correct after I dug a bit to find something.
I almost deleted this for the very reason, but I want to see how it hits. I feel that knowing there’s a lot of biases that anyone can fall into can help form better reasoning and argument.
but how will other redditors know how smart I am if i dont regurgitate what i read on reddit
What bias would that fall under? One could assume the variation has to do with the average American’s trust of law enforcement vs their trust of a qualified person.
(Assuming the repeat experiments were done in the US that is)
What bias is it if the only entry I’ve read in this table is the one for confirmation bias?
Confirmation bias bias
Probably… Selection bias?
Survivorship bias?
For negativity bias my wife just told me a great technique that she uses for that. Come up with a list of people whose opinions matter to you. Any time you question yourself, imagine how each person on that list would react to what you did. Since those are the only people whose opinions matter to you, if it’s mostly positive, then you should feel proud of your choice.
Ahh negativity bias, my other middle name.
:P
What’s interesting is how, even when knowing these biases, one has a tendency to often have and display at least some of them.
(At least, that’s the case for me)
Knowing these helps with self-talk. You trip over a curb and start scolding yourself. Then you can say to yourself “this is just spotlight bias”, and move on with your day, avoiding the impact of negative emotions. Or, you might be more open to a change in restaurant plans because you know of the false consensus effect. There’s subtle but real power in just naming things!
I tripped and fell spectacularly walking in a supermarket. I was annoyed that no one helped me up or checked if I was okay (I didn’t need help but it made me think less of my fellow man) and that my partner was waiting in the car and didn’t witness it, because it was actually really funny.
I left embarrassment in my 20s. Don’t have the energy or interest in it now. And I know I’m not the main character - everyone’s living their own lives, the impact you make on strangers is minimal. At worst someone said when they got home from the shops ‘i saw this chick stack and it was kinda funny’.
Reminding yourself that no one really cares about people that don’t know is a helpful way to shut down the negative self talk.
That’s a good point.
Ever since I’ve became more aware of those I’ve found myself doing similar kind of “disarming” of such falacies when I notice I’m using them.
My point it’s that it generally feels like swimming against the current.
You’re absolutely right there. We’re hard wired to think this way and it’s a constant battle.
How to develop the mental discipline to jump to naming the bias in emotional situations like that though??
Repetition.
What do I win once I tick them all off?
a senate seat.
a pair of human pants and a human burrito with meat
I’d love to see a list of names for writing devices used by trolls/propagandists thar generate completely false information of varying types. Forced binary choices when a third way is valid or the choices aren’t even related. Most of them are just plain old lies, so I don’t think the list would be too long.
Actually the reason I order the last item the server mentioned is because of crippling social anxiety
Same for not standing up in the middle of everyone to go out from watching a bad movie in the cinema.
I’m out here actively going against my biases and selling someone else’s house above market value 😤
I have a strong coherence bias. The less coherent a person, the less believable they seem.
That’s just, like, your opinion, man.
False Consensus Effect and Narcissistic Personality go hand in hand. Can’t tell you the amount of times my narcissistic coworker starts trash talking people I like a hell of a lot more than them assuming I agree.
What’s the cognitive bias for believing that any given chart is the ULTIMATE CHART. Yes yes, YOUR chart is gospel, the exhaustive definitive final chart 🙄
Oh ffs it gets worse with the Don’t Forget To Like And Subscribe whine beg plead for internet fart points at the bottom
Is there a “positivity bias” counterpart to “negativity bias”?
I’d argue that’s the outcome or survivorship bias, where people focus on the one winner or that time a thing worked and ignoring all the other failures. E.g. people think investors all swim in money because they only see the warren buffets of the world, when in reality there’s thousands, millions of people who tried the almost exact same thing and lost some or all of their savings in bad investments.
Absolutely! It’s called the Pollyanna Principle. In fact, there’s a counterpart to all of these biases that are immensely helpful in certain types of therapy.