• SumoMe

Every man needs to know how to persuade. Everything from buying a car, asking for a raise, advocating for a client, or convincing your significant other to go to the steakhouse over the French bistro involves the subtle art of persuasion. This eight-part segment is dedicated to teaching men how to argue by explaining the most common weaknesses in arguments, and hopefully how to avoid them.

16. The Gambler’s Fallacy

The gambler’s fallacy is best explained through an anecdote. In the summer of 1913, the Monte Carlo Casino experienced a seemingly-rare event. On an otherwise normal day, their roulette wheel’s ball fell on black 26 times in a row. During that streak, gamblers as a whole lost millions of francs betting against black because of the “imbalance” in the wheel, and that red was surely to come up soon and just as often as black. The incorrect reasoning here was that since the bet was 50/50 (red or black), the long streak of black must be balanced by an equally long streak of red. Therefore, after 5 black lands in a row, people bet on red. After 6, they were sure it was red. After 7 spins, even more sure it was red, and so on.

The thinking here is that if deviations from expected behavior are observed in repeated independent trials of some random process, future deviations in the opposite direction are then more likely. This is faulty, because the red/black bet is always 50/50, despite how lop-sided past rolls were. This type of thinking is mostly found in casinos and betting arenas, for obvious reasons as illustrated by the Monte Carlo Casino example. In order to avoid falling into this trap yourself, just remember that statistics lack a memory, and the odds on the 26th roll are the same as the 1st roll.

17. The Bandwagon

Andrew Coleman said it best, when he posited that the probability of any individual adopting a viewpoint increase with the proportion who have already done so. This is precisely the functionality of the bandwagon effect. The thinking here is that because lots of people think/do something, you should think/do the same thing as well. After all, the more people believe something, the more correct it is, right? Wrong. Remember how many people believed in a geocentric universe? Remember when Christians believed in witches and burned them at the stake? Remember when people believed in [insert ridiculous view here]? These were not outlying views, and they were widely believed to be true because so many people believed in them (and perhaps the risk of not believing was death/exile).

The bandwagon is used in many facets of life: politics, fashion, and religion, namely. In politics, many people vote for the “popular” candidate because they want to be on the winning team in the end. In fashion, many people dress similar to everyone else so that they can be “right” in their style opinions. In religion, many people believe in the same god as those around them to avoid exile, stigma, or (in some places) death. The mere fact that many people cling to a viewpoint or belief does not increase the validity of that viewpoint or belief, it merely increases its accessibility.

If you want to avoid the bandwagon, make sure that everything you believe or argue for is based on inherent truthfulness, not based on the accessibility or acceptance of others. This is difficult in things that we can’t empirically experience, like a heliocentric model of the universe. Many of us have not been far enough away from earth to see the planets orbit around the sun. We merely accept this thinking through peer reviewed scientific rigor. This is, to an extent, bandwagon thinking. However, the difference between scientific peer review and normal bandwagon thought is that peer review involves independent testing and evaluation, and it does not place importance on the conclusions of others. The bandwagon is important for things we can’t or haven’t experienced personally, but make sure the wagon you jump on has a solid foundation and isn’t premised on the willingness of others to belong to a group.

18. Authority Appeal

The authority appeal is the fallacy that simply because someone from a position of authority believes something that it must be true. This thinking is often useful (see the heliocentric example above) for many things, but it shouldn’t be relied upon wholesale.

For instance, say that there’s an expert in astrophysics–Neil DeGrasse Tyson. One day, he says, “Little green men are propelling the planets around the sun.” If someone said, “Well, Tyson is an expert in astrophysics, so what he says must be true,” then they have made an improper authority appeal. Simply because someone is an expert does not mean they are infallible. This is the basic premise of scientific exploration and peer review. While these sorts of authority figures are great for bolstering a claim in an argument, they should not be relied upon wholesale simply because they were right in the past. While their contributions to science are noteworthy and useful to us, we should be sure that what they posit is reasonable or plausible, rather than a blind reliance on their position of authority.

What examples of these logical fallacies have you experienced? Interested in more? Go read part 5.