What is real can seem pretty arbitrary. It's easy to be fooled by misinformation disguised as news and deepfake videos showing people doing things they never did or said. Inaccurate information - even deliberately wrong information - doesn't just come from snake-oil salesmen, door-to-door hucksters and TV shopping channels anymore.
Even the president of the United States needs constant fact-checking. To date, he has made an average of 15 false or misleading public claims every day of his presidency, according to a tally from the Washington Post.
The study of business history reveals that people everywhere have always had a sweet tooth for the unreal, enthralled by what should be taken as too good to be true.
Cognitive scientists have identified a number of common ways in which people avoid being gullible. But con artists are especially skillful at what social scientists call framing, telling stories in ways that appeal to the biases, beliefs and prominent desires of their targets. They use strategies that take advantage of human weaknesses.
Often, people who are "emotionally vulnerable" are unwilling to accept an unpleasant reality. Consider Sir Arthur Conan Doyle, the British author who created Sherlock Holmes, the ultimate deductive rationalist - a character who said, "When you have eliminated the impossible whatever remains, however improbable, must be the truth."
Yet, after experiencing family tragedies and the horror of the deaths in World War I, Doyle publicly announced in 1916 that he subscribed to Spiritualist beliefs, including that the spirits of the dead can communicate with the living.
In 1922, Doyle visited Harry Houdini in his home in New York City and was shown a clever magic trick involving automatic writing on a suspended slate. Houdini could not convince a stunned Doyle it wasn't paranormal activity.
Sometimes people covet what their peers have already achieved so badly that they will overlook the obvious and deceive themselves and others in an effort to claim better opportunities and a better life.
In 1822, a Scottish con man, Gregor MacGregor, convinced countrymen seeking easy wealth and their neighbors' better lives to buy bonds, land and special privileges, fill two ships and sail to an idyllic country, the Land of Poyais.
MacGregor priced land in Poyais to make it affordable to Scottish tradesmen and unskilled workers who had heard of promising South American investments but lacked the means to take advantage of them. Poyais had a distinctive flag, its own currency and a diplomatic office in London. The only problem was that Poyais did not exist. Most of those who sailed died on the Mosquito Coast of Honduras. Some of the few survivors were so taken in that they refused to accept that Poyais did not actually exist and argued that it was MacGregor who had been defrauded.
Greed is blinding
Greed can prevent people from seeing that they have made a decision that defies common sense.
In 1925, the con artist Victor Lustig took advantage of the French government's public complaints that it would cost more to renovate a decaying Eiffel Tower than to demolish it. He gathered together scrap iron dealers, convinced them the tower would be taken down and sold it to one of them. Then he sold it again. Lustig gained a reputation as the "man who sold the Eiffel Tower."
Swindlers can find opportunity in their marks' ignorance and unfamiliarity with local customs. The confidence man George C. Parker sold the Brooklyn Bridge four times, usually to recent immigrants who did not understand that the bridge could not be sold. He also sold Grant's Tomb, the Metropolitan Museum of Art and the Statue of Liberty.
Misery generates desperate belief
Desperate people can suspend disbelief. People believe promises have to be true when the alternative is too miserable. John D. Rockefeller's father, William, was a bigamist and seller of alleged cures and ineffective patent medicines to ailing people, riding the circuit through rural towns. Bill "Doc" Rockefeller is said to have tutored his son, the builder of the Standard Oil Trust, in business.
People believe stories because they trust those who tell them. They don't know how to, or don't want to bother to, investigate the claims - or see no need to do so.
Starting as early as the mid-1980s, swindler Bernie Madoff sought investors in his Ponzi scheme among wealthy Jewish retirees and their philanthropic organizations in the U.S., and, in Europe, among members of aristocratic families. His victims simply trusted others in the group who vouched for Madoff and his investments.
In 1912, a skull, some bones and other relics were found in Piltdown in East Sussex in the U.K. The remains appeared to be from a creature who could be the long-sought "missing link" between apes and humans. It took over 40 years to confirm that Piltdown Man was a hoax, and over 100 years to identify who forged it. It's hard to disprove untruths - consider the ongoing searches for Bigfoot or the Loch Ness Monster.
People want dreams to be true
Sometimes, despite built-in skepticism, people badly want improbable but wonderful things to be true - to move the world with a dream. For instance, if alien spacecraft had really crashed and were being analyzed in Area 51 in Nevada, it could mean that interstellar travel is possible.
Repetition - the hallmark of social media - creates belief
Hearing a false claim over and over can be enough to generate belief in it. A common advertising and public relations strategy is to be extremely visible by multiplying "impressions," so people see the message everywhere.
Independent matching claims are seen as credible
Repetition alone may not be sufficient. When people try to assess whether something is true, they often look for objective reasons on which to base their belief, such as finding two similar, independent judgments about events. In my research I call this the "Rule of Two."
On social media, users often see a claim repeatedly, posted by different friends or connections. The same information seems to come not only from everywhere but from apparently independent sources. But often there is just one source, though easy online sharing makes it appear there are more than that. That is why so many observers worry about the role that social media has assumed in politics - it can lead people to believe that false claims are true.
People believe what others appear to believe
People have a built-in willingness to defer to confident assertions made by an apparently expert or legitimate authority. In experiments by Stanley Milgram, ordinary people complied with directives from the scientist to administer to subjects what they (falsely) believed were painful shocks. A passionate and convincing swindler, often masquerading as an expert - for example, an art dealer or researcher of miracle cures - exploits that weakness to get people to believe false claims.
A related mechanism introduced by Robert Cialdini is called "social proof": Seeing someone else do what you are thinking about doing frees you to act. It's evidence of the correctness of the action. This is why con men often use "shills," helpers who confirm to the victim that the con man's scheme is legitimate.
Research by Hugo Mercier and others, as well as my research on the theory of testaments and ongoing work with Robert C. Ryan on the "skeptical believer model," argues that human defenses against scams and falsehoods are more robust than the entertaining tales of bridges sold and voyages to nonexistent paradises would suggest. In more ways than one, social interaction can become a "con-test."
Society - including government - cannot function well if every claim requires fact-checking. Yet con artists thrive, year in and year out, in business, politics and everyday experience. Ultimately, however, a world of "alternative facts" is not the world that our dreams want to be true.
[Get the best of The Conversation, every weekend. Sign up for our weekly newsletter.]
Author: Barry M. Mitnick - Professor of Business Administration and of Public and International Affairs, University of Pittsburgh