How Not to Solve Misinformation

Everyone is abuzz about fake news, Russian trolls and how they affected the US election. The proposed solutions, however, are actually more fake than the fake news itself.

Consider the filtering algorithms Google, Facebook and Twitter are putting in place. On the surface it’s a straightforward fix. If fake news doesn’t reach the eyeballs of susceptible readers — voila! they’ll act or vote based on sounder logic and reason.

This won’t this solve misinformation, however, and here’s why:

According to Adaptive Leadership, a leadership approach developed at Harvard, there are actually two kinds of problems involved with misinformation: technical and adaptive. A technical problem can be solved by an expert with a known solution. An adaptive problem, on the other hand, resides within the hearts and guts of the people themselves, not the experts.

As an example, here’s an everyday example that has nothing to do with misinformation. If your doctor says you have heart disease, a technical solution might involve a gym membership to lose weight, nicotine patches to quit smoking, or open heart surgery. Technical solutions only get you so far though. The adaptive problem requires dealing with the behavioral and social reasons for smoking, such as occupying your hands, filling awkward silence, relieving stress, and interacting with friends where smoking is the norm. No matter how caring or gifted the expert, you can’t quit smoking for someone else.

Nor can someone else solve misinformation for us with technical solutions. In fact, putting a technical fixes on the adaptive problem of misinformation is a waste of time and resources, and risks taking us down a dangerous path.

Here are classic ways, known as work avoidance, that we avoid dealing with the adaptive problem of misinformation:

Displace Responsibility

  • Externalize the enemy (e.g. blame Russia or Russian trolls)
  • Attack people with authority or a platform (e.g. attack Trump, the NY Times, Joe Rogan or Spotify)
  • Kill the messenger (e.g. unsubscribe or 'cancel' a media source that challenges your views)

Distract Attention

  • Define the problem to fit what you know how to do (e.g. block a Facebook friend with different beliefs)
  • Misuse committees and task forces (e.g. Facebook and BBC taskforces — they’re everywhere!)
  • Create ‘proxy’ fights (e.g. accuse the other side of creating misinformation)

The real problem is that we don’t want to deal with factual information that challenges our values, makes us feel uncomfortable, or questions who we are and what we stand for. We give misinformation oxygen because we want to believe the worst about ‘those people’. We ‘like’ it on Facebook because it reaffirms our belief about the best in ourselves and those we call friends.

Ultimately, the adaptive challenge we face is to heal the heart of democracy, as Parker Palmer describes in his characteristically simple eloquence. It’s about engaging with the other, not filtering them out of our lives and news feeds. This is our work to do, not Google or Facebook algorithms.

[Eric Martin is the author of the book, Your Leadership Moment: Democratizing Leadership in an Age of Authoritarianism. He also serves as Managing Director for Adaptive Change Advisors, whose mission is to democratize leadership — making it possible for anyone anywhere to lead adaptive change.]