A recent article in Wired magazine, “Fear and Liking on Facebook” by Matt Honan, explored what would happen if the author “liked” everything he saw on his Facebook's News Feed, no matter what it was. His findings? The simple act of “liking” something does not have innocent social consequences, and Facebook has turned into another one of many culprits designed to attract people towards big business, big politics, big advertising, and as a result, an uncompromising belief system that rejects anything outside of its bubble.
Facebook's algorithm goes something like this. By monitoring what you like, it can often accurately predict what your preferences are. It then fills your News Feed with those preferences, which you are then inclined to “like” even more. As the article states, “the more you like, the more you will like, an ever-escalating spiral of satisfaction.”
This might be all fine and dandy if the algorithm was designed to lead you to healthy, positive social interaction with your friends and family, which was the entire reason Facebook was originally designed. And to Facebook's defense, oftentimes it does. But, as Honan's experiment discovered, the more addicted you are to “liking,” the more the algorithm is designed to lead you towards “brands and messaging rather than humans with messages.” In no time, Honan's News Feed was devoid of human beings and loaded with ads and articles from sites like the Huffington Post or Upworthy.
It got political too. As soon as Honan began “liking” politically based content, his News Feed suddenly became filled with conservative ideology geared towards feeding an individual's already developed prejudices rather than offering other, fresh perspectives. Once those conservative philosophies were set in stone, Facebook sometimes threw in extreme opinions from the left only to cause anger in the individual's deeply ingrained biases.
The former point has become a cultural epidemic we see on Facebook every day, and we have Facebook to thank for it. Individuals can no longer check their News Feed without seeing at least one politically charged Red Team vs. another politically charged Blue team, the goal of both sides to be louder than the other and to receive more “likes” from people belonging to the same team (which, as we have discovered, only leads those people to a wildfire of content reinforcing the same one-dimensional views they already believe in). The result is a bunch of shouting and self-satisfied applause rather than any productive listening.
Related to our area of business, this has also created problems in the legal system. Jurors are supposed to put their biases aside, listen to the evidence and the facts, follow the law and render a fair and impartial verdict based on those laws and facts. And many times they do. Systems like Facebook, however, have trained people to apply their biases on others without hearing out other points of view. For this reason, juries have become vulnerable to taking that training into the courtroom, enforcing their prejudices on one party over another and ignoring substantial law and facts to render a verdict based on personal ideologies.
This is not to suggest everyone cancel their Facebook account or never “like” anything again. people have taken this action and I don't blame them, but these discoveries don't necessarily require something extreme. Instead, this serves as a simple reminder for everyone to be well aware of how Facebook is designed, what it is feeding you and why. It's a nudge of encouragement to carefully and thoughtfully listen to other perspectives regarding how this world is supposed to work. We have a right to look at sources other than Facebook for information and the right to reject what Facebook is feeding us. We have the right to "like" something that changes our mind.