Inspiration here is brought to you by an article I read in the Atlantic that spawned some potentially – or probably not – trademarkable Facebook inspired words and definitions to support the title and intent of this blog post:
facebookery. noun. the careless, deceptive, malicious, or foolish behavior of a person or group of people using Facebook. “There’s another ridiculous and false news story! I’m getting so sick of all this facebookery!”
facebooky. facebookish. adjectives. a somewhat condescending comparison of something or someone to Facebook’s design, content, or users. “Don’t they strike you as a bit facebooky in their grasp of reality?” or “His opinions sound very facebookish.”
facebooked. (1) adjective. the result of becoming consumed with or duped through media posts on Facebook. “She has really been facebooked lately and can’t find the time to finish other projects.” (2) verb. acted upon or influenced by content on the Facebook social media site. “Yeah, I know it was my own fault for sharing without checking sources for accuracy. I guess I got facebooked, again.”
Facebook. (1) noun. The online social media company; (2) verb. the act of using Facebook. “I really like to get into bed early and facebook myself to sleep.”
facebooker. noun. a person who facebooks. “She’s really quite a looker and a wild facebooker!”
facebooking. (1) verb. to use facebook. “He seems agitated. I think he’s been facebooking too much.” (2) noun. the act or instance of getting facebooked. “Those trolls gave you a seriously wicked facebooking!”
The term ‘face book’ originally described online student directories available at select American Universities pre-2006. Since then, the American-based online social networking service company, Facebook, expanded to allow almost everyone at least 13 years old to become a user.
So, now we have some context to proceed with a discussion of the article. Let’s begin with the slippery, dark lining that lays within the Facebook platform, as described in the aforementioned Atlantic article.
“Facebook’s draw is its ability to give you what you want. Like a page, get more of that page’s posts; like a story, get more stories like that; interact with a person, get more of their updates. The way Facebook determines the ranking of the News Feed is the probability that you’ll like, comment on, or share a story. Shares are worth more than comments, which are both worth more than likes, but in all cases, the more likely you are to interact with a post, the higher up it will show in your News Feed. Two thousand kinds of data (or “features” in the industry parlance) get smelted in Facebook’s machine-learning system to make those predictions.”
Before we delve too deeply on the downside subject of Facebook, let me be clear: I like Facebook. A lot. It does provide a great medium for social connections and I’ve weeded through my news feed to find some wonderful flowers and gems of people, places, things, events, knowledge, ideas, inspirations, and humor. That’s what Facebook offers and is what attracts most people who are users. Kudos to Mr. Zuckerberg and his brilliant team for their circa 2006 intent for use and value to consumers.
Enter the Sandman of politics and social engineering. This is how Facebook’s reach and power become used in a fashion that serves to influence human behavior beyond benign allegiances, likes and dislikes, and modestly constructive dialogue. As Alexis C. Madrigal observes in our subject article:
“But as far as “personalized newspapers” go, this one’s editorial sensibilities are limited. Most people are far less likely to engage with viewpoints that they find confusing, annoying, incorrect, or abhorrent. And this is true not just in politics, but the broader culture.
That this could be a problem was apparent to many. Eli Pariser’s The Filter Bubble, [a fascinating presentation you should watch] which came out in the summer of 2011, became the most widely cited distillation of the effects Facebook and other internet platforms could have on public discourse.”
Pariser, in his TED talk, provides proof for his assertion that “the internet [Google, Yahoo, and other information and social media sites] is showing us things that it thinks we want to see, but not necessarily what we need to see.”
The definitions at the beginning of this post that are derived from the core word Facebook could equally apply to many online media powerhouses. My purpose in writing this article is to bring attention and awareness to the hazards of ignorance of how web technology can manipulate content delivery and how we, as online information consumers, are subtly influenced by hidden – and not so benevolent – hands. Much of this is akin to the net neutrality debate.
It’s so very important that we understand what’s really going on behind and before our engagement with online platforms so that robustly accurate, truthful information is available to us, objectively. The preservation of our democratic republic depends on access to the truthful information relayed in full context; anything else or less is called propaganda.
Unfortunately, in the battle for viewership and revenue, a multitude of media outlets across the political spectrum are guilty of partisan propaganda. It’s increasingly difficult to find the unvarnished and fully vetted truth about people, things, and events. Which brings me back to my first post where I gave Facebook credit for inspiring this website and blog.
I’m an advocate for the advancement of technology, so long as it’s used in a responsible and ethical manner that enhances people’s lives and improves the conditions of the society in which we live. Presently, internet-based technology is being developed and used to undermine that aim. Business models have evolved to make use of increasingly unscrupulous practices, i.e. click-baiting, and ‘sponsored’ ads that look like news but are product promotions in disguise.
Back to Madrigal’s Atlantic article and a quote from Buzzfeed’s Craig Silverman:
“in the final three months of the  U.S. presidential campaign, the top-performing fake election-news stories on Facebook generated more engagement than the top stories from major news outlets such as The New York Times, The Washington Post, The Huffington Post, NBC News, and others.”
Getting duped and riled up by facebookery is now, and will probably always be, a fact of life. Facebook is taking steps to minimize the frequency and impact of such but, with a wide-open internet infested by users with divergent ideologies and malicious intent embedded in content, it’s unlikely that the erosion of our democracy will be curbed; unless we can start finding common informational ground for discussion and debate. It’s just too easy – and lucrative – to spread misinformation and splinter a society.
“A few days before the election Silverman and fellow BuzzFeed contributor Lawrence Alexander traced 100 pro–Donald Trump sites to a town of 45,000 in Macedonia. Some teens there realized they could make money off the election, and just like that, became a node in the information network that helped Trump beat Clinton.”
I rest my case. Call it what you will, but ignorance-based contempt is slowly consuming our society through undiscerning social media engagement. Now, I don’t always read the news, but when I do, I prefer the objective, unspun, unvarnished truth. Stay vigilant, my friends. Cheers and RIP, Jonathan Goldsmith.
P.S. “A Guardian reporter who looked into Russian military doctrine around information war found a handbook that described how it might work. “The deployment of information weapons, [the book] suggests, ‘acts like an invisible radiation’ upon its targets: ‘The population doesn’t even feel it is being acted upon. So the state doesn’t switch on its self-defense mechanisms,’” wrote Peter Pomerantsev.”