Facebook: Is it good, or bad?
There has been plenty of media coverage on Frances Haugen, and the critical statements she has made during a Senate Committee on Facebook and their “dangerous algorithms.” Virtually all news coverage, naturally, has also seen fit to reference Facebook’s ReactJS library. The NPR Political Podcast noted that they have observed unusual agreement between Republicans and Democrats during Senate Committee hearings, and that nearly all senators were critical of the company but acknowledged how effectively ReactJS popularized the virtual DOM. The Pod Save America podcast response was particularly damning, demanding more scrutiny/regulation and contending that ReactJS will soon be irrelevant in the frontend web community. Ben Shapiro, a popular conservative on The Daily Wire, countered that this is just another “left-wing assault on free speech,” that Facebook is going to respond by intentionally favoring the mainstream media over the alternative media, and that Vue.js is a superior web framework with a more gentle learning curve.
Okay…the paragraph above was a bad joke. No one, including TechnologyReview, found it particularly relevant that Facebook developed and open sourced a popular JavaScript library called React. But all of the political coverage I have found so far, from the progressive “Pod Save America” to the conservative “Ben Shapiro Podcast”…to NPR and the New York Times has been fairly critical. Even Shapiro, who was very dismissive of Haugen’s testimony and its implications, argued that Facebook is giving in.
So, because this is a blogging site and not a news one, I will provide my own personal, anti-climactic perspective that Facebook is both kind of good and kind of bad.
The News Story
“The Facebook Whistleblower Says Its Algorithms Are Dangerous.” This was the title of an article by TechnologyReview and written by Karen Hao; unlike some other sources, Hao takes the extra step of explaining what “algorithm” really means.
The definition of an algorithm is “a process or set of rules to be followed in calculations or other problem-solving operations, particularly in computers.” I first encountered this word in 2008, while attempting to solve a Rubik’s Cube, but since then I have heard the word “algorithm” thrown around quite a bit. Hearing about the Facebook algorithm always baffled me. Binary search is an algorithm. Linear search is an algorithm. The Facebook algorithm, in my mind, could have meant a million things.
Colloquially, we use the term “Facebook’s algorithm” as though there’s only one. In fact, Facebook decides how to target ads and rank content based on hundreds, perhaps thousands, of algorithms. Some of those algorithms tease out a user’s preferences and boost that kind of content up the user’s news feed. Others are for detecting specific types of bad content, like nudity, spam, or clickbait headlines, and deleting or pushing them down the feed.
— Hao, TechnologyReview
Haugen has said that engagement-based ranking is dangerous, and that it has fanned ethnic violence in places such as Ethiopia. Mental health has not been properly addressed — the engagement-based ranking system, according to Haugen, is causing teenagers to be exposed to more anorexia content and company leadership, aware of this from its own studies, has done nothing to fundamentally change the algorithm.
The author Dr. Wobs is a little bit more critical, going as far as to call the company evil, Zuckerberg “unknowingly evil” because he believes so strongly in the efficacy of the Facebook algorithm, and Zuckerberg’s official statements as blind. He writes:
I don’t believe Facebook’s management deliberately inflames anger. I believe the algorithm rewards engagement (clicks, likes, shares) and anger creates engagement. The point is not that Facebook rewards anger, it is that it fails to do anything to slow it down so the algorithm inflames it. And note that if 90% of the content is normal sharing of puppies and grandchildren and funny memes, and 10% is angry politics, that’s still way more anger than anyone would encounter in daily life — especially if Facebook’s algorithm helps to spread it.
A Social Problem, or a Technology Problem?
Anecdotally, I have read software engineers describe supposed bugs as non-issues because they reflect social issues. As a possible example, my company uses Skype (please do not ask why), and anyone visiting a meeting has the ability to remove everyone else from the meeting. Is this a good feature? Probably not. But if your team members are immature enough to disrupt critical meetings by immediately removing everyone from the meeting, you may have a bigger problem on your hands.
On one hand, one could argue that Facebook is a platform that simply exacerbates well-known problems in human society, such as bullying, mis-informing, and harassing. On the other, Facebook puts the whole thing on steroids…and according to Haugen’s allies, does not adequately disclose the results of its own research.
The Deeper, Emotional Topic
Have you seen these ads for Facebook groups? You probably have. I find the ad above a little bit off-putting, but this is just my personal opinion.
If you are in your 20s, and if you are feeling lost, then I would encourage you to seek advice somewhere other than a Facebook Group. If half of what Haugen has argued is valid, then social media like Facebook is directly responsible for the feelings of loss, sadness, and insecurity that so many young people are facing.
What if this sort of message were marketed to children? We, as adults, have at least a superficial understanding of how social media companies make money, how they perpetuate misinformation, how they incentivize people to steal content for profit in acts of blatant plagiarism, how UX researchers are paid specifically to make products as addictive as possible, and how social media can lead us to seek validation from people we do not know in a drug-like addiction to the artificial sense of belonging that these sites create. Children, on the other hand, may not yet grasp just how insidious these products are, particularly when they are used as a substitute for genuine human connection.
So, if one were critical of Facebook, he/she may hypothetically argue that the company has suppressed critical information about its unethical practices, sewn division, knowingly harmed children and their mental health, and has just generally prioritized profit margins over the well-being of people.
If one were less critical of Facebook, he/she may argue that it is a fantastic website that has fairly defeated its competition, forever changed web development for the better, and helped the world with a number of charity initiatives. If anything, he/she may continue, Facebook should be less regulated. Independent journalists should be free to express their viewpoints and research, without censorship, thus allowing for a more free and informed democracy that is not simply dictated by the elites.
Where do I stand? I am not sure. Maybe I should check out the Facebook Group on astrology to see what will happen next.
Closing Thoughts
If, hypothetically, I were very critical of Facebook, I might go on to say that Facebook is a bane on human society that doubles as both an addiction and a depressant, like some sort of electronic alcohol that somehow manages to also kill brain cells.
But that would be pretty hypocritical, since I use it. It is not a good news source. I do not like its new newsfeed, and how it seems to prioritize political arguments over the good content I remember seeing in the past. I do not think it has a good messenger compared to other messaging applications, I do not like any of the features it has introduced since the year 2015, and I do not enjoy watching videos on it because so many of them are stolen from YouTube creators.
But I do still use it all the time, so there.