Didn’t hear about it? “Everything We Know About Facebook’s Secret Mood Manipulation Experiment” should give you a good start.
If you know anything about the food industry, particularly processed and packaged foods, you know there are allowances for the presence of non-ingredients, such as dust, fecal matter (usually of vermin or insects), roach parts, rat hairs, and other unseemly bits. The allowances are very small and given as a matter of practicality: shutting down an entire farm and factory for point-oh-some-odd per cent cockroach parts is simply not cost effective, as the risk of possible damage from said parts is low.
Let’s say there’s a ubiquitous chocolate candy company that decides to test the limits of these allowances to see how it will affect their customers. Starting with a freshly cleaned facility, they produce a batch of their product drawn from a pristine crop, in which nothing except the listed ingredients is present, or at least as near to it as is possible. They make another batch of their product from an infested crop and deliberately include the maximum amount of roach parts and rat hairs that will pass inspection. They send each batch out into the world and record the results: who and how many customers were absent for work in the following days, went to the hospital, developed allergies, etc.
What they did was technically legal.
What Facebook did was technically legal. Passing through the legal loopholes of “Terms of Service” and the kind of manipulation that advertisers already practice, Facebook was within its legal rights as a company to decide what and how much of its product consumers could use. The event in short was this: Facebook manipulated the feeds of almost 700,000 users to restrict the emotional content exposure toward them to see if they later reflected the same emotions in their own posts. Essentially, they performed a psychological experiment without IRB approval, i.e., the ethics overview committees that ensure the safety and protection of those participants in the first place, and without informed consent, i.e., the “this is what we’re doing, this is what it could do to you, you have the right to decline, we’ll only lie about it if we feel it is absolutely necessary to the integrity of the study, and we’ll tell you if we’ve done so the minute the trials are over” speech. (See the article at the top.)
The ethics of such secret, unsupervised experimentation have already been called into question, so much that many are calling for government response. We already know that Facebook and similar social media use is linked to depression, a mental illness with real economic impact so that even the average taxpayer ought to feel offense. And because Facebook use alone is connected to depression, direct manipulation to increase such depression has horrible implications for vulnerable users who may not otherwise be aware of the medium’s effects. Despite being a plausible and disturbingly common scenario, we will likely never know the extent of the company’s damage to the population, as they have not released the raw data nor have they contacted individual users to inform them of their participation or which manipulation they received. This is unacceptable.
I haven’t been happy with Facebook in a very long time. I find the medium tedious to manage, and it reduces human interaction to popularity contests tallied in “likes” and “shares.” Like many unhappy users, I stayed because it was my most immediate and easiest network to long-distant connections. However, my disgust for the medium has reached the tipping point with this latest news. I will not be joining the very temporary 99 Days of Freedom campaign. Two weeks ago, I quit. I’m relieved to say I don’t miss it.