© 2024 Public Radio East
Public Radio For Eastern North Carolina 89.3 WTEB New Bern 88.5 WZNB New Bern 91.5 WBJD Atlantic Beach 90.3 WKNS Kinston 88.5 WHYC Swan Quarter 89.9 W210CF Greenville
Play Live Radio
Next Up:
0:00
0:00
0:00 0:00
Available On Air Stations
US

Lab Rats, One And All: That Unsettling Facebook Experiment

Myshkovsky
/
iStockphoto

I've always been the shrugging type when it comes to lots of things that Facebook does that make people crazy. They change the layout, they mess with the feed — even making you noodle with your privacy settings has always seemed to me like the craven doing of business, and something where I could say yes or I could say no, the same as any business that offered good service sometimes and lousy service other times.

But I found I did not shrug at the news late last week that Facebook had allowed researchers both inside and outside the company to manipulate users' news feeds to hide good news or bad news to see whether it affected the emotions of those users themselves. In other words, if they hid the parts of Facebook where people share joy with you, where they tell you about happy things, where the griping and grousing is balanced with baby pictures and bright sides, could they make you feel worse? If they led you to believe that something had altered the balance of things so that even if you couldn't put your finger on it, it seemed like things were going worse in the world, would it affect you? Could they make you artificially positive about things by hiding bad news from you?

In a paper that you can fortunately at least read, since you may have been one of the many unpaid and unwitting research subjects, researchers said yes, it turned out that when they fiddled with your news feed to see if it would make you feel bad (I admit to a bias toward being more concerned about the "feel bad" part than the "feel good" part), it made you — perhaps not you in particular, but you in the aggregate — feel bad. It didn't just make people more negative, but it made them more withdrawn: People whose feeds were manipulated to seem depressing not only posted things that were more gloomy, but they posted less on the whole. They didn't just talk about being sad to fit in or some silliness like that; they actually talked less. By all appearances, they actually got a little bummed out.

As the researchers note, the effects in this one study were small on the whole, but as they also point out, given the huge number of things that influence mood, it's pretty impressive — depending on your definition of "impressive" — to be able to demonstrate that you, by one specific manipulation, demonstrably made randomly selected people feel even a little bit worse. And there's no way to know, really, whether there are people in that sample who were made to feel not a bit worse but a lot worse. If they were, it was an intended and predictable effect to which they had no real opportunity to say yes or no.

There's a lively debate about whether the generic reference to your data being used for research in the Facebook terms of use is sufficient to render this experiment legal and medically ethical. Does allowing them to "use the information [they] receive about you" for "internal operations" including "research" constitute consent to having the data shown to you manipulated by people (including outside researchers) who are trying to see if they can induce changes to your mood without your knowledge?

If it does, would that allow researchers to pepper your news feed with stories about a particular topic to see if they can change your beliefs about it? To hide posts from your conservative friends or your liberal friends to see how your politics are affected? Would it allow them to hide all your updates from a particular friend to see if they can disrupt your connection to that person, or see how long it takes to do so? You can easily imagine a researcher saying, "We're trying to find out whether failure to interact with someone's posts leads to alienation from that person, so we're hiding Person A's posts from Person B, leading Person A to think she's being ignored." Would that be OK, according to the reading that says allowing your data to be used for research gives Facebook or anybody it decides to work with blanket permission to anything they want to do to the information they put in your feed for any reason?

This is not about the right or wrong of Facebook culture — whether you should be influenced by the messages you receive in your feed, or whether you should get depressed when everyone around you is depressed. It's about the fact that the researchers who did this experiment were testing, in their words, "emotional contagion." They were trying to see whether by manipulating people's environments they could alter their emotional states.

Even assuming it's legal, though, and ethical, I speak here as a Facebook user and straight from the heart: It's gross. It's gross. There are people who can't afford to be made to feel very much worse than they already do; there are people at all times who are existing pretty close to the line between OK and not OK, and more who are existing pretty close to the line between somewhat not OK and really not OK. There's every chance that this experiment, predictably and as intended, took a depressed person somewhere and made it harder for him or her to get up. There's every chance that somebody went for a quick dose of distraction because of a breakup or a job loss or a death or a simple setback and didn't get it, because it was denied to them on purpose, just to see what would happen.

And by the same token, there's every chance that somebody had a bad day, posted about it, and didn't get the support they might have expected from a friend because that friend was having all that negativity hidden from them and never saw the post. Just to see what would happen.

You can think it's stupid that people turn to social networks at times for a boost — those baby pictures, those snippets of good news, a kind word — but they do it, and Facebook knows more than anybody that they do it. If this kind of experimentation is really OK, if it's really something they believe is within their everyday operations and their existing consent, all they have to do is clarify it. Give people a chance to say yes or no to research that is psychological or sociological in nature that involves not the anonymized use of their data after the fact but the placing of users in control and experimental groups. Just get 'em to say yes or no. If it's really not a big deal, they'll say yes, right? It really seems like a pretty reasonable request.

Copyright 2021 NPR. To see more, visit https://www.npr.org.

US
Linda Holmes is a pop culture correspondent for NPR and the host of Pop Culture Happy Hour. She began her professional life as an attorney. In time, however, her affection for writing, popular culture, and the online universe eclipsed her legal ambitions. She shoved her law degree in the back of the closet, gave its living room space to DVD sets of The Wire, and never looked back.