Like everyone else, I’ve been hearing about Facebook conducting an emotional experiment, and everyone is in an uproar, and asking the big question – was the study even legally obtained.
All that anyone can do is shrug their shoulders and say, “Maybe.” Every time I heard mention of it in the past few days, I would walk away thinking that this issue was entirely uncool of Facebook to use its members in such a way, but they already watch you like a hawk, so what can you do? It took me longer than I’d like to admit to finally find out what was actually being studied and how, and now… I’m a bit miffed.
If this was a study derived from Facebook’s data, there really isn’t anything we could do (and the reasoning behind this real-time study was because the conclusions met from studying the 20 years of data was deemed controversial.) Facebook owns that data, we are all playing in their sandbox, and they have ownership to everything within it, and everything you build within their site (photo galleries, company pages, direct “private” messages, everything,) is theirs. They have so much information at their finger tips, you would think it simple enough to determine if like minded posts were being aggregated, on an influxed rate, through any given grouped network. Kramer et al were looking for concrete proof of what already occurs naturally on any given day, everywhere on the interweb. They were looking for something along the lines of an “information cascade” amongst internet users, but with enough of a differentiation to call it something cooler: emotional contagion.
Facebook worked with Center for Tobacco Control Research and Education from University of California, and the Departments of Communication and Information Science, Cornell University to conduct a real-time study and willingly affecting the emotions of nearly 700,000 Facebook users to determine if they could instigate an “emotional contagion” within a controlled setting.
If I read the blurb correctly, the study was conducted to essentially figure out if like = like. If all you saw on your Facebook News feed were hundreds of happy posts, would it compel you to share your happy thoughts? Or conversely, if you saw a hundred sad and miserable people on your feed, would you be compelled to share your “bad” news as well.
Here’s an excerpt directly from the study:
“Significance
We show, via a massive (N = 689,003) experiment on Facebook, that emotional states can be transferred to others via emotional contagion, leading people to experience the same emotions without their awareness. We provide experimental evidence that emotional contagion occurs without direct interaction between people (exposure to a friend expressing an emotion is sufficient), and in the complete absence of nonverbal cues.”
“if it bleeds, it leads”
There is a lot I don’t understand about this study (I’ve only seen the blurb, and if I saw the whole paper, I know I still wouldn’t understand); TheHubs was the psych major at University, not me (I hated psych 101 and kept falling asleep in class, oops,) and I haven’t yet had a chance to ask his thoughts on the matter, but…
- First, you ask for participation, and get written permission from consenting adults to utilize them in your study. If a study is going to knowingly affect the individual(s) being studied in any shape or form, you need their consent. This wasn’t a study to figure out how to better Facebook’s business, this was a social/psychological study, utilizing a well known social market as its farm. Heck, I willingly participated in psych studies for a little extra cash while at university so long ago, so am familiar with signing away my sanity momentarily; yes, we got paid to be messed with. Facebook could have easily created a database of consenting adults, simply by creating a yes/no form for its users to voluntarily sign if they choose, and if they were worried about receiving too many rejections, a 10 Facebook coin incentive would have probably been enough to turn around more “yes”s for them. So why didn’t the institutions, that know better, ask for consent?
- Second, was this study conducted with an American-only member database? Or was this a misjudgement of global proportions? The online public blurb doesn’t mention demographic statistics, and I would love to know where their findings came from, including location, age and gender, etc. Particularly age – I would hope that the data derived from adults that were above the age of majority from where they live.
- Third, how was this a necessary study? There is just so much information out there that creates one big neon sign in whether or not like=like for emotional health. There is already a coined phrase for this, “if it bleeds, it leads.” Whether you want to call it a cascade or (give more shocking credence by calling it) a contagion, herding or leading, humans follow each other around like lemmings; we know this. The global outcry from this study alone proves that our emotions are affected by outside influences. I get there is a need to determine whether social media is culprit to mental health issues like prolonged depression; I can understand the need for wanting to determine if certain medias is as harmfully addicting as nicotine; I could get behind a study that will somehow result in an overall benefit of improving mental health on a global scale, but what I cannot fathom (and this goes circular back to #1,) is why didn’t anyone get consent for this study.
It’s Facebook Complicated
The very departments that are there to help in mental health, potentially hindered in this instance. If 300,000+ users spent even just one moment of their day upset, because of this study, the results of this study are not worthy. 600,000 online members were unwittingly used in an experiment, that could have had unknown lasting consequences. There is a disconcerting amount of Facebook users who utilize this medium to freely announce life altering occasions, such as births, deaths, marriages and divorce, and if even just one person was affected by their network not seeing this sort of announcement because it was intentionally hidden from a member’s News feed, Facebook et al should be ashamed for knowingly and willingly blocking such communication.
I have to wonder if this outpouring of global resentment is what the study was looking for, ’cause it looks like the researchers are getting such results in spades.
So here’s my question for you… Should there be ongoing social/psychological research (which there should be,) would you be okay with Facebook actively adjusting/manipulating your account? Perhaps we’ll soon see an addendum to our Facebook TOS that includes an opt-in/out function for future studies.
10 Responses
I had not heard about this until reading your article. I am appalled that Facebook would conduct such, I’ll call it “experiments” with peoples emotions. Study? Those occur with informed consent; this is totally unethical behaviour. I feel violated!
And there’s the rub; the chances of our accounts having been selected are slim – in 2012 Facebook celebrated 1 Billion user accounts around June, citing nearly 100 million active daily users, so assuming they grabbed their control group from active users alone, we’re looking at .006% of Facebook users being affected by this. But because the research group and Facebook didn’t bother to actually ask for participation, none of us know if we were part of that group, so that’s 1 Billion users potentially feeling violated; as we all should. :/ The other interesting part is that if the researches did have a consenting control group for this study, this article on my site would have had an entirely different spin, full of excitement over the results found.
Facebook should be ashamed of itself running that type of survey. I have never heard of it either, but it is sad when you mess with peoples emotions.
It really is a sad way to treat their members without consent, and unfortunately the ongoing question is if they had the right to do so. :/ Perhaps its just one more way that we all have to be careful and well aware of our online “surroundings”.
I don’t know how I’m supposed to feel about this, I mean we did sign up on Facebook and put our personal lives up on the internet.
Its truly a confusing concept for sure. While we willingly put our lives on display online (and I firmly believe that there is no such thing as privacy anymore, so we should be careful/aware with what we share,) the research was based on active emotional manipulation of its users, outside of normal control/natural aggregation, which is the part that really bugs me, lol. If they had just stuck to their passive research (20 years worth of data,) then there wouldn’t be as much of an issue, ’cause Facebook owns that data. Facebook would then just have to prove that this social study was business related – that the results would impact Facebook in some way, which they easily could.
facebook is so confusing, it’s sad they have so much control over our daily lives
They really do, and they do it well enough that we’re okay with it, lol! There’s a reason why Facebook has over 1 Billion members. 😉
I heard about this on CBC yesterday and I was pretty confused. I understand why they would want this kind of data but affecting the content people see seems to be really manipulative.
Awh, I missed CBC yesterday… maybe I can find their vid online this afternoon. The lack of consent is biggest issue for me. Somehow the results found from 20 years worth of naturally developed data was considered too controversial to be accepted, so instead the researchers thought it better to manipulate Facebook users without consent to have more concise results. Am so confused, lol.