Like everyone else, I’ve been hearing about Facebook conducting an emotional experiment, and everyone is in an uproar, and asking the big question – was the study even legally obtained.

All that anyone can do is shrug their shoulders and say, “Maybe.”  Every time I heard mention of it in the past few days, I would walk away thinking that this issue was entirely uncool of Facebook to use its members in such a way, but they already watch you like a hawk, so what can you do?  It took me longer than I’d like to admit to finally find out what was actually being studied and how, and now… I’m a bit miffed.

If this was a study derived from Facebook’s data, there really isn’t anything we could do (and the reasoning behind this real-time study was because the conclusions met from studying the 20 years of data was deemed controversial.)  Facebook owns that data, we are all playing in their sandbox, and they have ownership to everything within it, and everything you build within their site (photo galleries, company pages, direct “private” messages, everything,) is theirs. They have so much information at their finger tips, you would think it simple enough to determine if like minded posts were being aggregated, on an influxed rate, through any given grouped network.  Kramer et al were looking for concrete proof of what already occurs naturally on any given day, everywhere on the interweb.  They were looking for something along the lines of an “information cascade” amongst internet users, but with enough of a differentiation to call it something cooler: emotional contagion.

Facebook worked with Center for Tobacco Control Research and Education from University of California, and the Departments of Communication and Information Science, Cornell University to conduct a real-time study and willingly affecting the emotions of nearly 700,000 Facebook users to determine if they could instigate an “emotional contagion” within a controlled setting.

If I read the blurb correctly, the study was conducted to essentially figure out if like = like.  If all you saw on your Facebook News feed were hundreds of happy posts, would it compel you to share your happy thoughts?  Or conversely, if you saw a hundred sad and miserable people on your feed, would you be compelled to share your “bad” news as well.

Here’s an excerpt directly from the study:


We show, via a massive (N = 689,003) experiment on Facebook, that emotional states can be transferred to others via emotional contagion, leading people to experience the same emotions without their awareness. We provide experimental evidence that emotional contagion occurs without direct interaction between people (exposure to a friend expressing an emotion is sufficient), and in the complete absence of nonverbal cues.”

“if it bleeds, it leads”

There is a lot I don’t understand about this study (I’ve only seen the blurb, and if I saw the whole paper, I know I still wouldn’t understand); TheHubs was the psych major at University, not me (I hated psych 101 and kept falling asleep in class, oops,) and I haven’t yet had a chance to ask his thoughts on the matter, but…

  • First, you ask for participation, and get written permission from consenting adults to utilize them in your study.  If a study  is going to knowingly affect the individual(s) being studied in any shape or form, you need their consent.  This wasn’t a study to figure out how to better Facebook’s business, this was a social/psychological study, utilizing a well known social market as its farm.  Heck, I willingly participated in psych studies for a little extra cash while at university so long ago, so am familiar with signing away my sanity momentarily; yes, we got paid to be messed with.  Facebook could have easily created a database of consenting adults, simply by creating a yes/no form for  its users to voluntarily sign if they choose, and if they were worried about receiving too many rejections, a 10 Facebook coin incentive would have probably been enough to turn around more “yes”s for them.  So why didn’t the institutions, that know better, ask for consent?
  • Second, was this study conducted with an American-only member database? Or was this a misjudgement of global proportions? The online public blurb doesn’t mention demographic statistics, and I would love to know where their findings came from, including location, age and gender, etc.  Particularly age – I would hope that the data derived from adults that were above the age of majority from where they live.
  • Third, how was this a necessary study? There is just so much information out there that creates one big neon sign in whether or not like=like for emotional health. There is already a coined phrase for this, “if it bleeds, it leads.” Whether you want to call it a cascade or (give more shocking credence by calling it) a contagion, herding or leading, humans follow each other around like lemmings; we know this.  The global outcry from this study alone proves that our emotions are affected by outside influences. I get there is a need to determine whether social media is culprit to mental health issues like prolonged depression; I can understand the need for wanting to determine if certain medias is as harmfully addicting as nicotine; I could get behind a study that will somehow result in an overall benefit of improving mental health on a global scale, but what I cannot fathom (and this goes circular back to #1,) is why didn’t anyone get consent for this study.

It’s Facebook Complicated

The very departments that are there to help in mental health, potentially hindered in this instance.  If 300,000+ users spent even just one moment of their day upset, because of this study, the results of this study are not worthy. 600,000 online members were unwittingly used in an experiment, that could have had unknown lasting consequences.  There is a disconcerting amount of Facebook users who utilize this medium to freely announce life altering occasions, such as births, deaths, marriages and divorce, and if even just one person was affected by their network not seeing this sort of announcement because it was intentionally hidden from a member’s News feed, Facebook et al should be ashamed for knowingly and willingly blocking such communication.

I have to wonder if this outpouring of global resentment is what the study was looking for, ’cause it looks like the researchers are getting such results in spades.

So here’s my question for you… Should there be ongoing social/psychological research (which there should be,) would you be okay with Facebook actively adjusting/manipulating your account?  Perhaps we’ll soon see an addendum to our Facebook TOS that includes an opt-in/out function for future studies.