BITlab: Behavior Information Technology

404 Wilson Rd. Room 249
Communication Arts & Sciences
Michigan State University
East Lansing, MI 48824

The effects of the "deep news feed"

By: Emilee Rader

Most of the conversation online about the Kramer et al. Facebook emotional contagion study has focused on informed consent, A/B testing, and effect sizes (for a summary see Some critics of the paper’s conclusions have argued that it is a big conceptual leap to equate the use of positive and negative words in Facebook posts with experiencing actual positive and negative feelings.

One alternative explanation is that participants in the experiment engaged in a form of conversational imitation when they created posts during the week-long study. In essence, they posted things that were similar to what they saw others posting.

Imitation or mimicry in verbal and nonverbal expression is commonplace. We do this in conversation, unconsciously, because it improves rapport, makes us more likable and because coordination is important for successful conversation (Niederhoffer & Pennebaker 2002). Mimicking someone else not only makes the person doing the imitation more likable to his or her conversation partner; it also causes the person being mimicked to be more generous and helpful to others (van Baaren et al. 2004). In other words, it serves an important social function.

Imitation is not just something that occurs in face-to-face communication. We do this when chatting online as well, and it can increase trust in the person we’re communicating with. For example, in an experiment involving a social dilemma game conducted over text chat, participants were less likely to defect when more mimicry was present (Scissors et al. 2008).

So, at a minimum, the Facebook emotional contagion study proves that asynchronous social imitation or mimicry happens when users read posts created by others, and then create posts themselves. This imitation effect (the paper calls it “contagion”) was identified in through a manipulation of the News Feeds of participants that algorithmically selected some posts to be “hidden” from participants’ News Feeds, based on criteria created by the research team.

Author Adam Kramer, in a Facebook post on June 29, emphasized that despite this manipulation, no posts were actually hidden from users, because the posts withheld from the News Feeds of participants were still available to them via another part of the interface:

“Nobody’s posts were ‘hidden,’ they just didn’t show up on some loads of Feed. Those posts were always visible on friends’ timelines, and could have shown up on subsequent News Feed loads.”

That’s a technical way of talking about a social behavior, and it ignores the social reality of interacting with others on Facebook.

Many frequent Facebook users are actually aware that Facebook chooses some posts to display in their News Feeds, and hides others. Rebecca Gray and I recently conducted a study in which we asked Facebook users if they feel like their News Feeds show them everything their Friends post, and analyzed their responses. Most of our respondents (recruited using a panel and Amazon Mechanical Turk) provided answers that demonstrated they understood to some extent that the News Feed does not display a list of all Friends’ posts in chronological order. Many reported having experienced situations where they only learned they had missed a post after someone else mentioned it to them in person or on Facebook, or days later when the post “surfaced” at the top of the News Feed after enough other users had commented on or Liked it.

In our study, we also prompted respondents to visit the Timelines of several of their Facebook Friends, and tell us if they noticed any posts there that they did not remember seeing in their News Feeds. Out of 939 respondents, only 153 (16%) said they didn’t notice any missed posts.

The emotional contagion experiment proved that social imitation happens on Facebook, but more importantly, it also proved that filtering algorithms can have a very real influence, and not just on the posts users read on Facebook. These algorithms also influence the posts users create. In other words, what you say on Facebook is affected by what you see others saying. And what you see others saying depends on what Facebook’s News Feed algorithm (or in this experiment, the positive/negative manipulation algorithm designed by the researchers) decides to show you.

This is interpersonal communication, mediated by algorithms that make some posts easier to access than other posts. It calls to mind the so-called “deep web”, the webpages not indexed by Google; the information you can only access if you already know what it is and exactly where to find it.

What happens to our relationships with our Facebook Friends when an important, unconscious communication process like imitation is “nudged” by an algorithm? This is a question the emotional contagion experiment was not designed to answer.