Facebook's social psychology experiment isn't so bad

I just finished listening to +TWiT from this weekend with +Leo Laporte, +Natali Morris, +Tim Stevens, and Denise Howell.  I don't find myself saying this about TWiT that often, but I disagree with the panel's thoughts on the recent revelations about Facebook's experimentation on their users news feeds.  (http://www.contriving.net/link/fr)

So many people seem to be in such a tizzy over this, but I just don't get it.

The very nature of human interaction is to influence others.  In today's world, we influence other people's emotions via advertisements.  We influence other people's emotions via "addictive" video games like Farmville or Simpson's Tapped Out.  We influence other people's emotions via how we choose to dress.  We influence other people's emotions via the type of car we drive.  All of this is intentional.

The more organized examples of influencing other people's emotions is done for a profit!  Heck, Zynga employs researchers whose sole purpose is to game your emotions and get you to spend more money.  The same holds true for advertising agencies, corporations, and political campaigns.

On the show +Tim Stevens brings up the point that websites do A/B testing all of the time to see what kind of content or presentation causes more user engagement.  That is exactly the same sort of thing.  Emotions drive user engagement.  *Websites manipulate your emotions to make more money.*  This is normal.  This is part of being human.

When you talk to the lady at the DMV you slap a smile on your face so you can manipulate her emotions in your favor so she'll hurry you through the process.  When you take your date to a fancy restaurant, you're manipulating their emotions for  some desired outcome.  

The majority of the time you're consciously or unconsciously partaking in a behavior to manipulate the emotions of those around you in your favor.

Facebook's experiment isn't any worse than any of these.  In fact, if you want to make a value judgement, it's probably the least "bad" out of them all, as the results of their experiment were published.

Finally, Leo brings up the point that the study was stupid because the results were obvious.  I have several things to say about that.  

First of all, I don't think the results are completely obvious.  Yes, I would have predicted the results they got with a fair amount of confidence. However, it's not out of the realm of possibility that people just don't care about their news feeds enough to let it affect them, or to see the opposite happen…namely for people to post more positive stories in "retaliation" against negative news feeds.

Secondly, you could extract a lot of nuance out of such a study.  Maybe, for each of the types of responses I detail in the last paragraph, there are groups of people who consistently respond in each manner.  That would be an interesting and useful result.

Lastly, just because you think the results of an experiment are obvious, doesn't mean the experiment is not worth doing.  You don't know for sure the results until you do them, and how wonderful is it when an experiment dis-confirms your beliefs?  Basically, using "obvious result" as the determining factor in choosing whether to do an experiment is a bad idea.

I'm no fan of Facebook.  I think it's a time wasting cesspool out of which most can extract very minimal amounts of value, and I only visit it a few times a year.  I just can't get excited about this relatively (if not absolutely) small experiment they did amidst the vast and turbulent sea of emotional manipulation we live in…particularly when the most pernicious examples are for profit and we accept those with barely a peep.

A lot of people have some sort of intuitive, it-just-feels-wrong, response to this story and I just don't get it.

Google+: View post on Google+

Leave a comment ?

12 Comments.

  1. Being manipulated into feeling bad/negative emotions is pretty fucked up. Most people will be fine in the end but what about the people who are  on the edge of depression or something worse? The implications of this kind of manipulation could be pretty serious.

  2. Glad someone agrees re the A/B thing, as I certainly wasn't getting much love during the show with it!

  3. The issue is informed consent.

    Facebook claims their TOS covered it, which they also openly acknowledge nobody reads it. How can you give informed consent when you are not informed?

    That's the issue. People were part of a psychological experiment which could possibly lead to negative emotions(and therefore actions) with no informed consent.

  4. +Chad Neu Do people give informed consent for advertisers to use cutting edge psychological science to manipulate them into buying things?

  5. Good point. +Dustin Wyatt This wasn't advertising though. This was manipulating people's social life. That's a different thing entirely.

    I'm personally not comfortable with it which is why I don't use Facebook.

  6. There is some understanding that advertising is manipulating us. It isn't surprising to hear that FB does this as well but this was an outright psychological experiment done to people without their consent. There are plenty of reasons why this is important in the psychological world.

  7. +Sophie Melich _There are plenty of reasons why this is important in the psychological world_

    That may or may not be true, but people aren't in a furor about this because of the standards set out in world of psychology research.

  8. I agree with Dustin on all points and struggle to make any more good points in favour of this study. The problem is that the people who don't like it /don't like it./ As +Chad Neu said, he is not personally comfortable with it. It's an emotional reaction, not a rational one.

    I wonder if Google can massage his feed enough to change his mind… 🙂

  9. I have both personal and major ethical issues with it.

    There is no denying there are major ethics issues here. Informed consent was not acquired and this is not advertising. 

    Would we be having this conversation had this experiment affected a mentally unstable person enough for them to commit suicide? If someone had died as a result of feed manipulation(yeah, far fetched) would this still be no big deal? Especially since they were never told they were being experimented on? 

    Facebook needs to do a much better job of informing people about what is in the ToS honestly.

  10. I seem to have skimmed your first post, apologies +Chad Neu.  I have to agree with you there, despite seeing potential benefit in not being informed.

  11. +Chad Neu 
    There is no denying there are major ethics issues here
    I deny that that there's obviously major ethics issues here.  I'm not sure.

    This is not advertising
    Right, I don't think anyone was claiming that it is advertising.  Rather, to prove your point, you have to explain how the emotional manipulation is different from advertising's emotional manipulation.  I see no difference.

    I don't believe the vast majority of the populace has any idea about the research into cognitive biases that organizations use to sway their emotions.  We don't have informed consent when we're not aware of the sheer manipulative power imparted by decades of research into heuristics, biases, and cognition.

    I'm not arguing that it's OK, I'm arguing that Facebook's moves are not as bad as all that and/or that the response is out of proportion.

    Would we be having this conversation had this experiment affected a mentally unstable person enough for them to commit suicide?
    I wouldn't be surprised if this has, in fact, happened to some poor soul exposed to advertising or some such sort of widely-spread emotional manipulation.  I also wouldn't be surprised if the behavioral researchers working for some political ideology haven't affected some unstable person into extremism.

    Facebook needs to do a much better job…ToS…
    This is an industry-wide problem that I'm in complete agreement with you about, I just don't think it's particularly more or less important in this case.

  12. Just because something is legal it is not always ethical.  The Facebook ToS probably covers the company legally, but I have to believe that the Senior Managers pondered many of the discussions we are having now, yet still pushed forward with the OK on the study.  Breaking what your customers perceive as a "trust" (even though it may be legal) is a slippery slope.   Taking a righteous hard-line stand when the public learns of the issue also may not be the right approach.  Certainly not the death spiral for FB, but slip #1 down that slope.

Leave a Reply