You will have heard of how Facebook ran an experiment on 600K+ unaware users and concluded that yes, it is possible to tweak people’s emotions by selectively filtering their facebook timeline. In their own words:
We show, via a massive (N = 689,003) experiment on Facebook, that emotional states can be transferred to others via emotional contagion, leading people to experience the same emotions without their awareness. We provide experimental evidence that emotional contagion occurs without direct interaction between people (exposure to a friend expressing an emotion is sufficient), and in the complete absence of nonverbal cues. (full article)
This was the piece of news. Over the past couple of days, there’s been a funny reaction: people complaining against facebook’s “manipulation“.
Now, this is really awkward. Facebook’s timeline is the result of a filtering algorithm on our acquaintances’ actions. Hence, our timeline is by definition manipulating our reality or, better, the part of reality we are made aware of.
The idea that social media are an unfiltered forum is just as ridiculous as the idea their business is to give away free visibility (hint: it is not).
Let’s get real: not only visibility is a valuable commodity, but the essence of current social media is propaganda, not reality.
Wait, it gets worse. Given the current structure of “social” media, the only difference between a “genuine” and a “maipulated” timeline is in the programmer’s intentions.
If such intentions are known.
And if there is any human programmer aware of them (I would be very surprised if most of facebook does not on machine learning already).
The entire purpose of the timeline-filtering algorithm is to maximise some kind of desired behaviour from the user. It could be time online, number of likes/views/comments, clicks on ads, whatever.
Facebook (and any other social platform) is a huge behaviour-inducing machine, just like any other mass-media.
Facebook is no more manipulative than any visual ad featuring a scantily-clad young woman, only vastly more efficient; hence its success.
If we really care about being manipulated, we should redesign what we mean by “social” platforms to be something else than a huge user corral to sell user’s eyeballs to advertisers.
But of course, the money is on a completely different choice.
Current “social” media are of course far from being social. But approaching them naively will not improve them. If we want something better, we have to demand it and build it.
Hands up, anybody?