Facebook’s recent apology for its Year in Review feature, which had displayed to a grieving father images of his dead daughter, highlights again the tricky relationship between the social media behemoth and its users’ data.
The free service Facebook offers to its 1,2bn users is free because of the advertising revenue the site generates from the time that users spend on the site. This model drives a need to keep users on the site as much as possible.
“Sticky” qualities that keep users coming back include the essentially addictive nature of social media sites — one that’s been compared to gambling and alcohol addictions. Another is to provide interesting new features that present Facebook’s vast pool of historical data in new ways — the Year in Review is such a feature, which automatically pulls together a collection of photos from significant moments through the year.
But innovations pose creative challenges, such as how to develop an algorithm that selects content for the Year in Review that you’d want to see and share. In most cases this works perfectly well, offering up memories from your historical Facebook timeline to bring a smile to your face. But in other cases there is the phenomenon described by writer Eric Meyer as “inadvertent algorithmic cruelty”: his Year in Review arrived with a picture of his recently deceased daughter, six-year-old Rebecca, as the headline image.
Meyer’s blog was widely reported and prompted an apology from Facebook.
But what Facebook didn’t apologise for was offering a new feature that thrust content directly into the user’s face. Yes, the algorithm was clumsy, but the notion of forcing content, un-asked for, upon the user is almost taken for granted. In business terms, this is sometimes called “supplier push”. It becomes part of a business philosophy that sees users as crowds, and innovation as a process of “mass customisation”. The danger of appealing to the crowd en masse, is that a significant minority will always fall between the gaps.
So, a minority get to see their dead relatives, dead dogs, their exes, and even their past bad behaviour they’d rather forget in their Year in Review. To be clear here, Facebook doesn’t publish the Year in Review directly, but offers a sample for further customisation and publication if the user chooses. Regardless, it’s still thrust in your face, whether or not you wish it; Eric Meyer got an image of his dead daughter whether he wanted to or not.
And this is where the relationship dynamics that sit at the heart of Facebook’s “free” social media model come in. By preventing us from deleting our own content, Facebook becomes the equivalent of an ever-growing attic of memories — many of which we, if we could choose, would choose to forget. This content is harvested for information with which to further refine advertising offers.
The existence of this problem has been recognised elsewhere: the Mailing Preference Service provides an opt out register for direct mail advertising for baby-related products to prevent unwanted reminders, for example in the event of a baby’s death. Online services have yet to incorporate these measures. And generally speaking, aren’t there are often things from our past that we wouldn’t respond well to when re-presented to us?
As social media grows in sophistication, algorithms attempt to target you with content that will keep you interested and so more connected and engaged. Software can currently recognise smiling faces, but not that the smile on one face is of someone no longer with us. Why? Because the user didn’t tag “dead” on the photo.
Tagging is another example of “in your face” social media, in that it also prompts you to look at an image to approve someone else’s tag on your image, or that you have been tagged in someone else’s image. Of course, it might not be an image you wanted ever to see again. There will be more of this in the future: if you can’t delete photos of your past without leaving Facebook altogether, do you lose the right to privacy at the moment you feel you need it? If your Year in Review shows you engaged mostly in dangerous sports, will that affect your next insurance quote?
If you thought you were going to start your new year with a clean sheet, then, as a social media user, think again. Facebook’s new and revised terms and conditions will see it observe your behaviour, location and the sites you visit in even more detail — in order, no doubt, to create further features to keep you engaged. Inevitably, these will also throw up further issues of badly targeted content and intrusion into our personal lives — a double-edged sword that can bring pleasure, or pain.
- Paul Levy is senior researcher in innovation management at the University of Brighton
- This article was originally published on The Conversation