Tag Archives: Facebook

Facebook Algorithms and Alternative Social Media

NB: This is an introductory statement for the Contemporary Social Media Platforms and Creative Practice 2018, an online discussion I’m participating in. Thanks to Judy Malloy, a Visiting Faculty Member at the School of the Art Institute of Chicago, for inviting me to join in!

There’s a saying people use to discuss online services: if it’s free, then you’re the product. As Karl Hodge notes in The Conversation, an exchange between Mark Zuckerberg and Utah Senator Orrin Hatch illuminates this saying quite well. Hatch asked, “How do you sustain a business model in which users don’t pay for your service?” Zuckerberg replied, “Senator, we run ads.”

Like so many of his public statements, Zuckerberg’s response is accurate, if not particularly illuminating. Magazines run ads. Radio stations run ads. TV runs ads. Facebook runs ads, too, but it’s different: its ads are far more targeted, far more invasive. Its ads are based on our own expressions, desires, and ideas, all of which are sold back to us.

The relationship between advertising and our sociality – our connections to our friends, family, colleagues – is precisely what I became interested in during the writing of my first book, Reverse Engineering Social Media. As I suggest in that book, Facebook and other social media are almost direct outgrowths of the late 1990s online advertising industry, which created the concept of surveillance capitalism: monitor what people do online then sell them things based on their activities.

At the heart of this relationship between advertising and sociality is the sorting, funneling, channeling, and above all modulation of what we see in Facebook. I’m talking, of course, about Facebook’s algorithms. As The New York Times reports,

Facebook’s ad system provides ways to target geographic locations, personal interests, characteristics and behavior, including activity on other internet services and even in physical stores. Advertisers can target people based on their political affiliation; how likely they are to engage with political content; whether they like to jog, hike or hunt; what kind of beer they like; and so on.

If advertisers provide a list for email addresses, Facebook can try to target the people those addresses belong to. It can also do what is called “look-alike matching.” In this case, Facebook’s algorithms serve ads to people believed to be similar to the people those addresses belong to.

The goal of such targeted ads is to fit in with the non-advertising content in Facebook. That is, an ad should look like it belongs alongside your grandma’s latest pictures and your colleague’s note about the upcoming Christmas party.

It is not fair to say that Facebook’s algorithms are totally subservient to the needs of advertisers and marketers. More precisely, the two sides are engaged in constant negotiation. As Taina Bucher writes in a New Media and Society article,

There is now a whole industry being built around so-called ‘News Feed Optimization’ akin to the more established variant, search engine optimization. Marketers, media strategists, PR firms all have advice on how to boost a brand’s visibility on Facebook.

That is, while Facebook wants to serve advertisers by selling our attention to them, it also must maintain our perception that it is giving us access to our friends, family, and interests.

Given that Facebook is driven almost entirely by the needs of marketers, what is to be done? Much of my scholarship has explored this question. My answer is: support the alternatives. If you’re worried about Facebook’s desire to know everything about you, consider leaving Facebook for non-profit, open source systems, such as Mastodon, diaspora*, or Twister (check out the Omeka Archive on this site for more). These systems often do two things that differ greatly from Facebook: they don’t sell your data to marketers, and they don’t shape the content you see with algorithms. As part of this conversation, I am happy to talk more about the alternatives, as well as their relationship to Facebook and its internal algorithms.

#deletefacebook

There’s furor over the latest revelation that the world’s largest corporate social media site, Facebook, sells personal data to those who want to manipulate its users. The story — this time — is about Cambridge Analytica, a psychographic analysis organization which claims to be able to drive voter behavior. Many people have weighed in, so I won’t say much here. But I want to pick up on a point made by Adrian Chen in The New Yorker:

Just because something isn’t new doesn’t mean that it’s not outrageous. It is unquestionably a bad thing that we carry out much of our online lives within a data-mining apparatus that sells influence to the highest bidder. My initial reaction to the Cambridge Analytica scandal, though, was jaded; the feeling came from having seen how often, in the past, major public outcries about online privacy led nowhere. In most cases, after the calls to delete Facebook die down and the sternly worded congressional letters stop being written, things pretty much go back to normal. Too often, privacy scandals boil down to a superficial fix to some specific breach or leak, without addressing how the entire system undermines the possibility of control. What exciting big-data technique will be revealed, six years from now, as a democracy-shattering mind-control tool?

His point about “the entire system” is precisely why I started the S-MAP several years back. Or more precisely, the “entire system” is why so many alternative social makers do what they do: make new social media systems that allow for the pleasures of connecting with others while staving off so many of the deleterious practices associated with corporate social media: surveillance, data mining, the sale (or leak) of personal information to third parties, and above all the manipulation of our sociality.

Surveillance capitalism — a system where every move we make through space and thought is tracked, analyzed, and sold — is the system we need to eradicate. There can be no other way. As Chen notes, the short-term answer to Cambridge Analytica/Facebook will be a “superficial fix,” but the real answer needs to be the wholesale dismantling of a system that sees you and me and everyone we love as objects to be cognitively and emotionally dissected.

For now, #deletefacebook will trend on Twitter (sadly, another corporate social media system), but it’s started to trend elsewhere: on Mastodon, the federated microblog. On Twister, the totally decentralized, peer-to-peer microblog. I haven’t looked, but perhaps it’s trending on Dark Web social networking sites.

It is only after we leave corporate social media behind and take on the work of socializing social media — making it our own, owning it, democratically administering it, democratically improving it — that we will even begin to address the system as such.

And then, or better at the same time, let’s move to the eradicate the fusion of money, media, and power that is our contemporary democratic form of governance.