Earlier this year I had an idea for a media literacy article in which I would ask people whose thinking I admired where they got their news. I was curious which media sources people thought were trustworthy, which homepages were reliable. I ended up putting the post back on the shelf because the answers I was getting were more or less the same: "I don't know, I sort of just end up clicking on things people post on Facebook."
It's not just that I needed to broaden my control group - a study done by the Pew Research group showed that over half of Americans under 35 get their news primarily from Facebook and Twitter.
While Twitter has established itself as a home for the citizen journalist, reporting in real-time from the riot, Facebook's share-baiting architecture is cultivating a new breed of citizen editors. As feeds are algorithmically filtered and manipulated by Facebook, each personal page becomes a homepage and each person becomes an editor, expending labor on curating their feed.
The biased news feed
The passive way of getting your news from Facebook is to staying adrift on the home page scroll, clicking on things that appear in the seemingly all-encompassing News Feed. But feeds are not neutral. The Facebook algorithm is designed with personalization in mind, showing you more of what you already like. "Over time, and through lots of clicks," writes Caitlin Dewey, "you gradually work your way into an online world where all news articles are fiercely liberal, or all recipes contain Brussels sprouts."
"On sites like Google or Facebook, now the primary news source for people under 35, algorithms aren't just keeping you from the next great cake recipe," Dewey goes on to write, "they could be isolating you from opposing views, exacerbating your own biases, or, as more disturbing and recent examples have shown, even perpetuating racial and gender biases of their own." For example, after Freddie Gray's death, my feed was full of support for the protests and signal boosting available resources - riot shamers few and far between. When Rentboy.com was raided, my feed buoyed posts decrying the raid and clamoring for the decriminalization of sex work. I can't say I mind it - the benevolent algorithm sends lots of great reading my way and makes using Facebook much less frustrating than if I were to wake up to a feed of TERFs or Trump supporters. The only trouble is that, unless you are constantly aware of the architecture of the feed, it lulls you into a false optimism, leading users to believe that the world at large is much more in their favor than it actually is.
In a presentation titled "algorithms as social control," given at the Theorizing the Web conference, panelists discussed the bizarre moment when word of protests in #Ferguson broke as the #IceBucketChallenge was trending. In the early days of the Ferguson protests Twitter exploded with coverage, while Facebook's algorithmic filtering kept it out, instead curating in the more family friendly meme of people dumping ice water on their heads for charity.
Curating as criticism / curating as editing
Over the last decade, writing about culture has become more curatorial than critical. Wired has dubbed this "the age of curation," comparing editors to baleen whales who drift through the sea of information, sifting the tasty bits. Editors curate for content and voice, especially for smaller independent publications such as Black Girl Dangerous, Rookie, and, of course, this one. But within publications with ad revenues and an editorial budgets, editors also curate to appeal to a multitude of demographics, maintain relationships, and occasionally slip in product placement to keep the advertisers happy. Front page editors still exist, but, especially as traffic becomes driven by user shares, the pressure to keep your content stream perfectly clean is off.
Instead, in the polluted waters of the Facebook feed, each user becomes a whale. People's personal pages have replaced homepages as landing sites. I have some of these power user friends: friends who can be relied on to post relevant content from their areas of expertise. There are the friends who can be relied on to share updates from protests, global worker rebellions, updates from refugee crises and climate legislation. There are friends who share feminist think pieces about rape culture, friends who will call out the latest misguided trans movie made by a cis person, friends to follow for #BlackLivesMatter tactics developments, friends who post dystopian cyborg news and worthwhile new music. It doesn't work quite the same as traditional news verticals - often these people are one and the same, sharing intersectionally, because we live intersectional lives.
This is labor
There's a Millenials of New York post that reads, "We live in such a politically contentious and difficult time, but so few people my age are actually willing to do anything about it. People need to WAKE up and smell the ESPRESSO. That's why I always share political articles on Facebook whenever I see them. In fact, I share so many that I don't have time to actually read any of them - that's how hard I'm working to make this world a better place. But some people I know hardly share political articles at all. It's like they don't even care about the pending avocado shortage, or whatever is happening in Africa. I'm doing my part, but honestly I can only share so much."
It's a joke but it's also not a joke.
Friends speak about feeling pressure to share relevant content. Sharing is partially performative - posting socially conscious articles brands you a socially conscious person, just as posting your shows brands you as a working artist. There is an impetus to keep up, to curate your shares to balance multiple points of view. If you post one gender article, you should probably post all of the gender articles, otherwise it's only a snapshot of an evolving conversation. If you post a white feminism think piece you should probably supplement it with a POC feminism think piece and perhaps a trans rights think piece, or your feed starts looking really single-issue. If you friend a cutie in your intersectional activism class and the last thing you shared was a pre-Seattle Bernie Sanders meme, how will they know you're truly woke?
The performative analysis is more than a little cynical. I want to believe sharing is driven more by the impulse to information-share than to appear well read. Plus, there's tangible "feeding the feed" power in intentional information-sharing. Everything happens all the time; editors shift the conversation by deciding when to pay attention to one thing over another. 2015 is not the first year that police have killed unarmed people, but 2015 has seen these stories foregrounded in the media. Queer, trans, and nonbinary people have existed forever, it's just more exciting for media outlets to paint certain identities as new. If you're conscious of your role as an editor, you can hold space for these things in an ongoing way, not only when they're trending in the mainstream.
A decade or so ago, individuals collecting and commenting on news did it preeminently through blogs, but the practice has been on the decline and Nieman Lab proclaimed the blog dead in 2013. The bigger ones baited some ad revenue and became viable sources of income for their creators, while the smaller ones became subsumed by social media platforms, especially Facebook. When citizen journalists were independent bloggers they worked for themselves, although the traffic they brought in meant they really worked for Geocities, or Wordpress, or Livejournal. Now, citizen editors are without a doubt working for Facebook.
That's not to say that this is wholly negative; Facebook definitely gives you access to a bigger audience base than your Blogspot ever did. People are more likely to read something if it's recommended by a trusted friend. You never know whose algorithm bubble you're floating at the margins of; sharing challenging perspectives or underrepresented voices can normalize radical ideas and expose people to ideas they may not necessarily have considered otherwise. But it's labor. It requires time, energy, emotional investment, and maintenance. It becomes a project you have little control over, one whose archive you don't get to keep. It generates profit for Facebook. And that's worth keeping in mind, if only on the days when your time and energy, which could be put into personal or material activist projects, is being siphoned into Facebook, only to be potentially filtered out and inevitably washed downstream.
NM went to journalism school, where they were told the internet was a fad.