How the Web Went Bad

(Author’s Note: Excerpts from this essay were published by the Baffler on … oh, at some point in the distant past; I can’t remember when. I’m putting the whole essay here because no one told me not to, and honestly, I think it’s pretty interesting.)

When Andrew Sullivan, the big bearded granddad of political blogging, announced his retirement in 2015, the internet was quick to perform an autopsy, only slightly before its patient had died. Blogging was dead, social media had killed it, and Sullivan’s sign-off had rung the death knell. Sullivan himself wrote a sort of valedictory predicting that blogging, a medium he loved and practically invented, was heading into a long night of obscurity. But like King Arthur, he insisted, like Frederick the Great, like Obi Wan Kenobi, it would return.

What exactly was blogging, this thing that died? Interpretations differ: it was a publishing platform, it was the art of conversation, it was the cultivation of a loyal online community. Sullivan quoted Nick Denton with approval:

“[Blogging is] the only truly new media in the age of the web … Blogging is the essential act of journalism in an interactive and conversational age.”

The web’s a-changing. The web’s a-growing. Everyone agrees that relentless evolution is the modus, the via, the what and wherefore of online life. But not everyone feels equally copacetic about a particular turn of the wheel. For every phase of internet phylogeny, two classes of commentator have spoken up: the “oh, shit” crowd, and the “aw, shucks” crowd. This time is no different, except that it is.

#

The oh-shit crowd, a loose cohort of professional naysayers, gray-muzzled bloggers, tech trackers, leftists, and congenital curmudgeons, think something has gone badly wrong with online interaction. They cringe at the reach of social media, bemoan the grim prospects of content creators. They think the web has been hijacked by advertising, or fatally infected with bad faith, or swallowed by a few behemoth companies. They believe that clickbait and Twitter wars are driving us into a new age of exploitation.

The aw-shucks crowd says, “Well, that’s the internet!” It’s ever been thus. One platform succeeds another. Technology pursues its own strange course. The old fear the new, the young neglect the old, and each generation idealizes the media of its youth. The only overarching trend is that the web gets ever rowdier, more crowded, more beautifully and chaotically democratic.

That’s how I used to feel. No more. The oh-shit crowd is right. Something really has changed, or perhaps a slow process of change has become too relentless to ignore. If members of the aw-shucks crowd don’t see it, they must be too young, too cynical, or too craven. The internet still has everything, and thank the cybergods for that. But the reigning culture of online life has lost its radical fervor. Mercenary interests and the habits of disengaged spectatorship have encroached on an original climate of community and collaboration. The web isn’t getting more democratic at all. It’s steadily metamorphosing into yet another arm of the old profit-driven, corporate-run, consumer-oriented mass media.

Ironically, it was a shared article on Facebook that brought this home to me. Andy Baio’s paen to the work of archive.org opens something of a window onto the early internet. I’d forgotten that we used to talk about being “netizens” back then—flippantly, to be sure, but with a wistful subtext. We were citizens of a new public space, with its own demands and responsibilities. Geocities and other hosting plans adopted the language of cyberspace, offering their users chunks of online real estate, stakes in a digital frontier. The very term “site” evokes images of a plot, a property, bare land on which to build, and later services like SecondLife promised to outfit this metaphor with clunky graphics and soaring avatars. For a surprisingly long time, the territorial conceit, lifted out of the fantasies of science fiction and strongly infused with the American dream, informed how we thought about web development. Digital domains were new land, a fresh green breast of a second America, waiting to be inhabited. And this time we were going to get it right.

Who would do the building of this brave new world? You would, the user, the owner, the developer. You’d learn some HTML, or LSL, or a dash of JavaScript—it wasn’t hard—and get cracking. Which is why every web site back then was cluttered, homely, and perpetually under construction.

And what would you build? Well, that was an open question. Business was big, and public services slowly opened their archives, but the early web was famous for being oriented toward eccentric interests. Toy collectors, role-players, artists, educators, wood-carvers, history buffs and other oddballs jumped at the opportunity to build public shrines to private passions. The quintessential online experience was a serendipitous discovery. We said content was king, and what we meant was that the individual had finally come into his own. It was individual labor that would build the web, out of a million peculiar little projects. It was individual expertise that would attract visitors, an individual voice that would keep them coming back.

The talk of those early days sounds hokey, now. Of course it does. It arose from ideas that were earnest, sometimes utopian, often poignantly resistant to skepticism. They make for a painful contrast with today’s online culture. People still nod to the founding fantasies, celebrating the web as a place of political engagement, personal empowerment, and amateur creativity. And there are plenty of voices online, more than ever. But their potency, and their potential, has been slowly redefined.

#

I’ve asked many friends to explain to me the charms of social media, which, though I use these services, I’ve never really understood. Is it all idle gossip? A way to keep in touch? A harmless distraction, like Buzzfeed or Weird Al?

The answer I get has nothing to do with distraction at all. You have to use these services, I’m told: to make contacts, curry influence, build awareness of whatever it is you do with your time. You might find Twitter fun or insufferable; either way, it’s an indispensable promotional tool.

This glances back to the old webernet ideals: entrepreneurship, craft, independence. It’s also a rather feeble attempt at hardheadedness. As promotional strategies go, flinging out endless quantities of poorly edited one-liners is hardly an improvement on the efforts of Madison Avenue. The appeal of the technique is its easiness: anyone can sign up, link, like, tweet. But social media as a style of self-advertisement trades sophistication for obsessiveness. Like a dupe at the slots, we keep punching buttons, hoping for the strike of a lucky success.

Does it matter? At its best, social media is certainly no worse than word-of-mouth. For most people, that’s probably all it amounts to. Personal or public, private or promotional: to you, to your dad, to your friend in the band, it hardly amounts to more than chump change. The small screen of the smartphone, and the stingy format of the tweet, conspire to keep users from producing anything of even briefly lasting value. Nor do they encourage careful discrimination between tossed-off remarks and targeted announcements. This enforced sketchiness is precisely the point. The purpose of such tools is simply to crystallize the kind of daily nattering that has always been with us, transcribing it in a way that can be exploited by the real advertisers: the business titans, the opinion trackers, the mavens of market research.

This dubious bargain—scattershot careerism for the little guy, laser-targeted marketing for the rich—has become so well-established, it’s hard to believe that only a few years ago people seriously looked forward to something different. Thinkers like Cory Doctorow, Yochai Benkler, Jonathan Zittrain, Kevin Kelly, and Lawrence Lessig had begun to push for a new sharing economy that would celebrate collaboration, volunteerism, and costless transactions. Crowdsourcing, back then, didn’t mean hitting up your fans for handouts. It meant that people would willingly donate online efforts to something other than shopping, watching, and badinage. Optimists foresaw whole novels being produced this way, games, new industries, content curation, scientific research, even feature-length films. Savvy remixing promised to reconceive the basic nature of media, raising digital collage into an art form in its own right. The point of this burgeoning creative commons wasn’t to celebrate every pilfered video clip as a triumph of dilettantism. It was to build up unmercenary creativity, free exchange, and the human instinct for cooperation into a new cultural order.

This vaunted market of makers, coders, educators, and collaborators had its counterpart in a blogosphere addicted to debate. The essence of blogging in those days was reengagement. You found an ardent soul who disagreed with you, and you revisited the disagreement again and again. Of course, the arguments often descended into sniping and nitpicking, but the sniping and nitpicking came with accountability. A blog was more than a stream of lightly edited thumbnail articles. It was, as the name suggests, a log, an archive, a transcript of evolving opinions, where any sloppy assertion or dubious claim could be brought to bear on a fresh disagreement. It framed its creator as a one-woman magazine, with editorial philosophy and writerly voice combined in one idiosyncratic ego. Most open source projects waned or failed through lack of talent, but writing is in many ways the signal art of democracy, distinct from TV, film, software, and even handicrafts in that if you consume it, you can produce it. Blogs looked like a sharing economy that had already arrived, a ferment of free and unconventional ideas.

There are some who assume social media has improved on this promise, as if bandying snark with frenemies on Twitter is the same as running one’s own magazine. Even sardonic accounts of social-media silliness still dutifully nod to our democratic hopes. But a steady dribble of time-killing gossip hardly justifies lofty talk of “democracy,” “empowerment,” and “cultural participation.” The old web and its rhetoric are still with us, but the obligations that gave them dignity have been stripped away.

#

Even in the dark ages of the mainstream media, it’s worth remembering, audience members had ways to signal their likes and dislikes. They could change the channel, switch the station, respond to polls, cancel their subscriptions, buy or refuse to buy new releases, call in requests, write to editors, organize boycotts, set trends, purchase tickets, join the studio audience. If nothing else, they could get on the phone and gossip with friends, knowing that even their casual remarks might in some small way bubble up into general opinion. The main thing they could not do was produce content themselves. They could be counted, in other words—and they were, obsessively—but without a special entrée, they could not actively participate in the creation of any but the most local art and culture.

That was the glorious promise of the web: the chance to do it all yourself. But cultural participation isn’t only a matter of crowing yea or nay at every diversion that flits across a screen. It takes work, and more: a dedication to the work of others. The word responsibility has a didactic ring, but we’re not talking about a lofty responsibility, or an onerous one. Only the humble, everyday responsibility we assume for the mores of a social space we enter, be it a public forum, a support group, someone’s web site, someone’s home.

That’s not true for members of a mass audience, who have limited options, but a special immunity. They can cheer or boo, stay or go, acclaim or reject what the culture makers offer, and not much else. The compensation for these constraints is that their choices are taken to be unassailable. Precisely because the audience members have submerged themselves in a larger social body, no person in particular will be challenged to justify his or her views. To leave the audience, to step onstage, means giving up that right to inviolability. It means that heckling and clapping, tuning in and tuning out, are no longer adequate options.

That’s the promise, the challenge, of public engagement, one that the evolution of the internet has steadily whittled away. Improving software and corporate cunning bear some of the blame, but the lesson of the web’s history is that most people, most of the time, are happy with the prerogatives of mass consumers. Hence the consistent defining down of the average netizen’s expected contribution, from building one’s own site to creating one’s own content to volunteering one’s occasional labor to posting one’s comments to, finally, liking and sharing and sneering. If the trend continues, it won’t be long before we’re selecting our kneejerk reactions from menus of stereotyped options (Amazing! Stupid! Misogynist! Libtarded!) or simply voting with our viewership habits, as we used to.

Admittedly, online traffic is famously difficult to analyze. Studies routinely describe a growing dominance of streaming media, but that’s mostly because the files are so big. And the essence of the internet is a hopeless jumbling of what counts as public or private. Phone calls, photo-albums, texting, TV: what can we call this but life as usual, measured in bps?

Relying on old-fashioned social observation, we can see the emergence of a model somewhat like nineties-era radio or TV, with social media serving as the telephone, the dial, and the cablebox. A cluster of big companies, many of them the familiar media titans of yesteryear, produce the most popular content, usually by exploiting the eagerness of interns and other naifs. They churn out countless three-minute diversions—screeds about liberals, screeds about conservatives, trailers, teasers, PSAs, music videos, plain old commercials—packaged to appeal to stereotyped demographics. The old media empires had ways of stoking trite controversies (industry rags, wardrobe malfunctions, risqué interviews), along with techniques for engineering “surprise” successes (savvy advertising, giveaways, puff journalism). And so it is today, except that we call the controversies “outrages” and the success stories “viral hits.”

Social media exploit these non-events to organize the masses into trackable audiences, but the furor never amounts to much more than a highly refined form of consumer feedback, little different in its underlying dynamics from Nielsen ratings, watercooler confabs, phone polls, and letter campaigns. People talk about social media exactly the way they used to talk about popular opinion, and they mean the same thing: “All my friends are gabbing about this topic, plus a few think pieces I happen to have read.”

What are the topics that incite these fleeting paroxysms of opinion? Why, they’re the same old mass entertainments, in slightly different forms: TV shows, country singers, summer blockbusters, video games, magazine articles. Yesterday’s upstart webonauts, with their dreams of creation, debate, and cooperation, are melting back into an anonymous crowd. And I think people like it that way. Certainly Twitter mobs and Facebook sharers seem to assume the inviolability of casual spectators, free to cheer and jeer without broader accountability. Holding a tweeter to the standards of a journalist is held to be uncouth.

The web’s mass audience is uniquely visible, notoriously vocal, unusually subject to measurement and manipulation. But its power is blunt and reactive, because its members embrace the essential tradeoff of consumerism: unlimited authority in judging culture, piddling control over making culture. They also regularly manifest a consumer’s latent rage. When the only power you have is to say Want or Don’t Want, you say it as vehemently as possible, because every frustration of your tastes feels like a denial of your right to speak at all.

Sure, counterexamples teem online: forums, hobby clubs, tumblrs, self-published novels. What all these things have in common is that it’s more difficult by the day to interpret them as a preview of a coming vox populi paradise. The old web culture limps along, and occasional breakout hits keep alive the familiar fables of economic and social renovation. But they’re no more heraldic of a looming new society than the mail order phenomena and sleeper hits of an earlier age. As for the communitarian web, its best-known exemplars seem increasingly beleaguered, like the shrinking political blogosphere, the ever-more-cultlike Wikipedia, the scary back alleys of 4chan, or the struggling open-access movement. The defining online action, now, is to click and be counted, in the world’s most sophisticated opinion poll.

That’s not necessarily a disaster. Mass media aren’t all bad. And it’s likely that long-term trends will prove more radically transformative. The galling thing is to hear every crass excess of this ad-driven, bean-counting system—every click-and-cry mini-scandal, every theft of original content, every colicky eruption of disgruntled consumerism—celebrated as a symbol of the democratizing power of technology. If anything, the current ethos is anti-individualistic. Instead of elevating amateur work, it encourages people to grab blockbuster hits on the cheap. Instead of helping creators and volunteers, it empowers advertisers and administrators. Instead of highlighting bold new voices, it lumps everyone into transient mobs defined by gossip, brief obsessions, and free-roaming discontent.

Which is more or less how things were in the eighties. Meet the new media, same as the old media. As one member of its captive mass audience, my response is: boo, hiss.

This entry was posted in Uncategorized. Bookmark the permalink.

Leave a Reply