Down with Geeks!

(Author note. I published this elsewhere once. Might as well republish it here.)

For several years, now, lefty cultural critic and man about the webs Freddie de Boer has been—how should I put this?—lamenting the laments of geek culture. His beef is that geeks are reflexively self-pitying. Anyone can see that nerdy interests are now ascendant in popular media, so why do nerds still insist on seeing themselves as cultural untouchables?

De Boer gave the neatest version of his argument in this 2012 post on Parabasis:

The major genres and media once consigned to the realm of geek or nerd culture, such as science fiction, high fantasy, comic books, and video games now dominate both in terms of commercial success and popular attention. They are simply unavoidable …

Yet despite this dominance, there remains a remarkable sensitivity towards perceived slights among these genres’ most dedicated fans.

By “remarkable sensitivity,” he means things like railing at critics who pan Batman movies. Jocks may be the archetypal enemies of high-school dweebs, but jocks aren’t known for writing snooty film reviews. The true bete-noire of geek culture has always been the snob: the mainstream critic who thinks Truffaut’s better than Tolkien, praises Lorrie Moore over George R. R. Martin.

And yet the real cultural outcasts, de Boer writes, are now the snobs themselves:

As dissatisfied as fans of comic books and sci-fi may remain at the perceived value of their cultural commitments, surely they can recognize that it’s better than nonexistence. And this is the stark reality for much traditional high art, like ballet, theater, opera, and orchestral music: what is threatened is not just their place in some nebulous hierarchy of tastes but their continued survival.

He’s right, obviously. I’d add three points.

First, geek culture may be king, but geek behavior still isn’t cool and never will be. Being a geek means loving something way, way more than most other people love it, and that kind of devotion always runs counter to the studied disaffection of the cool kids, or even the bland uniformity of the popular kids. So it’s fine nowadays to go see, say, The Avengers, even to love it. But if you start spouting off about inconsistencies with the Hulk comics, or the intricate mythology of the Thor universe, you’ll bore other people and embarrass yourself.

This doesn’t refute de Boer’s thesis, of course, just refines it. The geek who rambles on about Star Wars mythology at a dinner party comes across as a goofy, overgrown kid. But the geek who rambles on about ancient Greek mythology seems like an insufferable showoff. “What is this, a college lecture? Who’re you trying to impress?”

A related point is that the rise of geek culture has been something of a devil’s bargain. In the course of becoming popular, the cultural properties now crowding our TV screens have lost a lot of their deeper geekiness. The lore, linguistics, and poetry of Tolkien’s books have given way to glitzy effects and acrobatic action. The stupefyingly intricate plots of long-running comics have simmered down into simple narrative arcs that get recycled every few years. (Movie 1: the hero rises. Movie 2: the hero falters. Movie 3: the hero returns. Reboot and repeat.) The high concepts of classic science fiction have been hammered flat under sacks of corporate coin. In the video-game world, triple-A cashfests like GTA V and numbing addiction engines like Candy Crush suck up all the mainstream attention, while artsy indie games remain the province of enthusiasts. I could go on to talk about novels, but there’s no point, because no one would care unless those novels had been made into movies. We do sometimes get a flick like Spike Jonze’s Her to remind us that big money can still bed down with big ideas. But the countercultural SF that I grew up on survives at the fringe (albeit with two notable and worthy exceptions: Ursula Le Guin and PKD. I’m still waiting for Samuel Delany to go mainstream.)

All of which goes to say, part of the reason geek fanboys can seem so thuggish is that Hollywood and TV have elevated the most thuggish elements of geek culture. Even SF’s long streak of feminist writing has made it to the screen mostly in the form of leggy women kicking ass.

Which brings me to my last point. I’ve been arguing that certain elements of geek culture—idea-based fiction, experimental writing, indie video games—are worthier than others, and that most of what’s made it big—action blockbusters, paranormal romance, twitch-based gaming—isn’t very worthy at all. In other words, it’s perfectly possible to be a geek and feel that the old Star Trek is better than the new, or that George MacDonald is better than George R. R. Martin, or that novels are more interesting than comic books, or that most video games are garbage. So here we are, solidly within the borders of Nerdland, far from the strains of Wagner and the strokes of Picasso, and the old snob-vs-fan dynamic has popped up again. And in my experience, it provokes the same screeches of incredulous rage.
It seems that whether you’re talking about Shakespeare and Stephanie Meyer, Pollack and Rockwell, Gray’s Anatomy and the Wire, or Rush and Rolling Stone, that argument always plays out along the same lines. “Isn’t it all a matter of taste?” the populist says, and then proceeds to argue, in the most stringently moral terms, that having a certain kind of taste makes you a good person—an enthusiast, a passionate fan, a sharer of joy and delight—while having a different kind of taste makes you a bad person—an elitist, a snob, a spoiler of innocent fun.

The truth is that whether we’re talking about geek culture, high culture, low culture, foreign culture, or any kind of culture, what’s in retreat is any form of criticism on aesthetic terms.

But that’s a topic for another day.

Posted in Uncategorized | Leave a comment


Tim Wu, who writes often about technology, once suggested this thought experiment:

A well-educated time traveller from 1914 enters a room divided in half by a curtain. A scientist tells him that his task is to ascertain the intelligence of whoever is on the other side of the curtain by asking whatever questions he pleases.

The traveler’s queries are answered by a voice with an accent that he does not recognize (twenty-first-century American English). The woman on the other side of the curtain has an extraordinary memory. She can, without much delay, recite any passage from the Bible or Shakespeare. Her arithmetic skills are astonishing—difficult problems are solved in seconds. She is also able to speak many foreign languages, though her pronunciation is odd. Most impressive, perhaps, is her ability to describe almost any part of the Earth in great detail, as though she is viewing it from the sky. She is also proficient at connecting seemingly random concepts, and when the traveler asks her a question like “How can God be both good and omnipotent?” she can provide complex theoretical answers.

Based on this modified Turing test, our time traveler would conclude that, in the past century, the human race achieved a new level of superintelligence.

Superintelligence, eh? Tim Wu is no pie-eyed tech apologist, but in this scenario, even he succumbs to the wild romanticism built up around smartphones. The temptation is always to judge a new technology by what it could achieve, not by how it’s actually used, as with those early commentators who applauded television as a promising educational tool. Through rhetorical sleight of hand, our shiny new devices are presented as having their own wants, their own dreams, their own destiny, and like the immaculate, faceless figures inserted into architectural drawings, human beings barely figure in the picture.

The view on the street is always a bit messier, and if my experience is any guide, Wu’s hypothetical conversation would go something like this.

Time Traveler: Well, I must say, it really is a wonderful opportunity we’ve been presented with here, and I hope we’ll both be able to profit from this rare meeting of minds, so widely separated in time and experience, but not, I should hope, in their essential sympathies.

Modern Person: Uh, yeah.

TT: I hope you won’t mind if I, as the more ignorant party, pose the first question.

MP: Whatever.

TT: I’ve heard that people of your time have remarkable powers in matters of calculation, philology, geography, and indeed, general knowledge. Would you mind giving a short demonstration? I’d like to pose to you four challenges: first to calculate, to the tenth decimal, the natural logarithm of 97, second to give a short description of the terrain of the African interior, third to offer a brief treatise on the philosophical topic of your choice, and fourth, to give this last in French.


TT: Hello? Perhaps I’ve been too bold in my–

MP: No, I’m sorry, wait, I was just … what were you saying?

TT: Well, I was hoping, if it wouldn’t be too presumptuous, to ask if you might–

MP: No, sure, I mean, it’s OK, I just … hold on a sec.


TT: Is everything all right?


TT: Have I said something to offend you?

MP: No, no, sorry, I–hold on, hold on.


TT: I’m afraid I really must … I hope you won’t think it impolite if I ask what you–

MP: Oh my God!

TT: Pardon?

MP: Oh. My. God.

TT: I’m afraid something has gone badly wrong with this interview.

MP: No, it’s just, this guy posted this video here, and there’s this, like, goat, and this baby, and they’re both eating this big pile of … no, no, I’m sorry, OK. You were saying?

TT: Well, given the extremely rare experience afforded us, here, I made so bold as to dispense, as I thought, with idle chitchat, and to present you with four mental challenges intended to assess–

MP: Uh-huh. You wanted what, some math thing?

TT: My first challenge, yes, was to have you calculate, as quickly as possible, the natural logarithm of 97.

MP: Sure. Logarithm. Can you just spell that for me?

TT: Certainly. L-O-G-A-R-I-

MP: Got it. 1.9867717342.

TT: Astounding! Remarkable! And so quickly! But … no, wait, that can’t be correct. That’s not even close. May I ask what method you used?

MP: That’s, you know, that’s the answer. That’s what it says.

TT: But that can’t possibly be–

MP: Well, I don’t know what you expected, but that’s what it says.

TT: It seems to me you must have confused the natural logarithm with–

MP: Wait, what, natural logarithm? Is that, like, something different?

TT: Of course. The natural logarithm is–

MP: Well, why didn’t you say so?

TT: But I did say so. I don’t believe you were listening to me at all.

MP: Well, whatever. It doesn’t matter. What was the other question?

TT: The second challenge I posed was to give a description of the terrain of central Africa. And I must admit, I’m especially eager to hear your answer in this case, given the fragmentary nature of the reports the people of my day continue to have from that dark and savage–

MP: Yeah, I see, sure, it’s … well, it looks like mostly jungle.

TT: Yes, that’s what the travelers of my own day have reported, but I was hoping you could–

MP: No, that’s it. Jungle. There’s the Sahara up top, and then there’s some desert in the bottom, and in between, there’s all this green stuff. Lots and lots of green stuff. That’s what I see on the map, anyway. Let me zoom in. Yep. Trees. Bushes. Green stuff.

TT: I see. But would it be possible to furnish–

MP: Wait, let me see … OK, OK, here we go. Central Africa, located along the equator, consists primarily of wide plateaus that are smooth in the central areas and more rough along the exterior of the region. The plateaus in the region exhibit a huge range in altitude, reaching up to 16,795 feet at Margherita Peak (the highest point in Central Africa) and descending into the ground in deep and narrow gorges near the Kouilou and Congo.

TT: Well, I’m impressed, that’s really quite informative. I suppose it’s time to move on to our–

MP: Oh, God.

TT: Is something the matter?

MP: No, I just … I was scrolling through this stuff, and I … oh, shit. It’s just that there’s all this stuff about the ivory trade. And it’s like, ugh. It just makes me want to vomit or something, you know? It’s like elephants are practically as smart as people, you know, and then we just go and … oh, God.

TT: Are you well? Should we take a rest?

MP: No, I just, I’m looking at all these pictures, now, and I just, oh, Jesus. Oh, God.

TT: Hello?

MP: It’s just that people are just such complete shit, you know? People are such complete, total shit.

TT: I’m afraid I find myself a bit out of my–

MP: And now there’s all this stuff about gorillas … and Dian Fossey … and the stuff going on the in the Congo and … God, they’re actually eating people, there. They are actually hunting and eating human beings. It’s genocide. It’s genocide plus cannibalism. It’s genocide plus rape plus cannibalism. And I mean, it’s like, why? I mean, seriously? Why the fuck does the human race even go on existing? I’m sorry, I’m having kind of a meltdown, here.

TT: I can understand how you might–

MP: And you’re the ones who started it, you know. You fucked everything up, and now it’s still all fucked to shit. It’s like it’s just this endless story of horror, horror, horror, and it’s all your fault, and I … I really don’t even know what to think right now.

TT: I’m sorry, I don’t understand what–

MP: No, I’m sorry, it’s me. I mean, it’s mostly me. I mean, you’re still a racist, evil bigot, but I know that’s just how you were raised. It’s just, I can’t even stand thinking about this stuff anymore. And now, looking at all these pictures, it’s like, I really want to die, right now. I literally want to die.

TT: Well, I’m not sure I follow. And all I can say, with respect to your last assertion, is that I sincerely hope it’s untrue. But perhaps it furnishes a topic for my last challenge. You’ll recall Hamlet’s famous question: To be or not to be. If you feel up to it, I’d be gratified to hear you give a brief account of your era’s views on Hamlet’s dilemma–perhaps, some would say, the most crucial puzzle in all philosophy.


TT: Hello?

MP: I’m sorry, I’m still thinking about those elephants. Now I’m going to be depressed all day. Well, anyway, what were you saying? Hamlet? You want, like, the Cliff’s Notes, or something?

TT: I want to know how a person of your time might respond to Hamlet’s complaint. And then in French, remember, if you’d be so kind.


TT: Hello? Would you like to change the topic? Is this too upsetting or delicate a choice? Or are you still having a–what did you call it? A meltdown?

MP: No, I’m just … I’m skimming through all this stuff, here, and … it looks like there’s a lot of different interpretations.

TT: Yes, certainly, but I’d like to hear the one you favor.

MP: You mean, like, my take? Well, I don’t know. I never really liked Shakespeare. I mean, he wrote all his stuff like a million years ago, and it just makes me feel, I mean it goes on and on, and he’s just this old white guy who wrote some plays, after all, and after a while, I’m like, OK. You know?

TT: I’m afraid I don’t quite follow.

MP: Oh, right, you wanted it in French. Here goes. Je n’ai jamais vraiment aimé Shakespeare. Il a écrit son truc, il ya très longtemps. Et il va sur et sur​​, et il était juste un gars blanc qui à écrire des pièces de theater. Et après un moment, je commence à me demander, pourquoi devrions-nous prendre soin?

TT: Well, I suppose that could be called French. Of a kind.

MP: But I’m just saying, it’s like, Hamlet, so, OK. He wants to kill himself, sure. But, you know what, everybody feels like that sometimes. What makes Hamlet so special? I feel like that a lot.

TT: Yes, just now, in fact, you were saying that you shared some of Hamlet’s frustrations.

MP: Was I?

TT: Yes, only a moment ago! And I said that we might use the case of Hamlet to frame or clarify your feelings by giving an interpretation of his famous–

MP: Oh, right. I don’t know. I’m looking through this list, here, and you know, it seems like there are a lot of different interpretations. I guess you could just pick whichever one you want.

TT: But that’s precisely my point. I’d like to hear what you in particular would say–

MP: Oh my God!

TT: Dear me.

MP: Oh. My. God.

TT: I’m sure there must have been some mistake. I was told I would be speaking with a college graduate, a person with over sixteen years of education, someone with the intellectual advances of a century to draw on, inconceivable leaps in research, technology, a world of knowledge at hand, even a kind of superhuman intelligence–

MP: No, I’m sorry, you’re right, I’m still here. It’s just, my friend just posted this picture of this incredibly gross chimichanga he’s eating, and I have to–hold on.

TT: Yes, I can see you’re very busy. But before we conclude this interview, I thought you might like to ask me some questions about what it’s like to live in my day, in the past.

MP: Yeah, sure, I just. Hold on.

TT: You do realize that this opportunity will never come again.


TT: Hello?


TT: Hello?

Posted in Uncategorized | Leave a comment

You Won’t Believe My Take on Clickbait

So I was looking at the Slate homepage, and I was thinking, everybody knows, or intuits, or has in some way internalized the basic compositional principles of clickbait headlines. Right?

You’ve got your straightforward tease. “You won’t believe this wacky take on the new Star Wars trailer.”

You’ve got the hot take in embryo. “The new Star Wars Trailer is a Viral Sensation. Here’s Why It’s Actually Evil.”

You’ve got the headline that tells you how you’re supposed to feel while reading the article, but gives no specifics about what will make you feel that way. “The Latest Thing Donald Trump Said Is Absolutely Terrifying.”

You’ve got the headscratching question that presumably led to the article being written in the first place. “Are Trump Tweets the New Star Wars Trailers, Or Are Star Wars Trailers the New Trump Tweets?”

You’ve got the naked appeal to rabble-rousing emotion, which can take the form of a goad, a rhetorical question, or a hyperventilating announcement. “The Way You Eat Salad Is an Offense to Humanity. An Expert Explains Why.” “Is Bill O’Reilly Just the Worst?” “Republicans Plumb New Depths.”

And a bunch of others, all of which work on the same underlying principle. They make a prediction. They presume to tell you what you’ll feel when you’ve finished reading the article. The typical web headline isn’t a play on words or a topical reference or even a thesis statement. It’s indistinguishable from a salesman’s pitch. “This argument we published will add zest to your coffee break. Try it and see!”

I have the same reaction to this stuff that I have to all advertising—which is to wonder how such shameless nonsense can actually be effective. Presumably it is effective, or editors wouldn’t keep writing such irritating headlines. But it seems to me I read fewer and fewer of the articles, if only because browsing news sites has become so draining. It’s obnoxious to be told how you’re supposed to feel, over and over. And it’s tiresome to realize that the real point of reading any given article is to gauge to what degree it falls short of the promise made by its headline.

I’m not even sure these headlines are effective, whatever the web stats say. The carnival barkers’ techniques they use are so shabby and timeworn that they have to be varied endlessly to have any effect. Often the appeals grow more flagrant with each iteration. “Our haunted mansion is the scariest you’ve ever seen.” “You’re guaranteed to lose your mind when you behold the horrors in our haunted mansion.” “You thought World War II was scary? Wait till you see our haunted mansion!” Replace “haunted mansion” with “new Trump story” and you basically have the Slate home page.

Posted in Uncategorized | Leave a comment

Worlduilding, Nerdbusting, Etc.

Every moment of a science fiction story must represent the triumph of writing over worldbuilding.

Worldbuilding is dull. Worldbuilding literalises the urge to invent. Worldbuilding gives an unnecessary permission for acts of writing (indeed, for acts of reading). Worldbuilding numbs the reader’s ability to fulfil their part of the bargain, because it believes that it has to do everything around here if anything is going to get done.

Above all, worldbuilding is not technically necessary. It is the great clomping foot of nerdism. It is the attempt to exhaustively survey a place that isn’t there. A good writer would never try to do that, even with a place that is there. It isn’t possible, & if it was the results wouldn’t be readable: they would constitute not a book but the biggest library ever built, a hallowed place of dedication & lifelong study. This gives us a clue to the psychological type of the worldbuilder & the worldbuilder’s victim, & makes us very afraid.

–M. John Harrison

I pulled this quote of Adam Roberts’s blog, and he got it from Warren Ellis’s blog, and the original link leads to … nothing. Dead site.

Which is a shame, because this is the kind of argument that dies without examples.

I used to beat up on world-building authors myself, if only because the term is so annoying. Or, to be specific, worldbuilding sounds like something a novel shouldn’t do. It sounds like a misplaced technique, the province of game designers or film-set constructors. World-building is what the makers of Witcher 3 do, mapping digital terrain and filling it with digital rocks and bushes and trees. World-building is what the makers of D&D do, writing books full of charts and figures and lists of stuff you might buy at a blacksmith shop. Worldbuilding is what I used to do with my toys in the backyard, digging sapping in sandboxes and raising revetments in my mother’s garden.

Worldbuilding is what anyone has to do when they’re charting an imaginative space for multiple people to use, be it a laser-tag arena or a boardgame or a MOORPG.

But novels are more tightly focused, scripted for narrative clarity. So when a novelist starts doing something that gets the label worldbuilding, it sounds like a breach of duty, a lapse into alternate artistic modes, a grasping after excess imaginative territory. Worse, it sounds–these days, at least–like a cynical foray into marketing, an attempt to knock together the kind of fantasy world or “universe” that can be licensed to corporate developers and made into toys and games and blockbuster movies and amusement parks.

No one wants to read a novel that’s trying to be a video game, just as no one wants to read a novel that’s trying to be a stage set.

Or do they?

The more I think about this, the more I struggle to define “world-building” in any satisfying sense. It’s that jeer at “nerdism” that bugs me. This is the kind of knowing rebuke that tends to glide imperceptibly from a critique of literary techniques into a critique of literary interests.

Was Tolkien guilty of worldbuilding when he wrote his elven poetry? Or was he just following his muse wherever she led? Was Herbert committing the offense of excessive worldbuilding when he made up the resource-driven economy of his Dune books? Or was he working through an intriguing extrapolation? Is Kim Stanley Robinson worldbuilding, in the pejorative sense, when he details the steps involved in terraforming Mars? Or is this kind of detail in fact the chief justification for writing a novel about terraforming Mars in the first place?

One author routinely praised for his worldbuilding is William Gibson. But he famously eschews the clunky exposition that Harrison’s quote seems to critique.

I think few people would accuse experimental authors like Delany or Murakami or David Foster Wallace of doing something as clumsy and tedious as worldbuilding. And yet a characteristic quality of their work is its obsessive totality, its all-embracing scope.

So what does worldbuilding as a derogative term describe? Fiction on fantastic themes that happens to be boring? The term suggests a kind of literal-minded pedantry, a fannish addiction to completism. But that’s more often an aberrance of groups and communities–the appetitive demand of an audience for more sequels, more spinoffs, more detours pursued, more details filled in—than it is of individual authors.

Posted in Uncategorized | Leave a comment

Anguish Acquisition

Having children certainly inflects your experience in interesting ways. Before I had my son, I naively believed two things to be true:

  1. That children pick up language much more easily than adults.
  2. That this happens automatically, through some mysterious interplay of human neurology and immersive exposure.

I never thought about it much, but my model of language acquisition went something like this:

Parent holds cocktail party with friends. Young child toddles into room.

Parent (to friend): “It’s true, I disagree with most of his policies, but I don’t know what to make of this business with the new trade agreements.”

Child: Tade agheemunt.

Parent: Did you hear that? Buster said, “Trade agreement”!

Everyone: Wow!

With the assumption, of course, that humans are a naturally imitative species and that all stages of language acquisition proceed through unconscious mimicry and experimentation.

Now that I actually have a child, I can see that the process looks more like this:

Parent and child sit alone at home for many hours. No cocktail parties are in the offing.

Parent: See that? That’s a book.

Child: Buuuuuuh!

Parent: Close. Book.

Child: Baaaah!

Parent: Book.

Child: Boo-ahh-ee?

Parent: Oh, how cute! Now, try this: Book.

Child: (Stares at parent in utter perplexity. Eats book.)

Repeat four million times for several years.

I can’t help but contrast this with my own experience learning French, which went something like this:

French speaker: Oh, you speak French? Ca va?

Me: Bien.

French speaker: What? WHAT? Bee-enh? Bee-ENH? Oh, God, what an atrocious accent. Ai, it hurts just to hear you. Bee-ENH? Hey, did you hear this guy? Yikes.

Me: Comment-allez vous?

French speaker: God, just stop. Don’t even try. It’s too painful. I can’t stand it. I mean, it hurts, physically hurts, just to hear you try to speak.

Me: Je fais de mon mieux.

French: What? I can’t even understand you. This is horrible. What’s the point? Please go away.

Me: I guess we should just talk English, then.

French speaker: That’s the trouble with Americans. They never bother to learn a foreign language.

And we wonder why adults are so slow to pick up a second language! And why they tend to rely on inefficient methods like laborious book study and dull classes, instead of plunging into the hurly-burly of colloquial conversation.

I suppose linguists have the inside word on this stuff. And there are good reasons to believe in neurological boosts to early language acquisition. But man, I wish I could have gone to France as a teenager, wandered around babbling like an idiot, and had people patiently say to me, “Non, c’est un livre. Peux-tu dis livre? Livre? Li-vre. Et encore: leeee … vre? Ah, bien, pas mal! Bon essai!”

In other words, it seems to me the unique social environment of childhood is a fairly non-trivial factor in skills acquisition.

Just sayin.

Posted in Uncategorized | Leave a comment

How the Web Went Bad

(Author’s Note: Excerpts from this essay were published by the Baffler on … oh, at some point in the distant past; I can’t remember when. I’m putting the whole essay here because no one told me not to, and honestly, I think it’s pretty interesting.)

When Andrew Sullivan, the big bearded granddad of political blogging, announced his retirement in 2015, the internet was quick to perform an autopsy, only slightly before its patient had died. Blogging was dead, social media had killed it, and Sullivan’s sign-off had rung the death knell. Sullivan himself wrote a sort of valedictory predicting that blogging, a medium he loved and practically invented, was heading into a long night of obscurity. But like King Arthur, he insisted, like Frederick the Great, like Obi Wan Kenobi, it would return.

What exactly was blogging, this thing that died? Interpretations differ: it was a publishing platform, it was the art of conversation, it was the cultivation of a loyal online community. Sullivan quoted Nick Denton with approval:

“[Blogging is] the only truly new media in the age of the web … Blogging is the essential act of journalism in an interactive and conversational age.”

The web’s a-changing. The web’s a-growing. Everyone agrees that relentless evolution is the modus, the via, the what and wherefore of online life. But not everyone feels equally copacetic about a particular turn of the wheel. For every phase of internet phylogeny, two classes of commentator have spoken up: the “oh, shit” crowd, and the “aw, shucks” crowd. This time is no different, except that it is.


The oh-shit crowd, a loose cohort of professional naysayers, gray-muzzled bloggers, tech trackers, leftists, and congenital curmudgeons, think something has gone badly wrong with online interaction. They cringe at the reach of social media, bemoan the grim prospects of content creators. They think the web has been hijacked by advertising, or fatally infected with bad faith, or swallowed by a few behemoth companies. They believe that clickbait and Twitter wars are driving us into a new age of exploitation.

The aw-shucks crowd says, “Well, that’s the internet!” It’s ever been thus. One platform succeeds another. Technology pursues its own strange course. The old fear the new, the young neglect the old, and each generation idealizes the media of its youth. The only overarching trend is that the web gets ever rowdier, more crowded, more beautifully and chaotically democratic.

That’s how I used to feel. No more. The oh-shit crowd is right. Something really has changed, or perhaps a slow process of change has become too relentless to ignore. If members of the aw-shucks crowd don’t see it, they must be too young, too cynical, or too craven. The internet still has everything, and thank the cybergods for that. But the reigning culture of online life has lost its radical fervor. Mercenary interests and the habits of disengaged spectatorship have encroached on an original climate of community and collaboration. The web isn’t getting more democratic at all. It’s steadily metamorphosing into yet another arm of the old profit-driven, corporate-run, consumer-oriented mass media.

Ironically, it was a shared article on Facebook that brought this home to me. Andy Baio’s paen to the work of opens something of a window onto the early internet. I’d forgotten that we used to talk about being “netizens” back then—flippantly, to be sure, but with a wistful subtext. We were citizens of a new public space, with its own demands and responsibilities. Geocities and other hosting plans adopted the language of cyberspace, offering their users chunks of online real estate, stakes in a digital frontier. The very term “site” evokes images of a plot, a property, bare land on which to build, and later services like SecondLife promised to outfit this metaphor with clunky graphics and soaring avatars. For a surprisingly long time, the territorial conceit, lifted out of the fantasies of science fiction and strongly infused with the American dream, informed how we thought about web development. Digital domains were new land, a fresh green breast of a second America, waiting to be inhabited. And this time we were going to get it right.

Who would do the building of this brave new world? You would, the user, the owner, the developer. You’d learn some HTML, or LSL, or a dash of JavaScript—it wasn’t hard—and get cracking. Which is why every web site back then was cluttered, homely, and perpetually under construction.

And what would you build? Well, that was an open question. Business was big, and public services slowly opened their archives, but the early web was famous for being oriented toward eccentric interests. Toy collectors, role-players, artists, educators, wood-carvers, history buffs and other oddballs jumped at the opportunity to build public shrines to private passions. The quintessential online experience was a serendipitous discovery. We said content was king, and what we meant was that the individual had finally come into his own. It was individual labor that would build the web, out of a million peculiar little projects. It was individual expertise that would attract visitors, an individual voice that would keep them coming back.

The talk of those early days sounds hokey, now. Of course it does. It arose from ideas that were earnest, sometimes utopian, often poignantly resistant to skepticism. They make for a painful contrast with today’s online culture. People still nod to the founding fantasies, celebrating the web as a place of political engagement, personal empowerment, and amateur creativity. And there are plenty of voices online, more than ever. But their potency, and their potential, has been slowly redefined.


I’ve asked many friends to explain to me the charms of social media, which, though I use these services, I’ve never really understood. Is it all idle gossip? A way to keep in touch? A harmless distraction, like Buzzfeed or Weird Al?

The answer I get has nothing to do with distraction at all. You have to use these services, I’m told: to make contacts, curry influence, build awareness of whatever it is you do with your time. You might find Twitter fun or insufferable; either way, it’s an indispensable promotional tool.

This glances back to the old webernet ideals: entrepreneurship, craft, independence. It’s also a rather feeble attempt at hardheadedness. As promotional strategies go, flinging out endless quantities of poorly edited one-liners is hardly an improvement on the efforts of Madison Avenue. The appeal of the technique is its easiness: anyone can sign up, link, like, tweet. But social media as a style of self-advertisement trades sophistication for obsessiveness. Like a dupe at the slots, we keep punching buttons, hoping for the strike of a lucky success.

Does it matter? At its best, social media is certainly no worse than word-of-mouth. For most people, that’s probably all it amounts to. Personal or public, private or promotional: to you, to your dad, to your friend in the band, it hardly amounts to more than chump change. The small screen of the smartphone, and the stingy format of the tweet, conspire to keep users from producing anything of even briefly lasting value. Nor do they encourage careful discrimination between tossed-off remarks and targeted announcements. This enforced sketchiness is precisely the point. The purpose of such tools is simply to crystallize the kind of daily nattering that has always been with us, transcribing it in a way that can be exploited by the real advertisers: the business titans, the opinion trackers, the mavens of market research.

This dubious bargain—scattershot careerism for the little guy, laser-targeted marketing for the rich—has become so well-established, it’s hard to believe that only a few years ago people seriously looked forward to something different. Thinkers like Cory Doctorow, Yochai Benkler, Jonathan Zittrain, Kevin Kelly, and Lawrence Lessig had begun to push for a new sharing economy that would celebrate collaboration, volunteerism, and costless transactions. Crowdsourcing, back then, didn’t mean hitting up your fans for handouts. It meant that people would willingly donate online efforts to something other than shopping, watching, and badinage. Optimists foresaw whole novels being produced this way, games, new industries, content curation, scientific research, even feature-length films. Savvy remixing promised to reconceive the basic nature of media, raising digital collage into an art form in its own right. The point of this burgeoning creative commons wasn’t to celebrate every pilfered video clip as a triumph of dilettantism. It was to build up unmercenary creativity, free exchange, and the human instinct for cooperation into a new cultural order.

This vaunted market of makers, coders, educators, and collaborators had its counterpart in a blogosphere addicted to debate. The essence of blogging in those days was reengagement. You found an ardent soul who disagreed with you, and you revisited the disagreement again and again. Of course, the arguments often descended into sniping and nitpicking, but the sniping and nitpicking came with accountability. A blog was more than a stream of lightly edited thumbnail articles. It was, as the name suggests, a log, an archive, a transcript of evolving opinions, where any sloppy assertion or dubious claim could be brought to bear on a fresh disagreement. It framed its creator as a one-woman magazine, with editorial philosophy and writerly voice combined in one idiosyncratic ego. Most open source projects waned or failed through lack of talent, but writing is in many ways the signal art of democracy, distinct from TV, film, software, and even handicrafts in that if you consume it, you can produce it. Blogs looked like a sharing economy that had already arrived, a ferment of free and unconventional ideas.

There are some who assume social media has improved on this promise, as if bandying snark with frenemies on Twitter is the same as running one’s own magazine. Even sardonic accounts of social-media silliness still dutifully nod to our democratic hopes. But a steady dribble of time-killing gossip hardly justifies lofty talk of “democracy,” “empowerment,” and “cultural participation.” The old web and its rhetoric are still with us, but the obligations that gave them dignity have been stripped away.


Even in the dark ages of the mainstream media, it’s worth remembering, audience members had ways to signal their likes and dislikes. They could change the channel, switch the station, respond to polls, cancel their subscriptions, buy or refuse to buy new releases, call in requests, write to editors, organize boycotts, set trends, purchase tickets, join the studio audience. If nothing else, they could get on the phone and gossip with friends, knowing that even their casual remarks might in some small way bubble up into general opinion. The main thing they could not do was produce content themselves. They could be counted, in other words—and they were, obsessively—but without a special entrée, they could not actively participate in the creation of any but the most local art and culture.

That was the glorious promise of the web: the chance to do it all yourself. But cultural participation isn’t only a matter of crowing yea or nay at every diversion that flits across a screen. It takes work, and more: a dedication to the work of others. The word responsibility has a didactic ring, but we’re not talking about a lofty responsibility, or an onerous one. Only the humble, everyday responsibility we assume for the mores of a social space we enter, be it a public forum, a support group, someone’s web site, someone’s home.

That’s not true for members of a mass audience, who have limited options, but a special immunity. They can cheer or boo, stay or go, acclaim or reject what the culture makers offer, and not much else. The compensation for these constraints is that their choices are taken to be unassailable. Precisely because the audience members have submerged themselves in a larger social body, no person in particular will be challenged to justify his or her views. To leave the audience, to step onstage, means giving up that right to inviolability. It means that heckling and clapping, tuning in and tuning out, are no longer adequate options.

That’s the promise, the challenge, of public engagement, one that the evolution of the internet has steadily whittled away. Improving software and corporate cunning bear some of the blame, but the lesson of the web’s history is that most people, most of the time, are happy with the prerogatives of mass consumers. Hence the consistent defining down of the average netizen’s expected contribution, from building one’s own site to creating one’s own content to volunteering one’s occasional labor to posting one’s comments to, finally, liking and sharing and sneering. If the trend continues, it won’t be long before we’re selecting our kneejerk reactions from menus of stereotyped options (Amazing! Stupid! Misogynist! Libtarded!) or simply voting with our viewership habits, as we used to.

Admittedly, online traffic is famously difficult to analyze. Studies routinely describe a growing dominance of streaming media, but that’s mostly because the files are so big. And the essence of the internet is a hopeless jumbling of what counts as public or private. Phone calls, photo-albums, texting, TV: what can we call this but life as usual, measured in bps?

Relying on old-fashioned social observation, we can see the emergence of a model somewhat like nineties-era radio or TV, with social media serving as the telephone, the dial, and the cablebox. A cluster of big companies, many of them the familiar media titans of yesteryear, produce the most popular content, usually by exploiting the eagerness of interns and other naifs. They churn out countless three-minute diversions—screeds about liberals, screeds about conservatives, trailers, teasers, PSAs, music videos, plain old commercials—packaged to appeal to stereotyped demographics. The old media empires had ways of stoking trite controversies (industry rags, wardrobe malfunctions, risqué interviews), along with techniques for engineering “surprise” successes (savvy advertising, giveaways, puff journalism). And so it is today, except that we call the controversies “outrages” and the success stories “viral hits.”

Social media exploit these non-events to organize the masses into trackable audiences, but the furor never amounts to much more than a highly refined form of consumer feedback, little different in its underlying dynamics from Nielsen ratings, watercooler confabs, phone polls, and letter campaigns. People talk about social media exactly the way they used to talk about popular opinion, and they mean the same thing: “All my friends are gabbing about this topic, plus a few think pieces I happen to have read.”

What are the topics that incite these fleeting paroxysms of opinion? Why, they’re the same old mass entertainments, in slightly different forms: TV shows, country singers, summer blockbusters, video games, magazine articles. Yesterday’s upstart webonauts, with their dreams of creation, debate, and cooperation, are melting back into an anonymous crowd. And I think people like it that way. Certainly Twitter mobs and Facebook sharers seem to assume the inviolability of casual spectators, free to cheer and jeer without broader accountability. Holding a tweeter to the standards of a journalist is held to be uncouth.

The web’s mass audience is uniquely visible, notoriously vocal, unusually subject to measurement and manipulation. But its power is blunt and reactive, because its members embrace the essential tradeoff of consumerism: unlimited authority in judging culture, piddling control over making culture. They also regularly manifest a consumer’s latent rage. When the only power you have is to say Want or Don’t Want, you say it as vehemently as possible, because every frustration of your tastes feels like a denial of your right to speak at all.

Sure, counterexamples teem online: forums, hobby clubs, tumblrs, self-published novels. What all these things have in common is that it’s more difficult by the day to interpret them as a preview of a coming vox populi paradise. The old web culture limps along, and occasional breakout hits keep alive the familiar fables of economic and social renovation. But they’re no more heraldic of a looming new society than the mail order phenomena and sleeper hits of an earlier age. As for the communitarian web, its best-known exemplars seem increasingly beleaguered, like the shrinking political blogosphere, the ever-more-cultlike Wikipedia, the scary back alleys of 4chan, or the struggling open-access movement. The defining online action, now, is to click and be counted, in the world’s most sophisticated opinion poll.

That’s not necessarily a disaster. Mass media aren’t all bad. And it’s likely that long-term trends will prove more radically transformative. The galling thing is to hear every crass excess of this ad-driven, bean-counting system—every click-and-cry mini-scandal, every theft of original content, every colicky eruption of disgruntled consumerism—celebrated as a symbol of the democratizing power of technology. If anything, the current ethos is anti-individualistic. Instead of elevating amateur work, it encourages people to grab blockbuster hits on the cheap. Instead of helping creators and volunteers, it empowers advertisers and administrators. Instead of highlighting bold new voices, it lumps everyone into transient mobs defined by gossip, brief obsessions, and free-roaming discontent.

Which is more or less how things were in the eighties. Meet the new media, same as the old media. As one member of its captive mass audience, my response is: boo, hiss.

Posted in Uncategorized | Leave a comment

The Sorrows of Young Weev

Here’s Ethan Mills sending an open letter to his fellow “white dudes”:

In the last 10-15 years the internet has weaponized fragile white masculinity in anonymous toxic discussion boards and comment sections as well as directed hate campaigns such as the one against Leslie Jones in the summer of 2016.

In my late teens and early 20’s I was as angsty and as prone as any young white dude in 2017 to rail against the hypocrisies of my society.  But the difference is that I’m just barely old enough to have grown up without the internet …

His piece is rather ostentatiously written in the voice of a stereotypical white dude, which I found annoying. And most of it rehashes tiresome internet tropes, like John Scalzi’s suggestion that being a white man in America is like playing a video game on the lowest difficulty setting. This cutesy analogy ought to have died a quiet memetic death long ago.*

My biggest quibble, though, is that I think Mill’s essay commits the sin it’s most anxious to defend against:

I’m haunted by the thought that the real blame for this mess is on older white dudes like me.  Maybe we let down our younger brothers by creating the conditions that made them who they are; perhaps we failed to be role models of what responsible white dudeness looks like.

At 40, I’m probably among the last group of white dudes who grew up with our privilege largely unquestioned.  It must be confusing to have come of age in the last 15 years or so with the vestiges of invisible privilege while having that privilege made visible and explicitly challenged.

I’m the same age as Mills. I’m also a science-fiction-loving white dude. And I think men of our generation need to look a lot harder at our own contribution to the kind of pseudo-Nietzchean antisocialism he complains about. That goes double for our elders in the Boomer cohort. The weird brew of trollishness and insecurity we see among white men on the internet goes back, I think, to the youth rebellions of the fifties, when white men began to look for new sources of identity distinct from the imperial hubris and reactionary hatreds of their forebears.

An origin point for the trend might be Norman Mailer’s encomium to the hipster–aka, the “White Negro”–published in 1957:

… the man who knows that if our collective condition is to live … with a slow death by conformity with every creative and rebellious instinct stifled … why then the only life-giving answer is to accept the terms of death, to live with death as immediate danger, to divorce oneself from society, to exist without roots, to set out on that uncharted journey into the rebellious imperatives of the self. In short, whether the life is criminal or not, the decision is to encourage the psychopath in oneself …

“The unstated essence of Hip,” for Mailer, “was its psychopathic brilliance,” a horror of conformity and therefore of society. But what’s striking about Mailer’s essay is his candor about the inspiration for this antisocial stance. In Mailer’s thinking, the white man’s civilization had already proved itself, by his time, to be evil, deranged, and hostile to life. Atomic war, Nazism, and all the horrors of the early twentieth century had made the young white man of the fifties a prisoner, one might say, in his father’s house, an unwilling initiate in a white supremacist cult of death. The only way for a young white man to thrive–to be fully alive, to embrace life–was to become an outsider in his own society. And the obvious way to do that was to take someone already alienated from society as his role model.

Hence the “White Negro,” a young white man who adopted, by choice, the alienation imposed on black men.

Thus began a trend that has continued ever since, from the hipsters to the Beats to the hippies to the stoners to the punks to the slackers to a new generation of hipsters, and now to the curdled contrarianism of the web’s message boards. The romantic traditions of the West–which had always emphasized the needs of the self above those of society–found renewal in a strain of chi-chi sociopathy. Forms of youthful rebellion that had entranced earlier generations of young men–the upstart aestheticism of the pre-Raphaelites, the sentimentality and lyricism of nineteenth century poetasters, the preening effeteness of the flaneur, the dapper exhibitionism of the dandy, the Jazz Age hedonists cavorting at their Prohibition parties–gave way to assertions of bitter antisocialism. Those earlier rebellions had celebrated escape–into pleasure, art, nature, the life of the mind. Now mere escape was deemed insufficient, too feeble a tonic to soothe the pains of a developing white male mind. Only violent rejection could allow the male ego to save itself from the influences of a sick society.

Ever since, successive phases of white male radicalism have sat uneasily alongside movements for minority rights. The contradictions have always been obvious. What began for blacks and gays and others as an enforced, and keenly felt, lack of privilege has been adopted by white men as an assertion of privilege. When a black man shouts, “The system’s keeping me down!” there will always be a dozen white men on hand to answer, “Me, too, brother! Me, too!”

The critical difference is that for minority groups, anti-establishment agitation situates individuals within larger communities that experience, together, some form of shared oppression. For the young white male hipster, or the young white male slacker, or the young white male psychopath, rebellion is a lonely road. The characteristic white antihero–Travis Bickle, Rambo, William Foster–fights against everyone, on behalf of no one; he represents only his own insatiable need for psychological ease. Mostly, he fights with other people like himself, other self-involved, errant, undisciplined sociopaths. In the revenge fantasies of late-century cinema, maverick cops and maverick criminals hunt one another through lawless hellzones, stubbornly dismissive of rules, conventions, and authority. The reckless cops in these stories, like the murderous criminals, are fierce, predatory, solitary, and dangerous. Good guys and bad guys are distinguished only by narrative cliches and genre conventions. Everyone, in exploitation cinema, behaves like a lone wolf.

Instead of rebelling against Western society, then, the young white male radical comes to emblemize its most odious features: arrogance, narcissism, nihilism, and a poisonous obsession with personal independence. Imitating the anger of the dispossessed, he becomes, himself, an agent of dispossession, valorizing his own self-assertion over every social bond and scorning the type of conformism that sustains nurturing communities.

What to do? Young white men today face three challenges.

They have to explore their roots in Western culture while distancing themselves from that culture’s most noxious beliefs.

They have to support minority movements for civil rights while understanding that they’ll never belong to those movements in the deeper sense implied by shared suffering.

And they have to do all this while doing the work of growing up, differentiating themselves from their parents and their peers.

These challenges are compounded in a society where the boundaries of privilege are increasingly sketchy, where gender roles are perennially in flux, where minorities are gaining power (however slowly) while white men are losing it. Today’s fashionable pop feminism adds yet another twist, giving white men another set of radical postures to imitate. In the message boards of the alt-right, we see a new species of white male radical emerging, one who retains the classic antihero’s penchant for violence while eschewing his characteristic toughness. This new white radical is emotional, mercurial, sensitive, but has no interest in the nineteenth century’s romantic sentimentality. His chief preoccupation is his own vulnerability. Moody, petulant, easily wounded, he’s determined to persuade you that in today’s society, he is the true and only victim.

Mailer’s White Negro, that is to say, has been joined by the Male Female: a young man who imitates the argumentative style of a Third Wave feminist–snarky, easily hurt, sexually put-upon–without experiencing the same disadvantages.

As Mills says, he had things easier growing up in the nineties, when the erosion of white male privilege wasn’t quite so noticeable. I grew up at the same time, and even then, things were pretty confusing. People like Mills and me have a special responsibility to young white men, since we’ve been through some of what they’re going through, albeit in a muted form.

Are we living up to that responsibility? I look around at my fellow white male Gen-Xers and see us ducking responsibility in every way possible. We retreat into the pop-culture wonderlands of childhood, wallowing in the warmed-over vestiges of an eighties culture rife with nerdy self-congratulation and muscle-bound machismo. We scant the history of Western culture, even though as white men we’re well positioned to champion its better features. We become reactionaries, or worse, turn into trolls ourselves.

Or, more often, we use angry young white males as convenient bogeymen, scoring cheap points off them to curry favor with the more fashionable wings of contemporary liberalism.

The challenge for people like Mills and me, I think, is that we have to repudiate or ignore all the lessons of our upbringing. We have to give up the questionable consolations of antiestablishmentarianism. We have to support movements for minority rights without imitating or intruding on those movements. We have to be stewards of the arts and ideas of our own heritage without forcing those arts and ideas on others. We have to try and curtail the misogyny and bitterness of younger white men without indulging in cheap forms of self-congratulation. And we have to do all this in a way that champions the undersung virtues of maturity: responsibility, diligence, nurture, and wisdom.

In short, we have to learn to become something that no modern American ever wants to be. We have to become squares.

*I mean, what counts as a hard setting? Getting raped by Bill Cosby? Getting shot by gung-ho cops? Dying of opiate addiction? Growing up in Somalia? This metaphor says more about privilege than Scalzi intends.

Posted in Uncategorized | Leave a comment

Trending Tendencies: An Empirical Parable

You read one day in the paper that more women are now enrolling in college than men. Not only that, but the gap is widening.

What, you wonder, could be driving this trend?

You consider a few possibilities.

It could be because admissions committees discriminate against men, giving preference to female candidates.

It could be because male candidates are less qualified overall, by whatever standards colleges use.

It could be because men are less likely to apply in the first place, leading to lower numbers of men in the applicant pool.

Of course, all three factors might contribute to the trend. Or some combination of factors. It might be true, for instance, that admissions departments are biased in favor of men, but not enough to make up for a small pool of qualified male applicants.

To make matters worse, the different factors interact.

Suppose men are, on average, less qualified for college.

Over time, you reason, admissions officers might pick up on this. They might fall into the habit of moving male applications to the bottom of the pile. They might rush through male applications, giving them less attention because, after all, the admissions officers simply know that men are less scholarly than women.

In time, male applicants might sense that there’s a bias against them. They might give up applying, thinking to themselves, “Why bother–we probably won’t get in anyway.” They might internalize the stereotype, coming to believe that they’re innately less diligent and less studious, that college simply isn’t for them.

Or suppose men are much less likely to apply to college in the first place. Because of this, admissions departments might develop a bias towards men. They might admit male students instead of qualified women. As time goes by, female students might pick up on the bias, realizing they need to work even harder to compete with their privileged male peers. Colleges might find themselves inundated with overqualified women and underqualified men. To maintain gender parity in their student populations, admissions officers might become even more biased towards men, driving female applicants to work even harder, and so on.

You realize these questions will be extremely difficult to analyze, much less answer. Bias can drive low applicant rates. Low qualifications can lead to bias. Low application rates can drive down qualifications. Each factor affects the others. They interact in real time. It can be all but impossible to untangle the causal relations.

But you go ahead and crunch the numbers, and you tentatively conclude that, for whatever reason, men are less likely to apply to college in the first place.

This might explain why men are less likely to be in college. If they don’t bother to apply, they’re hardly likely to be accepted.

But you still have to ask, why are men less likely to apply?

Well, you reason, it could be because society as a whole is biased against men, and men pick up on the bias and get discouraged.

Or it could be because college-age men are less qualified for college, know they’re less qualified, and don’t bother to waste time applying.

Or it could be because men–even when qualified, and even in the absence of bias–choose not to apply because they’d rather be doing something else.

As before, these factors interact. Lack of interest can drive down qualifications. Low qualifications can lead to lack of interest. Both factors can lead to bias against male students–because people come to think of men as less studious–or bias towards male students–because people become desperate to boost male achievement.

And bias, whether towards or against men, has unpredictable effects. Men might work harder to overcome bias. Or they might give up and succumb to it.

But you persevere. You do a survey of college-age men, and the men tell you that they’re simply less interested in enrolling in college, even the ones who are qualified.

Now you have to ask, why are men less interested in going to college?

Motivations are complex. Maybe men think college is for sissies. Maybe men think that other people think college is for sissies. Maybe men have problems with authority; they’re tired of being bossed around by teachers. Maybe men hate studying. Maybe men would rather get a job right now than go into debt for the chance of a better job later. Maybe something about the culture of college turns men off. Maybe it’s all of the above.

These are subtle and highly subjective questions. But you do a follow-up survey, and you conclude that men are less likely to apply to college because they hate doing schoolwork. Now you have to ask, why do men hate doing schoolwork?

Once again, you run through several hypotheses. It could be that men have less self-control. It could be that men are less tolerant of sitting still. It could be that men are less conscientious. And so on.

All these terms are rather fuzzy. It’s hard to nail down rigorous definitions. Not only that, but even if you distinguish among several discrete traits–discriminating, say, between restlessness and laziness–you still face the challenge of linking those traits to behavior. Men’s relative lack of studiousness might be due to a mix of traits: restlessness plus impatience plus irresponsibility. Or it could be due to one critical trait–short attention span, maybe–that explains the whole difference.

So you do another study, and you tentatively conclude that men hate schoolwork because they’d rather by physically active. They don’t want to sit around hitting the books all day.

Now you’re faced with another question. Why do men have this psychological tendency?

Well, you reason, it could be because men are trained to think and feel in certain ways, shaped and guided by the dominant culture.

Or it could be because genetic differences–changes on the Y chromosome–predispose men to have certain traits.

But now you have a whole new set of problems, because biology shapes culture and culture shapes biology. If men are genetically predisposed to be rambunctious, society might pick up on that tendency and reinforce it. If society teaches men to be rambunctious, this cultural reinforcement will have an influence on men’s physical development. And this nature-nurture interaction is ongoing, starting with the ascription of gender to a fetus or infant and continuing through a boy’s childhood. How do you sort out the interactions?

You decide to focus on culture. Soon, more questions crop up. Will you look at parental influences? Peers? Media? Schools and institutions? Medical professionals? Like everything else, these cultural factors interact; media influences peers, institutions influence parents, parents influence access to media and medicine.

In despair, you switch your focus to biological factors. But the same problem comes up again. The body is a complex system, swarming with countless hidden interactions. Even the genome interacts with itself over time, some genes affecting the expression of others.

You decide to focus on one particular gene, or rather, one sequence of base pairs. Through careful statistical work, you establish that this sequence of base pairs might be correlated with an increased incidence of certain kinds of restless behavior. You publish your work, indicating that it’s a tentative finding that will have to be subject to further study.

Women Biologically Destined for Drudgery, the headlines read, Men Genetically Hardwired for Failure. There’s a massive public outcry, and you get fired for being sexist.

Posted in Uncategorized | Leave a comment

Men Are Alt-Right MRA Misogynist Dudebro Pigs, Women are Oppressive Totalitarian PC Ideological Feminazis

A Google employee recently ended up in hot water for arguing that women and men tend to seek out different activities and careers because they have, on average, different values and interests.

This is an idea that–I exaggerate only slightly–every single person who has ever lived has probably entertained at some time.

So why did this guy’s memo provoke a firestorm of angry commentary?

We can talk about swelling resentment toward the tech industry, anger at the claims of evolutionary psychology, longstanding anxiety around sex and feminism. But having the read the memo, I think the writer made three critical decisions.

First, he wrapped his argument about sex differences in drawn-out attack on PC thought-policing, presenting himself as a victim of liberal groupthink. He picked a fight, and he got one.

Second, he used the trigger word “biology,” which always sets off rhetorical microwars between scientistic contrarians and postmodernist ideologues.

Third, he set up his discussion of sex and personality by asking what makes women different from men. No matter how you answer that question, liberals get angry, because it implies that men are the standard sex from which women represent a deviation. The memo writer might have had better luck flipping his argument. What, he should have asked, is different about men?

Consider what the reaction might have been like if he’d written something along these lines.

Google is admirably committed to hiring and advancing more women. Currently, the company pursues this goal through forms of diversity training meant to reduce discrimination. The hope is to make the workplace more welcoming by changing employee attitudes and behavior.

But what if that’s not enough? Are there other methods that might help our company–and the tech industry in general–recruit and retain more talented women?

That’s a question worth asking. Mandatory diversity training has documented drawbacks. It can result in superficial compliance while provoking a backlash that results in more discrimination behind the scenes. It puts the onus for change on individual employees, without addressing institutional factors that might be driving women out of the field. It draws inspiration from scientific research that has recently come under fire (see, for example, the literature on stereotype threat and implicit bias). And it focuses only on one half of the problem. Instead of asking, “What drives women away from computer science?” we at Google might want to ask, “What can help to draw them in?”

Here, too, current research can offer guidance. Tech is famously a male-dominated field. And the psychological literature reports that men are, on average, more likely to exhibit certain traits, preferences, and handicaps.

Many of these will sound depressingly familiar. Men have been found to be less cooperative. Men are more aggressive, and, as a result, perhaps, more willing to put up with stress and pain. Studies have found that men are less agreeable, less gregarious, and less empathetic.

Significantly, men have also been found, on average, to show less interest in other people. Sometimes this is described as a preference for things over feelings, sometimes as a taste for systems over relationships. Whatever the terms, the trend is the same. Overall, men put a lower value on their interactions with others.

None of this is true for all men. The average differences are often small, and individuals show wide variation. And it’s not entirely clear where these traits come from. Some have been linked to high levels of prenatal testosterone. Evolutionary psychology suggests that, in an unkind phrase, men are simply more “disposable” from a genetic perspective, therefore more likely to engage in risky and solitary behaviors. It goes without saying that culture plays a role; boys are often encouraged to pursue status at all costs and punished for exhibiting stereotypically feminine traits.

What can be said with certainty is this. At the age when young people are contemplating a choice of profession, men are more likely to consider jobs that are stressful, combative, abstract, and lonely.

That sounds a lot like the tech industry.

So what can be done?

We should all be cognizant of the baleful effect that discrimination has on workplace culture. But we should also entertain the possibility that structural and systemic factors help to make tech a male-centric field.

Instead of putting the blame for our lopsided labor force entirely on employee attitudes, here are steps Google might take to attract a more diverse pool of candidates.

  1. Make software engineering more people-oriented with pair programming and collaboration. Find more outlets and uses for employees who value cooperation. The tech industry in general should make a greater effort to attract talented candidates who thrive on social connections.
  2. Offer more stress reduction courses and similar benefits, and work to make leadership positions less stressful. The tech industry is famous for its punishing demands; this drives away sensitive but capable recruits.
  3. Offer more flexible hours and more opportunities for part-time work, making accommodations for employees who want a reasonable work/life balance.
  4. Focus critical attention on male gender roles, challenging cultural influences that drive men to seek status at the expense of human connection.

In pursuing these goals, we should be mindful of the company’s larger mission. It makes sense for competitive, hardworking employees to get ahead at Google, whether they’re women or men. As always, we have to balance the benefits of greater diversity with the costs of changing the way we do business.

But isn’t change what we do? In striving for greater diversity, we can’t allow ourselves to lapse into a dogmatic insistence on what’s always been done. We should experiment with new approaches, new views, new sources of information. We should let science guide and inspire us. Above all, we should entertain a diversity of views in our pursuit of a diverse workplace–and, as always, keep an open mind.

If he’d written that memo, people might have objected to its gender essentialism. Critics would have quibbled with its facts, assailed its logic. Male rights advocates might have griped about its depiction of loutish undersocialized men. But would furious debates have blazed across the internet? I doubt it. And yet the substance of the argument is largely the same.

There’s a lesson here about the ways in which different groups compete for victim status. But I’m more interested in the tech angle. As I’ve written many times, the web is something of a giant gossip machine, and nothing illustrates this better than our endless online tiffs and spats over some person’s rhetorical choices. There are probably millions of extant documents saying that men tend to be less empathetic, or that women are more likely to put a high value on intimacy, or that women are prone to beat themselves up over trivial matters, or whatever. God knows everyone talks about this in private life. But it doesn’t light up the circuits of cyberspace.

Then some day, somehow, someone pushes all the right buttons, and the gossip machine goes into effect, sputtering, churning, shutting down servers, doxing names, spitting out takes and takedowns and downvotes, elevating some reputations, ruining others, doing its work as loudly and efficiently as a woodchipper. Pop, fizzle, like, tweet, ding. At the end of the process, a few more ads have been clicked, a few gigabytes of data have been harvested, a few new ulcers have formed, and a few more citizens have been politically radicalized.

Does it have to be this way? Does a worldwide network have to devote some share of its bandwidth to running a virtual gossip machine? Or is this a function of the way we built out the software layer of the internet, going back to Google’s original use of links as an index of popularity, popularity as an index of relevance, and relevance as an index of informativeness? Almost everything about the current design of the web–everything about the way it organizes information–is based on a few crude assumptions:

  1. Something popular is preferable to something unpopular.
  2. Something current is preferable to something old.
  3. Something fast is preferable to something slow.
  4. Something familiar is preferable to something unfamiliar.

These masquerade as rules for sorting information well. But they’re really just shortcuts for sorting lots of information quickly.

I don’t deny that as shorcuts, they do a good job of approximating human decision making. If I’m presented with an overwhelming range of choices–a mountain of unsorted books, a bevy of similar newspaper stories–I’m more likely to choose something popular, new, short, and familiar. But why should I want my web tools to emulate me at my most mercurial, scatterbrained, hasty, and impatient? I want tools that help me make better decisions, not tools that always play to my worst tendencies.

It’s bad enough that the web works this way with content sorting, product recommendations, site rankings, and news. But when it comes to social connections–reputation, relationships, group affiliations, any kind of conversation–this quick-and-dirty approach is a disaster. The gossip machine takes in the signal of human intelligence and spits out the noise of human stupidity. It brute-sorts people by the roughest metric available, asking of every user: what drives you crazy? It’s like a piano that only makes music when you bang the keys with a sledgehammer. It’s like a bathroom scale that thinks you weigh more when you’re shouting. It’s like a superpowered calculator that only calculates common denominators.

Worse, it’s a machine that treats its own design as input. The system’s built to favor whatever’s recent, popular, and familiar. That’s what users see, so that’s what they respond to. Whatever’s popular gets more popular, whatever’s familiar becomes overfamiliar, and the timeframe for what qualifies as recent narrows to a razor-thin margin. The Google bosses like to imagine that the web will one day evolve into a vast AI. But it’s already an AI, thinking one idiot thought over and over: “Wow, people sure love trivial shit!”

I really believe we’re living in a dismal age for information technology. Not a dark age, exactly–it’s not like we’ve forgotten how to compute. A white-hot age, perhaps, an age illuminated by the light of ten billion digital suns, until our eyes are fried and the atmosphere ignites. We’re over the early flirtations, past the engagement, through with the honeymoon, recovering from the warm glow of infatuation. Now the house is full of unwashed laundry, there are dirty diapers under the bed, something’s burning in the oven, and the sink just broke. Someone you used to love is yelling that it’s all your fault, and the air smells like despair and charred shit.

It’ll get worse, I think, before it gets better. Like that part of the agricultural revolution when Egyptian peasants and slaves were dying by the thousands to build temples and tombs for crazy despots who happened to have a monopoly on literacy. Like that part of the industrial revolution when troops of workers were marching into the mouths of Satanic mills and children’s corpses were piling up in Irish workhouses.

We now have proofs of concept for the dream devices of science fiction–a worldwide network, virtual worlds, portable computers in every pocket–but the tools themselves are worse than bad, because they all rely on cheap fixes for major challenges. We’ve sorted the world’s information, but we used the crudest possible methods. We have computers in our pockets, but the interface is so constrained and clunky that all you can do is poke at them like an impatient toddler. We have virtual worlds that look increasingly lovely, but we get around them using clunky overlays that constrict attention and stupefy the senses. We traded our privacy for a giant pile of data, and we have no idea what to do with it.

Almost everything about the way we use computers today comes down to a simple trick: making computers better by making people worse. We haven’t mastered the nuances of human behavior, so we reached for the easy money. We staked everything we had on addiction, anxiety, distractability, and anger. We can’t even come close to emulating knowledge, but we’ve invented ten gazillion workarounds.

The crazy thing is that I think engineers know this. They know they’ve built a bloated, teetering cybereconomy on a clumsy hack of the human pleasure center. They know they’ve created an enormous complex of flashy but inflexible devices that can only function if users are trained to repeat a few compulsive and restricted actions. They know that those two features will only work if they work together, that the world of popular software is mostly a pile of Skinner boxes, training users to keep pounding at a few simple levers because the system would fall apart if they tried to do anything else.

They know, above all, that they’ve tapped into a tiny, tiny part of what computers or humans can do.

They know this is their world, and it drives them crazy, which is why they’re so frantic about taking the money they’ve made off their gossip machine and pouring it into moonshot projects, AI research, robotics, and space exploration. When I look at the Google bosses or Mark Zuckerberg, I see a bunch of aging pop stars using the money they made licensing bubblegum hits for car commercials to self-fund an experimental jazz career. “Look,” they’re telling the world. “Tech doesn’t have to be top-forty all the time. Computer science can be so much more.”

But the public isn’t listening. The public is stroking its beloved smartphone, a miracle device that lets you do almost anything as long as it involves poking clumsily at large icons. The public is writing rhapsodies about the latest video game, which looks absolutely amazing as virtual wallpaper but mostly consists of following simple instructions to earn stupid upgrades. The public is on Facebook or Snapchat or Twitter, where everyone and his second cousin just posted a link to a poorly worded argument that you absolutely have to respond to. The public is cheerfully feeding the gossip machine one kind of information, over and over and over–here’s the easiest, cheapest, quickest, simplest way to get our attention–and the engineers have no choice but to respond. Because they built this goddamn thing. And now they’re stuck inside it, with the rest of us.

Posted in Uncategorized | Leave a comment

Osita Nwanevu Almost Gets It

I’ve been thinking about David Brooks’s sandwich gaffe.

A few weeks ago, Brooks wrote a column arguing that subtle cultural codes–what novelists call “manners”–were a bigger drag on social mobility than so-called structural barriers. As an illustration now made notorious by an orgy of viral snark, Brooks offered a visit to a bourgie sandwich shop as an example:

Recently I took a friend with only a high school degree to lunch. Insensitively, I led her into a gourmet sandwich shop. Suddenly I saw her face freeze up as she was confronted with sandwiches named “Padrino” and “Pomodoro” and ingredients like soppressata, capicollo and a striata baguette. I quickly asked her if she wanted to go somewhere else and she anxiously nodded yes and we ate Mexican.

Liberals had a field day. What a joke! How condescending! What a trite and tone-deaf anecdote!

Osita Nwanevu, at Slate, had a smarter take. He pointed out that this is the point intersectionalists have been making for years about race and gender:

The concept of intersectionality, which Brooks dismisses in his column as an empty cultural signifier no more meaningful than a membership at a barre studio, is partially rooted in the idea that class can erect invisible barriers to mobility and respect similar to—and in fact linked to—the barriers imposed by race, gender, sexual orientation, ability, and other components of individual identity … Social justice warriors thus have no difficulty incorporating the discomfort working-class people feel in unfamiliar situations into their broader analyses of how society leaves them behind.

Hmmm … In my experience, social justice warriors usually bring up the subject of class to explain how poor white men can still have privilege. As in: “Hey white doodz, just because you lack class privilege doesn’t mean you don’t have race and gender privilege, okay? *drops mic*.” That’s a valid argument, but not exactly what Nwanevu has in mind here.

Still, the larger point stands. If Brooks cares so much about class-based microaggressions, why not other microaggressions? Indeed, it’s easy to imagine a Tumblr-style version of Brooks’s anecdote:

Listen up, white people!

I know you think it’s totally cool and OK and not at all problematic to take friends out to eat wherever you feel like, because you absolutely need your soppressata or pomodoro or whatever high-class shit you want for lunch, and you just can’t imagine why people wouldn’t be delighted to bask in your hip white foodie taste. But let me tell you, this way you get about your food, this is straight-up white supremacist bullshit, and it is Not Fine. And pretending like it is–like you just want to go out and get an innocent bite to eat–that is some deeply ignorant liberal bullshit there, and I’m calling you on it.

I’m not even talking about cultural appropriation. Because I know how you get when you hear those words. I’m talking about stuff you think is totally mainstream (gotta love that word, “mainstream”), when you go bringing friends to some Eurocentric sandwich shop where the whole menu’s in Italian (read: white-centric) and the breads are some kind of organic hand-peeled-oat stuff (read: white-privilege) and anything not from Europe is advertised as “exotic” or some colonial bullshit like that, and even the water bottles are done up like little signifiers of capitalist status anxiety, and the whole place is basically a giant exercise in code-switching for anyone who didn’t grow up in the radiant gated epicenter of postcolonial race-segregated America …

I’m not saying I don’t have my own shit to work on. I’m not saying I’m immune to this stuff. In a racist society No One is immune. But you really need to hear this. Pulling a move like that, taking people to a place like that, it is NOT innocent. It is NOT harmless. And I don’t want to hear some kind of white-fragility whining about universality and intentionality and all that kind of liberal-tears shit. If you think ignorance is an excuse, if you need to come at the people fighting this fight and ask them to explain why that kind of colonial bougie foodie-culture is wrong and evil in eight thousand ways, if you’re going to lay that burden on marginalized people along with everything else they’re carrying, well, all I can say is you really need to unpack why you feel that way. Because when you pretend like a sandwich shop is just a sandwich shop–when you can’t even see the codes written there, codes that for a lot of people are powerful triggers of personal and historical trauma–that’s the pathology of privilege, right there. That’s aggression, dominance, power, dehumanization. That’s an act of implicit violence, and you need to deal with it.

If David Brooks had written something like that … well, all I can say is, it will be a delightful day when David Brooks writes that column. But is there any doubt that if this had popped up on Medium under a pseudonym, a conservative who read it would start foaming at the mouth with alarmist rants about totalitarian snowflakes? And that a centrist liberal who saw it would either stay conspicuously silent or pen a hesitant and fretful defense of Western humanism? And that an intersectional leftist responding to the piece would applaud its bravery and relevance, or make criticisms not of the argument itself but of the writer’s presumed identity? And that an alt-leftist reading the piece would denounce it for putting trendy virtue signaling ahead of collective class interests?

And yet this little pastiche of mine makes essentially the same argument as the Brooks column, with only a slight shift in subject matter and a major shift in a style. So what makes David Brooks a moral exemplar to conservatives and a convenient punching bag for leftists?

In his column, Nwanevu looks for the answer in Brooks’s past writings. He makes a good case. I’d point to three aspects of the sandwich anecdote itself:

  1. Topic. Brooks wrote about class instead of race, which these days, thanks to Trump, reads as “putting the desires of poor, bigoted whites ahead of the needs of minorities.” Not good.
  2. Voice. Brooks wrote his piece in a confessional mode: I exercised my privilege in a clueless fashion, I later came to regret it, now I’m reflecting on the experience, etc. etc. … In lefty culture, this is a big no-no, because it comes across as a form of humblebragging. Let me show you how much privilege I have by fretting about my treatment of those with less privilege. Yerk. The accepted protocol–even for upper-class whites–is to call out other privileged people and then tag on a reminder that you’re working on your own biases, too.
  3. Style. Brooks eschews the left’s academic jargon, even blithely dismisses it as an empty class marker. But what really stands out is the highminded earnestness of his style, as if he’s stroking his chin or adjusting his spectacles after every line. The favored rhetorical style on the social-justice left, by contrast (in online circles, anyway) is a kind of impassioned tongue-lashing, equal parts anger and scathing contempt–a mashup of spoken-word feminism, Black Power sermons, and the drawling, acerbic wit of gay TV characters, with perhaps an occasional dash of the voluble dudespeak associated with David Foster Wallace. The original aim of these various rhetorical modes, I think, was to embody alternatives to the self-important detachment of WASP culture–which arrogantly assumes that personal opinions can be legitimately recast as dispassionate social analysis–and so Brooks’s adoption of the dry culture-critic style makes him an irresistible target.

I don’t think the importance of that last point can be understated. It’s hard not to see this clash itself as a triumph of cultural style over structural substance. Everyone agrees that Americans are trussed and trammeled and tangled in shifting webs of norms and fashions. But efforts to escape or resist those webs inevitably become norms and fashions of their own.

Brooks’s studied impersonation of a sociologist makes him the perfect foil for people who see the whole post-Enlightenment intellectual tradition as an elaborate sham. Leftists are justifiably leery of the stuffy pseudo-objectivity of conservative scolds. But in response, they’ve settled on a slangy, hyperpersonal style that often makes them sound like snotty teenagers giving hell to their parents. That style has itself become a powerful shibboleth in certain circles, and frankly, I don’t think it can ever be more. The rhetorical modes of the twenty-first century Left are well-suited for dressing down pompous authority figures. But what happens when you’re the one in authority?

Posted in Uncategorized | Leave a comment