Scrolling through Instagram to see the pictures from the March for Science, I marveled at the protest’s display of teasing American wit. (“Remember polio? No? Thanks, science!”) And then I thought of Neil Postman, the professor and the critic and the man who, via his 1985 book Amusing Ourselves to Death, argued preemptively against all this change-via-chuckle. Postman wasn’t, as his book’s title might suggest, a humorless scold in the classic way—Amusing Ourselves to Death is, as polemics go, darkly funny—but he was deeply suspicious of jokes themselves, especially when they come with an agenda.
Postman died in 2003; were he still with us, though, he would likely be both horrified and unsurprised to see protesters fighting for the fate of the planet with the help of a punnified Labrador—or, for that matter, to see the case for women’s inalienable rights being made by people dressed as plush vulvas. He might whisper that, in politics, the line between engagement and apathy is thinner than we want to believe. He might suggest that fun is fun, definitely, but, given its amorality, a pretty awkward ethic. He might warn, with a Cassandric sigh, that there is something delightful and also not very delightful at all about a trio of Tyrannosauri who, in the name of saving the world, try their hardest to go viral on Facebook.
Maybe that’s because the tablet can’t carry food, or handle cash, or convey to the kitchen that I might like my linguine di mare with sauce on the side and meatballs instead of shrimp. It definitely can’t compliment my date’s haircut, or make a joke about the traffic, or answer my questions about whether or not it likes working at a place with tablets on every table. But it can take a normal order, and lets me pay with a card at the precise moment I want to leave, and then fill out a little survey about my meal. If I happen to have a kid with me, or realize with sudden revulsion that I can no longer stand to even look across the table at my companion, I could even pay an extra $1.99 for access to the tablet’s library of games, and then crush some trivia while I wait for my bottomless breadsticks to be replenished.
In the fancier precincts of the food-service world, where watching a barista spend four minutes prepping a pour-over coffee is a customer’s idea of a good time, robots might not seem like the future of food culture. But spend some time at the restaurants where the majority of Americans eat every day, and you’ll catch a distinct whiff of automation in the air.
I’ve been thinking about my failures, especially the ways I’ve failed other people. A year before my novel Binary Star came out, I began interviewing people for a nonfiction book about eating disorders. The protagonist of Binary Star is an anorexic college student and I had drawn heavily from my own history with anorexia to write her. I felt in my writing I was finally able to translate into language what I had been carrying around as a shapeless trauma. But once I was finished, once I’d satisfied myself with a psychological portrayal of the disease, I began to crave a more scholarly understanding.
In retrospect, what I truly wanted was some authority outside of myself to validate what had happened to me. Having relived the trauma of anorexia in my writing, I wanted to verifiably attribute it to some cause other than an inborn deficiency—point to a reason that was larger than me. Give my pain context and meaning.
Over the last few decades, the ideal of the rational individual has been attacked from all sides. Postcolonial and feminist thinkers challenged it as a chauvinistic Western fantasy, glorifying the autonomy and power of white men. Behavioral economists and evolutionary psychologists have demonstrated that most human decisions are based on emotional reactions and heuristic shortcuts rather than rational analysis, and that while our emotions and heuristics were perhaps suitable for dealing with the African savanna in the Stone Age, they are woefully inadequate for dealing with the urban jungle of the silicon age.
Sloman and Fernbach take this argument further, positing that not just rationality but the very idea of individual thinking is a myth. Humans rarely think for themselves. Rather, we think in groups. Just as it takes a tribe to raise a child, it also takes a tribe to invent a tool, solve a conflict or cure a disease. No individual knows everything it takes to build a cathedral, an atom bomb or an aircraft. What gave Homo sapiens an edge over all other animals and turned us into the masters of the planet was not our individual rationality, but our unparalleled ability to think together in large groups.
“Borne,” Jeff Vandermeer’s lyrical and harrowing new novel, may be the most beautifully written, and believable, post-apocalyptic tale in recent memory: A considerable achievement, considering “Borne” features not just a near-future, nameless city; an enormous, sentient, cataclysmically destructive bioengineered bear; and the endearing intelligent cephalopod who gives the book its title.
But no one told Mollie Maggia about the match girls before she went to the United States Radium Corporation in New Jersey, just after World War I. She painted glowing numbers on dials with their radium paint, licking her paintbrush for accuracy as she'd been taught, and it killed her.
Maggia was the first of the girls at USRC to die in agony from radium poisoning, but far from the last. And the horror at the heart of Kate Moore's Radium Girls lies in the way doctors, the company, and the law failed these women as they sought justice for the lives they were losing.
First let me try and explain: it’s like falling into deep, deep water. A sudden plunge that knocks your breath away, and once you go under, you forget which way is up. One minute I’m in line at the bank, or crossing the street, or pushing my cart through the Sav-Mart. Then something trips me and my memory opens up and I tumble in. Maybe I see a barrette in someone’s hair and suddenly I’m six years old, at the Gimbels perfume counter. Eight greasy fingerprints on the plate glass front. Eleven atomizers on a tray, piano music tinkling through the store stereo. A poppy seed stuck in the saleslady’s front teeth. She turns her head towards Leather Goods and two wisps fly loose from her tortoise-shell clip and my mother slips a bottle of Chanel No. 5 into her pocket and a snail of sweat creeps down my back and she pulls me away by the hand. I live it again, every little thing, and when I come back to the present the teller is shouting Miss? Miss? through the hole in the plexiglass, cars are honking, a quart of ice cream is melting to soup in my hands. On my back the same wet snail-trail. In my nostrils, Chanel No. 5.
You’d think this memory would’ve made me a straight-A student, a Jeopardy Champion. You might call it a gift. I wouldn’t. In school, when I opened my locker or sharpened my pencil or sat down with a quiz, I’d suddenly fall into some other day, some other moment. Ten minutes later I’d still be standing there, lock in hand as the late bell rang, or my pencil would be ground down to a nub, or the teacher, gently and sadly, would say, “Brianna, time’s up.” Even now, behind the wheel, I get lost in a memory and find myself parked at the Dairy Queen by the train station, or the hospital where Caitlin was born, or on Route 6 halfway to Chatham, and Caitlin, if she’s in the car, says, “Mom, you have the worst sense of direction.” But I can’t help it. Once I heard a story on the radio about a woman like me. She had scientists baffled. “Hyperthymesia,” they called it: highly superior autobiographical memory. They thought there might be forty or fifty people in the world like us, people whose pasts keep opening up and swallowing us down. I went to a doctor once, myself. I thought he would look at me and just know. But he took my blood pressure and told me to take a vitamin and said I was just fine, and I never told him. I never told anyone.