Hanōkizawa-san tells me to stop the car, and from the backseat points at an anonymous granite cliffside ten meters away. “There,” he says. “That’s where it came from.” We are driving south along a paved road built against the cliffs that fall into the Pacific outside the Japanese village of Yoshihama. He wants to show Yu Wada-Dimmer, our interpreter, and me the origin of the tsunami ishi, or “tsunami stone” that appeared on Yoshihama’s beach when the high waters of the 1933 tsunami receded. The stone, once used as a warning to low-living villagers, was then buried by man in the sixties, only to be unburied when the ocean surged inland once more on the afternoon of Friday, March 11, 2011.
I can just barely discern the scar of a large boulder ripped clean from the crag, but it could be the former home of any rock that has since tumbled to a saline grave. Eighty-five years have passed since 1933. Hanōkizawa is now 89, which means he was a child, four or five, when it happened. “How do you know this is where it came from?” I ask. “Because my father told me,” he replies.
Yuval Noah Harari may be the first global public intellectual to be native to the 21st century. Where other authors are carpetbaggers, hauling their 20th-century thinking into the new millennium, Mr Harari is its local boy done good. He comes with all the accoutrements of the modern pop thinker: a posh education (Oxford, followed by a teaching gig at the Hebrew University of Jerusalem), two bestsellers and the obligatory TED talk. He even meditates for two hours a day.
And he is armed with a big idea: that human beings will change more in the next hundred years than they have in all of their previous existence. The combination of biotechnology and artificial intelligence (AI) may enable some people to be digitally enhanced, transforming what being human means. As this happens, concepts of life, consciousness, society, laws and morality will need to be revised.
Ceridwen Dovey’s third book, “In the Garden of the Fugitives,” is an elegant — at times, deceptive — narrative that sifts through the selected memories of two characters. Royce is an elderly wealthy man nearing the end of his life in Boston. Vita, a middle-aged South African woman, once a recipient of his generous fellowship, now lives in a small outback town in Australia. The narrative structure reflects an epistolary exchange (well, email), but the stories Royce and Vita tell each other move beyond the perceived boundaries of the digital missives, almost as if the characters are writing past each other as they reconstruct their somewhat broken personal histories.
But as the story reaches its climax, the tension between action and withholding becomes increasingly problematic. It’s not just that Lacroix is reticent, and that so much action happens off stage or invisibly. It’s also that when Lacroix does finally confess the full story of what happened in Spain, he reveals a paralysis in himself that we are never entirely convinced has been cured. Is this plausible psychology? Possibly. Does it show a conflict in Miller himself, between his appetite for writing a historical yarn and something quieter, more subtle and more inward? This seems just as likely.
That the phrase “the fall,” in the context of espionage fiction, usually refers to the crumbling of the Berlin Wall doesn’t entirely rinse it of its theological connotations. The Cold War has long been a backdrop against which to explore human frailties; that’s one of the reasons so many spy novels are backward-looking. But, like any careful spook, the best of such novels look both ways at once. In what we have to assume is the twilight of his career, John le Carré opted to interrogate his own past in the recent “A Legacy of Spies,” but took the opportunity to examine the present at the same time. If neither era showed human nature at its best, it was hard not to discern between the lines a certain amount of nostalgia for the world before the fall.
Around the world, everyday surgical procedures, from treatments for common infections to chemotherapy, rely on antibiotics. In the past decade, however, the drugs we rely on to keep us safe from everything from E. coli to severe acute respiratory syndrome (SARS) are failing to keep up with the rapid evolution of such infections and viruses. As these antibiotics continue to lose their efficacy, we lose our ability to treat even the most basic of illnesses. The situation is so severe that the World Health Organization regards antibiotic resistance as “one of the biggest threats to global health, food security and development today”.
Nor is this a new crisis. In 2014, the Prime Minister David Cameron appointed economist Jim O’Neill to investigate the economic fallout of antibiotic resistance. The resulting report, the Review on Antimicrobial Resistance, put the global annual death toll due to drug-resistant superbugs at 700,000, with an estimated annual mortality rate of ten million by 2050. O’Neill also predicted that, should the crisis continue without a satisfactory response, the reduction in worldwide population would diminish the global economic output by up to 3.5 per cent, at a cost of $100 trillion (£63tn) – roughly 35 times the GDP of the UK. Four years later, we’re no closer to solving the problem.
Some years ago, I had a colleague who would frequently complain that he didn’t have enough to do. He’d mention how much free time he had to our team, ask for more tasks from our boss, and bring it up at after-work drinks. He was right, of course, about the situation: Although we were hardly idle, even the most productive among us couldn’t claim to be toiling for eight (or even five, sometimes three) full hours a day. My colleague, who’d come out of a difficult bout of unemployment, simply could not believe that this justified his salary. It took him a long time to start playing along: checking Twitter, posting on Facebook, reading the paper, and texting friends while fulfilling his professional obligations to the fullest of his abilities.
The idea of being paid to do nothing is difficult to adjust to in a society that places a high value on work. Yet this idea has lately gained serious attention amid projections that the progress of globalization and technology will lead to a “jobless” future. The underlying worry goes something like this: If machines do the work for us, wage labor will disappear, so workers won’t have money to buy things. If people can’t or don’t buy things, no one will be able to sell things, either, which means less commerce, a withering private sector, and even fewer jobs. Our value system based on the sanctity of toil will be exposed as hollow; we won’t be able to speak about workers as a class at all, let alone discuss “the labor market” as we now know it. This will require not just economic adjustments but moral and political ones, too.
The beloved neurologist and author Oliver Sacks was a man of many enthusiasms — for ferns, cephalopods, motorbikes, minerals, swimming, smoked salmon and Bach, to name a few — but none more so than for words.
When I say he loved words, I don’t simply mean within the context of being a writer of numerous classic books — “Awakenings,” “The Man Who Mistook His Wife for a Hat,” “Musicophilia.” Even if he had never written a single one, I am sure Oliver would still have been that funny fellow who took giant dictionaries to bed for light reading (aided by a magnifying glass). He delighted in etymology, synonyms and antonyms, slang, swear words, palindromes, anatomical terms, neologisms (but objected, in principle, to contractions). He could joyfully parse the difference between homonyms and homophones, not to mention homographs, in dinner table conversation. (He also relished saying those three words — that breathy “H” alliteration — in his distinctive British accent.)
Residency programs suggest that the best thing to do is to shrink and compartmentalize the whole process—to arrange for some brief, glorious period of time in which you get to live in the bathroom of a McDonald’s, or rattle around the back of a U-Haul, or, if you’re particularly lucky, decamp to some breathtaking cabin in the countryside, and make your stuff. Perhaps one no longer lives as an artist but merely vacations as one.
“The key element of any Dead Girl story is the investigator’s haunted, semi-sexual obsession with the Dead Girl, or rather, the absence she has left,” Alice Bolin writes in her deliciously dry, moody essay collection, “Dead Girls.” Bolin’s own obsession is nowhere near as lurid but just as fascinating. Once she spots the necrophiliac thread running through our culture — from “Twin Peaks” to “True Detective,” and including every procedural ever made — she can’t stop seeing it, can’t stop thinking about what it would be like, as a girl who’s alive and kicking, to occupy so much central, privileged space. The ubiquity of the popular narrative she comes to call the “Dead Girl Show” makes her think that the dead girl might be something she “could hitch my wagon to.” This might seem an oddly static choice if it didn’t prove so generative.
As expected, the book contains chapters devoted to charting the history of important toys, such as wood blocks and Lego. But “Design of Childhood” casts a wider, more ambitious net, looking at the ways in which attention to children and their needs has helped shape design at large — including public space (playgrounds), architecture (schools and the home) and urbanism (safe street design).
And it looks at the way design has tried to shape children in turn. Those open-plan schools of the ’70s? They were an attempt to foster free thinking and collaboration by removing the walls between classrooms. Unfortunately, they also fostered poor acoustics and quickly went out of vogue. (Though the buzzwords remain with us still.)
Both novels are about women trying to imagine and work their way out of a narrative that has already been decided for them. Both novels are inspiring, not in spite of Tambu’s hopeless situation but because through it all she never loses sight of herself while, at the same time, never underestimating the brutal reality of her predicament. In this regard, “This Mournable Body” is a story of triumph, not despair.
Like the misfit musicians who bought the Velvet Underground’s first album, the young kids of the seventies who pored over the first set of D. & D. manuals all went out to make their own games. The ideas that bubbled up in the following decade flowed in a similar direction. Wiz-War had wizards hurling fireballs at each other in a maze. Titan had players commanding an army of mythological creatures like centaurs and griffons to attack the other players’ titans. Garfield himself made a game called Five Magics. He granted elemental characteristics to five distinct colors of energy that arose, as in many other fantasy games of the period, from different geographies. The colors shifted around, but, eventually, red aggression came from mountains, black ambition from swamps, blue rumination from islands, white orderliness from plains, and green growth from forests. Garfield never considered Five Magics publishable and, because of constant tinkering, he never played it the same way twice. Sometimes it was a board game. Sometimes it was a card game. “There were some versions where you collected victory points to win,” he recalled, “and some versions where you wiped out your opponents.”
It wasn’t until 1991, when a friend put him in touch with a gaming entrepreneur named Peter Adkison, that Garfield, by then finishing a Ph.D. in combinatorial mathematics at Penn, thought he could make something of his creation. Adkison lived in Seattle and worked a day job as a systems analyst for Boeing; he moonlit as the founder and C.E.O. of a gaming company called Wizards of the Coast. He met up with Garfield in Oregon, where the young mathematician was visiting his parents. The lives of these two men and what followed has been chronicled, especially in books like “Jonny Magic & The Card Shark Kids" by David Kushner and “Generation Decks” by Titus Chalk, with particular attention to what happened at this fortuitous moment: Adkison suggested that Garfield create something portable that people could pull out during their downtime at the conventions where nerds flocked to find rare comic books, purchase esoteric collectibles, and discover new games.
Longevity is the new reality and I am in the vanguard of an emerging demographic trend. Life expectancy soared from seventy-two for men and seventy-nine for women in 1982—the year my mother died—to eighty for men and eighty-four for women in 2015. I’m not afraid of dying. The legalization of medically assisted dying means that death is losing its sting, at least for the terminally ill. What terrifies me is old age. What’s the point of longevity if I run out of money or become socially isolated because I am deaf and immobile and have outlived all my friends? Or, far worse, if I am plagued with myriad conditions that rob me of cognition and autonomy and force me to linger like last week’s leftovers because I will no longer be competent enough to request an assisted death? That boomers like me aren’t leaving this mortal coil any time soon also has costly implications for younger generations, which will have to support us in our dotage either as caregivers or with their taxes, or both.
Three years ago, we hit a tipping point: for the first time, there were more Canadians over the age of sixty-five than under fifteen—16 percent of the population, or almost 6 million people. And that massive older segment was growing four times faster than the population at large. If that trend continues, demographers predict that by 2031, the year the earliest boomers turn eighty-five, nearly one-quarter of Canadians will be sixty-five or older. Will we as a society have the medical, social, and financial resources to cope?
In some way, generic styles are nothing new. Hotel brands promise a consistent, yet sterile, customer experience no matter where you are. Walk into a Hilton or a Holiday Inn and you know what you’re going to get. The clerk at the check-in counter will speak English and there will be a rotating waffle maker at the continental breakfast. Airports and shopping malls are similarly predictable.
The issue is this: hotel visitors know they’re getting something generic. That’s the point. Mid-tier hotels advertise safety and reliability. They sell risk minimization, not experience maximization. Algorithms, though, advertise authenticity while selling commodities. Algorithms trick us into thinking we’re getting real and authentic experiences, when in fact, we’re getting the opposite.
What Mr. Chang does with his chefs is something we may not have a term for. We understand chefs who are solo artists, and we understand cooks who uphold the standards of a major chef’s global chain, like the ones at L’Atelier de Joël Robuchon or Nobu.
What’s hard to grasp is the middle ground where Mr. Chang has put most of his chefs. They have to fit into the family, but they also have to be individualistic enough to deserve a place in the family.
By and by, the magic dwindled — as I had feared it would. I grew up. Along the way, I experienced all the regular things that siphon away a person’s sense of wonder: the world’s cruelty and its indifference alike; failure; trauma and pain. By the time my twenties came around, I had begun to sink into a deep darkness. Five whole years slipped by while I remained trapped in the quicksand of a bottomless sadness, completely inert and unchanging, which is its own horrible magic.
I fell into the habit of perseverating on thoughts of my mistakes, turning them over and over in my mind, like worrying a sore tooth with your tongue for the sharp thrill of pain. I played endlessly looping mental reels about going back in time and doing everything over again, from the small stuff (not popping zits) to the bigger stuff (working harder in school, finding direction sooner). I wanted so badly to go backwards that I had cut off all momentum to move forward. I was mired in regret.
What began as a vague apprehension — unease over the amount of time we spend on our devices, a sense that our children are growing up distracted — has, since the presidential election of 2016, transformed into something like outright panic. Pundits and politicians debate the perils of social media; technology is vilified as an instigator of our social ills, rather than a symptom. Something about our digital life seems to inspire extremes: all that early enthusiasm, the utopian fervor over the internet, now collapsed into fear and recriminations.
“Bitwise: A Life in Code,” David Auerbach’s thoughtful meditation on technology and its place in society, is a welcome effort to reclaim the middle ground. Auerbach, a former professional programmer, now a journalist and writer, is “cautiously positive toward technology.” He recognizes the very real damage it is causing to our political, cultural and emotional lives. But he also loves computers and data, and is adept at conveying the awe that technology can summon, the bracing sense of discovery that Arthur C. Clarke memorably compared to touching magic. “Much joy and satisfaction can be found in chasing after the secrets and puzzles of the world,” Auerbach writes. “I felt that joy first with computers.”
The voice of this book, the voice of Cercas, with its beautiful grain and restlessness, its swerves from pity to fury, from calm to hysteria, owe much to Wynne’s almost musical modulations.
I know that writing, for the most part, writing essays and literary fiction, publishing mostly in online journals, running feminist blogs, is no cash cow. I know that people who appear to be making a lot of money from their art alone for long periods of time are often people who received an inheritance, or married into wealth, or are quietly running side hustles. Or worse. It seems that The Institution of Literature, or rather, The Institution of Publishing, still runs on some archaic machine built by old white men, and we Third World Others are still puzzling out how to retrofit ourselves and our stories to fit this model.
My part, the part I’m most ashamed of, is perpetrating the myth that we are all each other’s competition. I am constantly fighting the urge to isolate, to hide and work on my own, to hoard information, access to editors or journals. This secrecy leaves other writers to lean on rumor because so many people are elusive, or obscure the truth. This has left our community of dreamers, particularly women and writers of color, ripe for exploitation.
What do we mean, then, when we say we care about privacy? We might mean that we want the right to be left alone, beyond the gaze of other people—to have our private lives protected from public encroachment. Maybe we want to protect a more internal realm of intimacy or secrecy, where we share only what we choose with only those we have chosen. In politics, the value of privacy is associated with the liberal tradition, from Locke to John Stuart Mill, whose ideas are invoked to defend the separation of the public sphere from the private, in which one is free to make one’s own choices without interference. What matters about the private, in this view, is that the choices we make in that realm are our own—they belong to us. If the state fails to protect our capacity to make those choices—if its surveillance, policing, and monitoring of citizens goes too far—it ceases to be a liberal state and becomes something more sinister.
Many of these ideas about privacy are drawn from a canon that was invented largely by liberal educators and historians of ideas during the Cold War as part of efforts to undercut the expanding security state. Conventional histories of privacy tend to associate it with this tradition, and they present a paradox. Some tell the story of privacy’s triumph. Over the course of the twentieth century, citizens wrested their privacy from powerful, bureaucratic states. In the United States, it became a constitutional right in 1965. Thanks to the reproductive and civil rights struggles, it came to be seen as a human right, owed to all. Others tell the story of privacy as one of decline. Privacy is here a thing of the past, caught between the rise of the national security state and Facebook’s data-gathering regime, which monetizes our personal information and turns our intimate conversations into targeted advertising. These two contradictory histories point to our puzzling situation. Most of us worry, some of the time, about privacy. But we also willingly give it up. What does it mean to worry so much about something of which we seem to want so little?
This is the house that Kathryn Maris built: it has “only an attic and a basement”. What does it signify to have a bodiless house? The title is typical of this crisp, funny, lightly disturbing collection. Maris is a mistress of fragile structures. A wit informs her sometimes painful, mannered poems – their affectation a coping strategy. What Women Want is formed by layered futility: the woman’s superstitious initiative rendered null by the husband’s incurious loftiness. It plays with the pointlessness of its subject until the poem becomes the point. The charm of the book is that it is the poems themselves that offer stability. It is they that bridge – where a bridge is possible – the gap between the sexes (“The man in the basement wrote stories about heroin/ the woman in the attic read stories with heroines”). This is the gap that keeps threatening to become a void.
The Seas, Samantha Hunt’s first novel, is as disturbing as it is beautiful. It is a literary equivalent of the Rubin vase, the ambiguous image, multistable perception that shocks us back and forth between two possible realities of a story all dependent upon the gaze of the reader at particular moments. Our narrator is either a real mermaid or a schizo-affective depressive circling down the drain of a heavy mental breakdown. We think we have to choose between these sides of perception but we don’t. The richest understanding of The Seas comes as we see that these two interpretations are not mutually exclusive. We can witness the vase and the face.
This busy, squirming, roomy novel has a tidy ending, one that too neatly dispenses prizes and gives Barry a stab at redemption. We come to see that he is not so different from his autistic son. We come to see, in fact, that in some ways the reality of being on the autism spectrum, with all its challenges and rewards, is this novel’s great clandestine subject. It can be hard to get a handle on people with autism. It’s even harder to get a handle on Barry. He’s hard; he’s soft. He’s gregarious; he’s a loner. He’s a wolf of Wall Street; he’s a high plains drifter. He wants to fleece people; he wants to save them. His mind is like a bed that gets made, and then remade, on every other page. Is Barry hollow or is he holy? Shteyngart’s prose holds you in a way that Barry himself never does.
There's a lot going on in Ohio — a sprawling cast of main and supporting characters, and a series of interconnected events that doesn't come together until the book's shocking conclusion. But Markley handles it beautifully; the novel is intricately constructed, with gorgeous, fiery writing that pulls the reader in and never lets go. It's obvious that Markley cares deeply about his characters, even the unsympathetic ones — he treats them with respect, never writing condescendingly about these people whose lives have been battered and bruised by circumstances they don't quite understand.
Tradition still has a reckoning in store for those who turn away from it, and Pamuk’s novel is masterful in drawing out the inherent tension of a society in the midst of an identity crisis related to its own history and values.
We need more thinkers as wise as Appiah and Fukuyama digging their fingers into the soil of our predicament. And we need more readers reading what they harvest.
Jazz was in a sense always a late style, a timekeeper’s music out of time. In the 1920s, while jazz musicians were playing early show tunes and improvising with rudimentary harmony, the Second Viennese School was pushing ahead into total chromaticism and atonality, and Stravinsky, Milhaud, Prokofiev, and Ravel were experimenting with jazz’s musical signature—its fixed pulse, syncopated rhythm, and emphasis on flattened thirds and sevenths. Jazz was modern long before Modern Jazz was named in the 1940s, for the harmonic modernity of bebop was the chromaticism of Liszt, Chopin, and Wagner. In the wider chronology of Western music, jazz’s harmonic development is a long game of catch-up, finished too late—around 1972, when Miles Davis heard Karlheinz Stockhausen for the first time. Davis had already reached the same conclusions as the joyless German but without losing the funk.
No jazz musician incarnates the legend of late style more than the saxophonist John Coltrane. His early style is undistinguished; he was a bluesy sideman whose grasp of the instrument falls short of the reach of his ear. His middle style, stertorous and ambitious, began in his mid-1950s stint with Miles Davis’s quintet. Coltrane in this period is still less melodious than Hank Mobley and less witty than Sonny Rollins, but his chops are catching up with his ear. Only Johnny Griffin has fleeter fingers and only Rollins can beat him for persistence. Coltrane thinks aloud and never stops thinking; he is the perfect foil for Davis, who is also ironic and intellectual, also latent with eroticism and violence, but who never shows his working, only the finished idea. Coltrane’s sound waves are square and heavy, metallic and dark like lead. He is both implacable and lazy, like a bull elephant: You never know where the charge will take him, only that—as he himself admitted to Davis—once he gets going, he doesn’t know how to stop.
More and more, biologists are discovering that organisms thought to be different species are, in fact, but one. A recent example is that the formerly accepted two species of giant North American mammoths (the Columbian mammoth and the woolly mammoth) were genetically the same but the two had phenotypes determined by environment.
There are many conflicting uses of the term epigenetics, and this as much as anything has led to great dissension among and between scientists, as well as between scientists and science journalists. This is not an isolated incident: There are many cases in science where specific terms are used in quite different contexts, by different scientists, where the same word takes on disparate meanings; as a consequence, confusion can arise. In the past decade alone, there have been an increasing number of books, popular articles, and scientific reviews concerning epigenetics and in them there has been a diversity of meanings and ways that the word has been used. (And, according to many critics, overused.)
With parkways and expressways offering far quicker passage, no practical person would opt to drive the length of Route 25 with its changing speed limits and hundreds of traffic lights.
But a recent daylong jaunt along the entirety of the route — following the 25 East signs and stopping frequently to sample life along the way — showed how a single road could tie together vastly different worlds extending from the ethnic pockets of Queens to suburban sections of Nassau and Suffolk Counties to bucolic stretches of the North Fork, furthest east.
A reasonable person might ask, “Why would you travel 6,000 miles to China and eat pizza?” It is only with a small amount of shame that I admit that for the longest time my favorite food to eat in China was pizza. I was always a picky eater, perhaps as a result of some weird genetic tick, or because I was coddled so much by my parents, or — as I half-heartedly theorize — possibly something to do with my weird neuroses and mental hang-ups about growing up as a second-generation Chinese American.
At 94, Lewis Smith is the last living sibling of culinary icon Edna Lewis, arguably the most important figure in American regional cooking. In 1976, after nearly three decades of work as a chef and caterer in New York City, Lewis brought the traditions of refined, farm-to-table Southern cooking, and the black foundation of American food, to national attention with her second, best-known cookbook, “The Taste of Country Cooking.”
After editing a book examining Lewis’s life, work and legacy this year, I was invited by Lewis Smith to attend the family’s reunion in Galesburg, where Lewis Smith now lives with her daughter and son-in-law, Mattie and Jerry Scott.
When I arrived, the household was already in high gear.
One evening not long ago, my fifteen-year-old son, Noah, told me that literature was dead. We were at the dinner table, discussing The Great Gatsby, which he was reading for a ninth-grade humanities class. Part of the class structure involved annotation, which Noah detested; it kept pulling him out of the story to stop every few lines and make a note, mark a citation, to demonstrate that he’d been paying attention to what he read. “It would be so much easier if they’d let me read it,” he lamented, and listening to him, I couldn’t help but recall my own classroom experiences, the endless scansion of poetry, the sentence diagramming, the excavation of metaphor and form. I remembered reading, in junior high school, Lord of the Flies—a novel Noah had read (and loved) at summer camp, writing to me in a Facebook message that it was “seriously messed up”—and thinking, as my teacher detailed the symbolic structure, finding hidden nuance in literally every sentence, that what she was saying was impossible. How, I wondered, could William Golding have seeded his narrative so consciously and still have managed to write? How could he have kept track of it all? Even then, I knew I wanted to be a writer, had begun to read with an eye toward how a book or story was built, and if this was what it took, this overriding sense of consciousness, then I would never be smart enough.
Now, I recognize this as one of the fallacies of teaching literature in the classroom, the need to seek a reckoning with everything, to imagine a framework, a rubric, in which each little piece makes sense. Literature—at least the literature to which I respond—doesn’t work that way; it is conscious, yes, but with room for serendipity, a delicate balance between craft and art. This is why it’s often difficult for writers to talk about their process, because the connections, the flow of storytelling, remain mysterious even to them. “I have to say that, for me, it evolved spontaneously. I didn’t have any plan,” Philip Roth once said of a scene in his 2006 novel Everyman, and if such a revelation can be frustrating to those who want to see the trick, the magic behind the magic, it is the only answer for a writer, who works for reasons that are, at their essence, the opposite of schematic: emotional, murky, not wholly identifiable—at least, if the writing’s any good. That kind of writing, though, is difficult to teach, leaving us with scansion, annotation, all that sound and fury, a buzz of explication that obscures the elusive heartbeat of a book.
Gloucester Crescent in north London has been celebrated ever since the 1970s: Mark Boxer’s String-Along cartoons satirising the affluent liberal intelligentsia were followed by books and films such as Alan Bennett’s The Lady in the Van and Nina Stibbe’s Love, Nina. From the 1960s to the 1980s, it was home to a constellation of intellectual celebrities who, like the Bloomsbury group, combined talent, idealism and luck with adultery, rivalry and a sense of entitlement. All that has been missing from the mix is a worm’s (or child’s) eye view. This memoir, written by Jonathan Miller’s son William, provides this in spades.
Two decades ago, I arrived at Harvard as an undergraduate excited to soak up the brilliance of professors who had won Nobels and Pulitzers. But by the end of the first month of my freshman year, it was clear that these world-class experts were my worst teachers. My distinguished art history professor raved about Michelangelo’s pietra serena molding but didn’t articulate why it was significant. My renowned astrophysics professor taught us how the universe seemed to be expanding, but never bothered to explain what it was expanding into (still waiting for someone to demystify that one).
It wasn’t that they didn’t care about teaching. It was that they knew too much about their subject, and had mastered it too long ago, to relate to my ignorance about it. Social scientists call it the curse of knowledge. As the psychologist Sian Beilock, now the president of Barnard College, writes, “As you get better and better at what you do, your ability to communicate your understanding or to help others learn that skill often gets worse and worse.”
I’ve come to believe that if you want to learn something new, there are three factors that you should keep in mind when choosing a teacher — whether it’s a professor or mentor or soccer coach.
China spent $180 million to create the telescope, which officials have repeatedly said will make the country the global leader in radio astronomy. But the local government also spent several times that on this nearby Astronomy Town—hotels, housing, a vineyard, a museum, a playground, classy restaurants, all those themed light fixtures. The government hopes that promoting their scope in this way will encourage tourists and new residents to gravitate to the historically poor Guizhou province.
It is, in some sense, an experiment into whether this type of science and economic development can coexist. Which is strange, because normally, they purposefully don’t.
This is a beautiful novel with a deep and satisfying intelligence at its heart. It’s emotionally and sexually admirably frank (Marianne’s masochistic streak takes her down some dark paths), but also kind and wise, witty and warm. In the end, a little like Rooney’s first book, it’s a sympathetic yet pithy examination of the myriad ways in which men and women try – and all too often fail – to understand each other.
But Conscience is a curious book. Every time I wanted to object, Mattison pulled me back in, some of which, I think, is connected to the book’s pacing, which is wonderfully slow and lush. Fiction tends to move at a fast clip these days — it’s full of fragments and ellipses, abrupt shifts that reflect our accelerated, decentered lives. But Mattison refuses to give up the rich, mundane details of domestic life — people talking, cooking, washing the dishes. It’s where her stories live.
Patient and dedicated readers will find among the references to other books and their many footnotes and appendices a poignant sense of completion and finality to the life’s pursuit of a father and son. Deep delvers of Middle-earth lore will be rewarded with a thorough understanding of one of modern fantasy’s seminal works.
People often think that in the thousands of years following the rise of agriculture human societies were static. They were not. Empires rose—some flourished, then perished, while others persisted. Most people remained subsistence farmers who kept themselves, or themselves and the ruling elites alive. Foraging as a way of life was pushed to agriculturally marginal lands. Populations grew rapidly, with estimates ranging from between 1 and 10 million people at the beginnings of agriculture to between 425 and 540 million in the year 1500, around 10,000 years later.
In the 16th century everything began to change, and change with increasing speed. Agricultural development, from simpler farming communities to city-state to empire (and often back again), slowly began to be replaced by a new mode of living. Revolutions in what people ate, how they communicated, what they thought, and their relationship with the land that nourished them emerged. Somehow, those living on the western edge of the continent of Europe changed the trajectory of the development of human society, and changed the trajectory of the development of the Earth system, creating the modern world we live in today. Nothing would be the same again.
For every misfire, there are a dozen triumphs, large and small. The characters walk and talk like real, messed-up people; the author cares about them, and so does the reader. The prologue-four sections-coda structure works because Markley took the time to connect everything in a masterful set of flashbacks and flash-forwards that parcel out enough information to make the conclusion both shocking and inevitable. “Ohio” is a big novel about what happened after 9/11, the initial euphoria and the long depression that grips us still.
It won’t surprise anyone who reads this remarkable Dutch novella, set among the bloody churn of partisans, Russians and retreating German forces towards the end of the second world war, that it has long been regarded as a classic in the Netherlands. In a sharp new translation, the first standalone English-language edition arrives more than half a century after the book first appeared in Dutch.
But be glad that it has finally emerged. It remains a shocking read, even if you have to imagine the impact it must have had when it was published in its home country in 1951, exploding the prevailing postwar discourse of brave resistance to the Nazi occupation with a story of selfish opportunism and amoral nihilism.
The information age has ushered in an era of fear about children’s well-being, shifting norms heavily towards constant oversight and nearly impossible standards of safety. One casualty of that trend has been the playground, which has become mind-numbingly standard-issue—with the same type of plastic swing sets and slides—designed to minimize harm, rather than maximize enjoyment.
Over the last few years, however, pushback against the overly sanitized playground has grown considerably, with new research supporting the importance of play—especially unstructured play—for early childhood development. Critics also argue that concerns about actual harm are overstated. These findings have raised questions about playground design. Is the current playground model fostering creativity, independence, and problem-solving? What does risk really mean—and when is it OK? What can alternatives to current play spaces look like? And how can their benefits extend to all children in a city? Architects, researchers, childhood development specialists, and parents are weighing in on these questions around the world, and outlining a new vision for the future of play.
Writing a novel is a scary prospect. They’re so long and winding, they can seem never-ending. The main obstacle might seem to be starting – the terror of the blank page – but the real stumbling block lies elsewhere. There is no reason in the world why you can’t write a novel and the only thing stopping you from doing so is yourself. It seems such an insurmountable task and, in any case, you might ask yourself, why would anyone be interested in what I have to say? Who am I to have a voice? It is this lack of self-belief that is the main hindrance. It is the first thing any aspiring author has to get to grips with every time they sit down to write.
Writing is about claiming ownership of yourself in order to become the person you know you can be. It’s about acknowledging to yourself that writing is not just a hobby, but a profound force in your life, one that will help you to achieve a deep sense of self-expression. A novel is making your mark on the world. It is your cri de coeur. But bridging that gap will be a struggle. You will have to push yourself far outside your comfort zone. And you will have to be completely honest with yourself about why and what you want to write. The first question to ask yourself is: “What do I want to say?”
Max Weinreich, a linguist, made famous the wry remark that “a language is a dialect with an army and a navy.” The usual criterion for what is a separate language, and not a mere dialect, is that speakers of two languages should find it difficult or impossible to understand each other. But factors that have nothing to do with language often supersede the linguistic ones.
Neel Patel’s debut story collection is a study of doomed attachments. In “If You See Me, Don’t Say Hi,” no one is spared: Friendships fester, marriages combust and families fall into civilized distemper. Where did it all go wrong? Patel’s characters are trying to piece it together. In elaborate feats of retrospection, his 11 narrators re-enact conversations with lovers and friends, scrambling their memories for clues and causes. Hailing from backgrounds wealthy and working class, closeted and out, coastal and country, they seem to share a talent for unrequited love.
Translation, after all, entails its own set of artistic demands. Defined by both fidelity and freedom, it must offer transparency while remaining a touch inaccessible and foreign. “The task of the translator,” as it were, is rooted in creative limitation-as-inclination, in a need not only to communicate what escapes language but also to communicate it artfully.
But what if a translator judges the original text to be artistically inadequate? What if he argues that its narrative trails are begging for stronger connections, that its story exhibits glaring archetypal deficiencies, that its author is much too pretentious?
The book opens with a clear-eyed look at the early anti-GMO movement. But Lynas begins to ask questions and finds the slogans often don’t reflect scientific consensus.
Morally speaking, Mrs. Thatcher and Ronald Reagan should have been right. As long as I am better off, why should I begrudge your doing better still? Yet something was amiss with this consensus — something that goes far to explain why Reagan-Thatcher conservatism has caved in under pressure from the populisms of President Trump on the right and Senator Bernie Sanders of Vermont on the left.
In America (and also in other countries), an impressive postwar rise in material well-being has had zero effect on personal well-being. The divergence between economic growth and subjective satisfaction began decades ago. Real per capita income has more than tripled since the late 1950s, but the percentage of people saying they are very happy has, if anything, slightly declined.
In an increasingly digital age, Kroupa believes that the physical book has meaning, that holding a historic or unique book is different from viewing it on a screen. She’s concerned that our culture is moving from things to pictures of things. It’s like art, she says; you can spend years studying and looking at reproductions of paintings, and then you visit a museum and see the real thing. “You realized that what you’ve been seeing is just the skeleton, some kind of amorphous thing. It’s not the art — you have to have the art there. And I think books are exactly the same way.”
There’s nothing like political and economic upheaval to make boredom look good. An era like the 1950s, which used to be lampooned for its stifling conformity — all those organization men in their gray flannel suits — has since been revered for its stability. To the gig-economy worker who has no idea how many hours she’ll be putting in next week (much less whether she’ll make enough to pay her rent or her health insurance), the prospect of donning a fedora, taking the commuter train into the city, sitting at a desk from 9 to 5 while her ample pension benefits accrue — well, it sounds like a fantasy now.
Then again, it would have been a fantasy for her back then too. As Louis Hyman shows in his illuminating and often surprising new book, the midcentury idyll of steady employment and a regular paycheck wasn’t designed to include women and people of color. For them, today’s economic precariousness wouldn’t look entirely unfamiliar. The New Deal’s fair labor standards applied only to industrial jobs in factories and offices; agricultural and domestic work were deliberately excluded.
Bookstores have become cultural Rorschach tests. After the past decade or so, you’ve either been traumatized by watching your favorite store go dark, or you’re fine with the coffee and craft cocktails now served alongside exquisitely curated books.
This fall begins a new era, or maybe a retro one, marked by the reemergence of national bookstore chains and two prototype stores opening next month. In New York, Shakespeare & Co. is growing to three locations, laying the groundwork for its national expansion, while Indigo, Canada’s largest bookstore chain, is opening its first U.S. store in New Jersey, staking its claim before growing west. Both believe there’s big potential in general bookstore chains despite wildly different ideas about how we buy books.
Data. I know there's lot of it around. But do I really need it? Especially if I’m a literary scholar of the old-fashioned, ruminative type? Readers like me cling to artful, poetic texts as a refuge: an antidote to information overload, technological distraction, and the hegemony of instrumental reason. Can a database help me understand a book?
Daniel Shore’s new book says it can. On his (largely persuasive) account, even a traditional humanist critic can use new search tools and datasets to become better at interpreting literary forms and their cultural history. When used prosthetically and heuristically — as aids to discovery — these tools help deepen our appreciation of verbal artifacts. So data can serve our purposes even when statistics, bar charts, and scatterplots leave us cold, and when the spirit of our readings remains defiantly antique and analog.
About the resolution of the big question at the heart of Jessie Greengrass’s unusual and absorbing first novel—whether or not the unnamed narrator should become a parent—there is no suspense: we know from the first sentence that she is pregnant and from the second that she is already the mother of a little girl. Written mostly in the form of a memoir, “Sight” recounts the history of the narrator’s struggle to arrive at her decision, a process that goes beyond mere difficulty to something more like torture. She has a partner, Johannes, who is patient with her vacillations, though he does not share them. At one point, he suggests that to help make up her mind she go away from London, where they live, and seclude herself in a cottage in Wales. The narrator recalls of the stay, “I sat the week out, unhappy, and went home to tell him with defiance that I wouldn’t have a child; but two days later I cried and said that after all I might, because still I could feel nothing but how much I wanted to.”
It was October 2016. Hurricane Matthew had just rolled out to sea, Samsung phones were catching fire, Hillary Clinton was up by double digits in the national polls and the unthinkable was still unthinkable. Shailagh Murray had spent two terms in the White House helping to lead the administration’s communications strategy and it appeared to have taken its toll. With Obama just a few months away from leaving office, journalists wanted exit interviews; they wanted to be first, biggest, loudest. She was sick of the egos, the same old questions.
The letters, she said, served as a respite from all that, and she offered to show some to me. She chose a navy blue binder, pulled it off the shelf, and opened it, fanning through page after page of letters, some handwritten in cursive on personal letterheads, others block printed on notebook paper and decorated with stickers; there were business letters, emails, faxes and random photographs of families, soldiers and pets. “You know, it’s this dialogue he’s been having with the country that people aren’t even aware of,” she said, referring to Obama’s eight-year habit of corresponding with the American public. “Collectively, you get this kind of American tableau.”
Obama had committed to reading 10 letters a day when he first took office, becoming the first president to put such a deliberate focus on constituent correspondence. Late each afternoon, around five o’clock, a selection would be sent up from the post room to the Oval Office. The “10 LADs”, as they came to be known – for “10 letters a day” – would circulate among senior staff and the stack would be added to the back of the briefing book the president took with him to the residence each night. He answered some by hand and wrote notes on others for the writing team to answer, and on some he scribbled “save”.
I’m not arguing that neurotypical writers should never create autistic characters (that would lead to even greater invisibility than we have at the moment). I’m suggesting that it’s time those characters reflected reality, based on careful research, and contact with real, autistic people. In the course of that research, writers might come across the occasional Don or Christopher, but they’ll also find far more diversity than they ever imagined; people brimming with creativity, empathy, wisdom and good humour; and people facing physical, sensory and intellectual challenges far greater than fiction has portrayed. The psychiatric literature is playing catch-up here, to the extent that it’s not a useful source of reference.
Readers, too, need to be more critical when faced with stereotyped autistic characters – as they now are with a range of other minority representations. When encountering characters from vulnerable or minority groups, we must all learn to ask who is doing the writing, why, and on what authority. It is time to grow tired of ‘folklore autism’ being used to jazz up a tired book, and to seek out the freshness and life of authentic accounts, written by autistics.
“Adorkable.” “Manspreading.” “Frenemies.” Coining new words to fit modern needs is a practice that goes back to the beginning of language; Shakespeare, for example, is said to have introduced somewhere from 1700 to 3200 new words. Peter Hill may not be Shakespeare, but he has cataloged around 3000 new words in the indigenous Lakota language. Hill, a Philadelphian who married into Lakota fluency, runs a language immersion school at the Pine Ridge Indian Reservation in South Dakota. Over the past six years, Hill and other Lakota speakers have hashed original phrases to encompass newly English concepts such as “smartphone,” “methamphetamines” and “same-sex marriage.”
For Hill, the effort to craft neologisms is key to revitalizing a marginalized language — a tongue the federal government took pains to suppress. Today, the words developed by Hill and other native speakers provide a look into how languages evolve and shape themselves. At Hill’s immersion school, everyone — from teachers to students — tries to speak Lakota 100 percent of the time. Children ages 1 to 5 run through classrooms, and play in areas filled with Lakota picture books. Hill opened the school in 2012 via online fundraising with the mission of reviving the Lakota language, which had only about 2000 speakers left as of 2016, according to the nonprofit Lakota Language Consortium.
Relationships between writers never work out. We know this. They fall apart, dramatically, tragically—Plath and Hughes, for example—but even if they don’t, they’re indelibly marked with misery: jealousy, obsession, resentment. Somebody usually writes a memoir and their partner never forgives them. Somebody wins all the prizes. Of Joan Didion and John Dunne, a literary couple who really did appear to love each other, biographer Tracy Daugherty wrote, “Though neither could imagine not being married to a writer, though they counted on each other for editorial and professional support, an edginess grew between them—not competition so much as sadness that things could not always be equal.” Things cannot always be equal. It is not surprising that literary relationships so frequently fail.
Just over 10 pages from the start, in a second beginning, Wash tells us he was a “freeman” by the age of 18, and it is clear that Edugyan is coming at her subject sideways, not with gritty realism but with fabular edges, and as much concerned with the nature of freedom as with slavery, both for her white characters and black.
This is, in fact, less a book about the effects of slavery and more about the burden, responsibility and the guilt of personal freedom in a time of slavery.
Critics, like exterminators and exorcists, are in the business of bringing what is hidden into the light. To make the implicit explicit, as Samuel Johnson had it, to root out elusive associations, half-invisible effects — to identify how a text works, and why.
On good days, that is. “We That Are Young,” a hectic new novel by Preti Taneja, a retelling of King Lear set in present-day India, was published last year in Britain to much acclaim. It’s a doorstop, full of sound and fury, more nihilistic than Shakespeare’s original, with all the blunt and dismal machinations of a soap opera.
Like the best speculative fiction, Severance also aims for more than chills and thrills: without being preachy, Ling Ma's story reflects on the nature of human identity and how much the repetitive tasks we perform come to define who we are. That's why the images of the fevered in this novel are not only terrifying, but poignant: the fevered mother who keeps setting dinner dishes down amidst rotting food; the fevered taxi cab driver who'll keep on driving until gas runs out; and even un-fevered Candace herself, who has such trouble breaking away from the daily round of a job she doesn't even like.
Ever since my mom died, I cry in H Mart. For those of you who don’t know, H Mart is a supermarket chain that specializes in Asian food. The “H” stands for han ah reum, a Korean phrase that roughly translates to “one arm full of groceries.” H Mart is where parachute kids go to get the exact brand of instant noodles that reminds them of home. It’s where Korean families buy rice cakes to make tteokguk, a beef soup that brings in the New Year. It’s the only place where you can find a giant vat of peeled garlic, because it’s the only place that truly understands how much garlic you’ll need for the kind of food your people eat. H Mart is freedom from the single-aisle “ethnic” section in regular grocery stores. They don’t prop Goya beans next to bottles of sriracha here. Instead, you’ll likely find me crying by the banchan refrigerators, remembering the taste of my mom’s soy-sauce eggs and cold radish soup. Or in the freezer section, holding a stack of dumpling skins, thinking of all the hours that Mom and I spent at the kitchen table folding minced pork and chives into the thin dough. Sobbing near the dry goods, asking myself, “Am I even Korean anymore if there’s no one left in my life to call and ask which brand of seaweed we used to buy?”
Whether hair pulling, skin picking or cheek biting, body-focused repetitive behaviors blight many people's lives. How can science help us understand and treat these distressing conditions better?
The color grey is no one’s color. It is the color of cubicles and winter camouflage, of sullage, of inscrutable complexity, of compromise. It is the perfect intermediate, an emissary for both black and white. It lingers, incognito, in this saturated world.
It is the color of soldiers and battleships, despite its dullness. It is the color of the death of trees. The death of all life, when consumed fire. The color of industry and uniformity. It is both artless and unsettling, heralding both blandness and doom. It brings bad weather, augurs bleakness. It is the color other colors fade to, once drained of themselves. It is the color of old age.
Because I have no style, I defer to grey. I find it easier to dress in grayscale than to think. I buy in bulk, on sale, in black and white and shades between—some dishwater desolate, some pleasing winter mist. I own at least five cardigans in grandpa grey.
Every morning in Tokyo, as the tile roofs of the neighborhood houses come into view, I put the kettle on for Darjeeling tea. When the water reaches a rolling boil, I pour it over the dark, crinkly leaves of the Camellia sinensis var. sinensis tea plant. Like the Japanese paper flowers Proust writes of, the ones that bloom when put in water, a world unfolds as the leaves steep and the musky, floral fragrance rises.
The tea estates, which I first saw as a small girl when my mother brought her American husband and children to her hometown of Darjeeling, lie 6,700 feet in the Himalayas near the India-Tibet border. The long, even rows of emerald tea bushes undulate with the hills, dirt paths cutting through them like veins. The estate names read like a roster of champion racehorses: Margaret’s Hope, Makaibari, Happy Valley, Rangaroon, Liza Hill. The teas include crisp and ethereal First Flush, harvested in spring; rough-edged Rain Tea, produced during the summer monsoon; fruity, coppery Autumn Flush.
Bringing water to a boil, waiting for the leaves to brew, pouring the tea into a cup and milk into the tea (only a drop, so the taste isn’t diluted), I’m doing what my Tibetan family has done for over a century. The earthy notes of the amber liquid conjure the wool-and-camphor smell of our Darjeeling house, the odor of butter lamps and incense in the altar room. They make me feel connected to the land itself: 28,000-foot Mount Kanchenjunga, soaring over the town; sacred Observatory Hill, where our family feasted at Losar New Year; the dusky waters of the Teesta River, where my grandparents’ ashes were scattered.
Now two more books have arrived with cases that hover between cautious optimism and measured despair: Cambridge political theorist David Runciman’s How Democracy Ends and conservative pundit Jonah Goldberg’s Suicide of the West. Goldberg’s book has been taken up in the beleaguered ranks of the intellectual right as one of the best explanations the movement has for the rise of Trump. Runciman, on the other hand, is too idiosyncratic a thinker to belong to any tribe except the professoriate. Both authors came of age in the 1980s—Runciman was born in 1967, Goldberg in 1969—and made careers in the long 1990s, that period between the fall of the Berlin Wall in 1989 and the financial crisis of 2008. Dire warning about democratic crisis belonged to their childhood, and so did radical challenges to the political system. Intellectual maturity required putting away juvenile delusions—until, suddenly, maturity itself seemed like the delusion.
Sight delves into a lot in under 200 pages: Mothers and daughters, birth and death, loss and grief, finding one's balance, the ardor and arduousness of scientific discovery. But it is also a book about the limits of knowledge – learning to accept them, yet continuously pushing to expand their boundaries. Readers willing to give themselves over to Greengrass' penetrating vision will surely expand theirs.
When the hip-hop artist Kendrick Lamar won the 2018 Pulitzer Prize for Music, in April, reactions in the classical-music world ranged from panic to glee. Composers in the classical tradition have effectively monopolized the prize since its inception, in 1943. Not until 1997 did a nominal outsider—the jazz trumpeter and composer Wynton Marsalis—receive a nod. Lamar’s victory, for his moodily propulsive album “damn.,” elicited some reactionary fuming—one irate commenter said that his tracks were “neurologically divergent from music”—as well as enthusiastic assent from younger generations. The thirty-one-year-old composer Michael Gilbertson, who was a finalist this year, told Slate, “I never thought my string quartet and an album by Kendrick Lamar would be in the same category. This is no longer a narrow honor.”
Lamar’s win made me think about the changing nature of “distinguished musical composition,” to use the Pulitzer’s crusty term. Circa 1950, this was understood to mean writing a score for others to perform, whether in the guise of the dissonant hymns of Charles Ives or the spacious Americana of Aaron Copland. But that definition was always suspect: it excluded jazz composers, whose tradition combines notation and improvisation. In 1965, a jury tried to give a Pulitzer to Duke Ellington, but the board refused. Within classical composition, meanwhile, activity on the outer edges had further blurred the job description. By the early fifties, Pierre Schaefer and Pierre Henry were creating collages that incorporated recordings of train engines and other urban sounds; Karlheinz Stockhausen was assisting in the invention of synthesized sound; John Cage was convening ensembles of radios. By century’s end, a composer could be a performance artist, a sound artist, a laptop conceptualist, or an avant-garde d.j. Du Yun, Kate Soper, and Ashley Fure, the Pulitzer finalists in 2017—I served on the jury—make use, variously, of punk-rock vocals, instrumentally embroidered philosophical lectures, and architectural soundscapes. Such artists may lack the popular currency of Lamar, but they are not cloistered souls.
Yet feminism has not done a good enough job articulating what alternate strategies of reprodction may be. In part this is a problem of thought, in part a problem of genre. From Firestone to Haraway to Laboria Cuboniks (an anti-naturalist, gender-abolitionist collective of “daughters of Haraway”), the manifestos issued by feminists often call for universal access to reproductive technologies, biotechnical interventions, hormones, and “endocrinological knowhow” (including about gender hacking). What necessarily gets lost in these manifestos’ universalizing are the differences in how particular technologies calibrate particular peoples’ experiences of reproduction and care; how they bring to light vast structural inequalities of time, money, kinship, health care, legal protections, and bodily integrity; and how, when these inequalities become palpable enough, the desire to reproduce naturally can undercut a progressive politics of reproduction.
To appreciate all this—and to figure out what to do about it—we need narrative.
The summer after my senior year of high school, I’d been accepted into seven colleges, but I didn’t know if I could go to any of them. My financial aid wasn’t settled, and I was poor. I couldn’t go if I didn’t get some grants or loans (this was in the 1990s, when school lending was more conservative), so while I waited, I got a job as a waitress, working the graveyard shift at the local diner.
It was a discouraging time. I desperately wanted to leave the rural town I grew up in, and college seemed the way out. So, A Tree Grows in Brooklyn was on my mind as I memorized the restaurant menu and counted my tips. I kept thinking about the tin can Francie’s mother “nailed to the floor in the darkest corner of the closet,” where they put “half of any money they got from anywhere.” One crisis after another forces them to empty the can. I knew a similar frustration.
The Only Girl perhaps couldn’t be viewed as a definitive book on the burgeoning rock’n’roll era, or even on Rolling Stone, as it has an eccentric, wilful, albeit charming, tendency to weave, back and forth, through time zones, with Green mulling and remulling (and even re-re-mulling) on people, events and thoughts, seemingly as the mood takes her. Not that it matters – there are already books on Rolling Stone, and on the era, the majority of which are written by men. This one is about a woman navigating the uncharted territory of her crazy expanding new world, not only armed with the requisite “groovy” access-all-areas pass, but also the self-awareness, humour, and resilience that an “only girl” needs.
I tried to be the good immigrant by assimilating as swiftly as I could when I arrived in the United States as a young girl. I tried to be a grateful immigrant by learning to talk, dress, cook, eat, drink, dance, and even think like an American. Following the logic of meritocracy, I believed that my success was earned by merit. And my merit was my virtue. I was entrepreneurial. I fashioned myself to increase my chances of finding success. I wore whiteface. And just when my colleagues and friends simply “forgot” I was Not White — an unexpected tide of anger welled up inside me. Just when I thought I had succeeded in following the rules of my own DIY whiteface manual, I found myself angry and overwhelmed by sadness.
As Stephen Olshansky hiked south through alpine Colorado in the crackling beauty of autumn 2015, he knew he was playing chicken with the arrival of winter. He was almost past the highest peaks along the Continental Divide Trail as fall storms laid down the first sheets of snow—not enough to stop him in his tracks, but plenty to slow him. “I was postholing a lot, shin to knee deep above 9,000 [feet]” in southern Colorado’s San Juan Range, he wrote on his blog.
Olshansky, a veteran thru-hiker who went by the trail name Otter, knew snow. He’d often been a southbounder—a sobo, as hikers say—starting at the northern end of a long-distance trail in the spring, before the snow had completely melted, and covering the entire distance of the trail, rather than picking it off section by section. “He was a master, a top expert,” says Art Rohr, one of Otter’s thru-hiking friends. But what Otter had encountered in previous years was spring snow—compacted enough for him to cruise along on top of it.
Blame it on the religieuse pastry, two stacked, chocolate cream-filled puffs that sent me to patisserie nirvana the first week of my long-ago junior year in Paris. When you’re used to Twinkies, that kind of experience is, indeed, a revelation. After marrying a lemon-tart-loving Frenchman and producing a daughter (vanilla macaron) and son (coffee eclair) who share my passion, I thought I had pretty much covered the gamut of French pastries.
Until this past April, that is, when on a Sunday afternoon stroll with an old friend down the Rue de Rivoli, near the Louvre, I realized my guilty pleasure had emerged from the shadows and, seemingly, been embraced by le tout Paris.
This is still an engagingly unusual saga that stakes out a place for its author as a sharp chronicler of an urban demimonde that few will ever experience, for good or ill.
The Happily Ever Esther Farm Sanctuary, about sixty-five kilometres southwest of Toronto, is home to around sixty-five rescued farm animals, including pigs named Hercules, April, and Len, a goat known as Diablo, and a cow whose moniker, Pouty Face, perfectly matches her cuddly demeanour. These animals have come from a variety of places: some are from petting zoos, some have literally fallen off of trucks, and one was abandoned at the sanctuary’s front gate, presumably because its owners could no longer care for it. Other animals hadn’t been cared for at all—two of the sanctuary’s eight sheep were found, still squirming, on a farm’s so-called dead pile, where cast-off cadavers are heaped. Many of the animals at the sanctuary are factory-farm refugees, raised expressly to be consumed. At Happily Ever Esther, they will instead live out their natural lives in comfort and safety.
The animals, or “residents,” as the owners of the sanctuary prefer to call them, inhabit the twenty-hectare farm: the pigs stay in the main barn plus one fenced-in hectare of roamable forest and one of pasture; the chickens overnight in a sun-dappled enclosure also inhabited by two Muscovy ducks and a couple of garrulous peacocks; a few cows, a horse, and a donkey occupy a four-hectare paddock; and a colony of rabbits lives in a condo-like complex known as Bunny Town. But Happily Ever Esther’s eponymous resident lives in neither barn nor paddock. Rather, Esther, a 650-pound, six-year-old pig, shares a farmhouse with the sanctuary’s proprietors, Steve Jenkins and Derek Walter. She has her own bedroom, just off the entrance to the house, though the room has become a bit dingy—the broadloom is stained, the cupcake-patterned wallpaper peeling—as a renovation, which will open her room to the backyard, is imminent. But Esther usually prefers the sunroom, where she can snooze on a tattered, queen-size mattress. The first floor of the house could, in fact, be called Esther Town—paintings and photographs of the pig cover every wall, and porcine sculptures and other tchotchkes occupy most corners. On a shelf above Esther’s mattress are copies of the books that Jenkins and Walter have written about her, as well as a throw pillow emblazoned with “CHANGE the WORLD.”
I don’t want to downplay the underrepresentation of Asians in pop culture—it is stark and it is depressing. But carping on The Joy Luck Club as giving unreasonable prominence to one woman’s experience and imagination obscures the fact that Asian American men can and do tell their own stories. It’s time for Asian Americans to finally forgive The Joy Luck Club for the sin of being the first and only and instead start to think of it as what it has been all along: a brave and beautiful film in a canon long overdue for more.
Whenever I encountered a story about religious quests, I went through the same arc of raised hopes and crushing disappointment. In Bums, Ray Smith encounters the Buddhist poet and adventurer Japhy Ryder, a close simulacrum of real life Zen scholar and poet Gary Snyder. Ray, an alcoholic, semi-homeless wanderer, finds purpose in Ryder’s exhilarating, whirlwind leap through centuries of Zen mystical tradition. They climb the Matterhorn, get drunk, compose poetry, throw wild San Franciscan parties, and eventually part regretfully; Ray Smith still wants to be a lost boy of America, riding the rails, while Snyder is moving to Japan to study his religion in a more serious and authentic way. At seventeen, it was the first time I had read American literature that used Zen as its metaphorical backbone instead of Judeo-Christian principles. Instead of Christ metaphors and prodigal sons, there was the non-duality and emptiness of the Zen poetry and sutras I’d already been reading on the side. Zen is an exhilarating rejection of all the domesticity and obligation, a renunciant tradition that demands its followers cast aside illusions and all self-regard. Monks and nuns are called “home-leavers,” and the Beats fit that perfectly. In the fifties, on the jittery fringes of a monolithic culture, America was ripe for a spiritual awakening. Zen could serve as the engine for sexual, political, and social liberation, and its exploration by figures like Snyder helped to usher in the free love and the radical countercultures of the sixties.
The English novelist Rose Macaulay once sagely described croquet as “a very good game for people who are annoyed with one another, giving many opportunities for venting rancour”. Smack goes the mallet against the ball and off it flies, powered by your politely simmering rage. It’s an underlying sense of this absurdity, perhaps, that makes croquet such an effective device in Thomas Jones’s debut novel. The book begins and ends with the game played by a group of thirtysomethings, uneasy in their friendships and not entirely comfortable in their skins. Jones notices seething excitations beneath the niceties, small acts of sabotage and determinations of desire that ripple through each roquet.
Violent stories, one argument goes, inevitably endorse violence. No matter how much pacifist moralizing a writer brings to a war story, it will always play up the thrill and clamor of battle to some degree. And a murder tale will always stoke our voyeuristic urge to witness violence. That’s not so much because we love violence as much as we love story — the rush of bloodshed is wired to the comforts of a familiar narrative arc.
So what if you bent that arc, or even turned it into a pretzel? What effect does a violent novel have then? Those questions are at the core of “The Arid Sky,” the stellar English-language debut of Mexican-born author Emiliano Monge. By leaping forward and backward in time across most of the 20th century while following one man’s violent life in a dusty mesa town, the novel strips away anything that might be construed as heroic. Instead, it evokes a sense of terrible acts constantly repeating in one place, history grimly folding back on itself. It’s a traditional western cut up and turned into an M.C. Escher print.
Sometimes animals end up in cities because they have nowhere else to go. Other times they happily move in, finding readily available food or other advantages over life in the wild. Chicago’s coyotes, for instance, escape year-round hunting and trapping by staying within the city’s borders. “The city actually serves as a huge refuge for them,” says Stan Gehrt, a wildlife ecologist at The Ohio State University who has been studying the canines for almost two decades. “There are a lot of nooks and crannies in the landscape, places that people don’t use, that coyotes are really good at exploiting.”
One of the great mysteries of urban adaptation is what, if anything, living in cities does to animal minds. Research on urban wildlife has already shown that cities can have jaw-dropping effects on animals’ behavior. Gehrt’s coyotes have not only learned where it’s safest to cross roads, but have also learned to avoid traffic based on its speed and volume. Do behavioral shifts like this reflect deeper changes in how urban animals think? In what urban animals are?
You may not think you know classic service standards, but you probably do. Even if you’ve never worked in a restaurant, spend enough time in upscale establishments and you know the deal: Women are served first, going clockwise around the table, then men are served clockwise. That goes for every step of the service, from how the water is poured to the order in which orders are taken to how plates arrive to (and are set down on) the table. The same goes for wine, though the host (the diner who receives the “taste” pour from the bottle) is served last, regardless of gender. That’s according to the Court of Master Sommeliers, the training most beverage professionals undergo to learn the social graces of good service, which was founded in 1969. (Incidentally, that’s also the era in which women increasingly pushed back on restaurants’ discriminatory practices — in some American cities, women weren’t allowed to enter restaurants, or specific sections of restaurants, unaccompanied by a man until the 1970s.)
But at Chicago’s Tied House, which opened in February 2018, general manager Meredith Rush says there’s a way to provide thoughtful service without relying on those measures of old-school etiquette: Essentially, it’s omitted the idea of “ladies first.” The staff has eliminated language like “ladies and gentleman” from its vocabulary, and no longer serves guests in order of gender performance. “We’ll do our service as elegantly as we would if we were adhering to the classic standards,” Rush says.
Pop quiz: What’s a word you use a hundred times a day — that doesn’t show up in the dictionary?
Give up? Mhmm.
You got it! Mhmm is a small word that’s often used unconsciously. But it can actually tell us a lot about language, bias and the transatlantic slave trade.
Toward the end, Franz goes with a Japanese friend to see an exhibition of calligraphy. “What do you think?” the friend asks, as they gaze at the scrolls.
“I don’t understand anything. But I also can’t look away.”
“Oh. Then you understand,” her friend says.
Franz’s book — a love story, a recovery narrative, a knowingly futile attempt to penetrate “a nation that takes great pride in its impenetrability” — is the same kind of thing. It demands attention, and defies understanding.
In fact, the closer you look, the more it seems that the human brain is hardwired to enjoy the mazes and labyrinths that its own structure resembles. In Stanley Kubrick’s 1980 film The Shining, Wendy and Danny explore the hedge maze outside their hotel, wandering in and out of dead ends. Then, as Jack broods over an architectural model of the maze, the scene cuts to a bird’s eye view of mother and son as they arrive at the centre. “I didn’t think it was going to be this big, did you?” asks Wendy. She could just as easily be talking about the amazing history these books dig up and put on display.
The Icelandic literary maverick and Oscar-nominated songwriter Sjón writes with a poet’s ear and a musician’s natural sense of rhythm. This extraordinary performance, consisting of three books in one – the first originally published in Iceland in 1994, the second in 2001, and the third in 2016 – sets out to entertain, but also to prod the reader towards a stark realisation of human mortality and the games fate plays.
Just 33 days after the appearance of Blast, war was declared. This did not initially seem a threat to vorticism and its fellow movements, but rather an opportunity. As Rupert Brooke, a very different kind of sensibility, put it, many artists welcomed the onset of war “as swimmers into cleanness leaping”. A short and sharply brutal conflict was just what art needed finally to euthanise the past and slough off the fusty clutter of landscapes, nudes and the strictures of the academy. A new and modern art lay just on the other side.
Of course, as it turned out, these brave new movements turned out to be just another casualty of the trenches. Dissolving and faceting the human form was all very well on canvas but it didn’t look so clever in the light of the evisceration and vaporising caused by bullets and high explosives. The machine age promised a utopia in abstract but in practice the machine gun rendered humans merely a bloody and deliquescing smear. Blast was an early war victim: the second and final issue was published in 1915.
At some point in my youth, I learned the art of cruising a library. I hungered to satisfy my gnawing curiosities. I sought something that would make my queerness less unsettled and unsettling. Books, for years, stood in for men. I wandered silently through the stacks, a catalog number in hand, scribbled hastily on a sheet of paper, and delighted in reading books like Andrew Holleran’s Dancer from the Dance in the back corner of the library’s second floor. A borrowing card, tucked neatly at the back of the book, stamped with dates from decades ago, made me wonder how many others had wandered these stacks like me, like Ariadne in the Minotaur’s labyrinth. Literature that reflected the queer experience seemed to me a shared resource, and a public one for those who knew how to look for it. What’s more, the books I came upon made my queerness a cultural, historical phenomenon rather than simply a freakish feeling to be hidden away.
But the knowledge of my own sense of difference should have attuned me to the ways in which reading could illuminate the differences of others. Books could act as more than a mirror—weren’t they also a window?
If literature, as William Giraldi writes in “American Audacity,” is “the one religion worth having,” then Giraldi is our most tenacious revivalist preacher, his sermons galvanized by a righteous exhortative energy, a mastery of the sacred texts and — unique in contemporary literary criticism — an enthusiasm for moralizing in defense of high standards. “Do I really expect Americans to sit down with ‘Adam Bede’ or ‘Clarissa’ after all the professional and domestic hurly-burly of their day?” he asks in an essay bemoaning “Fifty Shades of Grey.” “Pardon me, but yes, I do.” The only insincerity there is the request for pardon: Giraldi is defiantly, lavishly unforgiving.
“American Audacity” is the rare example of a collection that coheres into a manifesto. Its essays were published during the last seven years, many in The New Republic and The Daily Beast, on topics as various as the art of hate mail, Herman Melville’s life and the Boston Marathon bombing (Giraldi, the author of two novels and a memoir, teaches at Boston University and is fiction editor of the literary journal AGNI). But every piece possesses the same moral urgency, which is to say that each advances the same critical argument. A clue to Giraldi’s sensibility can be found in the chapter headings. Literature’s actuarial tables dictate that a younger critic will tend to review, on balance, more elders than youngers. Giraldi is 44, still rosy-cheeked in critic years, but all of his subjects are older than he is, or dead.
The Incendiaries is an extraordinary novel in so many ways: the finely hewn beauty of its language, the layering of its themes, the ways that it reveals truth through narrative unreliability, and the remarkable way it makes one uncomfortable with one’s sympathies. I marveled at its efficiency, that it could do all of this in just over 200 pages, and I very much look forward to Kwon’s next book.
The Politics of Parody is a valuable study that shows how these satires were read, and may still be read, and also demonstrates their importance as cultural signifiers.
Alexiou, the author of books on Jane Jacobs and the Flatiron Building, fills her story with colorful sketches of Bowery luminaries, from the Calvinist martinet Stuyvesant, with his disdain for other faiths, to Thomas “Daddy” Rice, the white song-and-dance man whose “Jump Jim Crow” act became a synonym for brutal racism, to the punk diva Debbie Harry, a mainstay of the lamented nightclub CBGB, a Bowery fixture from 1973 to 2006.
They told me I wouldn’t be able to read anymore. That the pleasure of the text, like a lover in a non-law degree, would slowly grow opaque to me — if pleasure were something I even had time to consider. In exchange, I’d learn how to do other things with words: plow through pages of bad legal prose and extract the principle like an animal’s delicate skeleton. Hold up the skull to the dim courtroom light and proclaim its equivalence to the fossils of a different era, a strange phrenology. Memorize the divots in the bones of critters past. Legal education calls this “learning to think like a lawyer.”
After a few weeks of living that story, my body and I revolted at cross-purposes. The stresses of the program congealed into physical illness, which offended me; more often, panic meant productivity. Rather than resting, I hauled myself to a campus book sale I can only recall in feverish splashes — an indiscriminate hunger to grab and possess; the close press of bodies in airless rooms; violent shivers that kept sending my stack of books askew — and somehow came home with a shelf’s width of volumes: Stendhal and Dickens and DeLillo and Mann; Maugham and Poe and Davies and Irving; Gallant and Munro and Atwood and Moore. Mostly men, all of them white, and completely in violation of my network of rules for used book condition. More striking still was that nothing in the stack seemed to call to me, which was likely strategic. Even fever-drunk — a state in which, apparently, I backslid into canonical reverence — I sensed that it would lessen my feelings of loss if the books I kept around me were not ones I burned to read. Loading up my shelves was more gestural than practical; a finger to the mythos of the law school and a memorial to a version of myself that I refused to let disappear entirely.
Although I myself am uncertain about the extent to which we ourselves are aware of how literature is changing with regard to nature, when you begin to see the ugliness, the ambivalence—the “contamination”—of nature in one place, you begin to see it everywhere.
Shaggy maximalism is the ethic and aesthetic of “Flights.” It is thronged with plots and subplots, flotillas of travelers who barge in to make thunderous confessions and vanish, never to be seen again. You might wish that novels, like elevators and taxis, had a strict maximum carrying capacity; it feels impossible to connect to characters no sooner conjured than whisked away and replaced. Monotony settles in; we read at a remove, which feels cruel given that Tokarczuk’s aim is so clearly to train the eye to see more deeply, more attentively, and to root out hidden consonances and meanings.
The obvious summary would be that Flights is not a novel, but a series of loosely interconnected stories about the human body and movement. But even that seems inadequate; several of the stories are mere stubs, and their connections are loose and liquid. Flights functions more like a cabinet of curiosities, the kind which served as a precursor to natural history museums. Within drawers of different sizes, curious onlookers might find animal and botanical specimens, minerals, and man-made objects.
The sense of some deeply melancholic encounter haunts the pages of Australian writer Shaun Prescott’s winningly glum debut novel, aided by elegiac musings on belonging and estrangement, growth and decay, places and voids, portals and dead-ends.
This “separate form of life” would become known as the archaea, reflecting the impression that these organisms were primitive, primordial, especially old. They were single-celled creatures, simple in structure, with no cell nucleus. Through a microscope, they looked like bacteria, and they had been mistaken for bacteria by all earlier microbiologists. They lived in extreme environments, at least some of them — hot springs, salty lakes, sewage — and some had unusual metabolic habits, such as metabolizing without oxygen and, as the Times account said, producing methane.
But these archaea, these whatevers, were drastically unlike bacteria if you looked at their DNA, which is what (indirectly) Woese had done. They lacked certain bits that characterized all bacteria, and they contained other bits that shouldn’t have been present. They constituted a “third kingdom” of living creatures because they fit within neither of the existing two, the bacterial kingdom (bacteria) and the kingdom of everything else (eukarya), including animals and plants, amoebas and fungi, you and me.
When my mom cooked sambal from scratch, she moved with controlled haste. Her eyebrows would furrow as she used her index finger to mix belacan, a pungent shrimp paste, with water. “Open all the windows!” she would suddenly yell, her warning to my brother, father and me that fiery chiles would be hitting her oiled wok in a few minutes.
Even with windows opened wide, the fumes from sizzling capsaicin invoked coughing fits and heavy breathing.
Making food from scratch often seems to be a luxury. How paradoxical, given the roots of so much cooking lie in thrift. One-time methods of food preservation – fermenting, pickling, salting, curing – have become rustic trends requiring time, money and space, all for results that, though lovely, are hardly essential to feeding a family. Homemaking bread has gone from being a quotidian chore – quite literally, “daily bread” – to a rare thrill, a novelty even, mostly the preserve of a leisure class to whom the skill can be sold back at £150 a class.
Which is why my library is what I call a “sentimental library.” A sentimental library is characterized by memory and association. It’s the halfway point between alphabetical and aesthetic. And, in my case, each book’s placement corresponds not just to when I read it and how I felt, but to whatever activity takes place beneath it now. They are thus animated in a way they might not be otherwise. Like it or not, I am in constant, real-time conversation with their contents.
“To love truth means to endure the void,” wrote Simone Weil in her notebooks, suggesting perhaps that we should resist the temptation to fill the empty spaces of ourselves with our own stories, and instead welcome grace into the absence. In remaining largely unknown, Faye allows Cusk to explore universality, which is a sort of truth, perhaps a sort of grace. “She has a task and she applies herself to it soberly: the trapping, if only in a mirrored surface, of some fragment of reality that might yield a truth about the whole,” wrote Hilary Mantel in 2009. “I had found out more […] by listening,” Faye tells us in the conclusion to the second book, “than I had ever thought possible.” More about the whole, to be sure, but also more about herself.
For myself, every page was a miracle and for Robert a portal into a world he was clandestinely drawn and would eventually immortalize through the image. Artists pillage. A piece of writing, a musical phrase, a statue first regarded with pleasure until the moment, when seized, as Proust accounts, by a powerful joy, he casts off all pretenses of adoration and executes a work of his own. His poetry drew me to write, his imagery drew Robert to the camera.
So if you were to succeed in working with some clever AI system – as Kasparov can today, as the biological half of a human-AI chess ‘centaur’– you couldn’t celebrate that success together. You couldn’t even share the minor satisfactions, excitements and disappointments along the way. You’d be drinking your Champagne alone. You’d have a job – but you’d miss out on job satisfaction.
Moreover, it makes no sense to imagine that future AI might have needs. They don’t need sociality or respect in order to work well. A program either works, or it doesn’t. For needs are intrinsic to, and their satisfaction is necessary for, autonomously existing systems – that is, living organisms. They can’t sensibly be ascribed to artefacts.
I write this in the dead of summer, always a bittersweet season — why is it we got summers off from school for all those years but don’t get summers off from work? — but doubly depressing these days, when I find myself suffering from picnic panic. The hot, languid weather brings with it a series of outdoor family events for which, as a tribal elder, I’m charged with providing provisions. Lately, though, I’ve had my feet cut out from under me. For years — nay, decades — my contributions to the Hingston clan’s Memorial Day and Fourth of July and Labor Day gatherings were no-brainers: I made what my mother once made. She was such a good cook that when she died prematurely, my husband and I typed up and photocopied (quaint, I know) a booklet of her recipes, tried-and-true favorites on which she built her formidable culinary reputation. When the holidays rolled around, I simply re-created one of her delicious dishes and toted it along.
Along about a decade ago, though, I began to notice I was toting home as much of my offerings as I’d concocted. My contributions were being overlooked — or shunned. Why should this be? Mom’s extraordinary potato salad — fragrant with dill, spiced by celery seed — went untouched on the picnic table. So did her macaroni salad, and her chicken salad, and her deviled eggs. … When I carted home a good three pounds of painstakingly prepared Waldorf salad — all that peeling and coring and slicing! — I was forced to face facts: The family’s tastes had changed. Or, rather, our family had changed. Oldsters were dying off, and the young ’uns taking our places in the paper-plate line were different somehow.
One day last November, I dropped my dad’s fountain pen on the floor. Actually it’s been my fountain pen since my dad died half a century ago, but I still think of it as my dad’s pen. Right away I could see that the nib had gone a bit wonky. No good could come of messing with a pen I loved and that was at least seventy-five years old. So the next morning I wrapped it up like a baby and took the bus to my favourite notebook-and-pen store and asked about fixing an ancient fountain pen. It was a busy morning, but a young woman at the counter, who perhaps recognized me as a profligate shopper in the store, went off to fetch Rose, the one who knew about repairs, while I lifted my dad’s pen from its swaddling clothes. When Rose came over, she was smiling but already shaking her head: “I’m sorry, I’m not really doing repairs any more, so . . . oh my gosh, is that,” she said, “that’s a Parker 51!” She drew it from its nest with reverence, noted the wonky nib, thought for a moment and said, “I’ll take it out back and see what I can do.” On her way she showed the pen to another colleague, who ooh’ed and aah’ed and touched the brushed-silver cap—“Sterling! And in such good shape.” By the time Rose emerged from out back, word had got around and a couple of customers were waiting to get a look at my dad’s pen.
Rose had coaxed the nib a bit closer to where it belonged. She said she could nudge it a little more, but it might snap. Should we take the chance? On a bit of test paper I wrote “Dad’s pen with wonky nib” and drew some curlicues, which worked well enough that I decided she should stop there. Off she went to do a bit of cleanup on the pen, but not before showing it to one more worker—a young man, who had never before seen a Parker 51 “in person,” and who I’m pretty sure had tears in his eyes.
Wilson tells this story with meticulous attention to detail and an almost omniscient command of her sources. In doing so she offers a number of small but necessary corrections to the sometimes self-serving inaccuracies of Graves’s own account of the same period, and persuasively argues that Graves’s father made a more significant contribution to his son’s poetic success than Graves was prepared to allow. The real strength of this biography, however, lies in the care and vigour with which it animates the conflicting strands of Graves’s personality. To encounter him in these pages is to feel something of the relentlessly explosive energy with which he lived the first half of his life. Wilson lands him like a Zeppelin bomb.
Quammen offers a readable and largely reliable Baedeker to a fast-moving and complex field of science that is as tangled as the tree of his title. He ultimately concludes that Darwin was not wrong, but that his tree of life was too simplistic. Yet, though Quammen shapes a truly fascinating tale, it’s clear that this story is not yet finished.
A beguilingly light tone masks but never mars Thomson’s impressive scholarship.
Qin Shihuang, First Emperor of China, survived assassination attempts, constantly feared conspiracies, and insisted on secrecy in his movements to the extent of building walls and corridors to disguise them from public view—and to render them invisible to malign spirits. After years of military conquests and bloody massacres he had good reason to fear revenge from victims whose spirits would also continue to live after death and might lie in wait for him. His vision of a lasting dynasty was founded on personal immortality, so death was unthinkable; as a scholar of Chinese religious practices expressed it, writing of the emperor’s Han successors, “Holiness essentially meant the art of not dying.” In fact we know from the biography by Sima Qian that Qin Shihuang hated even hearing conversations about death, to the point that his officials were afraid of mentioning the very word. This obsession was something of a family tradition, for traces of it appear in all the chronicles and histories from the time of King Huiwen onward. Indeed from around 400 bc, a couple of generations before Huiwen, it was believed that some men had managed to liberate themselves from death and had achieved perpetual life. Such beliefs were obviously attractive to kings, and later an emperor, who wished to prolong their reigns.
We’d just emerged from a long and rather liquid dinner on a barge along the Taedong River, in the heart of North Korea’s showpiece capital of Pyongyang. Two waitresses had finished joining our English tour guide, Nick, in some more than boisterous karaoke numbers. Now, in the bus back to the hotel, one young local guide broke into a heartfelt rendition of “Danny Boy.” His charming and elegant colleague, Miss Peng — North Korea is no neophyte when it comes to trying to impress visitors — was talking about the pressures she faced as an unmarried woman of 26, white Chanel clip glinting in her hair. Another of our minders — there were four or five for the 14 of us, with a camera trained on our every move for what we were assured was a “souvenir video” — kept saying, “You think I’m a government spy. Don’t you?”
But I was back in North Korea because nowhere I’d seen raised such searching questions about what being human truly involves. Nowhere so unsettled my easy assumptions about what “reality” really is. The people around me clearly wept and bled and raged as I did; but what did it do to your human instincts to be told that you could be sentenced, perhaps to death, if you displayed a picture of your mother — or your granddaughter — in your home, instead of a photo of the Father of the Nation? Did being human really include not being permitted to leave your hometown, and not being allowed to say what you think?
I teach in a scruffy old building at a small state university in the middle of nowhere. Intro to Poetry takes place in B143, deep in the back end of the basement. The walls are cheaply paneled. No windows. The carpet is thin and stained. A tangle of broken chairs lie in a heap by the door. Fluorescent lights buzz. It’s the first day of class. Ten or twelve students sit silently texting. The clock says 2:20. It’s 12:25.
I hate to break it to parents who just sent their college-admission-minded progeny to the Tibetan Plateau to churn yak butter, but the smartest summer I ever spent was in secretarial school. This was back when I was 17, and it wasn’t grist for an essay about a transformative communion with people outside my clique. I wasn’t ripping the blinders from my eyes. I was typing — hour upon hour, day after day, with my shoulders back and my spine straight and my hands just so.
Once upon a time, a television show was delivered through the cathode-ray tube to American living rooms in orderly half-hour or hour-long bursts. Seasons began like clockwork in the fall and closed up shop in spring; in between was the deathly dull valley of reruns. Predictability ruled the production side as well. Writers spent long hours in airless rooms drumming up jokes or churning out cliff-hangers, but they knew exactly how long they would be there and what they were delivering. In our current age of peak TV, there are no certainties or standard formats anymore. A series can consist of 4 episodes or 24; it might broadcast weekly or stream online all at once in a giant, binge-ready bloc. Networks launch shows at any time of year. As the series in production swell in number, taking on a dizzying array of shapes and sizes, and as cable and streaming channels compete with the major networks for viewers’ attention, there’s also a new fluidity to the way shows are created. Traditional television practices—such as producing elaborate, pricey pilot episodes as the basis on which network executives decide what shows to put on the air—are being reconsidered.
“It’s the Wild West,” says screenwriter Evan Dickson, who’s developing a TV series alongside horror director David Bruckner. “I feel like the landscape changes from month to month.” The partners are currently running a three-week-long mini–writers’ room funded by a studio to “pressure test” eight episodes’ worth of story ideas and “see how they hold water.”
For the last couple of years I've been part of a group of researchers who are interested in where logic comes from. While formal, boolean logic is a human discovery*, all human languages appear to have methods for making logical statements. We can negate a statement ("No, I didn't eat your dessert while you were away"), quantify ("I ate all of the cookies"), and express conditionals ("if you finish early, you can join me outside.").** While boolean logic doesn't offer a good description of these connectives, natural language still has some logical properties. How does this come about? Because I study word learning, I like to think about logic and logical language as a word learning problem. What is the initial meaning that "no" gets mapped to? What about "and", "or", or "if"?
Late on my fourth day hiking the 102-mile Arctic Circle Trail in western Greenland, I encountered smoke rising from the ground. White tendrils, sometimes columns, rose in all directions from charred soil and wisped out from an 800-foot-tall hummocky, granitic hillside to my left. To my right was the 14-mile-long, string-bean-shaped Lake Amitsorsuaq, the biggest of the dozens of lakes we had hiked past since starting the trail. The smoldering ground extended to the lake’s shore and made the supersaturated blues of the water pop even more. While it was plausible that we had wandered into an area dense with steaming thermal features, we hadn’t. Me, my boyfriend Derek and our friend Larry had finally reached one of the wildfires that made international news when they started a few weeks before our 2017 trip.
Translation is a curious craft. You must capture the voice of an author writing in one language and bear it into another, yet leave faint trace that the transfer ever took place. (The translator extraordinaire Charlotte Mandell calls this transformation “Something Else but Still the Same.”) Though spared the anguish of writer’s block, the translator nonetheless has to confront the white page and fill it. The fear: being so immersed in the source text, adhering so closely to the source language, that the resulting prose is affected and awkward—or worse, unreadable. Yet immersion is inevitable. In fact, it’s required.
Like the ghostwriter, the translator must slip on a second skin. Sometimes this transition is gentle, unobtrusive, without violence. But sometimes the settling in is abrupt, loud, and even disagreeable. For me, “plunge deep” tactics that go beyond the mechanics of translation help: coaxing out references to reconstruct the author’s cultural touchstones (books, film, music); reading passages aloud, first in the original and then in translation, until hoarseness sets in; animating the author’s story through my senses, using my nose, my ears, my eyes, and my fingers; devouring every clue to imprint the range of the author’s voice (humor, anger, grief, detachment) on my translation.
“It was possible that femininity, as I had been taught it, had come to an end,” Levy writes, tired of serene femininity and of corporate femininity. “There were not that many women I knew who wanted to put the phantom of femininity together again ... it is a role (sacrifice, endurance, cheerful suffering) that has made some women go mad.”
The task is both to create a new life and to redefine what being a woman means. Albertine returns to singing and buys a new haphazard home for herself and her daughter. Levy discards the marital home and installs her daughters in a flat, where she mends the plumbing in her nightie and transports her groceries on a liberating electric bike. One female friend teaches her to “live with colour” and another provides a writing shed.
At long last, someone has finally gotten it right. In “Chesapeake Requiem,” author Earl Swift masterfully reveals Tangier as it is — a proud but struggling community of fewer than 500 people trying to hold on to what they can amid unending hardship and isolation.
“I realized this morning,” said my friend Leah, “that this is who I am, here in Tokyo. I am a person who waits.” We were, at that moment, 23rd and 24th in line at Fuunji Ramen, surrounded front and back by locals and tourists, part of a neat queue that snaked out the restaurant’s entrance to the curb, where it broke for the tarmac only to pick up again in the grassy park across the street. Every few minutes, the noren curtain hanging in front of the door would twitch, discharging bodies into the Tokyo dusk, and we would steadily shuffle forward. To pass the time in this line, Leah was telling me about another: her wait the previous morning at Sushi Dai, the legendary morning omakase restaurant and sushi bar at the Tsukiji fish market, where even showing up at 3 a.m. may not be enough lead time to guarantee a first-round seat when the restaurant opens for breakfast at 5:00.
A dreamy otherworldliness haunts these pages, and will, I wager, haunt you, as it did me, long after you finish this slim and masterful mood piece. I dare you to make it to the final, piercing line—which I won’t spoil here—and not feel as if the world you live in has been irrevocably changed.
In the fiction of Schreber’s madness, every person is, as he puts it, a “plaything of the Lower God.” In the reality that Schreber lived, the mentally ill were playthings of the “well,” children were playthings of adults, and minorities were playthings of the state. It is this economy of cruelty — not repressed homosexuality, as Freud suggested in an essay on Schreber’s memoir — that is the seed of Schreber’s suffering. Pheby illustrates this point with compassion and subtlety in “Playthings”; the book’s hybrid position between the historical and the fictional makes it all the more potent.
The last 50 pages of the book read like a hasty after-action report, and Andrei should be pretty miffed with his author for imposing on him a denouement, and diminution, not only rushed but, in part, difficult to believe. Yet even here the novel manages to offer hard-won insight into an impossible place. I don’t know if “A Terrible Country” is good fiction, but you won’t read a more observant book about the country that has now been America’s bedeviling foil for almost a century.
But if you ignore the fluff, here’s a clear and frequently interesting survey of Aristotle’s thought on everything from virtue, work and friendship to the natural world, God and the good death, together with biographical snippets and personal reflections, from an author who has clearly read Aristotle well and thoughtfully and many subsequent philosophers to boot. So, as Aristotle would certainly have asked had he been writing this review, why should I be a kuminopristes about it?
I had no hearing aids until I came to America. The Odessa I know is a silent city, where the language is invisibly linked to my father’s lips moving as I watch his mouth repeat stories again and again. He turns away. The story stops. He looks at me again, but the story has already moved on.
Decades later, when I come back to this city, I don’t feel I have quite returned until I turn my hearing aids off.
Click — and people’s lips move again, but no sound.
No footsteps of grandmothers running after their grandchildren. No announcements by tram-conductors as the tram stops at a station and, finally, I jump off.
A cab whooshes by me and abruptly parks at the curb. I do not hear the screech of its brakes.
This is the Odessa of my childhood: my father’s lips open, in Proviantskaya Street. I see a story. He bends to pick up a coin. The story stops. Then, as he straightens up and smiles at me, it is a story again.
It holds that everything in nature is made up of tiny, immutable parts. What we perceive as change and flux are just cogs turning in the machine of the Universe – a huge but ultimately comprehensible mechanism that is governed by universal laws and composed of smaller units. Trying to identify these units has been the focus of science and technology for centuries. Lab experiments pick out the constituents of systems and processes; factories assemble goods from parts composed of even smaller parts; and the Standard Model tells us about the fundamental entities of modern physics.
But when phenomena don’t conform to this compositional model, we find them hard to understand. Take something as simple as a smiling baby: it is difficult, perhaps impossible, to explain a baby’s beaming smile by looking at the behaviour of the constituent atoms of the child in question, let alone in terms of its subatomic particles such as gluons, neutrinos and electrons. It would be better to resort to developmental psychology, or even a narrative account (‘The father smiled at the baby, and the baby smiled back’). Perhaps a kind of fundamental transformation has occurred, producing some new feature or object that can’t be reduced to its parts.
Nominally a book that covers the rough century between the invention of the telegraph in the 1840s and that of computing in the 1950s, The Chinese Typewriter is secretly a history of translation and empire, written language and modernity, misguided struggle and brutal intellectual defeat. The Chinese typewriter is ‘one of the most important and illustrative domains of Chinese techno-linguistic innovation in the 19th and 20th centuries … one of the most significant and misunderstood inventions in the history of modern information technology’, and ‘a historical lens of remarkable clarity through which to examine the social construction of technology, the technological construction of the social, and the fraught relationship between Chinese writing and global modernity’. It was where empires met.
Don’t Stop the Presses! tells the story of a mandate. Morrison explains how newspapers are interwoven into the fabric of American culture, society, and politics. Freedom of the press is, of course, consecrated in the First Amendment of the US Constitution. Our founding fathers were also publishers: Benjamin Franklin, Alexander Hamilton. Small frontier towns set up newspapers as quickly as they incorporated, bringing in and operating prized presses in ingenious ways. “Any new settlement of a few hundred people didn’t consider itself a real town until someone had lugged a printing press over mountains or across deserts to print a newspaper,” she notes. Cheyenne, Wyoming, had four papers and 1,000 people in 1867. In Quilcene, Washington, a water wheel powered the Megaphone.
Ever since the Renaissance, the sciences have dealt human beings a steady stream of humiliations. The Copernican revolution dismantled the idea that humanity stood at the center of the universe. A cascade of discoveries from the late-18th to the early-20th centuries showed that humanity was a lot less significant than some of us had imagined. The revelation of the geological timescale stacked millions and billions of years atop our little cultural narratives, crumbling all of human history to dust. The revelation that we enjoy an evolutionary kinship to fish, bugs, and filth eroded the in-God’s-image stuff. The disclosure of the size of the galaxy—and our position on a randomly located infinitesimal dot in it—was another hit to human specialness. Then came relativity and quantum mechanics, and the realization that the way we see and hear the world bears no relation to the bizarre swarming of its intrinsic nature.
Literature began to taste and probe these discoveries. By the 19th century, some writers had already hit upon the theme—meaninglessness—that would come to dominate the 20th century in a thousand scintillating variations, from H.P. Lovecraft’s Cthulhu stories to Samuel Beckett’s plays. But by the turn of the new millennium, it has become clear that this sense of meaninglessness is no longer up-to-date.
And then this year, winter never came. I watered the trees in our yard in early February. On April Fool’s Day, I hiked to 11,000 feet without snowshoes. A friend and her husband, who were planning a spring trip to Montana, said they wanted to scope it out as a place to live. “We can’t have all our money tied up in property in a place that’s going to run out of water!” she told me.
I began to worry, too, that after a long and frequently distant romance, I’d married us to a town without reckoning with the particulars of its future. How likely is this place to become barren? How soon? Will we have the tools to endure it? We’d eloped.
Now, in this rapaciously dry year, a quiet question grew louder: What are we doing here? I felt a sudden need to understand what Colin and I stood to lose as the heat intensified and the world dried out. And I wondered if we should leave.
Devotees of fanfiction will sometimes tell you that it’s one of the oldest writing forms in the world. Seen with this generous eye, the art of writing stories using other people’s creations hails from long before our awareness of Twilight-fanfic-turned-BDSM romance Fifty Shades of Grey: perhaps Virgil, when he picked up where Homer left off with the story of Aeneas, or Shakespeare’s retelling of Arthur Brookes’s 1562 The Tragical History of Romeus and Juliet. What most of us would recognise as fanfiction began in the 1960s, when Star Trek fans started creating zines about Spock and Captain Kirk’s adventures. Thirty years later, the internet arrived, which made sharing stories set in other people’s worlds – be they Harry Potter, Spider-Man, or anything and everything in between – easier. Fanfiction has always been out there, if you knew where to look. Now, it’s almost impossible to miss.
To solve these little mysteries, Glazer recently assembled a team of sleuths from across the branches: Chatham Square, in Chinatown; the Jefferson Market, in Greenwich Village; the Andrew Heiskell Braille and Talking Book Library, near the Flatiron Building; and the Mulberry Street branch, in Nolita. At lunchtime on a recent Wednesday, they were gathered in that computer lab in the library’s offices—across the street from the soaring, spectacular Stephen A. Schwarzman Building (the Main Branch)—to nibble on homemade lemon rosemary cookies and apple, carrot, zucchini bread while they clattered away on their keyboards. Other members of the team participated remotely. The “Title Quest” hackathon was underway.
Throughout this outstanding collection, there is the sense of an elsewhere, at once tantalisingly close and unreachable.
Quammen doesn’t just give us stories of solitary toil and triumph. Every discovery is couched in a life with its particular constraints and spurs — not least the power (or catastrophe) of personality. For all Woese’s brilliance, it can be argued that he stood in the way of his success; he was a disinterested lecturer, a collector of petty grievances. We see how women scientists in the field were mocked and marginalized — even those with some standing, like Lynn Margulis, whose groundbreaking theory revealed how eukaryotic cells (which include most cells in the human body), developed symbiotically with bacteria. She was blunt about the professional and personal burdens on women: “It’s not humanly possible to be a good wife, a good mother and a first-class scientist.” She eventually opted for the latter two.
When rivers flood now in the United States, the first towns to get hit are the unprotected ones right by the river. The last to go, if they flood at all, are the privileged few behind strong levees. While levees mostly are associated with large, low-lying cities such as New Orleans, a majority of the nation’s Corps-managed levees protect much smaller communities, rural farm towns and suburbs such as Valley Park.
But why Valley Park? It wasn’t the biggest city or largest employer along the Meramec. Its neighboring towns all had homes and industry in harm’s way, too. But after almost a century of planning to protect all these communities, the federal government built a single 3-mile levee, shielding the low-lying area of just one town.
When people die after suffering from schizophrenia, bipolar disorder, depression, opioid abuse or some other mental disorder, Lipska’s team works with local medical examiners to collect their brains. There is a sense of reverence when one comes in. Each brain is a clue in an effort to understand mental illness, which is the subject Lipska has spent her life studying — including, in a roundabout and unexpected way, when her own mind went dramatically wrong three years ago.
It was January 2015 when Lipska reached out to turn on her work computer and something peculiar happened: Her right hand disappeared into a kind of black hole. When she moved her hand to the left, it reappeared within her field of vision. She immediately feared something might be awry in her brain.
The environmental effects of plastic buildup and the declining popularity of plastics have helped to spur chemists on a quest to make new materials with two conflicting requirements: They must be durable, but degradable on command. In short, scientists are in search of polymers or plastics with a built-in self-destruct mechanism.
As cultures of consumption change and people become more environmentally conscious, homes must change to reflect this. Designing homes around “entertaining” that happens only a handful of times a year is a wasteful, yet still mindbogglingly popular practice. When people come to visit, they are there to see you, not your open concept.
It may not be as glamorous, but the closed kitchen is actually more efficient for cooking than the sprawling, open “chef’s kitchens” that are so popular. It enables whoever does the cooking to take fewer steps to perform tasks. The chef’s kitchen follows the wasteful logic of the 1920s: Instead of moving the sink closer to the stove, builders install a pot filler or a second sink in a center island. Instead of closing in the main kitchen to isolate the disorder of food preparation, developers are building “mess kitchens” for this purpose.
People are always saying, “I have an idea for a story.” But if a story starts in an idea it might as well give up and be a novel. I think ransacking your mind for story ideas builds up an immunity to the mysterious form itself. At some point you have to bow to the story’s elusiveness and refusal of paraphrase, that is, of expression as an idea. As Lucia Berlin said, “Thank God I don’t write with my brain.”
For a long time, I thought that the job of a writer was akin to that of an ethnographer. I needed to collect the best stories and write them down, with a few technical twists. Besides, I had so many at my disposal, with wild plots and characters, set in unique landscapes. It would be a shame not to put them to use, to write anything other than the riches I’d been given wholesale. And yet, there was also the nagging suspicion that these stories didn’t belong to me—they were too big, too loud—and that I didn’t belong to that lineage of charismatic storytellers.
Yet there is genuine appeal in watching this indomitable woman continue to chase the next draft of herself. After a while, the pages turn themselves. Tomalin has a biographer’s gift for carefully husbanding her resources, of consistently playing out just enough string. When she needs to, she pulls that string tight.
Meet Me at the Museum is a touching, hopeful story about figuring out what matters and mustering the courage to make necessary changes. At one point, Anders writes encouragingly to his despairing penpal, "Please do not be angry with the circumstances of your life ... nothing is so fixed it cannot be altered." Both the substance and very existence of this impressive late-life debut bring to mind a nugget of advice imparted to a friend by his wise therapist: "Life's open-ended if you can get there."
Early in the morning of 16th October 2012, seven valuable artworks were stolen from the Kunsthal in Rotterdam. The theft was world news. But what first seemed like a sophisticated burglary by professionals, turned out to be the work of a few small-time Romanian criminals who had no idea what they were getting themselves into. They knew about house burglaries, not art, and they certainly didn’t know about selling art.
This is the story of the Kunsthal robbery, based on the case files and conversations with those involved.
My mother Nadia and I were going back to a place we’d never been. Perhaps it was sorrow or the fact that we had purchased last-minute plane tickets and had to consult these maps in haste, but the eyes played tricks. Street names had changed after independence. Our family had left Tunisia under occupation and had become accustomed to that state. Liberation confused me. In reality, the map movements I observed had taken decades and cost many lives. Prisons rang with gunshots and our anthem. But thumbing through paper and digital maps—old maps, re-creations from memory, Google Maps—the avenues appeared to crack and shift in seconds.
I’m not certain there had been municipal maps of the same nature before. Maps are Arab, but I am not sure that’s true of order, in the constrictive Western sense, down to the minutia of city grids, precisely marked streets and plots of land. Maybe these were a prison we’d come to accept. It’s hard to know what came before, since before only exists theoretically. The past had been thoroughly decimated. When my grandmother Daida died, she took with her an entire civilization, containing time immemorial, and all I could do now was to match maps to know where we were, and that we were, once.
I’ve always been wary of oversharing with my husband. I often tell my friends my news before I tell him. I base this on something I read a long time ago that stuck in me, something about keeping boundaries with the people you love so as not to overrun their love with too much knowing, to keep the relationship exciting—some Esther Perel shit that I can’t actually source in her book. (I have that book at my bedside, but let’s be real; I never actually read it. My husband did and I asked him to summarize.) Anyway, I tweaked it in some weird way in my mind to mean never tell your partner about your life.
When my doctor tells me about the endometriosis, I don’t tell my husband for a few days. A weekend passes and at the end of it I tell him on the couch on Sunday night, and he leaps up—“WHY DIDN’T YOU TELL ME?”—alarmed and gripping my hands. I stand awkwardly with him and say, “I’m telling you now.” But it’s the wrong time, I see, and I’ve been somehow disloyal by not doing it in time.
“Who else knows?” he asks; then, shaking his head before I can answer: “You never tell me anything.”
Japan’s sense of the supernatural seemed to be shifting: from the living working to pacify the spirits of the dead, to the dead being called upon to pacify the spirits of the living, rescuing them from the uncertainties – and misplaced certainties – of modern life, and recalling them to older, more natural and fulfilling ways of perceiving and living in the world.
Fast-forward a century, and in the shadow of the disasters of 2011 – earthquake and tsunami, followed by a nuclear meltdown – the ghosts of Japan seemed once again to be up to something new.
As one of the greatest challenges facing the planet, climate change deserves serious treatment by a great writer who combines deep reporting with a compelling literary style — someone who can explain the overwhelming scientific evidence of warming and its human causes, and of the need for action.
William T. Vollmann would seem to be just the writer for that challenging project. Superhumanly prolific and willing to take on the toughest topics, he packs research and voice into his impassioned works. “Rising Up and Rising Down,” his exploration of violence, spans seven volumes. He is also a celebrated novelist, winning the National Book Award in 2005 for “Europe Central.”
So is this the book on climate change we’ve all been waiting for?
For anyone who has ever kept a diary, Sarah Manguso’s Ongoingness (first published in the US in 2015) will give pause for thought. The American writer kept a diary over 25 years and it was 800,000 words long. She elects not to publish a word of it in Ongoingness. It turns out she does not wish to look back at what she wrote. This absorbing book – brief as a breath – examines the need to record. It seems, even if she never spells it out, that writing the diary was a compulsive rebuffing of mortality. Like all diarists, she was trying to commandeer time. A diary gives the writer the illusion of stopping time in its tracks. And time – making her peace with its ongoingness – is Manguso’s obsession. Her book hints at diary-keeping as neurosis, a hoarding that is almost a syndrome, a malfunction, a grief at having no way to halt loss.
But at least the archeologists are happy. “It’s a bit like kids in a candy shop,” Robert Bewley, an aerial archeologist at the University of Oxford, told me, a few days ago. The freak conditions have made this summer one of the best in living memory for what archeologists call “parch marks”—ghostly, pale outlines of vanished castles, settlements, and burial sites that materialize on the land when it dries out and grass and crops die off. In recent weeks, archeologists in light aircraft, hobbyists with drones, and even people walking through their local parks have discovered Iron Age farms in South Wales, a Roman road passing near Basingstoke, burial mounds in Ireland, and the outline of Second World War bomb shelters on the lawns of Cambridge. Seen from above, the parch marks have a magical quality, as if a giant had doodled them from memory, but they are also disconcertingly real. They are only there because something else was.
Everyone wants to ask Dick Cavett the same question, and it is a question that he never wants to answer: Of all today’s talk-show hosts, who is the “next Dick Cavett”?
“Well, that’s an awkward subject matter for me, because I know all of them,” Mr. Cavett, 81, said on a recent sunny Thursday afternoon at his sprawling country house in Connecticut. “I’m not addicted to talk shows. God knows, I’ve spent enough time on them.”
As in Mr. Cavett’s 1960s and ’70s heyday, the country is in a period of turbulence, with racial tensions flaring, protests in the streets, and a fundamental ideological fissure. The hosts who have emphasized substance, who have “gone political,” have been praised and nominated for Emmys.
But “the next Cavett”? Is such a thing possible?
If only.
In the early 20th century, Americans were hungry for a quick bite. Yet long hours and late nights made going home to eat difficult. Through that, entrepreneurs saw an opportunity. It might come as a surprise to know that all aspiring restaurateurs had to do to fill this demand was to order a pre-made diner, modular and modern, often looking rather like a train car. It would even likely arrive by train.
The ballpoint pen’s creation story reminds you of the fact that it was once considered an exciting piece of technology.
Unlike the smartphone, which still shows glimmers of innovation despite being everywhere, the ballpoint pen has fallen into the tableau of everyday life, never really standing out unless you really spend time thinking about it.
But as an analog device, it solved a lot of problems, and it took a few tries to get right. The man to first patent the object, John J. Loud, did so about 60 years before the average person saw one in person, in part because the design was imperfect and it needed more work before it was ready for prime time.
Do you remember when you learned to read? Like most of us, I don’t. Still, many people can take comfort in knowing that this event, beyond memory, involved our parents. The parents who took us to school, who read books to us at home, who could speak to us in a shared language. But in my case, one of the things I lost as a refugee, without even knowing it at the time, was a childhood where my parents would have read to me.
I never thought I could hold a baby for an hour — my head a few inches from hers, hanging on every sigh, waiting intently for the next scrunch of her lips or arch of her barely visible eyebrows — perfectly happy, an idiot entranced by a magic trick. But there I was on my granddaughter Avery’s first day of life, so happy I didn’t recognize myself.
I have raised children. Five of them. I have held my own babies in their first minutes of life; I have felt that shock of recognition — this is a version of me. I have kvelled (a Yiddish word meaning a giddy mixture of pride and joy) at the things my babies did that all babies do. But I have never felt this thing that stopped my brain, that put all plans on hold, that rendered me dumb.
O.K., I’ve had glimpses of this thing. But this was my first uninterrupted hour of it.
These Penguin Minis from Penguin Young Readers are not only smaller than you’re used to, they’re also horizontal. You read these little books by flipping the pages up rather than turning them across. It’s meant to be a one-handed maneuver, like swiping a screen.
The Third Hotel, Laura van den Berg’s gorgeously eerie second novel, begins with a question, one that protagonist Clare returns to again and again: “What [i]s she doing in Havana?” Dense and uncompromisingly intelligent, The Third Hotel is uninterested in leading the reader to a simple answer. Buoyed by van den Berg’s sinuous, marvelous sentences, the novel is instead a deep dive into memory, love, and loss as filtered through film theory, metaphysics, and the humid, sunstroked cityscape of Havana. A lesser writer might have lost themself in this byzantine world of maybe-doppelgangers and maybe-zombies and maybe-madness, but Laura van den Berg is one of our most accomplished storytellers—it is no surprise that she has elevated the uncannily horrifying into something achingly human.
This kind of sensitivity to language is unusual in a book intended for a popular audience. Whether they are drawn from legendary ancient historians or unsung modern eyewitnesses, moments like this one are what put Kneale one step ahead of most other Roman chroniclers.
Purple is a paradox, a contradiction of a colour. Associated since antiquity with regality, luxuriance, and the loftiness of intellectual and spiritual ideals, purple was, for many millennia, chiefly distilled from a dehydrated mucous gland of molluscs that lies just behind the rectum: the bottom of the bottom-feeders. That insalubrious process, undertaken since at least the 16th Century BC (and perhaps first in Phoenicia, a name that means, literally, ‘purple land’), was notoriously malodorous and required an impervious sniffer and a strong stomach. Though purple may have symbolised a higher order, it reeked of a lower ordure.
It took tens of thousands of desiccated hypobranchial glands, wrenched from the calcified coils of spiny murex sea snails before being dried and boiled, to colour even a single small swatch of fabric, whose fibres, long after staining, retained the stench of the invertebrate’s marine excretions. Unlike other textile colours, whose lustre faded rapidly, Tyrian purple (so-called after the Phoenician city that honed its harvesting) only intensified with weathering and wear – a miraculous quality that commanded an exorbitant price, exceeding the pigment’s weight in precious metals.
What is freedom? We who came of age half a century ago found ourselves well placed by unearned good fortune to test its limits. Our parents, having suffered the privations of the Great Depression and the anxieties of World War II, had subsequently harnessed themselves to the task of rebuilding. From their discipline emerged a world of prosperous plenty sicklied o’er with the pale cast of gray-flannel conformity and lonely crowds. We wanted more. Throughout 1968, our inchoate desire bubbled over into the public sphere.
The nature of that desire—perhaps I should call it a yearning, because it was vaguer than desire, limitless and without object—was vividly evoked by a surprising witness to what happened in Paris that spring: Yves de Gaulle, the grandson of Charles de Gaulle, who was president of France at the time of the May ‘68 student uprising turned general strike, and whose grip on power was loosened by what the French to this day simply refer to as “the events.”
In many ways, we still think like Aristotle. Most everyone strives for happiness. Today, the standing dogma is that purposelessness and disorder are nihilistic. Whether you’re mulling a major life change or healing from trauma, being told that there’s no purpose in life might be particularly devastating. The chances are better that you’re looking for an ultimate explanation. Or you could simply be searching for that something or someone meant for you — God, a soul mate or a calling of sorts.
I’m certainly no Aristotelian. Not because I reject happiness. Rather, as a materialist, I think there’s nothing intrinsic about the goals and purposes we seek to achieve it. Modern science explicitly jettisons this sort of teleological thinking from our knowledge of the universe. From particle physics to cosmology, we see that the universe operates well without purpose.
Before I started work on this manuscript, I was sufficiently familiar with these subjects that nothing I’ve read, so far, has truly shocked me. I keep thinking I’ve become habituated; that, like a vaccination containing a measure of attenuated bacteria, the hundreds of thousands of words have immunized me against the horror of it over time.
But it has turned out be more like a dose of a something that lies dormant before metastasizing. Neither of my previous books involved reading of this magnitude. Never before have I shrouded myself in material of this nature for so long, and so intensely. I want to make something that feels real, to capture the emotional temperature of the era and places I’m writing about. Research is critical for verisimilitude. But there is something demented in making yourself read this stuff. And I’ve always been hyper-cognizant of becoming a trauma tourist.
I want to write about insidious, cumulative weight. I’m trying to write about writing about trauma, and the ways it changes your brain.
With so much life waiting in my reading list, I'm ready to leave my other ghosts behind. But next time you put down a book, remember this: It's not you. It's not the book, either. (OK, maybe it's the book.) It's the timing. A year down the road, maybe more, that book might be just the thing you need. Maybe you need to grow into it; maybe it needs to grow into you. But you're not going to discover that connection if you pretend it never happened. Anything can drive you away from reading—but only a book will bring you back.
Here Naomi Novik has gathered countless old tales and turned them into something all kinds of new. The theft of summer, a burning demon who lives inside a prince, a witch’s hut in the woods, the secret power of names, the frozen winter road that winds its way through the depths of the forest — they’re all here.
But she also borrows our everyday truths: the way a family can disintegrate into violence, the way a ghetto can be disappeared, how the everyday persecution of Jews can erupt into mass violence, the magic of young children becoming people, the creation of food and clothing and blankets and shelter from plants and animals.
By focusing on less exalted characters, often of a literary bent, Ms Baker produces a highly readable and intimate view of an unusual time and place. At times her fluent writing beguiles: it is easy to forget this is non-fiction and wonder how a novelist might have invented a more satisfying plot for her well-sketched characters.
Sometimes the best way to understand how a “normal” brain works is to explore those that are unique. Such is the insight in reporter Helen Thomson’s enjoyable new book, “Unthinkable: An Extraordinary Journey Through the World’s Strangest Brains.” Unsatisfied with the “cold and impersonal” accounts that make up the bulk of modern case studies, she reaches out to the humans they feature to get a fuller picture of their lives. She goes one step further than her idol Oliver Sacks: Instead of interviewing them in a clinical setting, she meets them on their own turf — in their homes, favorite restaurants and other haunts of regular life. What is it like, she constantly asks, to live with a brain that is so incredibly different?
Drawing on interviews with his unhappy correspondents, he defines a (expletive) job as employment “that the worker considers to be pointless, unnecessary, or pernicious.” Not all bad jobs are unnecessary. Some workers — in municipal services and the restaurant and transportation industries, for example — may do menial, unpleasant work, but if they all quit, the rest of us would notice. This may not be the case, Graeber writes, with some higher-paid workers such as “private equity CEOs, lobbyists, PR researchers, actuaries, telemarketers, bailiffs, or legal consultants.” But don’t take Graeber’s word for it. If the polls are correct, a substantial percentage of these workers would agree.
Complementing Graeber’s sharp analysis of white-collar ennui, the journalist Sarah Kessler reports on the burgeoning gig economy in “Gigged: The End of the Job and the Future of Work.” She follows freelancers as they try, and mostly fail, to find a better way to make a living. Taken together, the professor and the journalist offer a deep look at what Graeber calls our “civilization based on work” — and what’s so often unsatisfying about living in it.
The inaugural chapter of the climate-change saga is over. In that chapter — call it Apprehension — we identified the threat and its consequences. We spoke, with increasing urgency and self-delusion, of the prospect of triumphing against long odds. But we did not seriously consider the prospect of failure. We understood what failure would mean for global temperatures, coastlines, agricultural yield, immigration patterns, the world economy. But we have not allowed ourselves to comprehend what failure might mean for us. How will it change the way we see ourselves, how we remember the past, how we imagine the future? Why did we do this to ourselves? These questions will be the subject of climate change’s second chapter — call it The Reckoning. There can be no understanding of our current and future predicament without understanding why we failed to solve this problem when we had the chance.
That we came so close, as a civilization, to breaking our suicide pact with fossil fuels can be credited to the efforts of a handful of people, among them a hyperkinetic lobbyist and a guileless atmospheric physicist who, at great personal cost, tried to warn humanity of what was coming. They risked their careers in a painful, escalating campaign to solve the problem, first in scientific reports, later through conventional avenues of political persuasion and finally with a strategy of public shaming. Their efforts were shrewd, passionate, robust. And they failed. What follows is their story, and ours.
Rich’s piece makes for vivid history. It alludes to plenty of real changes: the fracturing of a unified political elite, the breakdown of the alliance between high science and the national-security state. But it’s tempting, when revisiting the past, to assume that everything was better then. The history of just about every major American political issue is contested; it’s not surprising that climate change should be the same. But any sensible narrative of climate politics has to start and finish with the idea that opposition to climate policy grew in parallel with the scientific case for action. Telling the wrong story makes the case for action looks easier than it is.
Je suis la jeune fille: though I’ve never formally studied French, I’ve had that phrase stuck deep in my linguistic consciousness since childhood. So, surely, have most Americans of my generation, hearing it as we all did over and over again for years in the same television commercial. Frequently aired and never once updated, it advertised a series of language-instruction cartoons on videotape. Even more memorable than the French words spoken by that young girl were the English ones spoken by the product’s both grandmotherly and severe pitchwoman: “Yes, that’s French they’re speaking, and no, these children aren’t French, they’re American. And they’ve acquired their amazing new French skills from Muzzy.”
In those same years, an early episode of The Simpsons saw Bart sent off to France, an ostensible student exchange meant to punish him for his constant pranks. He spends two months in the French countryside mistreated by a couple of crooked vintners who, in a plot point ripped from the headlines of the era, spike their product with antifreeze. When a shoeless and disheveled Bart finally spots a passing gendarme, he can’t make himself understood in English. Only when he reaches the brink of emotional breakdown does he realize that, unconsciously and effortlessly, he has internalized the French language: “Here, I’ve listened to nothing but French for the past deux mois, et je ne sais pas un mot. Attendez! Mais, je parle Français maintenant! Incroyable!”
All this convinced me, on some subconscious level, that to learn a foreign language meant almost by default to learn French. Sufficient exposure to the sounds of French, I also gathered, might lead to fluency by osmosis. More than a quarter-century later, French President Emmanuel Macron has set about spending hundreds of millions of euros on an international campaign to reintroduce versions of those now unpopular notions: that his country’s language is easily acquirable, and that it’s worth acquiring in the first place. Macron believes, as he told a group of students in Burkina Faso last year, that French (which in number of speakers currently occupies sixth place behind Mandarin Chinese, English, Hindi, Spanish, and Arabic) can potentially become “the number one language in Africa and maybe even the world.”
Still, Kwon delivers a poignant and powerful look into the millenial mindset. It can be rocky, but it can also rock.
Two new books are here for those who resolutely do not want to be told that everything is O.K., or even that everything might become O.K.
In The Incendiaries Kwon has created a singular version of the campus novel; it turns out to be a story about spiritual uncertainty and about the fierce and undisciplined desire of her young characters to find something luminous to light their way through their lives.
The danger is that, with overreliance on automated decision-making systems, there will be no need for thinking (and perpetually rethinking) justice either in its general outlines or in its application to particular cases. Computerized calculation will take the place of deliberation and will be responsible for further concealing the unjust intentions programmed into the “algorithms of justice.” And few things are more harmful to our vital ability to seek the meaning of the world and of our place in it than that.
Then I started using a wheelchair. Suddenly, stairs became a barrier that prevented me from getting from here to there. One step was often enough to stop me in my tracks. It turns out that when you start using a wheelchair, you quickly realize that there are a lot of staircases and steps in our world—and a lot of broken (or nonexistent) elevators and ramps.
“If we don’t see ourselves in the writing, then it’s not inviting to us.”
Once you start realizing how many stairs there are stopping you in real life, it becomes impossible not to notice them existing in the sci-fi you adore. Turns out they’re everywhere, in all of our sci-fi. Whether it’s decades-old or shiny and brand-new, our sci-fi imitates a real-world reliance on steps and stairs in our architecture.
As a narrative about the restoration of the status quo, the detective story parallels the situation engendered by the War. Like the crime that sets the detective story going, the War was an intrusion into what must—in juxtaposition with all the bloodshed, loss, and devastation—have seemed like an idyllic past. One wants to reclaim it, to set the upheaval of the War aside and go back to those peaceful times. One can never really go back, of course—the world after the War was different from the world before—but it was peace again, at least . . . Or was it? There was still that lingering desire to see the destructive intrusion defeated and the world made right again.
The annals of literature are packed with writers who also practiced medicine: Anton Chekhov, Arthur Conan Doyle, William Carlos Williams, John Keats, William Somerset Maugham, and on and on. As doctors, they saw patients at their most vulnerable, and their medical training gave them a keen eye for observing people and what makes them tick.
But if studying medicine is good training for literature, could studying literature also be good training for medicine? A new paper in Literature and Medicine, “Showing That Medical Ethics Cases Can Miss the Point,” argues yes. In particular, it proposes that certain literary exercises, like rewriting short stories that involve ethical dilemmas, can expand doctors’ worldviews and make them more attuned to the dilemmas real patients face.
In introductory studio art classes students are often assigned a negative-space drawing—that is, they are asked to draw everything surrounding a figure, filling up the page, until the blank shape of the figure emerges. This has been Rachel Cusk’s technique in her last trio of novels—a trio referred to appropriately as the Outline trilogy—and it is a little puzzling that more people haven’t thought to write novels in this manner before. Perhaps we go to fiction for the solitary inner life of one character and her actions against the confining tenets and structures of her society (though Cusk’s trilogy manages this as well) rather than for everything surrounding her—in this case, linked and paraphrased soliloquies of secondary, even tertiary, characters upstaging and downstaging the ostensible protagonist. This is not a cinematic way of writing a novel but it is theatrical, with sudden arias and contiguous monologues reminiscent of the plays of Brian Friel.
You could read The Third Hotel as an ode to watching. You could read it as a fever dream, a horror movie, a love letter to film theory or Cuba or women who keep secrets. The Third Hotel is a novel that operates in symbols and layers, which means you can read it however you like. There's no one ending, no right answer, and as a result, it will take away your internal compass. It will unmoor you, send you wobbling around your house in a haze. It will slide some eels under your skin. My recommendation? Let it. We can all stand to learn some new truths.
But it is the food of the Levant — particularly Lebanon and Syria — for which the book reserves the greatest affection, with references to the grilled Syrian kibbe that Helou ate during childhood visits to her aunt’s house in the town of Majd al-Helou, and a sumac version she ate with friends in Aleppo’s Old City.