Noah Hawley really knows how to keep a reader turning the pages, but there’s more to the novel than suspense. On one hand, “Before the Fall” is a complex, compulsively readable thrill ride of a novel. On the other, it is an exploration of the human condition, a meditation on the vagaries of human nature, the dark side of celebrity, the nature of art, the power of hope and the danger of an unchecked media. The combination is a potent, gritty thriller that exposes the high cost of news as entertainment and the randomness of fate.
There is also focus on color. The individual ingredients’ shades are emphasized – avocados paired with carrots and leafy greens highlight the dynamic nature of these ingredients (for those who think avocados aren’t exciting) and that colors can go beyond the beige nature of typical vegetarian foods like tofu. Cheddar biscuits, sloppy sandwiches, more burgers, and even items like granola that regularly have little interest for me caused me to bookmark pages.
And better yet, there was no shock moving from image to ingredient list and recipes. Everything seemed feasible. Typically, I’ll see a recipe and hold my breath until I see the labor and ingredient list. In the past my hopes have been dashed when I’d see 30 ingredients, 10 different Cuisinart attachments necessary, and 46 steps.
It all has the potential for some sappy feel-good melodrama just in time for Father’s Day; but somehow, like Bucky Dent himself, Duchovny hits an unexpected home run.
These videos are the pinnacle of media optimized for the Internet: short, seductive, shareable. I may never make a vampire taco. But I’ll happily watch someone else make them.
Have we devised any greater waste of time and energy than the running of the marathon? I’m asking for a friend.
All across Oberlin—a school whose norms may run a little to the left of Bernie Sanders—there was instead talk about “allyship”: a more contemporary answer to the challenges of pluralism. If you are a white male student, the thought goes, you cannot know what it means to be, say, a Latina; the social and the institutional worlds respond differently to her, and a hundred aggressions, large and small, are baked into the system. You can make yourself her ally, though—deferring to her experience, learning from her accounts, and supporting her struggles. You can reach for unity in difference.
On February 25th, TheTower.org published an article that included screenshots from the Facebook feed of Joy Karega, an assistant professor of rhetoric and composition at Oberlin. The posts suggested, among other things, that Zionists had been involved in the 9/11 plot, that isis was a puppet of Mossad and the C.I.A., and that the Rothschild family owned “your news, the media, your oil, and your government.” The posts did not sit well with everyone at Oberlin, where, weeks earlier, a group of alumni and students had written the president with worries about anti-Semitism on campus; the board of trustees denounced Karega’s Facebook activities. As a teacher, however, she’d been beloved by many students and considered an important faculty advocate for the school’s black undergraduates. The need for allyship became acute. And so, with spring approaching, students and faculty at one of America’s most progressive colleges felt pressured to make an awkward judgment: whether to ally themselves with the black community or whether to ally themselves with the offended Jews.
I had been ignoring the avalanche of calls and texts from friends and family asking where I was and if I was OK. But that night I caved, turned on my phone and decided to look. Scrolling down the list of messages, I saw one from a friend that read: “Just Google yourself.” I typed my name into the search bar and a huge list of news reports with photos of my face stared back at me. Shocked, all I could think was, “Oh my God, the police are looking for me.”
I was living two lives at once, and it was so surreal.
The collection’s title is not ironical. There is magnitude and sublimity in this latest chronicle of a long, hard pilgrimage to inner freedom.
A man watches Vertigo more than 50 times. A man with a fear of heights watches another man, Jimmy Stewart as Scottie Ferguson, develop a fear of heights in Vertigo. A man watches Vertigo, a film about obsession and identity, more than 50 times and writes a book about Vertigo (and obsession and identity).
Critics from left and right blame the two-term presidency of this evidently intelligent and decent man for everything from the failure to close Guantánamo Bay (he’s still trying) to a continuing economic malaise that has fuelled what is shaping up to be the most extremist presidential election since 1860. Yet Souza’s photographs tell a different story – and the one that matters. Obama accomplished the impossible and made the White House an African American home for eight years.
“Similarity serves as a basis for the classification of objects,” wrote the noted psychologist Amos Tversky, “but it is also influenced by the adopted classification.” The flip side holds: Things we might have viewed as more similar become, when placed into two distinct categories, more different.
Genre is also the way we most commonly make sense of music. But here, too, a form of categorical perception reigns.
There are some countries so vast and diverse that any attempt to summarise them feels insulting: such is Indonesia. With a population of 258 million, it is the world’s fourth most populous nation and the largest formed by an archipelago. When it was guest of honour at the Frankfurt book fair last year, it appeared under the banner “17,000 islands of imagination”, a phrase describing its geography but also encapsulating the complexities of representation. Indonesia is home to hundreds of different ethnicities speaking as many languages, and, along with Hindus, Christians and Buddhists, has a majority Muslim population that is the largest in the world. But, as yet, little of its literature has been translated into English.
That ambition explains why my heart leapt when, among the shabby titles in the Cornell store’s alcove, I noticed 29 leather-bound volumes of what turned out to be the Eleventh Edition of the Encyclopaedia Britannica. The set was marked $25 and, after a quick count to be sure it was complete, I raced to the cashier to hand over my money, fearful that the price might suddenly be raised or that someone else would swoop down and carry off my new-found treasure. I then borrowed a grocery store cart to wheel the oversized volumes back to my room. I still own those books, even though their spines have slowly crumbled away, such deterioration — called red rot — being sadly typical of the Eleventh’s aging leather.
Denis Boyles doesn’t mention red rot or, for that matter, the minuscule type of the smaller-sized cloth-bound edition of the Eleventh, but “Everything Explained That Is Explainable” doesn’t overlook much else. Boyles’s account of how this classic reference work came to be published in 1910-1911 makes for enthralling business history.
We can wish and wish and wish for someone to change. We can think that by using this word, and not that... they can make things better or easier for themselves — and by extension, us. But all that wishing won't matter if the rest of the world refuses to bend.
The original researchers and the replicators both have a stake in cooperation. Even if a replication attempt fails, the field will find the failure far more informative because both parties agreed on the process in the first place. Then they can set their sights on understanding why the replication results differed from the original study. The lesson here is not that context is too hard to study, but rather that context is too important to ignore.
Dozens of Salon alumni have, over the past several months, posted their favorite stories from and memories of the once-beloved liberal news site described as a “left-coast, interactive version of The New Yorker,” a progressive powerhouse that over the years has covered politics with a refreshing aggressiveness, in a context that left plenty of room for provocative personal essays and award-winning literary criticism. Story Continued Below
“We were inmates who took over the journalistic asylum,” David Talbot, who founded the site in 1995, wrote on the Facebook page. “And we let it rip — we helped create online journalism, making it up as we went along. And we let nobody — investors, advertisers, the jealous media establishment, mad bombers, etc — get in our way.”
They are mourning a publication they barely recognize today.
With a surprising new proof, two young mathematicians have found a bridge across the finite-infinite divide, helping at the same time to map this strange boundary.
The boundary does not pass between some huge finite number and the next, infinitely large one. Rather, it separates two kinds of mathematical statements: “finitistic” ones, which can be proved without invoking the concept of infinity, and “infinitistic” ones, which rest on the assumption—not evident in nature—that infinite objects exist.
Recently I was browsing in an esoteric bookshop in a small medieval town in England when I found, among witches’ almanacs, books on dog reincarnation and boxes of runes, a paperback called “The Secret History of the World.” This was the kind of book that had apparently inspired “The Da Vinci Code.” Secret societies, strange connections, gargoyles. I love all that stuff. Who doesn’t? So I almost bought it.
But I didn’t, because I had no need for it. At home I already had a 600-page book, two-thirds read, that had so far promised me insights into numerology, card counting, the chaos of war, the secrets of the city, alchemy, fake books, art history, the rule of three, the manufacture of mirrors and the workings of 16th-century magi. This was the only book I needed; the book I was raving about to my friends before I’d even finished it. The novel whose bookmark was simply a pencil, because of the volume of notes I was making in its margins.
Brunch in Los Angeles can be considered a microcosm of the city itself — a social experiment fueled by Champagne, eggs Benedict, Snapchat filters, sunshine and chefs and patrons, each with increasing levels of celebrity. It would be too easy to write it off as a nonchalant midday meal made popular by a community of freelancers and those with disposable incomes. But within the last eight months, we’ve seen brunch go from diner food to omelets and egg sandwiches made by James Beard-caliber, white tablecloth chefs.
This is the comedian known as the Merchant of Venom, the Insult King from Queens, Mr. Warmth (as in precisely the opposite). But not once in a convivial afternoon will he lob an insult our way, the sort he hurls at friends and fans alike, the latter paying handsomely for a dose of Rickles ridicule. At one point, he takes our left hand, bows his bullet head and, in the custom of an Old World courtier, bestows a kiss.
It’s one of Hollywood’s worst-kept secrets that Don Rickles is a mensch.
I feel very strongly that this is true about the Golden Gate Bridge. Today, I heard that people are trying once more to build a kind of suicide-prevention railing along its side, which would keep us from seeing the bay and the beautiful view of the city. I haven’t read much about suicide lately, but I believe that almost 98 percent of such deaths leave more evil than good after them. Even my husband Dillwyn’s death, which I feel was justified, left many of us with some bad things. And when my brother died, about a year after Timmy did, my mother asked me very seriously if I felt that Timmy’s death had influenced David to commit his own suicide, which to me remains a selfish one, compared to the first. I said, “Of course, yes! I do think so, Mother.” And I did think then that Timmy’s doing away with himself helped my young brother David to kill himself, a year later. But there was really no connection; we don’t know what the limit of tolerance is in any human being.
Warren and Tyagi demonstrated that buying common luxury items wasn’t the issue for most Americans. The problem was the fixed costs, the things that are difficult to cut back on. Housing, health care, and education cost the average family 75 percent of their discretionary income in the 2000s. The comparable figure in 1973: 50 percent. Indeed, studies demonstrate that the quickest way to land in bankruptcy court was not by buying the latest Apple computer but through medical expenses, job loss, foreclosure, and divorce.
In the song “Oxford Comma,” the band Vampire Weekend asks, “Who gives a fuck about an Oxford comma?” And the answer is: a perhaps weirdly high number of people. Enthusiasts make and buy comma-sporting T-shirts, start Twitter accounts, and circulate memes. More than one person has told me—unprompted and apropos of nothing—“I’m so glad The Walrus uses the Oxford comma.”
Sometime in the 20th century, shit—having already long been a verb and then a noun—also became an adjective, as in He was a shit teacher or That restaurant has shit service. Exactly when this happened is a bit tricky to pin down, precisely because of the word’s versatility. In many contexts, the shit you think is an adjective might actually be a noun.
Try to picture the perfect elevator system. What makes that system so great? Does it serve the person who's been waiting the longest? Or always go to the closest call? Where does it make the compromise between speedy service and keeping energy usage down?
Meanwhile, the stream restoration and containment pond project was proceeding on county time, which, on the scale of eternity might not be as slow as say, island time, but is pretty close. It was spring again last year, and the untouched bamboo was on the march when the neighbors on the other side of us, whose yard had not yet been invaded, came over with an unusual request. Could they harvest some bamboo shoots? To eat.
Not long ago I found myself sitting in a neon-lit bus shelter at 3:30 a.m. The sidewalks of Paris were black and shiny after a hard rain on a winter's night. I passed the time waiting for the bus, half reading a magazine and eavesdropping on a pretty African woman with intricate black-cherry-soda-colored braids who was chatting in a beautiful, lilting creole with a friend in faraway Mayotte, a French island near Madagascar.
After she ended her phone call, I could feel the lady staring at me. When I looked up, she smiled and asked, “Why are you here?”
“My job,” I answered brightly. “I'm going to work as a baker.”
This is over the top, and not something she’d do for you, certainly not something you’d do for yourself. What you are doing here is overachieving the mom visit: You have to front-load the effort so that when she gets here and you start to wither, you’ve got the towels to fall back on. Write the Wi-Fi password on a small piece of paper and place it on top of the towels. Do this the morning she’s set to arrive so that whenever you get stressed out, you can pace the house and feel better when you catch a glimpse of the towels and the Wi-Fi password. Look at you being a thoughtful host and not a teenage girl who erupts at the first sign of criticism! You’ve never used a washcloth in your life, but maybe your mom will want to. See, you’re a good person. The perfect host. You’ve transcended your upbringing. Everything will be fine.
Throughout human history, people have struggled with two competing impulses: the desire to make a mark for future generations, and a deep confusion about what, exactly, that mark should be.
When I was readying my first novel for publication, it struck me that writers have far more control over what’s in their books than what’s onthem—the cover art, blurbs, jacket copy, but especially the title, where the author’s concerns overlap with marketing ones. Deciding on a name for your life’s work is hard enough; the prospect of changing it at the eleventh hour is like naming your newborn, then hearing the obstetrician say, But wouldn’t Sandra look amazing on the certificate? It took a nine-month war of attrition to secure the original title of my book, Private Citizens.
With so many possible explanations for what went wrong, the real one had better reach a high bar. Does it? I had doubts. But this much is clear: Mr. Hawley has made it very, very easy to race through his book in a state of breathless suspense. Get to that endpoint. Then you can decide.
At one since-disappeared location in Flower Mound, Texas, a picnic table is covered by a roof in the shape of longhorns. A curved aluminum shelter offers shade amidst the arctic-like glow of White Sands National Monument, New Mexico. In Blackwell, Oklahoma, a bench surrounded by a minimalist arrangement of wooden poles suggests a teepee.
Although the extinction of the dinosaurs tends to get most of the attention (I’m not bitter, honestly), it is worth considering that plant fossils are excellent indicators of environmental change, and that if we are interested in understanding precisely what happened to life on land at the end of the Cretaceous, then the plant fossil record is a pretty good place to look.
His words at sentencing came to define my life, and I’ve thought many times over the years about contacting him. But I didn’t hear Curtin’s voice again for nearly half a century, when he returned my request for a telephone interview four weeks ago.
I recognized his voice immediately, though it is now whispery, befitting a man of 94 years. He suffered a heart attack ten years ago but did not retire until a week or so before our conversation, ending 48 often-controversial years on the bench.
Horgan’s career reflects the increasingly porous nature of these national styles. “Pulling” is the epitome of the grim British comedy. Two attempts to adapt it for American television failed. “Catastrophe” is a series about two likable characters who do not quite seem so on paper. Based only on a script, it is possible to imagine an interpretation of “Catastrophe” that veers dangerously close to “Who’s Afraid of Virginia Woolf?” In the final episode of the new season, Sharon enthusiastically lectures Rob, “Not everyone has to like you. You’re not a puppy. You’re an adult man with a wife. Honest people who tell people how they feel when they feel it have people not like them. O.K.? That’s what I do. I have earned the right to have people dislike me. I am very happy to have people not like me!” (“No shit,” Rob replies.)
What was it about handles—door-handles, axe-handles, the handles of pitchers and vases—that transfixed thinkers in Vienna and Berlin during the early decades of the twentieth century, echoing earlier considerations of handles in America and ancient Greece?
Ludwig Wittgenstein, as everyone knows, abandoned philosophy after publishing his celebrated Tractatus Logico-Philosophicus in 1921. He took up gardening instead, in a monastic community on the outskirts of Vienna, where he camped out for a few months in a toolshed. It was in part to draw him back into “the world” that his sister Margarete (Gretl) invited him to join the architect Paul Engelmann in designing her new house, a rigorous Modernist structure that, much changed, now houses the Bulgarian Embassy.
Ever since Anthony Bourdain, our tribal king, published his peerless “Kitchen Confidential” in 2000, we, the demimonde of Professional Restaurant, have glutted the bookstores with more accountings of ourselves and our work than anyone could possibly wish to read. The taco truck chef, the French chef, the drug-addicted chef, the Korean-American chef, the reluctant chef (ahem), the female vegetarian chef, the bad-boy chef, the cancer survivor chef, not to mention the wine importer, the farmer, the restaurant critic, the host of a cooking competition show, the butcher, the magazine editor turned line cook, the fisherman, the baker, the beekeeper, the forager, even the sous-chef — there have been so many books from our people that you could be forgiven if at shift drink one night, loosened by a couple of shots, you rolled your eyes and groaned to your co-workers, “It’s only a matter of time before we have the celebrity dishwasher memoir.”
Well, I was close enough. Now the busboy — my apologies, that’s back waiter — has written a book too. And she has done an outstanding job of it.
Many of the reviews I’ve read of DeLillo’s latest book, Zero K, talk about “late period” DeLillo, suggesting that since Underworld, he’s been prone to writing similar books – marked by slender plotting, elusive meanings and dense, elliptical prose. What most reviewers don’t say is that another characteristic of these late novels is that they generally need to be read more than once to be understood, and that they sometimes take years to mature in the reader’s mind.
Pho is so elemental to Vietnamese culture that people talk about it in terms of romantic relationships. Rice is the dutiful wife you can rely on, we say. Pho is the flirty mistress you slip away to visit.
I once asked my parents about this comparison. My dad shook his hips to illustrate the mistress. My mom laughed and quipped, “Pho is fun, but you can’t have it every day. You would get bored. All things in moderation.”
Here are some tools you can use. You do not need to buy them. They are already in your toolbox. They are in everybody’s toolbox. They are facts that are designed specifically to alleviate the pain of existence. There are not many — only two — but they are incredibly true and incredibly important.
Al Gore got stuck on a scissor lift. Studio execs fell asleep at a screening. And everybody hated the title. The amazing true story of the most improbable — and important — film of our time.
Ultimately, I am more of a tourist than a time-traveler. After all, no digital collection can fully reveal what the past was really like. There will always be mysteries left unexplained.
Thomas Thwaites first considered becoming an animal on a spring day in 2013. He was walking Noggin, his nieces’ Irish terrier, along the Thames when he found himself taking stock of his life. Thwaites was then thirty-three. A few years earlier, he’d launched his career as an artist and designer with a clever project: constructing a toaster from scratch, mining the iron and making the plastic himself. Along the way, he catalogued the environmental devastation caused by humanity’s determination to toast en masse—a vast crime against nature committed in the name of breakfast. Thwaites’s toaster was acquired by the Victoria & Albert Museum for its permanent collection. Although the toaster never actually made toast—a few crucial components proved too difficult to build—it was, in all other respects, a success.
As a second generation immigrant, the way I pronounce my name is different to how my parents say it. But it’s my choice, and I refuse to feel guilty about it.
As you probably know, Jeremy Corbyn, the embattled leader of the Labour Party, is a vegetarian: or if you read the Daily Mail, a “teetotal vegetarian”. It’s taken for granted that Corbyn’s dietary choices are indicative of his politics and beliefs just as his beard is an outward and visible sign of his inward and invisible weirdness. In Ludwig Feuerbach’s phrasing, Der Mensch ist, was er isst.
Are we creating a problem that future generations will not be able to solve? Could the early decades of the 21st century even come to seem, in the words of the internet pioneer Vint Cerf, like a “digital Dark Age”? Whether or not such fears are realised, it is becoming increasingly clear that the migration of knowledge to formats permitting rapid and low-cost copying and dissemination, but in which the base information cannot survive without complex and expensive intervention, requires that we choose, more actively than ever before, what to remember and what to forget.
Free speech is complicated and comes at a high price. We pay for it in terms of other things we also need to care about: public order and security, children’s needs, private reputations, civic courtesy, cultural worth, the social dignity of vulnerable minorities. As Timothy Garton Ash makes admirably clear in his wise, up-to-the-minute and wide-ranging new survey, “Free Speech: Ten Principles for a Connected World,” most of the difficult arguments about free speech bear on its price in terms of other things that also ought to matter to us.
Smoke is a mess of a book. It's long and untidy, the dialogue is implausible, the action bottlenecks about halfway through, but it works. It works because it feels psychologically true: Imagine the relief of being in the right, of all your microdecisions and weaknesses and passing thoughts being judged and found pure, of not having to bear the guilt of success at someone else's expense. Of deserving it.
If the same cannot be said today, if dictators are no longer seen to hold the power of life or death over their subjects, if the archcriminals of Cambodia, Sudan and Rwanda are indicted and sometimes even punished, in short, if the idea of international justice has gradually gained a semblance of meaning, we owe it to two ideas, or more precisely two concepts — as well as to the two men who brought them to life: Hersch Lauterpacht for the concept of the crime against humanity and Raphael Lemkin for that of genocide. Philippe Sands, a professor of law at University College London, recounts the life and work of both men in “East West Street: On the Origins of ‘Genocide’ and ‘Crimes Against Humanity.’ ”
What a difference a century makes. One hundred years ago, a generous-hearted, independent-minded, Hong Kong-born Briton found a baby girl abandoned on his doorstep in Changsha, in China's Hunan province. He took her in and made her part of his family but, in doing so, drew the disapproval of his compatriots.
While widening income inequality is a divisive issue in Hong Kong today, in 19th- and early 20th-century "Britain in China", race and class divided the world of treaty ports and foreign enclaves.
Chance events play a much larger role in life than many people once imagined.
Most of us have no difficulty recognizing luck when it’s on conspicuous display, as when someone wins the lottery. But randomness often plays out in subtle ways, and it’s easy to construct narratives that portray success as having been inevitable. Those stories are almost invariably misleading, however, a simple fact that has surprising implications for public policy.
It’s a sad truth: No matter how much progress women have made in the workplace — and it’s still pretty limited — the message about our romantic prospects remains stubbornly mired in the past.
The first adult novel (and with its scenes of sex and violence, it's very, very adult) from young adult author Wasserman, much of its power depends on the suspense that she carefully constructs. That's not to say this is a run-of-the-mill thriller; it's a perfectly constructed literary novel, but one that dares its readers to put it down.
And it's nearly impossible to put down. Much of that is because Wasserman's characters are so flawlessly realized — Hannah is an appealing everygirl, Lacey is compelling and terrifying, and Nikki is surprisingly complex, a sadistic manipulator who may or may not actually have a good heart.
Many scientists say that the American physiologist Benjamin Libet demonstrated in the 1980s that we have no free will. It was already known that electrical activity builds up in a person’s brain before she, for example, moves her hand; Libet showed that this buildup occurs before the person consciously makes a decision to move. The conscious experience of deciding to act, which we usually associate with free will, appears to be an add-on, a post hoc reconstruction of events that occurs after the brain has already set the act in motion.
The 20th-century nature-nurture debate prepared us to think of ourselves as shaped by influences beyond our control. But it left some room, at least in the popular imagination, for the possibility that we could overcome our circumstances or our genes to become the author of our own destiny. The challenge posed by neuroscience is more radical: It describes the brain as a physical system like any other, and suggests that we no more will it to operate in a particular way than we will our heart to beat. The contemporary scientific image of human behavior is one of neurons firing, causing other neurons to fire, causing our thoughts and deeds, in an unbroken chain that stretches back to our birth and beyond. In principle, we are therefore completely predictable. If we could understand any individual’s brain architecture and chemistry well enough, we could, in theory, predict that individual’s response to any given stimulus with 100 percent accuracy.
Bad writers often believe they have very little left to learn, and that it is the literary world’s fault that they have not yet been recognised, published, lauded and laurelled. It is a very destructive thing to believe that you are very close to being a good writer, and that all you need to do is keep going as you are rather than completely reinvent what you are doing. Bad writers think: “I want to write this.” Good writers think: “This is being written.”
To go from being a competent writer to being a great writer, I think you have to risk being – or risk being seen as – a bad writer. Competence is deadly because it prevents the writer risking the humiliation that they will need to risk before they pass beyond competence. To write competently is to do a few magic tricks for friends and family; to write well is to run away to join the circus.
If you ask a book reviewer or look at any of the “Best Book” lists compiled by critics, you would say War and Peace. But what if you asked everyday readers on the Internet?
In fairy tales, magical seeds grow into beanstalks, climbing through clouds into giants’ houses full of treasures and gold, lifting the hero away from the quotidian concerns of taking care of cows and avoiding starvation. The characters of Scarlett Thomas’s new novel, “The Seed Collectors,” start in those elevated regions, though, comfortably beyond any real worry about money or material needs. Their magic seeds have to go to greater lengths: providing an escape from existence and self, which by the end of the novel does indeed come to seem like a reasonable choice.
The inmates learn how to cook and serve the paying public, who have applied in advance and received security clearance. Each diner must bring a passport or driving licence and be fingerprinted. Tipping and mobile phones are banned, which makes for a pleasant dining room.
Every now and then, even Charles Darwin was dumbfounded by the mysteries of the natural world. On those occasions, he reached out for enlightenment to a repertory cast of scientific correspondents, one of whom was Francis Trevelyan Buckland, a raffish, tousle-haired star of the natural-history craze that befell Britain in the mid-nineteenth century. The two made for unlikely pen pals: if Darwin was the dour, sincere prophet who transformed humanity’s appreciation of its place in the universe, Buckland was a professional eccentric, as much showman as scientist. Although he did groundbreaking work in pisciculture (the breeding of fish), Buckland was perhaps best known as a lecturer, beguiling huge audiences with his left-field takes on botany, zoology, and human anatomy. As a general rule, the weirder the subject, the more likely Buckland was to have something to say about it: the fighting behavior of newts, the cannibalistic propensities of rats, the best method for killing a boa constrictor, gigantism, walking fish, flea circuses, conjoined twins (he was a good friend of Chang and Eng Bunker, the original Siamese twins), the uses of human hair as manure, and pagan burial rites. Tellingly, it was Buckland to whom Darwin turned to verify a claim that a dog and a lion had successfully bred in rural Russia.
Senses, reflexes and learning mechanisms – this is what we start with, and it is quite a lot, when you think about it. If we lacked any of these capabilities at birth, we would probably have trouble surviving.
But here is what we are not born with: information, data, rules, software, knowledge, lexicons, representations, algorithms, programs, models, memories, images, processors, subroutines, encoders, decoders, symbols, or buffers – design elements that allow digital computers to behave somewhat intelligently. Not only are we not born with such things, we also don’tdevelop them – ever.
As good as comic-book movies have been for making it rain money for the studios — mainly Disney, Warner Bros. and Fox — and as much fun as I’ve had at some of them, the genre has perverted what it means to watch certain actors in movies. When the character is more famous than the actor playing it, how does anybody develop the trademarks of a star? The prerogatives of the comic book are warping the properties of movie stardom. One feels at permanent odds with the other.
Given the option, Gallagher wouldn’t freeze time even if he could. That might seem strange for a man whose life is devoted to scrubbing away rust stains and gluing broken marble crosses back together. But just like Sampras needed Agassi to play his best tennis, Gallagher needs decay to do his best work. “We have no preconceived notion that we’re saving these things,” he tells me. “We’re just giving them a little more time.”
But there have been moments lately when I feared we were speaking and writing in a new adverbial age. I took the appearance of Daniel Handler’s 2006 novel Adverbs, a love story in which every chapter is named with a word like “immediately,” and Jonathan Safran Foer’s 2005 9/11 novel Extremely Loud and Incredibly Close, whose title signals the child narrator’s voice, as signs that this century would be friendlier to that part of speech than the one ruled by Hemingway, whether I liked it or not. I thought it might have something to do with the death of the typewriter and the rise of the internet, a zone with an excess of feeling and an amateur taste for the rhetorical flourish. Was the adverb winning?
After all, the essay, in its American incarnation, is a direct outgrowth of the sermon: argumentative, insistent, not infrequently irritating. Americans, in my observation—and despite our fetish for the beauties of individuality and personal freedom—are always, however smilingly, trying to convince somebody, somewhere, of something, and our essayistic tradition bears this out.
At 13, I used to try my best to maximize the experience: eating until I was too full to move, schlepping my way to the toilet, making myself sick, and going back for more like a hedonistic Roman nobleman. It wasn't the best con, really, and it just made me feel like shit. So I stopped going to all you can eats. The love affair ended.
Then a couple of years ago, I moved to London. Broke and in the city of greed, I got obsessed again. How can I maximize their value? How do I beat the system? The house can't always win. After many sleepless nights, jotting into my notepad, by God, I had it. Four different cons crafted carefully with one purpose: to take down the man and to finally get our fair fill of an all you can eat buffet.
Behind the scenes as one of America’s most innovative opera companies adapts a Stephen King (and Stanley Kubrick) classic.
From the rise of the casual camper to the boutique fitness boom, it can feel like there have never been more people in the market for sports apparel. As of 2015, sporting goods stores in the US were bringing in as much as $48 billion in annual revenue, according to IBISWorld, up from $39.8 billion in 2012. Sports participation is up, too. According toEuromonitor, participation in high school sports has increased from 25 percent to 35 percent over the last 35 years, with nearly double the number of female students playing sports as compared to the 1980s.
But there's a stark gap between an increasing customer base and many sports retailers — a gap that only continues to widen, no matter how many times companies see new ownership or rethink their businesses. As Hermina puts it, "My needs evolved, but in many ways, Sports Authority hasn't."
Most importantly, Allington et al. argue, DH has “tended to be anti-interpretive.” In place of the interpretation of literary texts, digital humanists “archive materials, produce data and develop software.” These are the entirely legitimate tasks of historians, computer scientists and social scientists, but they make DH an odd fit for a scholarly endeavor which, almost all teachers and students would agree, is focused on interpretation. Why should humanists be asked to do what can be done better by others? And why should they be asked to stop doing what they do so well? How, in short, did digital humanities come to be seen as the humanities at all?
But Merriam-Webster also admits that the word, which saw its first recorded use in English in 1946, “is so overused that it’s begun to lose its meaning”, a word that a columnist for Toronto’s Globe and Mail argued is “tossed around with cavalier imprecision, applied to everything from an annoying encounter with a petty bureaucrat to the genocidal horrors of the Third Reich”.
Two years ago, a lawyer in Indiana sent me a check for seventy-eight thousand dollars. The money was from my uncle Walt, who had died six months earlier. I hadn’t been expecting any money from Walt, still less counting on it. So I thought I should earmark my inheritance for something special, to honor Walt’s memory.
It happened that my longtime girlfriend, a native Californian, had promised to join me on a big vacation. She’d been feeling grateful to me for understanding why she had to return full time to Santa Cruz and look after her mother, who was ninety-four and losing her short-term memory. She’d said to me, impulsively, “I will take a trip with you anywhere in the world you’ve always wanted to go.” To this I’d replied, for reasons I’m at a loss to reconstruct, “Antarctica?” Her eyes widened in a way that I should have paid closer attention to. But a promise was a promise.
Women in many times and places have felt pressure to bear children. But the idea of the biological clock is a recent invention. It first appeared in the late 1970s. “The Clock Is Ticking for the Career Woman,” the Washington Post declared, on the front page of its Metro Section, on 16 March 1978. The author, Richard Cohen, could not have realised just how inescapable his theme would become.
Steve Jobs claimed that dropping acid was one of the most important things he had ever done in his life. “LSD shows you that there’s another side to the coin,” he said, “and you can’t remember it when it wears off, but you know it.” Jobs’s openness to psychedelic experiences is an aspect of his formative years that’s often invoked to help shade in his genius, a way of decoding the inputs and stimuli that allowed him to—as the billboards used to say—“think different.” Last year, one of Jobs’s comrades from those shaggier days, Daniel Kottke, described their acid trips as fairly typical: they were “monk-wannabes” who would go hiking and listen to music, talk about consciousness, attempt to read books. And then, like many of their generation, they grew up. By the time both of them were involved with Apple, in the late seventies, Jobs had rerouted his creativity toward something less ephemeral. “Once Apple started,” Kottke, who would be one of the company’s first employees, said, “Steve was really focused with all of his energy on making Apple successful. And he didn’t need psychedelics for that.”
“That which is the immodesty of other women has been my virtue — my willingness that the world should gaze upon my figure unadorned,”Audrey Munson, the favorite nude model of the Beaux Arts movement in the United States, once proclaimed. And that openness to posing in often freezing artist studios completely naked, in uncomfortable poses with swayed legs and hair held back with bended arms, such as for Adolph Alexander Weinman’s “Descending Night” (1914), or practically on tip toes for Alexander Stirling Calder’s “Star Maiden,” endlessly duplicated at the Panama-Pacific International Exposition in San Francisco, earned her incredible acclaim.
A sheet of paper can be a work of art, its surface rich with life and visual interest. Timothy Barrett, the MacArthur fellow and master paper maker, moved to Japan to learn how to make washi: a translucent paper so delicate it hardly seems material. In more recent years, he has studied the solid white paper, made from cloth rags, that Europeans used for books from the 14th century on. These papers, he says, “had a kind of crackle and made you want to touch them.” Now he makes them as well, from the proper ingredients, raw flax and hemp.
Everything is different on an island: language, weather, food, tradition. There are phrases in the Exuma Islands in the Bahamas that seem to exist only there: “day clean” (dawn), “sip-sip” (gossip), “first fowl crow” (rooster call). The molasses-colored rum in Barbados is special to that rocky gem, and the distinctive sweet coffee can be found in the tiled corner bars in Cuba where habaneros sip it morning and night. Bali’s vibrant batik sarongs are art you can wear, and Maldivian dhon riha tastes like seafood curry concocted in the depths of the ocean.
Two-and-a-half score and a million records ago, two very different American songwriters released two very different albums that would go on to shape the future of popular music.
Citizens of the United States are quite taken with the vocabulary of liberal democracy, with words such as ‘freedom’ and ‘democracy’, which conjure key democratic values and distance the nation from the Old World taint of oligarchy and aristocracy. It is much less clear, however, that Americans are guided by democratic ideals. Or that ideology and propaganda play a crucial role in concealing the large gap between rhetoric and reality.
Strip away the delicious Seussian linguistic fillips, and you have a little boy on a vague journey with vague obstacles and a vaguely happy ending. (“Kid, you’ll move mountains!”) As the tepid 1990 review of the book in the New York Times asked, “Seriously now, who's got the punch line?” It’s harmless, of course, but one hates to think that “you” are beating Horton or Bartholomew Cubbins in the sales race. Frankly, you just don’t deserve it.
Perhaps the most important of Erdrich’s achievements is her mastery of complex forms. Her novels are multivocal, and she uses this multiplicity to build a nest, capacious, sturdy and resplendent, for her tales of Indians, living and dead, of the burden and power of their heritage, the challenge and comedy of the present’s harsh demands. Woven into the specificity of these narratives is Erdrich’s determination to speak of the most pressing human questions. In the case of her latest novel, “LaRose,” that question is deceptively simple: Can a person “do the worst thing possible and still be loved”?
In the long months of deep winter, the mood freezes, and, like the sky, turns forbidding. People hurry home to cocoon themselves indoors. To live year-round in a place like this you must be very good at being with others and then very good at being alone.
I find this odd because we know exactly what consciousness is — where by “consciousness” I mean what most people mean in this debate: experience of any kind whatever. It’s the most familiar thing there is, whether it’s experience of emotion, pain, understanding what someone is saying, seeing, hearing, touching, tasting or feeling. It is in fact the only thing in the universe whose ultimate intrinsic nature we can claim to know. It is utterly unmysterious.
The nature of physical stuff, by contrast, is deeply mysterious, and physics grows stranger by the hour.
Musing on what we will perhaps never comprehend is a discombobulating experience that sends most of us scampering off to the familiar, intelligible corners of our daily lives. Not so Marcus du Sautoy . With What We Cannot Know, the prominent mathematician, writer and broadcaster boldly squares up to what he calls the seven “edges” of human knowledge, topics that range from the nature of time to the mysteries of human consciousness.
But “White Sands” isn’t just a catalog of travel mishaps, with Mr. Dyer cast as an English-speaking Monsieur Hulot. It is also a rumination on the meanings we assign the strange destinations of our pilgrimages — “the power that some places exert and why we go to them.”
Weiss is the cheerfully cursing, relentlessly curious experimenter who dreamed up and helped will into fruition the massive antennae that captured the “chirp heard round the universe” — the first detection of gravitational waves rippling toward Earth from a cataclysm in a distant galaxy, two black holes that collided a billion light-years away.
That discovery, made secretly last fall and revealed in February, would have made a splash if it had simply been the first recording of gravitational waves, something Einstein initially conceived a century before.
But it also marked the first detection of black holes in pairs — orbiting each other before colliding to form a more massive black hole — a cataclysmic event more common than theorists ever dreamed.
I had another baby in January 2015, bringing my total to four under the age of 8. I published a book in June, and make a good deal of my living traveling to give talks. My husband also travels frequently for work. While we were doing pretty well with three kids and two jobs, adding a fourth, even with help from a nanny and from family, felt like courting chaos. I worried about my ability to be the ringmaster of this circus of deadlines, school projects and sippy cups. By getting some perspective on my life, I hoped I could figure out ways to make it better.
So I logged on a spreadsheet in half-hour blocks every one of the 8,784 hours that make up a leap year. I didn’t discover a way to add an extra hour to every day, but I did learn that the stories I told myself about where my time went weren’t always true. The hour-by-hour rhythm of my life was not quite as hectic as I’d thought.
Fred Harvey was said to have “civilized the West” by bringing middle-class values to hardscrabble frontier towns, but his real accomplishments were more impressive: Harvey’s business model established the modern chain restaurant, created a major tourist market for Native American art, and gave opportunity to scores of young women escaping the confines of their Midwestern upbringings. In the late 19th century, Harvey Houses put many small towns on the map, providing sophisticated accommodations and magnificent public architecture that often became the locus of these communities, both culturally and economically. Through its promotion of the region’s landscape, architecture, and Native American cultures, the company also built a lasting fantasy of the American Southwest.
Fast-forward to 2011. Smartphones were now everywhere, and I noticed that the entirety of Proust’s novel—every volume but one in the original Moncrieff translation—was available to download for free on my cellphone, an HTC Incredible, thanks to Project Gutenberg Australia. I quickly downloaded all seven volumes. Finally, in the fall, shortly before my father turned 95, I began where I left off, in Sodom and Gomorrah, reading Proust on my cellphone at night when everyone else in the house had gone to sleep.
When I tell people this, they look at me like I have drowned a kitten. And when I tell them that not only did I finally finish all of Sodom and Gomorrah on my cellphone, but the rest of Proust’s opus, too, and in time to tell my father, they back away from me very slowly.
There is a certain perverse pleasure in imagining the world going down in flames. But what if it was more than a metaphorical conflagration? What if your neighbors, friends, colleagues, or family could ignite without warning, starting a chain reaction that could send entire cities up in smoke? What if this was happening simultaneously all over the world? And what if nobody knew how to stop it? In his new novel, “The Fireman,” Joe Hill drops us in just as everything is starting to go to hell, where nothing and no one is safe.
It’s a page-turner — or perhaps page-burner is more appropriate — full of edge-of-your-seat tension and moral quandaries that simmer.
The Mirror Thief is as difficult to explain as it is completely original. It's one of the most intricately plotted novels in recent years, and to call it imaginative seems like a massive understatement. The three stories are as different from each other as can be, and the fact that Seay weaves them together so skillfully is almost miraculous.
You’ve got your non-fat milk, full-fat milk, soy milk, and coconut milk; espresso shots; all the different flavored syrups, some of which are sugar-free; whipped cream; iced, hot, or “extra hot” if you’ve got a Kevlar tongue; different sizes; different roasts of coffee; and on and on and on.
Surely some of those combinations are gross—Venti green tea latte with peppermint and whipped cream, anyone?—but you can have them if you want them. And as Sophie Egan, a program director at the Culinary Institute of America, writes in her new book Devoured, that speaks to “a most American element of the American food psyche”: customization.
Social mobility is preceded by literal mobility: people who can walk or ride from one place to another. Economic and technological changes (starting with the building of roads) enabled that movement, then accelerated in order to accommodate it; this in turn has made further such movement more attractive, more inevitable. Supplemental technologies of writing, record-keeping, and administrative organization (including regular naming practices and travel documentation) have also arisen in order to keep track of all the movement and to prevent descent into social chaos. The result is the world we live in, a world in which we all must ask — in a tone and for a purpose quite alien to those of the person who coined this phrase — “Who is my neighbor?”
The other day I saw my father, who is ninety-two years old, and in very poor health. Physically, he’s a wreck, and mentally he’s not much better. At his peak, he was a capable and intelligent man, by nature rational to the point of coldness. But the other day he was full of childlike fear of the darkness that lay ahead. He’s religious, in an austere way. So I knew what he meant. “Don’t be afraid,” I said. “You’re a good man, and you lived a good life.” In fact, neither thing was true. But what else could I say? I’m sure he said the same to his own father, for the same reasons, and with the same reservations. Don’t we all?
For a word that literally means definition, the aphorism is a rather indefinite genre. It bears a family resemblance to the fragment, the proverb, the maxim, the hypomnema, the epigram, the mantra, the parable, and the prose poem. Coined sometime between the fifth and third centuries BC as the title for one of the books of the Corpus Hippocraticum, the Aphorismi were originally a compendium of the latest medical knowledge. The penultimate aphorism, “In chronic disease an excessive flux from the bowels is bad,” is more representative of the collection’s contents than the first—“Life is short, art is long”—for which it is best known.
But in those six words lies a clue to the particular space aphorisms were supposed to define. Thanks to a semantic slippage between the Greek word techne and its English translation (via the Latin ars), the saying is often taken to mean that the works of human beings outlast their days. But in its original context, Hippocrates or his editors probably intended something more pragmatic: the craft of medicine takes a long time to learn, and physicians have a short time in which to learn it. Although what aphorisms have in common with the forms listed above is their brevity, what is delimited by the aphorism is not the number of words in which ideas are expressed but the scope of their inquiry. Unlike Hebrew proverbs, in which the beginning of wisdom is the fear of God, the classical aphorism is a secular genre concerned with the short span of time we are allotted on earth. Books of aphorisms are also therapeutic in nature, collections of practical wisdom through which we can rid ourselves of unnecessary suffering and achieve what Hippocrates’ contemporary Socrates called eudaimonia, the good life.
“Just talk to any Chinese who lived through that time,” a middle-aged man whose father spent nearly 20 years in a labor camp for “practicing capitalism” tells the radio reporter Rob Schmitz, in “Street of Eternal Happiness,” his new book about some of the ordinary people he encounters in his Shanghai neighborhood. “We all have the same stories.”
For not only does the restaurant review address taste, as in how does it taste, but it addresses and influences trends in taste, and fashion in food. This puts undeniable agency in the hands of the critic, the reviewer and with agency comes responsibility.
Andrew Levy’s parents knew that the rare and deadly cancer in his blood could not be beaten, so they began to prepare for the worst. Then something mysterious happened.
As society becomes more wedded to technology, it's important to consider the formulas that govern our data.
In his new book, In Praise of Forgetting: Historical Memory and Its Ironies, journalist David Rieff questions the idea that remembering the past is an inherently virtuous practice that will help us solve present-day problems. It’s a philosophical argument that he pursues across the globe, invoking examples drawn from the histories of the United States, Argentina, Spain, Germany, Bosnia, Israel, and Ireland, among others. “What if,” Rieff asks, “a decent measure of communal forgetting is actually the sine qua non of a peaceful and decent society, while remembering is the politically, socially, and morally risky pursuit?”
A few weeks ago, an old friend who was traveling to New Orleans for the first time emailed to ask me for restaurant and bar recommendations. I sent him my usual list—some personal favorites from the time I lived in the city, pre-Katrina, from 1998 to 2003: Willie Mae's Scotch House, Restaurant August, Molly's at the Market, and Dante's Kitchen—as well as newer places that have opened since I left for New York, like Cochon, La Petite Grocery, and MoPho. I told him to go to Shaya and Domenica, because everyone tells everyone to go to Shaya and Domenica these days, though I haven't been to either. I strongly advised him to grab a Grasshopper at Tujague's, and a Sazerac at The Roosevelt, then I reluctantly hit send.
The reason I say "reluctantly" is because I really didn't want to send him any recommendations at all. Instead, I wanted to send an email back that read something like this: "Go anywhere that looks good to you. Then let me know what you find." In other words, discover your own places to eat. Eat without a map.
Dining with a friend online.
Carrying a hand-held barometer and mapping elevation shifts in the terrain with his smartphone, he had arrived on a scouting mission for a quixotic project. He wanted to redefine the limits of human endurance by training a man to run a marathon in less than two hours without the use of performance-enhancing drugs.
The Sub2 Project, as it is called, is an attempt at the extraordinary — to reduce by nearly three minutes the world record of 2 hours 2 minutes 57 seconds, set at the 2014 Berlin Marathon by Dennis Kimetto of Kenya. A marathoner breaking the two-hour barrier would finish more than six-tenths of a mile ahead of Kimetto, a veritable eternity in distance running.
It wasn’t long before it struck me that chess seemed to be a game for the young. When my daughter began doing scholastic tournaments, I would chat up other parents and ask whether they played—usually the reply was an apologetic shrug and a smile. I would explain that I too was learning to play, and the resulting tone was cheerily patronizing: Good luck with that! Reading about an international tournament, I was struck by a suggestion that a grandmaster had passed his peak. He was in his 30s. We are used to athletes being talked about in this way. But a mind game like chess?
But what to do when work ended? He loved Jane Jacobs’s evocations of Greenwich Village, with its friendly shop owners and its “ballet” of the city streets. But, he said, “I’d end up going to a bar and just sitting there, talking to a bartender and staring at Twitter.” A thought surfaced: I’m surrounded by people and things to do, and yet I’m so fucking bored and lonely.
All of this seemed very far away on a Sunday night this winter, in the basement of a renovated four-story brownstone in the Crown Heights neighborhood of Brooklyn. The building, Kennedy’s new home, is run by the co-living startup Common, which offers what it calls “flexible, community-driven housing.” Co-living has also been billed as “dorms for grown-ups,” a description that Common resists. But the company has set out to restore a certain subset of young, urban professionals to the paradise they lost when they left college campuses—a furnished place to live, unlimited coffee and toilet paper, a sense of belonging.
In “Little Labors,” a highly original book of essays and observations, Rivka Galchen writes, “The world seemed ludicrously, suspiciously, adverbially sodden with meaning.” The birth of her daughter, she observes, “made me again more like a writer . . . precisely as she was making me into someone who was, enduringly, not writing.”
This brilliantly described state is familiar to me, as is her experience of maternal sleeplessness and dim memory of Russian formalism. Galchen writes, “Another problem with being the mother of a baby is loneliness.” What friends we might have been, she and I, pushing our strollers around together. I fantasized that we might have breast-fed while watching the sexy antihero Louis C.K., recited the poetry of Sei Shonagun, talked about the new Jenny Offill novel that we couldn’t put down, sharing our well-worn copies of Jane Bowles. But probably we are both too guarded with our time; we would have seen each other at the park, and looked away.
I suspect that in the UK Baudelaire is more nodded to respectfully than actually read. This is a pity, because he could be said to have been the first modern poet: TS Eliot thought so, saying he was “the greatest exemplar in modern poetry in any language”.
Andrew Michael Hurley likes to know which part of The Loney unsettles you the most. It’s different for everyone: is it the setting – a decrepit house on a “wild and useless length of English coastline”? Or the occasional touch of gothic – a girl’s face glimpsed in a window, an effigy of Jesus found hanging in a wet wood, a crown of thorns topping a sheep skull? Or simply the relentless tension – who will go the furthest to cure a mute boy: his fanatical mother or the presence she believes is God, but the reader knows, deep down, is something else entirely?
When did individual writers begin to use word processors? As I began work on a literary history of word processing, I found it difficult to establish a time line. Sometimes writers kept a sales record—a word processor or computer would have represented a significant investment, especially back in the day. Other times, as with Stanley Elkin or Isaac Asimov, the arrival of the computer was of such seismic importance as to justify its own literary retellings. But most of the time there were no real records documenting exactly when a writer had gotten his or her first computer, and so I had to rely on anecdote, detective work, and circumstantial evidence.
Last year, after nearly a decade of long sojourns in Berlin, I signed the lease on an apartment in a pre-World War I, or altbau, building on a tree-shaded block just off Güntzelstrasse, a quiet neighborhood southwest of the city center.
Although I was vaguely aware that the city’s Jewish community had once been centered here, I found it unsettling to discover that Nazi terror had unfolded just outside my front door. Beginning in 1942, the Gestapo arrested dozens of Jews on my street, Jenaer Strasse, and shipped them to Theresienstadt and Auschwitz, where almost all were killed.
The first salvo was a missile launch by the Chinese in 2007 that blew up a dead satellite and littered space with thousands of pieces of debris. But it was another Chinese launch three years ago that made the Pentagon really snap to attention, opening up the possibility that outer space would become a new front in modern warfare.
This time, the rocket reached close to a far more distant orbit — one that’s more than 22,000 miles away — and just happens to be where the United States parks its most sensitive national security satellites, used for tasks such as guiding precision bombs and spying on adversaries.
The flyby served as a wake-up call and prompted the Defense Department and intelligence agencies to begin spending billions of dollars to protect what Air Force Gen. John Hyten in an interview called the “most valuable real estate in space.”
I now see Austen as a very dark writer and Mansfield Park as her darkest work, a book full of sexual repression and unconscious conflict, with no forgiveness or redemption for anyone who dares struggle against the social code. The world of taffeta and lace exists only on the surface; underneath it, these well-bred young women are trapped like rats. This fact is made most vivid in the scene where Fanny joins a party of friends and family on their visit to Sotherton Court, the home of her cousin Maria Bertram’s wealthy but deathly boring fiancé, James Rushworth, whose extensive grounds include a bowling green, lawns bounded by high walls, pheasants, a wilderness, a terrace walk, iron palisades, and a small wood.
From the written letter to online commentary, the fine art of literary hate mail endures.
Like many rock acts, she’d had years to dream up her world-shut-your-mouth arrival, then suddenly had no time at all to patch together a follow-up.
But here’s the thing: I like moist. And not just because of good associations with the groundbreaking Moistworks blog, either. I think moist just needs better PR.
In a haze of cigarette smoke and laughter, a full spectrum of my Parisian neighbors filled the crowded tables: a vivacious fashionista in huge red eyeglasses chatting with a Franco-African woman in a motorcycle jacket; a male couple in Ray-Bans cuddling at the next table. Behind me, two older women clinked wineglasses and leaned forward to hear each other amid the funk-music groove.
Twilight was falling. A Parisian night was about to flare into fullness.
On a bitter, soul-shivering, damp, biting gray February day in Cleveland—that is to say, on a February day in Cleveland—a handless man is handling a nonexistent ball. Igor Spetic lost his right hand when his forearm was pulped in an industrial accident six years ago and had to be amputated. In an operation four years ago, a team of surgeons implanted a set of small translucent “interfaces” into the neural circuits of his upper arm. This afternoon, in a basement lab at a Veterans Administration hospital, the wires are hooked up directly to a prosthetic hand—plastic, flesh-colored, five-fingered, and articulated—that is affixed to what remains of his arm. The hand has more than a dozen pressure sensors within it, and their signals can be transformed by a computer into electric waves like those natural to the nervous system. The sensors in the prosthetic hand feed information from the world into the wires in Spetic’s arm. Since, from the brain’s point of view, his hand is still there, it needs only to be recalled to life.
“To rebel is justified,” the Great Helmsman intoned. He named his teenage followers Red Guards, and it was they who packed Tiananmen Square, waving copies of the Little Red Book filled with his sayings as they stood in their millions for a brief sight of him. Like their western contemporaries who encountered the Beatles, they told each other that their lives were changed. But Mao Zedong’s Cultural Revolution also had a darker side. It was necessary to destroy the bourgeois past, and this involved the wholesale looting of shrines, the destruction of books and parchment, the smashing of ornaments and the pillaging of homes belonging to the wealthy.
You’ve probably heard that “breakfast is the most important meal of the day.”
What you may not know is the origin of this ode to breakfast: a 1944 marketing campaign launched by Grape Nuts manufacturer General Foods to sell more cereal.
I think I started every interview with: Tell me how you met Bill Cosby.” Noreen Malone, a senior editor at New York magazine, didn’t plan the question ahead of time. As she set out to interview the 35 women accusing Cosby of sexual assault for New York’s July 2015 cover story, Malone had other questions on her mind, like would the alleged victims speak to her at all? Could she get them to open up? But once she began interviewing the women, one by one, Malone realized that this question—neutral yet probing, simple yet cutting straight to the core of the narrative—was the perfect place to begin a painful discussion. “I let them choose the starting point for the story,” she says. “It just put it on their terms. And it just went from there.”
Malone is a magazine writer, not an oral historian. But her working method for the Cosby story could have been pulled straight from the oral historian’s handbook. Ask open-ended questions. Get people talking, and keep them talking. The women, to Malone’s surprise, did just that. And the more they talked, filling 232 pages in transcripts, the more Malone realized her voice, the writer’s voice, would only get in the way. “The flow of a feature didn’t feel quite right for it,” she says. “To me, what was so effective was hearing from the women themselves and having that be as undiluted as possible.”
It’s hard to imagine a more unlikely novelist than Samuel Richardson. The son of a carpenter, he attended school only intermittently until he was seventeen, when his formal education ended and he was apprenticed to a printer. He didn’t publish his first novel until after he turned fifty. The undertaking was almost accidental. He had become the proprietor of a printing press when, in 1739, two London booksellers asked him to put together a “letter-writer,” an etiquette manual consisting of letters that “country readers” might use as models for their own correspondence.
Louise Erdrich’s new novel, “LaRose,” begins with the elemental gravitas of an ancient story: One day while hunting, a man accidentally kills his neighbor’s 5-year-old son.
Such a canyon of grief triggers the kind of emotional vertigo that would make anyone recoil. But you can lean on Erdrich, who has been bringing her healing insight to devastating tragedies for more than 30 years. Where other writers might have jumped from this boy’s death into a black hole of despair — or, worse, slathered on a salve of sentimentality — Erdrich proposes a breathtaking response.
She is not, you see, just a great food writer. She is a great writer, full stop. For her, food cannot be separated from the rest of life. A hunger for mayonnaise comes from the same place as a hunger for love. It’s just good deal easier to satisfy.
But something troubling has emerged on the American scene: Political activity has become a hobby. Voting, petitioning, partisan cheering, donating, watching infotainment news: The chief purpose for participating in politics seems to be self-gratification.
On the face of it, Madonna In A Fur Coat is just a largely unrequited love story set in the crowded streets and seedy cabarets of 1920's Berlin. Protagonist Raif, who is no Heathcliff - he's often described as being more of a girl than a man - has been bewitched by the feisty feminist artist Maria, alias the Madonna in a Fur Coat, and they embark upon an intense, platonic love affair.
It doesn't sound very 21st Century - yet for the past three years the book has topped the bestseller lists. And its readers are Turkey's youth. When Filiz goes into schools and talks about the book to teenagers she sees the boys, as well as the girls, cry.
This is not a book filled with sweet, traditional depictions of motherhood; like Ferrante’s narrator, many of the female artists reckon with motherhood as a role that could physically and mentally deter them from creating art.
Perception vs. reality is a venerable versus. From prisoners in Plato’s cave mistaking shadows for reality, to studies about the unreliability of eyewitnesses, the difference between what is and what we perceive has been a problem for thousands of years.
When it comes to eating and drinking, most of us generally assume that what we taste and smell is what’s there in the food. In fact it’s not. Fortunately, the categorical accuracy of what is or isn’t there is less important at the table than in the witness box. What actually matters at the table is perception. Perception is king when we’re eating and enjoying. It is its own reality.
The internet started out as the Information Highway, the Great Emancipator of knowledge, and as an assured tool for generating a well-informed citizenry. But, over the past 15 years, that optimism has given way to cynicism and fear — we have taught our children that the net is a swamp of lies spun by idiots and true believers, and, worse still, polluted by commercial entities whose sole aim is to have us click to the next ad-riddled page.
Perhaps our attitude to the net has changed because we now see how bad it is for knowledge. Or perhaps the net has so utterly transformed knowledge that we don’t recognize knowledge when we see it.
For philosopher Michael P. Lynch, our fears are warranted — the internet is a wrong turn in the history of knowledge. “Information technology,” Professor Lynch argues in his new book, The Internet of Us, “while expanding our ability to know in one way, is actually impeding our ability to know in other, more complex ways.” He pursues his argument with commendable seriousness, clarity, and attunement to historical context — and yet he misses where knowledge actually lives on the net, focusing instead on just one aspect of the phenomenon of knowledge. He is far from alone in this.
With a trial about to begin, lurid and alarming details of the billionaire’s condition and the scheming around him continue to emerge. Many questions will arise in the courtroom—and control of CBS and Viacom could ultimately hang in the balance.
Anyone who's lost a family member knows the feeling of unreality that follows. Psychologists call it "denial," but it's something more than that — it's a sense that you're not really there, that you're living in an alternate world, that the pain you're feeling can't possibly be real. Grief is a powerful thing, and it can temporarily turn people into walking ghosts.
Or, as Dana Cann writes in his debut novel, Ghosts of Bergen County: "This was life: you're here. And this was death: you're not. And then you're here again, haunting some stranger. And none of it matters."
How, for instance, does one disentangle ego from moral action? When does kindness shade into selfishness, and does it matter if it does? At what point does thinking you know what to do become arrogance, or hubris?
The battle lines among American feminists over selling sex were drawn in the 1970s. On one side were radical feminists like the writer Andrea Dworkin and the lawyer and legal scholar Catherine MacKinnon. They were the early abolitionists, condemning prostitution, along with pornography and sexual violence, as the most virulent and powerful sources of women’s oppression. “I’ve tried to voice the protest against a power that is dead weight on you, fist and penis organized to keep you quiet,” wrote Dworkin, who sold sex briefly around the age of 19, when she ran out of money on a visit to Europe.
Other feminists, who called themselves “sex positive,” saw sex workers as subverters of patriarchy, not as victims. On Mother’s Day 1973, a 35-year-old former call girl named Margo St. James founded a group in San Francisco called Coyote, for “Call Off Your Old Tired Ethics.” Its goal was to decriminalize prostitution, as a feminist act. In its heyday, Coyote threw annual Hooker’s Balls, where drag queens and celebrities mixed with politicians and police. It was a party: In 1978, a crowd of 20,000 filled the city’s Cow Palace, and St. James entered riding an elephant.
At the center of this story there is a terrible secret, a kernel of cyanide, and the secret is that the story doesn’t matter, doesn’t make any difference, doesn’t figure. The snow still falls in the Sierra. The Pacific still trembles in its bowl. The great tectonic plates strain against each other while we sleep and wake. Rattlers in the dry grass. Sharks beneath the Golden Gate. In the South they are convinced that they have bloodied their place with history. In the West we do not believe that anything we do can bloody the land, or change it, or touch it.
The biggest surprise in this first collection of short stories by the bestselling novelist Mark Haddon is not that he can write compellingly in short form too. The shock of The Pier Falls is in the darkness of subject matter Haddon chooses to bring to centre stage.
Babies in art mostly look nothing like babies in life. This is especially true of the baby Jesus, but also of babies more broadly, and this is true even, and maybe most noticeably, in paintings and sculptures that are, apart from the oddly depicted babies, realistic.
The literary history of the early years of word processing—the late 1960s through the mid-’80s—forms the subject of Matthew G. Kirschenbaum’s new book, Track Changes. The year 1984 was a key moment for writers deciding whether to upgrade their writing tools. That year, the novelist Amy Tan founded a support group for Kaypro users called Bad Sector, named after her first computer—itself named for the error message it spat up so often; and Gore Vidal grumped that word processing was “erasing” literature. He grumped in vain. By 1984, Eve Kosofsky Sedgwick, Michael Chabon, Ralph Ellison, Arthur C. Clarke, and Anne Rice all used WordStar, a first-generation commercial piece of software that ran on a pre-DOS operating system called CP/M. (One notable author still using WordStar is George R.R. Martin.)
I heard the phrase “go off on a tangent” before I learned the geometrical meaning of tangent, defined by Wikipedia as a line “that ‘just touches’” a curve at one point. (I’m told that on math, Wikipedia is very good, which is to say dependably useful and accurate, without obvious bias. Yet I am seduced by the subjectivity of those scare quotes; they conjure up a phantom voice.) It’s an elegant metaphor, if hyperbolic, to borrow another term from geometry, for digression — the line that connects at a single point only and then extends off into infinity.
That idiom “go off on” makes a tangent sound undesirable, like an unprovoked rant. But in writing or in conversation, the tangent may be the most interesting part — the unexpected, insuppressible part. It is not a non sequitur — literally, “does not follow” — with its attendant loss of logical continuity. The digression does follow, it just doesn’t necessarily come around and connect back up.
When Smith slows down and lingers over the details of the scene at hand, the prose sings, and we are once again convinced that the strange story of Eliot and Cross could only be shaped by Smith’s deft hand.
We don’t binge on television because we like it, we like television—more than movies—because we can binge on it.
Can you really teach people to write? I might pose a parallel question: do we really teach people to be philosophers or mathematicians? Don’t we, instead, hand over to our budding philosophers or mathematicians a few basic tools that permit them to self-evolve?
Perhaps what we really mean when we say we can’t teach writing is that we can’t teach someone to be Virginia Woolf. On the other hand, a number of our most accomplished writers were, early in their lives, enrolled in writing courses: Eugene O’Neill, Tennessee Williams, Arthur Miller, Wallace Stegner, and Flannery O’Connor, to name a few.
Once upon a time, there was a city where people looked at each other or their surroundings instead of their cellphones. Where they spoke to one another, rather than making some clever online comment. Back then, in the 1970s, Manhattan was still rough, and not just on the edges, but through and through, it seemed. And so it was a time when street photographers could roam the streets and find inspiration all around, engaging with their subjects to capture moments of subtlety, beauty and serendipity.
This was the New York that Carrie Boretz discovered when she started working in 1975 as a photo intern at The Village Voice. There was enough to be found on the street that she did not have to make a living doing still lifes or fashion. She wanted real life.
Greg Milner’s Pinpoint tells the story of how we were navigated into this situation − conventional methods of dating place us in the 2016th year of the Common Era, but for him the clock really began ticking (if oscillating caesium atoms do, in fact, tick) at midnight on 5 January 1980 when − synchronised to UTC (Coordinated Universal Time, as determined by an averaging of more than 200 atomic clocks worldwide) − the GPS system went live.
“If the structure of the universe is a result of a pattern of vibration, what causes the vibration?” Stephon Alexander asks in his new book, “The Jazz of Physics.” And does that vibration mean that the universe is “behaving like an instrument?” In the most engaging chapters of this book — part memoir, part history of science, part physics popularization and part jazz lesson — Dr. Alexander ventures far out onto the cutting edge of modern cosmology, presenting a compelling case for vibration and resonance being at the heart of the physical structure we find around us, from the smallest particle of matter to the largest clusters of galaxies.
When theories get too complex, scientists reach for Ockham’s Razor, the principle of parsimony, to do the trimming. This principle says that a theory that postulates fewer entities, processes or causes is better than a theory that postulates more, so long as the simpler theory is compatible with what we observe. But what does ‘better’ mean? It is obvious that simple theories can be beautiful and easy to understand, remember and test. The hard problem is to explain why the fact that one theory is simpler than another tells you anything about the way the world is.
Jimmy MacDonald, a trained hydrologist and guide who had delivered a talk to the passengers on the science of sea-ice formation the day before, aimed his shotgun at the water off the stern and unloaded five shots in quick, rhythmic succession: BANG, snick-snick, BANG, snick-snick, BANG, snick-snick, BANG, snick-snick, BANG. The ocean swallowed each slug within seconds, the expanding ripples erased by the waves.
If live-fire target practice isn’t quite what you’d expect on board a cruise ship under sail, that’s fitting — Canada’s Arctic waters are not your usual tourist destination. But a growing number of small cruise ships are heading to the region each year, and the Northwest Passage — the fabled waterway that for centuries claimed hundreds of explorers’ lives — is an especially powerful draw.
Julian Barnes makes this traumatic event central to his fictionalized portrait of Shostakovich in his ambitious but claustrophobic new novel, “The Noise of Time.” It’s a book that attempts to turn the composer’s complex relationship with the Soviet authorities into an Orwellian allegory about the plight of artists in totalitarian societies — and a Kafkaesque parable about a fearful man’s efforts to wrestle with a surreal reality, even as he questions his complicity with the system.
This rainbow-flag polity, Plato argues, is, for many people, the fairest of regimes. The freedom in that democracy has to be experienced to be believed — with shame and privilege in particular emerging over time as anathema. But it is inherently unstable. As the authority of elites fades, as Establishment values cede to popular ones, views and identities can become so magnificently diverse as to be mutually uncomprehending. And when all the barriers to equality, formal and informal, have been removed; when everyone is equal; when elites are despised and full license is established to do “whatever one wants,” you arrive at what might be called late-stage democracy. There is no kowtowing to authority here, let alone to political experience or expertise.
The very rich come under attack, as inequality becomes increasingly intolerable. Patriarchy is also dismantled: “We almost forgot to mention the extent of the law of equality and of freedom in the relations of women with men and men with women.” Family hierarchies are inverted: “A father habituates himself to be like his child and fear his sons, and a son habituates himself to be like his father and to have no shame before or fear of his parents.” In classrooms, “as the teacher ... is frightened of the pupils and fawns on them, so the students make light of their teachers.” Animals are regarded as equal to humans; the rich mingle freely with the poor in the streets and try to blend in. The foreigner is equal to the citizen.
And it is when a democracy has ripened as fully as this, Plato argues, that a would-be tyrant will often seize his moment.
From Mexico City's Zócalo to Rome's Piazza Navona, public squares have always been a vibrant part of urban life. After visiting Italy a few years back, editor Catie Marron began thinking about the different roles these public spaces have played. She asked some well-known writers to share their thoughts about famous squares around the world, and the resulting essays are gathered in a new book called City Squares.
But our theory of the universe, called biocentrism, in which life and consciousness create the reality around them, has no space for death at all. To fully understand this, we need to go back to Albert Einstein’s theory of relativity, one of the pillars of modern physics. An important consequence of his work is that the past, present and future are not absolutes, demolishing the idea of time as inviolable.
Still, I backslide. This year I felt like I should do some special piece, something clever, something with real ambition—a monument to Jen and the art that inspired both of us, or something. The closer I got to the day the more I resisted getting started. A couple of weeks ago, I realized I was doing it again—trying to shape the narrative of my life, and give numbers and calendar pages power over me, instead of just living my life and letting the world speak to me and through me.
Again, as is so often the case, as soon as I let go, things started to happen, seemingly at random—and they all helped me process the loss and feel the weight of the passage of time. The signs were there, the symbols were there. They didn’t “mean” anything in any conventional Fiction 101 sense. They were just there, like shapes in clouds.
My local flea market in Florence is a rich affair. From dusty glass chandeliers to the everyday detritus of living - entire houses cleared, including private correspondence. Here were someone's emotions going cheap. The paper was thin, crisp with age, densely written on both sides. The date, November 1918. The challenge was two-fold. First they were in Italian - of course. While I can negotiate markets even hold my own in a one-to-one about history, letters written in dialect 100 years ago would be tough.
But bigger than that was the hurdle of the handwriting. It was tiny with hardly any room between the lines, as if a conscientious ant had climbed out of the inkpot then wound its way across every millimetre of the page. Not even any crossing out. If this was love, he (and I could make out enough to know it was a he) was very sure about it.
With my left hand sunk in a bowl of ice water, I chant a mantra to deliver me from suffering: [expletive deleted]. [expletive deleted]. [expletive deleted]. The words are emotionless and delivered at speaking-voice volume, a counterpart to a moment ago when I repeated “spoon” instead.
I’m trying my own homespun replica of an experiment in which a British psychologist studied the narcotic power of expletives: Uttering curse words led study participants to report less pain and endure the frigid water for about 40 seconds longer than when they spoke neutral words.
Dreams of fast commercial airline travel recur every so often. Over the past year, we’ve seen talk from NASA about a new generation of X-planes to demonstrate technologies needed for a supersonic Concorde replacement. And Lockheed Martin is discussing a commercial transport derivative from its hypersonic weapons and spy plane programs. The world is more than ready to hear such talk; for many, it’s frustrating, even baffling, that airline travel (Concorde excepting) has been stuck at speeds just over Mach 0.8 since the dawn of the jet age.
Alas, the latest dreams are no more real than previous ones. The obstacles to fast jet travel remain very high, and if anything they’re getting higher.