The ancient Egyptians did it, and so do we. Here's how a leap day—which occurs Februrary 29—helps keep our calendars and societies in sync.
A novel starring a novelist can often seem a little pleased with itself, as if the author is looking over her shoulder, eager to make a great drama from a greatly uneventful thing. So it was with some trepidation that I picked up “Ways to Disappear,” the first novel by the poet and translator Idra Novey. The protagonist, Emma Neufeld, is a Portuguese-to-English translator devoted to the work of a cult-classic Brazilian writer. Novey herself translates from Portuguese to English, most recently the work of Clarice Lispector, the cult-classic Brazilian writer.
But Novey has wholly eluded the hazards of writing about writers. Instead, this lush and tightly woven novel manages to be a meditation on all forms of translation while still charging forward with the momentum of a bullet.
Why do most of us feel that we are something more than molecules?”, asks Mark Haddon, author of The Curious Incident of the Dog in the Night-Time, in his engaging introduction to this compelling collection drawn from literature, science, philosophy and art ranging back 500 years and tackling the thorny question of what consciousness actually is. “We are made of the same raw materials as bacteria, as earth, as rock, as the great dark nebulae of dust that swim between the stars, as the stars themselves”, writes Haddon, introducing extracts that explore how the sense of being made of something immaterial, too, has long haunted humans.
Gone With The Mind is Leyner's new novel. A novel that is, by turns, autobiographical, fictional, touching and just flat-out insane. It takes the form of a writer named Mark Leyner giving a reading in a mall food court — one to which no one has shown up except for two fast-food employees on their break, and Mark's mom, who arranged the reading and drove him there. Who begins the book with a long, rambling, deliberately saccharine introduction, transcribed verbatim (as is everything else that happens — there is no narrator, just some passing, italicized notes that read almost like stage directions) and run on forever. So long that you get that it's a joke, then get annoyed by the joke, then come to a place of grudging respect for the author for his commitment to the gag, then get annoyed all over again. And then it ends. Then Mark's opening remarks begin.
rust us, this blog was scheduled long before the unpleasantness, but given the amount of confusion that a certain supermarket’s decision to straighten a morning pastry has caused, it is timely (indeed, of great relief to the nation) that How to Eat will now definitively settle what constitutes the perfect croissant.
Please do not get too twisted up below the line. Ensure you can prove your point. Flaky, pain-ful arguments will not butter-up your fellow contributors. They will make you look like a cronut.
I’ve spent the last few months talking to more than 40 researchers, development practitioners, foundation employees and other Silicon Valley philanthropists, asking them about the difficult business of giving money away. They told me about their own Newarks: Promising ideas scaled into oblivion, donations that disappeared into corrupt governments, groupthink disguised as insight. But they also told me about projects that worked, that scaled, that matched the ambitions of the new philanthropy while avoiding its blind spots. And it turns out that some of the best ideas are the ones Zuckerberg is the least likely to hear in Silicon Valley.
Calculus and higher math have a place, of course, but it’s not in most people’s everyday lives. What citizens do need is to be comfortable reading graphs and charts and adept at calculating simple figures in their heads. Ours has become a quantitative century, and we must master its language. Decimals and ratios are now as crucial as nouns and verbs.
Good on Paper is a multilayered, cleverly structured novel. The balance between an emotionally engaging tale of family on the one hand and an intellectual exploration of translation on the other is not always perfect, but, despite this, Cantor creates a playful and rewarding read.
One of the problems with globalisation — from the earliest times to its most recent avatar — is the obsessive pressure toward a linear homogeneity of structures and narratives. At the same time, cultures in conflict often invoke their essential heterogeneity in a carnivalesque pageant offering resistance and seeking identities through difference. In our own times with its post-industrial traces, Europe is processing itself from the homogeneity of several post-enlightenment nation-states to a larger heterogeneous political and cultural entity called European Union; and India, characterised by a historically established heterogeneity, is permanently struggling against homogenising tendencies that seek to unsettle its constitutionally established secular diversity. This struggle is easily borne out by the cultural contours of language in everyday use as well as in its literary contexts. Perhaps it is a truism to state that there is an unseen link between colonialism and monolingualism. Multiculturalism and multilingualism in the eyes of imperial powers is nothing but unchaste and impure.
Indeed, scientists who study the mechanics of curiosity are finding that it is, at its core, a kind of probability algorithm—our brain’s continuous calculation of which path or action is likely to gain us the most knowledge in the least amount of time. Like the links on a Wikipedia page, curiosity builds upon itself, every question leading to the next. And as with a journey down the Wikipedia wormhole, where you start dictates where you might end up. That’s the funny thing about curiosity: It’s less about what you don’t know than about what you already do.
“The imagination of a novelist has everything to do with what happens to his material,” he said. As the speech neared its end, however, it became clear that the two novels Baldwin had already written, and the ones he had yet to write, were part of this hypothetical oeuvre. “Go Tell It on the Mountain” was only his first attempt.
At first, I was taken aback by the lack of incident in Mona Awad’s otherwise absorbing new novel, 13 Ways of Looking At a Fat Girl. The protagonist, a woman named Elizabeth living in southern Ontario, simply grows up, gains weight, loses it, gets married, gets divorced. That’s it.
Few novelists are comfortable with this quiet of a plot. In order to sustain it you either have to have to construct a narrator of unusual reflective capabilities, or one with an undeniably interesting characteristic, something any reader wants to know more about. And Awad opts for the latter. It seems blunt Elizabeth is mostly interesting because she is – as the title says – fat.
With the advent of social media, forgetting can seem like an obsolete danger: Facebook timelines and chat logs have externalized the burden of remembering. But a different and more pernicious kind of forgetfulness looms in Petina Gappah’s first novel, “The Book of Memory,” whose narrator grapples with the threat of erasure.
Before social media, lonely people would just scream into their toilets. But, now, the toilets talk back.
So I’ve been hunched over my phone morning, noon, and, now, my night started by opening a text. Dinner plans had changed. I thought about canceling the plans. Which is one of the few pleasures left in life. But I had spent one too many weekends peering out my laptop screen. It was time to practice being human. The detour led me down a tree-lined street in Brooklyn.
I'm planning a move from a spacious apartment to a compact 500-square-foot space. It's exciting — at least that's what I'm telling myself. The truth is that since my mother passed away in November, after living with me for the last decade, the house feels too empty. I don't need a space big enough to accommodate a family. Small is beautiful, right? And the chance to design a compact "great" room is a welcome excuse to spend too much time on Pinterest and Houzz.com. I spend all my time in the kitchen and living room anyway — so I'm creating a space that's both, and has the benefit of opening to the garden instead of being on the second floor.
But what do I do with the books?
When I was twenty-three, I was hired by the CIA. I was working at a Catholic school at the time, coaching squash and teaching seventh-grade social studies—which was funny, since I had never before seen a squash game before and was not even so much as a lapsed Catholic. I lived behind the school in a former convent where the only consistently functioning lights were a pair of glowing red exit signs. My prevailing feeling that year was one of intense personal absurdity, and it was in this spirit that I applied to the CIA (I liked international relations, and who knew they had an online application?) and the Iowa Writers’ Workshop (I liked writing stories, and what the hell?). These things certainly didn’t make any less sense than coaching squash and living in a convent—though they weren’t really ambitions as much as gestures: reflections of my general hope that I would, someday, do something else. Each was something in between a dice roll and a delusion, a promissory note and a private joke to no one but myself.
Later, it turned out that this was a lot like what writing a novel would feel like.
For thousands of years, human beings have relied on stone tablets, scrolls, books or Post-it notes to remember things that their minds cannot retain, but there is something profoundly different about the way we remember and forget in the internet age. It is not only our memory of facts that is changing. Our episodic memory, the mind’s ability to relive past experiences – the surprising sting of an old humiliation revisited, the thrill and discomfort of a first kiss, those seemingly endless childhood summers – is affected, too.
I don’t know if there is a statute of limitations on confessing one’s sins, but it has been six years since I did the deed and I’m now coming clean.
Six years ago I submitted a paper for a panel, “On the Absence of Absences” that was to be part of an academic conference later that year—in August 2010. Then, and now, I had no idea what the phrase “absence of absences” meant. The description provided by the panel organizers, printed below, did not help. The summary, or abstract of the proposed paper—was pure gibberish, as you can see below. I tried, as best I could within the limits of my own vocabulary, to write something that had many big words but which made no sense whatsoever. I not only wanted to see if I could fool the panel organizers and get my paper accepted, I also wanted to pull the curtain on the absurd pretentions of some segments of academic life. To my astonishment, the two panel organizers—both American sociologists—accepted my proposal and invited me to join them at the annual international conference of the Society for Social Studies of Science to be held that year in Tokyo.
The Moravian Book Shop in Bethlehem, Pennsylvania, is still thriving, some 270 years after its founding.
It takes more than just policies to make a workplace truly flexible. The whole office culture has to change.
But who’s to say machines don’t already have minds? What if they take unexpected forms, such as networks that have achieved a group-level consciousness? What if artificial intelligence is so unfamiliar that we have a hard time recognising it? Could our machines have become self-aware without our even knowing it? The huge obstacle to addressing such questions is that no one is really sure what consciousness is, let alone whether we’d know it if we saw it.
Three days before the wedding, when I was riding my bike to pick up my wedding dress, a car door opened in front of me. The thing I most recall about the accident is not the moment of impact but just after. How I imagine my brain — coiled and white — pitching forward in fluid. How I sit up and hold my head as if to stop my brain from hitting the cranium. How when I break my eyes open, there is a man in glasses next to me pulling me to my feet. He is the same man who flung his car door in front of me causing me to crash, so I brush his hands away and say, I am fine, I am going to ride away now. I straddle my seat and push my feet on the pedals but the wheels do not move. The man lurches up to me. He avoids my eyes. He pins the front wheel between his knees. He yanks the crooked handlebar back into place.
Next, I am a reveler, walking along the city sidewalk, trailing my bike behind me as people in business suits and winter hats brush past me inflamed with purpose. Then I stand still at the corner, mesmerized by the street signs — Madison, Halsted — because not only do I not recognize the names, it suddenly dawns on me: I have no idea where I came from or where I was going, what city I am in, what my name is, and I do not even know the year.
Sometimes, the best approach to a book about deadly pathogens is to read it in a slightly dissociated state — it allows you to marvel at the cunning adaptability of microbial life, rather than contemplate the far creepier possibility of your own doom. In her introduction to “Pandemic: Tracking Contagions, From Cholera to Ebola and Beyond,” Sonia Shah, a science writer whose previous books include “The Fever” and “The Body Hunters,” does not seek shelter in euphemisms or shy away from scary numbers. Instead, she cites a study in which 90 percent of epidemiologists say they believe a global pandemic will sicken one billion and kill up to 165 million within the next two generations.
Imagining Stephen King as a doctor of dental surgery may seem strange at first, but it’s an image that is easy to conjure — King is, after all, the Master of Horror, and few things provoke more fear and anxiety than a trip to the dentist. The idea of King donning scrubs and latex gloves while fiendishly grinning at a helpless patient is perhaps a metaphorical stretch with respect to how his writing is received by his Constant Reader, but King’s latest collection of short stories, The Bazaar of Bad Dreams, gives gruesome life to this illustration.
Whereas most biographers tend to focus on Doyle’s creation of Holmes, treating his other characters as a crowd of literary extras, Ashley sets out to describe the shape of his whole career. From the early manuscripts that Doyle called “paper boomerangs”, because they were returned to him so quickly, to his later fame and riches, the writer who emerges from this account is a restless figure who both enjoyed and slightly resented his most famous creation. Like many writers who hit a public nerve, he seems to have been surprised by his success, before spending the rest of his life trying to prove that it wasn’t a fluke.
"Petaloso" is now well on its way to becoming an official Italian word thanks to an eight-year-old's imagination - and the power of social media.
(*If you're not a straight white man.)
I went back and forth a few times about David Lehman’s new book. Does the world really need another biography of Frank Sinatra? There are several long ones, and a couple of excellent short ones, too, by John Lahr and Pete Hamill. So what does Lehman offer to make Sinatra’s Century worthwhile?
In the end, I come down strongly in its favor. There is much to love here, even though, let’s face it, there’s nothing about Sinatra’s life that can’t be found in most of the other books: the rise from modest means in Hoboken, the Major Bowes success (actually, Lehman doesn’t mention Major Bowes, although there is a photo of him, unidentified, with the rest of the Hoboken Four), the bobby-soxers, the pride he generated among Italian-Americans, the mob connections, the Ava debacle, the Rat Pack, the Mia debacle, the generosity, the thuggery. But although Lehman’s reportage may be derivative, it turns out to be the good kind of derivative; this is a compact-yet-complete portrait of a complicated guy who lived a long and active life; a guy whom Lehman calls “the most interesting man in the world.”
Lehman is a poet, and in structure and occasionally in style his book resembles an epic poem. In an afterword, he reveals that the strategy he chose, 100 discrete sections in honor of Sinatra’s 100th birthday (December 12, 2015), provided the answer to his own insecurities regarding purpose. “How do you write about someone who has provoked so many other writers, journalists, and novelists to erupt into prose?” Well, why not erupt into poetry, instead?
Charred, crispy textures and ingredients -- no longer limited to the world of barbecue -- are blackening everything from bread to vegetables. Not even dessert is safe. Yeah, it may look, feel and taste a little gritty, even a little ashy, but that's kind of the point, chefs say.
Cooking dinner can be stressful whatever your hours. It is hard to do as often as you like.
Try cooking breakfast instead.
It is a radical upheaval, a national reckoning with massive social and political implications. Across classes, and races, we are seeing a wholesale revision of what female life might entail. We are living through the invention of independent female adulthood as a norm, not an aberration, and the creation of an entirely new population: adult women who are no longer economically, socially, sexually, or reproductively dependent on or defined by the men they marry.
This reorganization of our citizenry, unlike the social movements that preceded it and made it possible — from abolition and suffrage and labor fights of the 19th and early-20th centuries to the civil-rights, women’s, and gay-rights movements of the mid-20th century — is not a self-consciously politicized event. Today’s women are, for the most part, not abstaining from or delaying marriage to prove a point about equality. They are doing it because they have internalized assumptions that just a half-century ago would have seemed radical: that it’s okay for them not to be married; that they are whole people able to live full professional, economic, social, sexual, and parental lives on their own if they don’t happen to meet a person to whom they want to legally bind themselves. The most radical of feminist ideas—the disestablishment of marriage — has been so widely embraced as to have become habit, drained of its political intent but ever-more potent insofar as it has refashioned the course of average female life.
On a recent morning, after checking out a pile of books at the Mid-Manhattan, I headed over to the Schwarzman to sit at an elegant wooden desk in the Allen Room, my work spread out under a pretty green-shaded desk lamp. Before long, someone’s phone pinged and the room grew tense, and I found myself missing the grubby comfort a block and a world away.
What makes swear words so offensive? It’s not their meaning or even their sound. Is language itself a red herring here?
If you’re looking at contemporary France from the outside, by means of the distorting prism of media, you could be forgiven for understanding Paris as a city structured entirely around two conflicting poles. On the one hand, you have the Fox News view of the City of Light turned lawless City of Darkness, a warzone overrun by fundamentalists and dotted with “no-go” areas of hard-core Islamist control, run by Daesh-inspired imams. On the other, you have the vision of the city propagated by films like Woody Allen’s Midnight in Paris, where the city and its heritage exist primarily as a plaything for white, Western elites who spend their never-ending vacations squandering trust funds on lavish meals and illicit affairs while pursuing kitsch quests for the Lost Generation.
The real Paris, as experienced by the vast majority of the French capital’s 2.2 million inhabitants, is considerably more mundane than these polarities suggest. Paris can be a place where extremes collide, as they did, shatteringly, at both the start and the end of 2015, but it is not a city of extremists. For the most part, Paris is a place where radical differences coexist. Watchwords generally include tolerance, cooperation, and “just getting along.” These efforts might not always take place harmoniously, not even necessarily completely respectfully (as the worrying support for the political extreme right in France demonstrates), but most people are, as they are in London, Tel Aviv, and Seoul, just trying to get by, and live their humdrum lives as modestly as possible in the context of the machinations of late capitalism. Belleville, a district in the northeast of the city, is a perfect case in point — it has a large Chinatown, but is also home to significant Jewish, Arab, and Berber populations, living cheek by jowl as they have done for decades, all contributing to the unique, vibrant texture of the local community.
Ramen bloggers aren’t just passive observers of the noodle soup phenomenon: to be a ramen writer of Kamimura’s stature, you need to live in a ramen town, and there is unquestionably no town in Japan more dedicated to ramen than Fukuoka. This city of 1.5 million along the northern coast of Kyushu, the southernmost of Japan’s four main islands, is home to 2,000 ramen shops, representing Japan’s densest concentration of noodle-soup emporiums. While bowls of ramen are like snowflakes in Japan, Fukuoka is known as the cradle of tonkotsu, a pork-bone broth made milky white by the deposits of fat and collagen extracted during days of aggressive boiling. It is not simply a speciality of the city; it is the city, a distillation of all its qualities and calluses.
Indeed, tell any Japanese that you’ve been to Fukuoka and invariably the first question will be: “How was the tonkotsu?”
From the vantage point of a 19th century lighthouse, a small, slow ship would appear every few months on the horizon. A woman, her husband and their children might look out at the glistening sea in anticipation from their tower: the shipment was finally here. They’d haul supplies from the boat; cleaning rags, paint, milk, and possibly the most awaited item: a thick wooden carrying case with brass hinges, filled with books.
Igor Pasternak started thinking about airships when he was twelve. Back then, in the nineteen-seventies, he loved rockets. One night, he was curled up in the soft green chair that doubled as his bed, in the two-room apartment where he lived with his parents, his little sister, and his grandmother, in the city of Lviv, in western Ukraine. He was reading a magazine aimed at young inventors, and he came across an article about blimps. He saw old photographs of imposing wartime zeppelins and read about another kind of airship, which had never made it off the drawing board: an airship that carried not passengers but cargo. It would be able to haul hundreds of tons of mining equipment to remote regions in Siberia in one go, the article said—no roads, runways, or infrastructure needed. Just lift, soar, and drop.
Igor wondered what the holdup was. He read the article again and again. He spent the summer in the library, studying the history and the aerodynamic principles of blimps. One day, on the way there, he looked into the sky, and the emptiness seized him.
Where are all the airships? he asked himself. The world needs airships.
Few inventions have so profoundly shaped consumer habits. With the exception of the automobile, the shopping cart is the most commonly used “vehicle” in the world: some 25 million grace grocery stores across the U.S. alone. It has played a major role in enriching the forces of capitalism, increasing our buying output, and transforming the nature of the supermarket — and for its role, it has been dubbed the “greatest development in the history of merchandising.”
Rarely comes the time when we sit back and consider the history of the shopping cart. But gather, friends: that time has come.
For this reader, at least, a novel is a success if it causes time to warp, to bend and deform, if it breaks time apart and puts it back together again in an interesting way. John Wray does all of the above, with wide-ranging intelligence and boundless verbal energy. Any experiment that Wray conducts is likely to be worth a reader’s time, and “The Lost Time Accidents” is certainly no exception.
How precisely does one become more creative? This is a perpetual anxiety in the C Suite, where executives lunge at advice that promises to open up their Steve Jobsian third eye. Kennedy’s book and Adam Grant’s latest, “Originals: How Non-Conformists Move the World,” try to demystify creativity, via the genre’s now traditional counterpoint of inspirational stories and counterintuitive social science. If clumsily constructed, this type of book can become self-parodic, a PowerPoint slog through the Five Things You Need to Do to Become More Dynamic and Creative.
Contempt for the nouveau riche is hardly limited to China, but the Chinese version is distinctive. Thanks to the legacy of Communism, almost all wealth is new wealth. There are no old aristocracies to emulate, no templates for how to spend. I asked some of the women on “Ultra Rich Asian Girls” about being the objects of both envy and censure. “In Web forums about the show, people are always, like, Why do they have to show off like that?” Weymi said with a shrug. “I don’t think I’m showing off. I’m just living my life.”
But in an important sense, the term is apt. On one level, the “Dark” in Dark Age refers not to artistic regression but to a shift toward darker themes, graphic violence, sexual explicitness, and a generally cynical tone, an approach commonly summed up by professionals and fans with two words: grim and gritty. This metonymical expression has inspired portmanteau neologisms like “grimdark” as well as related phrases like “darker and edgier” or its counterpart in superhero film, “grounded and realistic.” If Moore and Miller are the creators most responsible for this grim and gritty turn, both are ambivalent about its legacy.
On Mothering Sunday, 1924, with one war not long past and a second waiting over the horizon, young Jane Fairchild – foundling, maid to the Niven household in the green home counties, and the narrator and protagonist of Graham Swift’s enchanted novella – has no mother to go to. Instead she has “her simple liberty”, along with a book and half a crown in her pocket bestowed by a kindly employer who, his sons dead in France and his domestic staff reduced, is inclined to be indulgent to her youth.
Part of this is a mirror trick – I suddenly see myself peering back – but the rest of it is down to the effective use of space that follows when you fill bookshops with – well, books, rather than books plus stationery plus lattes. Oddly, given the disparity in size and grandeur, it has a touch of Lello, the bookshop in Porto that always appears high on lists of the world’s most beautiful places to buy books.
Peering back across Harper Lee's life, it can seem impossible to distinguish the novelist from her masterpiece, To Kill a Mockingbird. Lee died at the age of 89 in her hometown of Monroeville, Ala., on Friday morning — yet it's clear that her legacy will live on much longer than that, through her characters and the readers who have embraced them for decades.
The chief complaint about today’s psychiatric medications is the same one cited by those frustrated by the lack of progress on Alzheimer’s: They don’t treat the disease, just the symptoms, and they don’t even do that very well.
Rather than targeting brain chemistry to reduce symptoms, people such as Insel want to focus on brain circuitry. Their efforts have been bolstered by advances in technology and imaging that now allow scientists not only to see deeper into the brain, but also to study single brain cells to determine which circuits and neurons underlie specific mental and emotional states. Many of these advances come from fields as disparate as physics and electrical engineering — as well as the new field of optogenetics, which uses light to manipulate neurons.
“I was captivated,” John Allen Paulos says of the second time he met his wife. To attract an inattentive waiter while out on a date, he dared her to tip her bottle of Coke on to the floor. “She hesitated only for a brief moment,” he writes. The bottle duly dropped and shattered. The waiter came rushing over. “I was two-thirds of the way to being smitten,” he writes.
A possibly throwaway remark, but then again, I note that this is a book in which Paulos wields a mathematical lens in unexpected ways. So no throwaway here. I mean, most of us (three-fourths of us?) might have said, “I was well on the way to being smitten”, or perhaps we might have settled for “halfway to being smitten”. But Paulos has thought this through and thinks a more precise measure fits better; thus “two-thirds”.
Current neuroscientific thinking suggests that all three of these theories are at work. With further study, we may come to confirm traditional lessons on how to harness creative potential—by releasing our inhibitions, not overthinking, and engaging in free association.
Of course, countless science fiction works have portrayed imagined machine beings, such as HAL in Stanley Kubrick’s 2001. The classic film Colossus: The Forbin Project (1970) portrays an AI running amok and ruling humanity. The conceit has obviously become a popular generator of fictional plots. But Musk and the others are talking about the real world, our world. The pressing question becomes: Should we panic?
Or should we just accept defeat and hope our machine overlords won’t be too brutal? Or, in a more hopeful mood, look to a golden age mediated by kindly superintelligences? Or, in a more indifferent one, file all these comments under “techno overhype” and go about our business?
I’m gratified that Hercules has become such an object of public interest. He’s held my interest and abiding respect since I first learned about him five years ago. I’ve spent hundreds of hours and pored through thousands of pages to understand Hercules and his world.
But so much of what people are saying about him now does not ring true to me, based on that research. I can’t know exactly what he thought or how he felt, of course. But in writing the book my aim was always to represent him as he saw himself: dignified, commanding and proud.
The notion that there is a “normal” height or a “normal” salary is a relatively new one, and it's had a profound effect on how people think about each other and themselves.
I remember my grandmother telling me that if I were ever to marry, I should make sure he was kind. But she might just as well have said: "Find yourself a man who's nice to waiters." The way people treat restaurant staff is, I think, a kind of poker tell, revealing a person's character in as long as it takes to say: "I'll have the sea bass."
There is no reason why good writing should exclude arousal, but any sex scene that reaches for stock phrases – or even stock words – can forget about its literary credentials. The stakes are higher than usual; “quivering member” is not just a cliché, it’s a cliché that tips you straight into another genre. Nor is it possible to play it safe.
There’s a distinct possibility that I would never have been able to finish reading “Moby-Dick,” in my early twenties, had it not been for the Guns N’ Roses song “November Rain.”
Years ago, I came across this title by way of one single recipe—an anecdotal, two paragraph wonder by Mrs. F. Scott Fitzgerald, described by Stratton as “wife of author of ‘The Beautiful and Damned,’ ‘The Jazz Age,’ etc.” It was an entry called “Breakfast.”
See if there is any bacon, and if there is, ask the cook which pan to fry it in. Then ask if there are any eggs, and if so, try to persuade the cook to poach two of them. It is better not to attempt toast as it burns very easily.
Increasingly, shows feature big, bold, spectacular works that translate into showy Instagram pictures or Snap stories, allowing art to wow people who might otherwise rarely set foot inside museums. But the trend toward accessibility has its critics, who wonder whether the sensationalist works being exhibited are worthy of all the attention, not to mention whether the smartphone photography is getting in the way of people looking and thinking about the art in front of them.
It’s not just your imagination. Horror films are much more scary than they were in the past. Here’s how they do it.
The man leading the search for Malaysia Airlines Flight 370 is showing the strain after almost two years of fruitless toil.
Martin Dolan, head of the Australian Transport Safety Bureau, said he struggles to sleep at times, gnawed by thoughts that wreckage from the Boeing Co. 777 may have slipped through the sonar net scanning 120,000 square kilometers (46,330 square miles) of the southern Indian Ocean.
MH370 is weeks away from becoming aviation’s biggest unsolved mystery since Amelia Earhart disappeared in 1937. Of the 3 million components in the jet, only one has turned up -- a barnacle-encrusted wing flap -- on Reunion Island, thousands of miles from the search. There have been no traces of the 239 people on board, their luggage or even the life jackets that were supposed to float.
The former editor of the Edmonton Journal’s inside look at the damage being done to Canada’s newspapers.
Growing up, I took E.T.’s fortuitous, last-minute revival for granted as down to the fact that E.T. just did that sort of thing purely because he was awesome. Having read sci-fi authors like Vernor Vinge and others in more recent decades, I can now set aside such childish notions and offer a better theory.
The promise that neurogastronomy holds is that once we understand how the mind combines the disparate biological and evocative forces that create flavor, we will be able to circumvent the learned and innate preferences of our taste buds. And with that capacity—truly an example of mind over matter—instead of stimulating appetite via the conventional and unhealthy trifecta of salt, sugar, and fat, we can employ the neural pathways through which flavor is constructed in the brain to divert attention to different, more nutritious foods. Control flavor and you control what we eat—and perhaps, given time and more research, begin fighting the global nutrition problems that are a direct result of the industrialized production of food.
Spiotta returned to the idea of ‘‘attending’’ in our talks and email correspondence this winter. What she called ‘‘codas, afterthoughts, parentheticals, digressions, qualifications’’ were often attempts to get at saying something the exact right way.
‘‘ ‘Attend’ comes from ‘attendere,’ which means ‘to stretch,’ ’’ she emailed one morning. ‘‘That is so interesting, as if attending means you have to stretch your mind toward another.’’
A battle over triplets raises difficult questions about the ethics of the surrogacy industry and the meaning of parenthood.
“Overripe,” “overwritten” — these terms have judgment baked in. A term like baroque or minimalist reserves judgment; you can like minimalist work or dislike minimalist work without needing to choose a different descriptor. But it would be weird to say “I like overwritten poetry.” “Overwritten” implies a shared benchmark, an agreed-upon, appropriate level of writtenness.
I do, though, kind of like overwritten poetry, when it’s purposely overwritten. Take Lucie Brock-Broido, who may be the queen of garish, costumey excess. No one can tell me she isn’t trying to be funny, a little bit — see the first two lines of “Basic Poem in a Basic Tongue” from Trouble in Mind.
It all sounded pretty crazy, as NASA’s top administrator observed, but, as it turned out, it was “the right kind of crazy.” For more than three and a half years now, the little rover has been working diligently, trundling across the surface of Mars, looking for evidence that the planet could have once supported life, and occasionally tweeting.
“The Right Kind of Crazy” is the title of this engaging book, written by Adam Steltzner, an engineer leading the team at NASA’s Jet Propulsion Laboratory (J.P.L.) that was charged with landing the Curiosity. Written with William Patrick, the book is an inside account of the intense decade of teamwork that went into Curiosity, and it’s also the story of Mr. Steltzner’s own unlikely journey — from an aspiring musician, who barely graduated from high school, to a California Institute of Technology recruit to a team leader at the J.P.L. in Pasadena, Calif.
In Finding Them Gone, the translator Red Pine, a.k.a. travel writer Bill Porter, calls on more than 40 ancient Chinese poets in 30 days. With three small porcelain cups and a flask of expensive bourbon, he crosses the country in search of places associated with the authors of his most beloved poems: usually their graves, but also former homes, memorial pavilions, and famous landmarks. Once located, regardless of the poet’s station in the literary afterlife, Porter pours his libations into the ground and then sips some himself.
What he had, he later determined, were molecules that lit up only when crowded together — in solid form, for example. Dr. Tang’s study of that chemical and its unusual behavior has led to an emerging class of small, nonmetal compounds with applications in unusually diverse arenas, from vastly improving optoelectronic devices like organic light-emitting diode (OLED) televisions to advancing the use of fluorescent technology in the human body.
A slip of a woman with an orderly poof of snow white hair, swooping cat eye glasses and dramatic eyebrows, the 52-year-old West promptly organized a boycott of five supermarket chains on behalf of her group, Housewives for Lower Food Prices. Within a week the boycott had swept across the nation and beyond, crossing the border to Canada. Food prices were rising everywhere, and people—especially the women who did the bulk of shopping for their households—were sick of it. West (who said she found picketing to be “unladylike”) said, “I’m not an organizer, I’m simply a housewife disgusted with food prices.”
Just in time for Valentine’s Day, it appears that oil and stocks have developed an unhealthy, codependent relationship. They’re way too deep into each other. Where one market goes, the other follows. If they were people, a counselor would be urging a trial separation. “This is highly unusual,” Torsten Slok, chief international economist at Deutsche Bank, wrote to clients in late January. “Call it the oil correlation conundrum.”
Jansma is a brilliantly talented writer, but he also has a unique insight into what friends mean to one another, and what it means to be part of a city in which you never quite belong, but can't quite bring yourself to leave. It's a heartfelt novel, tender and painful and cathartic all at once, and even if the characters belong to New York, the story belongs to us all.
Market-driven memoir insists on redemption — on some version of a hero’s journey in three acts: the beginning, the middle, the end. But there’s an argument to be made that this is a false construction — that to author a personal story in this way is to impose a distorting narrative arc over a life, and thus, some would say, to deceive. In Liar, a new memoir by Rob Roberge, the author chooses instead to untangle the lie. Roberge plumbs the mental constructions, obfuscations, omissions, exaggerations, and half-truths he has told himself and others, beginning, in the book, in second grade (in 1972), and ending in 2013, when, undone by his own lack of integrity, he writes of himself, “you walk into a room and the first thing you say is ‘I’m sorry.’”
As with Hope’s highly acclaimed debut novel, Wake, the writing is elegant and insightful; she writes beautifully about human emotion, landscape and weather: “There was no wind. It was as though they were all simmering under the great grey lid of the sky, like water almost brought to boil.”
Like all successful historical novels, The Ballroom tells us a story of the past in order to shed light on the present.
But cancer has also ushered in new ways of being alive. Even when I am this distant from Canadian family and friends, everything feels as if it is painted in bright colors. In my vulnerability, I am seeing my world without the Instagrammed filter of breezy certainties and perfectible moments. I can’t help noticing the brittleness of the walls that keep most people fed, sheltered and whole. I find myself returning to the same thoughts again and again: Life is so beautiful. Life is so hard.
Reading Jeet Thayil’s Collected Poems is a reminder of the singular pleasures of reading the work of a poet from first book to last. You see changing approaches to form — from terse implosive verse to expansive, full-throated song; from a nascent verbal ingenuity to an ability to combine exuberance with exactitude.
After reading the rest of the stories, however, the older meaning of the word hysterical came to the fore and I began to see the introductory story as the anxious thesis of the book. These privileged women have made a career of the domestic arts, and their virulent pride over their daily reality frays and distorts their prim facades. They flare up and blaze, punching out from behind their perfect veneers to ruthlessly protect what is theirs, and they do it with withering wit. They want to matter, regardless of what they do or what they can offer the world; for their domestic salad spinning to be as valuable as CPR. The women of American Housewife are hysterical in both senses of the word, often at the same time.
Over the last few years, a handful of researchers have compiled growing evidence that the same cells that monitor an individual’s location in space also mark the passage of time. This suggests that two brain regions—the hippocampus and the entorhinal cortex, both famous for their role in memory and navigation—can also act as a sort of timer.
Norman Garmezy, a developmental psychologist and clinician at the University of Minnesota, met thousands of children in his four decades of research. But one boy in particular stuck with him. He was nine years old, with an alcoholic mother and an absent father. Each day, he would arrive at school with the exact same sandwich: two slices of bread with nothing in between. At home, there was no other food available, and no one to make any. Even so, Garmezy would later recall, the boy wanted to make sure that “no one would feel pity for him and no one would know the ineptitude of his mother.” Each day, without fail, he would walk in with a smile on his face and a “bread sandwich” tucked into his bag.
The book was Dream of the Red Chamber, also known as The Story of the Stone, written by Cao Xuequin. The critic Anthony West called it “one of the great novels of world literature … to the Chinese as Proust is to the French or Karamazov to the Russians”.
But for the sort of reader who can wait for a fire to get roaring, who can live with a cat who refuses to sit in your lap — in short, a reader who enjoys a house in winter — there are the rewards of beauty and humanity in “Weathering.” So many of the characters, worn down by a life of disappointments, feel as Luke does when, after fruitlessly digging in his garden for ancient coins, he tells Pepper: “In the end you have to let it go.” But one senses that Wood feels differently, as does Pepper. She looks at the old man as he stares despondently into his drink. “Why?” she asks him, an innocent and startling demand. “Why do you?” Why indeed?
For one thing, lower-income people behave more consistently as consumers than more affluent ones. Poorer people tend to value a dollar more consistently, irrespective of the context. It is not simply that those with less money pinch more pennies; it is that they are compelled to value those pennies in absolute rather than relative terms.
Whereas the well-off may dabble in frugality, necessity makes the poor experts in it. To them, a dollar has real tangible value. A dollar saved is a dollar to be spent elsewhere, not merely a piece of token accounting.
The customs officer at Changi Airport in Singapore gestured at my girlfriend’s suitcase as it rolled out from the X-ray machine. He repeated his question for a third and final time: “Are you sure you don’t have anything in your bag that you’d like to tell us about?”
By now, panic had spread across my girlfriend’s face. If we confessed to what was in there, we would be in trouble. If we didn’t and they searched her bag, things would only be worse.
To me, the greatest tragedy of the store’s end is not that the neighborhood is losing another beloved longtime business, and a piece of its living history, but that Contant and McCoy, who are now both in their seventies, are losing, under dismal circumstances, the jobs they’ve held for nearly four decades.
Just over a billion years ago, many millions of galaxies from here, a pair of black holes collided. They had been circling each other for aeons, in a sort of mating dance, gathering pace with each orbit, hurtling closer and closer. By the time they were a few hundred miles apart, they were whipping around at nearly the speed of light, releasing great shudders of gravitational energy. Space and time became distorted, like water at a rolling boil. In the fraction of a second that it took for the black holes to finally merge, they radiated a hundred times more energy than all the stars in the universe combined. They formed a new black hole, sixty-two times as heavy as our sun and almost as wide across as the state of Maine. As it smoothed itself out, assuming the shape of a slightly flattened sphere, a few last quivers of energy escaped. Then space and time became silent again.
The waves rippled outward in every direction, weakening as they went. On Earth, dinosaurs arose, evolved, and went extinct. The waves kept going. About fifty thousand years ago, they entered our own Milky Way galaxy, just as Homo sapiens were beginning to replace our Neanderthal cousins as the planet’s dominant species of ape. A hundred years ago, Albert Einstein, one of the more advanced members of the species, predicted the waves’ existence, inspiring decades of speculation and fruitless searching. Twenty-two years ago, construction began on an enormous detector, the Laser Interferometer Gravitational-Wave Observatory (LIGO). Then, on September 14, 2015, at just before eleven in the morning, Central European Time, the waves reached Earth. Marco Drago, a thirty-two-year-old Italian postdoctoral student and a member of the LIGO Scientific Collaboration, was the first person to notice them. He was sitting in front of his computer at the Albert Einstein Institute, in Hannover, Germany, viewing the LIGO data remotely. The waves appeared on his screen as a compressed squiggle, but the most exquisite ears in the universe, attuned to vibrations of less than a trillionth of an inch, would have heard what astronomers call a chirp—a faint whooping from low to high. This morning, in a press conference in Washington, D.C., the LIGO team announced that the signal constitutes the first direct observation of gravitational waves.
The box jellyfish’s eyes are part of an almost endless variation of eyes in the animal kingdom. Some see only in black and white; others perceive the full rainbow and beyond, to forms of light invisible to our eyes. Some can’t even gauge the direction of incoming light; others can spot running prey miles away. The smallest animal eyes, adorning the heads of fairy wasps, are barely bigger than an amoeba; the biggest are the size of dinner plates, and belong to gigantic squid species. The squid’s eye, like ours, works as a camera does, with a single lens focusing light onto a single retina, full of photoreceptors—cells that absorb photons and convert their energy into an electrical signal. By contrast, a fly’s compound eye divides incoming light among thousands of separate units, each with its own lens and photoreceptors. Human, fly, and squid eyes are mounted in pairs on their owners’ heads. But scallops have rows of eyes along their mantles, sea stars have eyes on the tips of their arms, and the purple sea urchin’s entire body acts as one big eye. There are eyes with bifocal lenses, eyes with mirrors, and eyes that look up, down, and sideways all at the same time.
At one level, such diversity is puzzling. All eyes detect light, and light behaves in a predictable manner. But it has a multitude of uses. Light reveals the time of day, the depth of water, the presence of shade. It bounces off enemies, mates, and shelter. The box jellyfish uses it to find safe pastures. You use it to survey landscapes, interpret facial expressions, and read these words. The variety of tasks that eyes perform is limited only by the fecundity of nature. They represent a collision between the constancy of physics and the messiness of biology. To understand how eyes evolved, scientists need to do more than examine their structures. They need to do what Nilsson did with the box jellyfish: understand how animals use their eyes.
The most important contribution of Fox’s argument, perhaps, is his claim that what really deserves to be damned as pretentiousness is not an aiming-up but an aiming-down. “Anti-intellectualism,” he notes, “is a snobbery … The anti-intellectual is often anxious not to be marked as part of an educated elite.” And it is inverse snobbery in all its guises that might deserve to be called pretentious in a bad way.
When it was established that we had a week to get hitched, for visa purposes, a little thrill passed over me. There was no dread with it, no abstraction. We were getting married and there was no time for emotional turmoil, for deep Googling, for dispiriting dress searches or arguments over the guest list. We’d spend a few hundred bucks at most ($500, somehow, in the end) and then it would be done. Time now became a creative constraint, how much we could do with how little. This I could handle. Maybe.
The surgeon, who has spent 15 minutes gently tearing through tissue, suddenly pauses to gesture ever-so-slightly with his tiny scissors. "Do you see what's on this side? That's nerves." He moves the instrument a few millimeters to the right. "And on this one? That's cancer."
Ashutosh Tewari is the head of the urology department at Mount Sinai Hospital in New York City. He is in the process of removing a patient's cancerous prostate, the walnut-sized gland in the delicate area between the bladder and the penis. This surgery—one of three that Tewari performs on an average day—takes place entirely within an area the size of a cereal bowl. Tewari's movements are deliberate and exact. Just a few wrong cuts could make the patient incontinent or unable to perform sexually for the rest of his life.
But Tewari is making those cuts from 10 feet away. With a robot.
“Not everything can survive translation,” said Shira Greene, when confronted for the first time since graduate school with an Italian manuscript and a blank page. “Hence the age-old notion that she who translates is both translator and traitor to the text: traduttore e traditore.”
Translation is the guiding metaphor through which Rachel Cantor explores themes of love, loyalty, and transformation in her second novel, Good on Paper. Drawing parallels between literary translation and romantic partnership, Cantor dives head first into the “chasm between languages” and the chasm between people, to ask: Can one ever bridge the gap?
It tells a simple story — in the best sense of the word — and tells it well. Unlike thrillers that deal in incomprehensible plots and cheap thrills, this is the believable, focused story of a young man trying to escape the consequences of crime and facing hard choices about love, religion and life itself.
For some women, menopause is no big deal. Some say they barely notice it. My mother, long ago, described her menopause this way: “My periods just started gettin’ lighter and lighter, and my harmones settled down, and then one day … pfft! It was over.”
Not me. Not only did menopause change my life, it changed me.
Complaints about the fashion show system, a monthlong twice-yearly four-country treadmill to see clothes six months before they reach stores, have been around for a long time: Fashion week is too tiring, too old-fashioned, too crowded. But while fashion people have largely complained about the effect the system has on their own lives and jobs and creativity, today’s problems are driven by a force even more powerful than simple self-interest: financial interest.
Which is to say, the buying public.
When a 21-year-old Swede named Lena Söderberg became Playboy magazine’s Miss November in 1972 under the name Lenna Sjööblom, there was little to set her apart from other Playmates of the era. She was a classic beauty, demure yet approachable, but otherwise unremarkable in the pantheon of Playboy centerfolds. It wasn’t until a few years later that Lena, with the help of a few researchers at the University of Southern California, would earn herself an unexpected place in history.
According to Film L.A., the organization that helps the film industry book municipal locations, over 80 movies, television shows, music videos, and commercials are shot on or underneath the Sixth Street Viaduct each year. That’s partially because of the bridge’s swooping metal arches, perched on an art-deco concrete platform; and partially because of the river underneath and that access tunnel: if you want to film something set in Los Angeles that makes reference to the city’s automotive culture, or if you’re just looking for a place to shoot a car chase that’s cheaper and more available than a clogged freeway, the channelized, concretized bed of the Los Angeles River is your best choice.
Except that the bridge officially no longer functions that way, as of last week. It’s going away completely. And the river? It’s on its way to becoming a river again.
Though fraught with inelegant-sounding sentences (“We can see the schizophrenic nature of empire in this early wave of globalisation”), Trentmann’s history of five centuries of material culture is impressive in its breadth and scholarship. Anyone with compulsive buying disorder should buy a copy, or two, or three. Ka-ching! Ka-ching! Ka-ching!
In these heady days of crowdfunding, the question often isn't who is asking for money, but who isn't?
When is a prop not a prop? When it’s not doing anything to further the story. On the whole, we don’t want to clutter our stories with meaningless objects. Yes, there are red herrings and MacGuffins, and some use can be got out of them, especially in detective stories and thrillers, but literary fiction is not about coaxing the reader into dark corners or dead ends. Serious literature usually intends to shine a light on the mysterious, the obfuscated, the entangled, and the overlooked.
Nevermind that New York is technically not the capital city of America: like Zurich and Bern, the designation is often thought a formality; if Goldsmith is looking for a center, it is one of consumer culture, not sovereign government. Unlike its progenitor, the book contains no original writing — in keeping with Goldsmith’s usual practice — and provides none of the contextual framing that published versions of The Arcades Project offer. Like previous works, this is more mélange than bricolage, placing writings together without affecting change to their individual characteristics.
Yet if Goldsmith feigns to create what are, as he puts it, “boring,” purely conceptual works that hardly need be made at all, his account is not entirely convincing, and Capital challenges it further.
Recently, I found myself with a book contract and a deadline to produce 80,000 words, so I turned to some author friends for advice. Their answers were nearly always the same: Don’t do what I did.
These critically acclaimed, award-winning writers had produced their best-sellers via creative processes that most described as disorganized and nonlinear. Careful plans had crumbled as they had encountered unexpected turns in their research and thinking.
After reading “Wired to Create,” by Scott Barry Kaufman and Carolyn Gregoire, I’m inclined to think that these writers took the right path.
The structure of Olga Grushin’s original new novel, “Forty Rooms,” is ingeniously simple. Over several decades, we follow the Russian-born narrator — an aspiring poet turned American housewife — into the 40 rooms that represent the topography of a privileged, middle-class woman’s life.
Richard Nixon has managed to defy definitive categorization by his multiple biographers because his character resembles the proverbial elephant being examined by blind men and evidence can be found supporting any pre-existing bias.
Behold! I give you the problem of Superman. It’s a problem that has less to do with the character himself and more to with DC Comics, which found itself stuck with a flagship character it thought needed fixing. In trying, it broke him nearly beyond repair.
It’s not just your imagination. Horror films are much more scary than they were in the past. Here’s how they do it.
These recent weeks have seen the publication of five books about death: one by a historian; two by hospice workers; one by a widow; one by a man who is dying himself. Several of them quote Dylan Thomas’s “Do Not Go Gentle Into That Good Night” to advocate resilience, then map the fine line between denial and succumbing. In “Death’s Summer Coat,” Brandy Schillace complains, “The modern Westerner has lost loss; death as a community event, and mourning as a communal practice, has been steadily killed off.” Examining rituals of bereavement across cultures and across time, she suggests that everyone else has been better at the rites of farewell than we are. Our postindustrial disavowal of mortality is described by Simone de Beauvoir, who wrote, “For every man, his death is an accident and, even if he knows it and consents to it, an unjustifiable violation.” Schillace, a research associate at the Dittrick Museum of Medical History, points toward the confusion that has emerged in a technological age when brain death, heart death and other definitions becloud our understanding of expiry itself, observing that by current legal definitions, the same person could be alive under American law and dead under British law. We don’t know what death means or even what it is.
Shylock meets his modern doppelganger in the novelist’s playful examination of what it means to be Jewish.
Female orgasms, like male nipples, have no direct biological function. Which means the science gets a bit strange.
Ultimately, A Decent Ride is a book about growing older: What it's like to lose virility and cope with that loss, and what exactly we replace youth with when it departs and leaves a gaping, mocking vacuum in our souls.
In a broad valley devoted largely to the dead, the history museum in Colma — nicknamed the City of Souls — sells T-shirts that read, “It’s Great to Be Alive in Colma!”
It is a town of 1,600 living residents and about 1.5 million dead ones — many of whom, like the 49ers, uprooted and left San Francisco for greener pastures to the south.
In the past it used to be very simple. First came intense training to master French haute cuisine techniques, then a series of apprenticeships in Paris under the world's best chefs, all of whom were themselves classically trained.
Finally, a chef was ready to add a personal touch to the French repertoire and launch a new restaurant under his or her own name.
The world's culinary aristocracy was then recorded in the Michelin Guide, with the best restaurants earning one, two, or exceptionally three stars.
The White House is a place defined by transients — presidents and political appointees who come and go after a term or two.
But Ficklin is a different, more enduring sort: He is the 10th member of his family — all children and grandchildren of a Virginia slave — to have worked in the White House, a long line that stretches back to Franklin Delano Roosevelt’s administration. Ficklin’s uncle Charles got a job as a White House butler in 1939. His father, John Woodson Ficklin, joined the staff a year later and stayed for 43 years.
“At first, you’re like: Why are they stealing the color white? I had to Google it to figure out what titanium dioxide even was,” says Dean Chappell, acting section chief of counterespionage for the FBI. “Then you realize there is a strategy to it.” You can’t even call it spying, adds John Carlin, the assistant attorney general in charge of the U.S. Department of Justice’s national security division. “This is theft. And this—stealing the color white—is a very good example of the problem. It’s not a national security secret. It’s about stealing something you can make a buck off of. It’s part of a strategy to profit off what American ingenuity creates.”
Culling its name from the 1999 satirical film directed by Mike Judge, the group show "Office Space" at Yerba Buena Center for the Arts in San Francisco focuses on the soft power and absurdity inherent in the alienating strategies and the sometimes-productive ambiguity of the modern workspace.
The architects and planners profiled by Wade Graham in “Dream Cities,” his ambitious study of the forms and ideas of the contemporary city, come in two categories. There are those few, like Jane Jacobs, who drew the deeper lesson of “Utopia,” which is that one must approach the organic interconnectedness of society with humility and deference. A vast majority, alas, are like those who have only dipped into its second half, emerging as incorrigible believers in the power of rational thought, right angles and good intentions to perfect society. In this category fall Daniel Burnham, Robert Moses, Le Corbusier and all those who marched in the cause of urban renewal.
Daniel Pennac’s latest novel doesn’t behave the way novels are meant to behave. It is told in diary form over one man’s long lifetime, but manages to withhold the most basic biographical details along the way. The focus, instead, is on the narrator’s body: the physical body, rather than the person who inhabits it, is hero and subject of this book. Or to be more exact – and the narrator is very exact on this question – it’s about the life-long process of reconciling himself to his body, this “intimate stranger” that is simultaneously himself and a constant mystery to him. Our bodies, he writes in a note to his daughter, are “generous with surprises”. So he anatomises (an apt word) those episodes when his body makes its presence felt: when he is ill (or rampantly hypochondriac, which is frequent), and when he feels particularly bad pain or particularly ecstatic pleasure.
Swooshtika, flashpacking, moobs, swaption: English is awash with new portmanteaus. But what determines whether yours will be a buzzword, or a bum word?
Wisdom from classical Greece: democracy and liberalism are both better off if we understand the difference between them.
Lillian came to my office for a second opinion. Her first nephrologist had just done a kidney biopsy and handed her a diagnosis of fibrillary glomerulonephritis, an extremely rare form of kidney disease whose annual incidence is less than 50 cases per year. “My doctor said you were the only one who could help me,” she said, trying to muster a smile. I told her the truth: we knew very little about what causes fibrillary glomerulonephritis, and therefore we knew even less about what treatments might help. I suggested a course of the only therapy that had been shown, in case reports, to work for her disease, the monoclonal antibody rituximab, although the rationale for why this drug would work for this specific disease was at best speculative.
Six months later, her local nephrologist sent her back for another second opinion, specifically about whether she should try a second course of rituximab. Her lab results showed no response to the drug. “In fact,” I said, “if anything, your kidneys have gotten worse in the last six months.” Lillian started to tear up. When I offered her a box of tissues, her crying intensified. She soon fell into near hysterics. Her wailing was the only sound in the room. Two medical students were shadowing me that afternoon, and Lillian’s crying was clearly making them uncomfortable. I told her we would re-dose the rituximab. “I think a second round of therapy will help,” I said. “You will get better.” She stopped crying. “You really think so?” she asked. “Yes.” This was not an outright lie, because, if her kidneys failed, she’d get a transplant, in which case she would technically get better. When she left the office, the medical students asked me if I really thought she’d get better. “No,” I said, “but she needs to have some hope right now.” They laughed, but I was serious. And Lillian did get better.
When Philip Seymour Hoffman died of an accidental drug overdose on February 2, 2014 at age 46, it felt like a huge part of the past two decades of cinema had disappeared as well, as if all the wonderful characters he created were on some level buried with the man who played them. A shocked public experienced a profound double loss. They were mourning the Hoffman who took up such formidable real estate in many modern classics. But they were also mourning all the brilliant Hoffman performances to come, which were extinguished with Hoffman’s death.
"Chronology is an illusion, if not a deliberate lie," a character posits in "The Lost Time Accidents," the fourth offering from novelist John Wray. "The steady, one-way current we seem to be suspended in is actually a jumble of spherical 'chronocosms' that can be moved through in any direction, if some great force manages to knock one's consciousness out of its preconditioned circuit."
If you're not quite sure what all that means, never fear. The fine line between hokum and rational thinking is precisely the point of "The Lost Time Accidents"; a brick of a book not just because of its length but because of the density of both the prose and the ideas it contains.
s self-help gurus and internet memes continually remind us, our lives are a story we are empowered to write ourselves. Travelers Rest provides a thoughtful take on this idea, interweaving the melancholy stories of Tonio and Julia Addison, their 10-year‑old son Dewey, and ne’er-do‑well Uncle Robbie.
I stopped drinking when my father died. That was in July, on the Sunday that ended the long holiday weekend. It wasn’t from shock or grief but rather I stopped drinking as a gesture, as a mark of my ambition for his soul. He died drunk behind the wheel in an accident of his own making.
I remember his hair-raising roulette with drinking and driving when I was a child. Coming home from cook-outs or family gatherings, my brother Aristos and I developed a routine around our father’s violent swerves to stay on the road. When my blond Germanic mother stiffened in the front seat and stretched her hand out to the dashboard, her arm a rigid brace, Aristos and I would assume the crash position in back, hoping to be spared should we hit a tree, a cement pylon, a bridge railing, whatever. The better defense, I realized after I’d been away at college for a year, was to avoid my father entirely. I hadn’t spoken to any of my family in twenty years when my mother e-mailed to say what we’d waited for so long to happen finally had.
The month before he was executed, in April 1952, Guo Ching wrote letters to his mother, wife and children to say goodbye.
The letters had only 140 miles to travel, but they would take 60 years to be delivered.
When his daughter finally received her father’s farewell after a protracted negotiation with Taiwan’s government, she was in her 60s, twice his age when he died.
Historical fiction was not—and is not—meant to supplant literature from the period it describes. As a veteran of the Crimea, Tolstoy wrote War and Peace to match his own internal sense of the truth of the Napoleonic wars, to dramatize what he felt literature from that period had failed to describe. The force of his vision, even in translation, may have shifted the benchmark for realism away from authenticity and toward the feeling of it for the reader—a way for the living to argue with history and posterity. Powdered wigs or not, War and Peace is with us still.
Meanwhile, we’ve made our peace, Hem and me. He’s just a writer to me now, and his work matters more than the myth ever did. Of course he’s spawned a legion of imitators and posers. He put in the work. Besides, his pithy macho quips can—sometimes—hold up. Writing what you know, for example. That’s inspired many a reckless idiot to pursue the reckless and the idiotic, but direct experience is only one way of many to know something. And Hemingway’s obsession with finding the “true” bits—it took me time and perspective to figure out he meant more than transcribing reality, but tapping into the emotional wells of human existence and perspective. (That’s not “truth,” exactly, but macho pseudo-philosophies require some malleability.)
Another decision I made right at the beginning and stayed with to the end is one that would be wrong in translating almost anything else. I hesitate to admit it, but I did not read the book through before starting, or even as I went along. I knew the overall shape of the book from Waley's version devoured in childhood. It is not like a poem so intricately structured that you need to be aware every word of it before attempting to translate any line of it. I found reading it for the first time as I translated it helped me to be caught up in each story. I wanted to know what would happen when I finished each page and started on the next. If it turned out that something later in the episode meant that a correction was needed in an earlier part, that could easily be put right. This questionable but attractive strategy kept me from going stale, and there was no harm in having the urgency and freshness that comes from translating something new and unexpected.
like Elizabeth Bennet, Hilton Als, and Jazmine Hughes, I have four sisters. My original plan for this piece was to see the Tina Fey-Amy Poehler comedy Sisters with my sisters over the holidays and write a kind of conversational film review — a transcription of all of our during-the-movie snarky whispers and post-movie summary judgments. On Sisters, by sisters!
I texted them all about it before my trip home in December, and they were open to the idea — especially since I was paying — though they were not exactly enthusiastic.
The idea behind the Breakthrough Prize seems to be that money and celebrities will make science sexier, and that this, in turn, will entice more talented young people to go into it. Is this really sensible, though? Does someone go through a decade of advanced education and a lifetime of hard work in the hopes that they might win a science lottery and get to shake Russell Crowe’s hand? Meanwhile, although any mid-career scientist would be happy to win such a prize, I suspect that every single winner of the Breakthrough Prize would happily return the money in exchange for a Nobel Prize. The Breakthrough Prize, like the Kavli Prize and the other million-dollar-plus awards being given out around the world, will always be considered a consolation prize.
The print version of Playboy, in other words, is struggling with the conundrum of the Internet, just like every other legacy media enterprise. But say this for the redesign: Even if it fails to increase subscriptions, it makes that deathless dodge “I read it for the articles” a little easier to utter with a straight face.
One of the publishing industry’s only black editors is transmitting ideas from writers on the margins to the mainstream readers who need to hear them.
Ryder has always been trapped in her own anticipatory nostalgia, and the public has always wanted to keep her there.
Child prodigies rarely become adult geniuses who change the world. We assume that they must lack the social and emotional skills to function in society. When you look at the evidence, though, this explanation doesn’t suffice: Less than a quarter of gifted children suffer from social and emotional problems. A vast majority are well adjusted — as winning at a cocktail party as in the spelling bee.
What holds them back is that they don’t learn to be original. They strive to earn the approval of their parents and the admiration of their teachers. But as they perform in Carnegie Hall and become chess champions, something unexpected happens: Practice makes perfect, but it doesn’t make new.
By any measure, Ann Lee, the illiterate daughter of a Manchester blacksmith, led one of the most audacious and improbable lives of the 18th century. Born in 1736, she came of age in the fetid, soul-destroying crucible of English industrialization. But in religion, Lee discovered her native boldness and charisma. She became a prophet and, in 1774, led a small band of followers across the Atlantic. They became known as Shakers. And from a mean cabin in upstate New York they formed a society that would draw thousands into communal villages across much of the United States. Lee did not live to survey her realm. But her social conscience — forged in the bleak shadow of the Manchester mills — animated Shaker communities well into the 20th century.
Irrepressible as ever, the ghost of Ann Lee hovers over every page of Chris Jennings’s uncommonly smart and beautifully written book “Paradise Now.” In a sense, this is hardly surprising. All utopian experiments would seem to invite comparison with the Shakers, whose advocacy of simple living, hard work, shared property and gender equality holds tremendous modern appeal (in certain circles, anyway).
One was working as an accredited C.P.A. Another had just completed the requirements for a pre-med degree at the University of Chicago. Yet another, a junior employee at Morgan Stanley, walked down 75 flights in the World Trade Center’s South Tower and back into the family food business on Sept. 11, 2001.
These New Yorkers — Thomas Chen, Jonathan Wu and Wilson Tang — are among a few dozen Chinese-Americans who have recently surfaced as influential chefs, determined to begin a new culinary conversation with the food of their ancestors. Independently, they arrived at the same goal: to invent a kind of Chinese-American food that is modern, creative and delicious instead of sweet, sticky and bland.
Let’s say you need some books. Maybe you have recently acquired a big fancy house, boat or plane with a big empty library, and you want to fill it with real books, not those things that look like books but are actually built-in fake book spines engraved with ornate titles.
One lazy solution would be to employ a decorator to acquire an aesthetically pleasing instant collection. Another would be to visit an estate sale and hoover up someone else’s, caveat emptor. Or you could do what the smartest bibliophiles do: Put yourself in the hands of the staff at the London bookstore Heywood Hill, who promise to go to the ends of the earth to hunt down the books you need — the rare, the old and the out of print as well as the newly published — to build your perfect custom library.
For as long as the blue-eyed Shaw sisters can remember, they knew that their parents planned to one day take their own lives.
It was often a topic of conversation. Patricia and Peter Shaw would discuss with their three daughters their determination to avoid hospitals, nursing homes, palliative care units - any institution that would threaten their independence in old age.
Having watched siblings and elderly friends decline, Pat and Peter spoke of their desire to choose the time and manner of their deaths.
She could have been researching for her next academic work as a scholar of Early Modern theater; she could have been writing the next in her long line of bestselling romance novels or grading her students’ Shakespeare papers, but instead Dr. Mary Bly spent her entire Thursday creating a single, delectable, chocolate trifle. She transformed herself for the afternoon into one of the Regency heroines from her historical romance novels, following a custard recipe from a centuries-old cookbook. She boiled milk, added sugar, beat the eggs, and combined them. Thickening the custard without corn starch meant whisking it madly over a low heat, and when it was finally thick enough, she added brandy and poured it over the chocolate cake she had spent the morning baking. On top she layered cherries and the whip she had made the day before.
There are plenty of grocery stores near her Upper West Side apartment that carry corn starch, but that’s not the point. Mary Bly thoroughly enjoys the process of creation. In the same way, under the romance-writer pen name Eloisa James, she enjoys writing her books, mixing ingredients together into a delicious happy ending. In the literary world, many would consider romances trifles—sweet, rather than mentally nutritious, but Eloisa is as thoughtful and creative in writing her stories as she is in making her trifle. It’s pure fun for her; if her stories were desserts, she would be licking her fingers as she wrote.
We are conditioned from a young age to think of humanity as somehow separate from Nature. Nature is the blank slate against which humans define themselves, from God’s command in Genesis that humans “have dominion over the fish of the sea, and over the fowl of the air, and over the cattle, and over all the earth, and over every creeping thing that creepeth upon the earth,” to modern environmentalism’s call to preserve it. Nature is the passive recipient of our actions; we are the active, vital force that exerts ourselves on it.
Dogs trouble this duality: they are of nature but no longer belong to it; they are part of human culture and yet remain wholly alien within it. If, as Georges Bataille once wrote, “every animal is in the world like water in water,” this does not apply to the dog, who exists on the borderlands. We still don’t know for sure when and how the wolf became the dog — the dog’s origin is lost in the same mist of time that is our own. It’s almost as if both human and dog appeared at once — as though each depended on the other for its existence. And yet their place will always be at the edge of the camp, or on the doorsill: protective and loyal, and yet steadfastly never quite one of us. You can never hope to maintain a hard and fast distinction between humanity and nature so long as there are dogs.
It may comfort you to know, however, that we are not the first generation to witness the death of great magazine writing. That bell began tolling, some would say, as far back as 1911, when a run of unprofitability forced Samuel S. McClure to sell off McClure’s—founded in 1893, and the birthplace of the muckracking narrative journalism of Ida Tarbell and Lincoln Steffens—to creditors who slowly bled it to death. Sure, the 19th century also produced long-running magazines like National Geographic, Harper’s, and The Atlantic Monthly. But as avid readers watched the likes of Munsey’s and The Century follow McClure’s down the hole, the stench of death was already upon us.
Last August, Anne Rice posted a call to arms — on Facebook, of course — warning that political correctness was going to bring on literary end times: banned books, destroyed authors, “a new era of censorship.” “We must stand up for fiction as a place where transgressive behavior and ideas can be explored,” she proclaimed. “I think we have to be willing to stand up for the despised.” I, a fan of transgressive literature, could not pinpoint why I found her post to be so much more vexing than the usual battle cries of P.C.-paranoiacs. I finally had my answer after reading Han Kang’s novel “The Vegetarian”: What if “the despised” can stand up on their own?
There’s one scene that stands out as being especially difficult. I essentially called it the “Matt sets up the third act” scene, and it’s just a monologue. We had this concept of what the third act is, which is that we’re going to launch Matt into space in a tin can. That’s it. When we explained that that was going to happen, we needed to explain why, and we needed to explain the velocity involved in what’s going to happen, because one of the things that’s hard about filmmaking is speed can be difficult. For example, if you look at race cars on tracks, you need to see them blowing past something to understand that they’re moving at a high rate. It’s perspective. The problem with launching off the surface of a planet is, we really wanted to sell how dangerous all of this was about to be. It was this exposition that I was struggling with, of just Matt Damon talking.
IN OUR AGE of the selfie and instant upload, the self-portrait has far different cultural and aesthetic values than it had in the past. It thus seems only appropriate that the National Gallery of Victoria, in Melbourne, Australia, should exploit this aesthetic form in the work of the two artists featured in their current exhibition, Andy Warhol|Ai Weiwei (AW|AW). The exhibition, which opened on December 11 and runs until April 24, pairs the work of an iconic American pop artist with that of a contemporary Chinese artist and political activist, and the result is fascinating.
Before Challenger's final mission - listed as 51-L - it was known that the O-ring seals on a previous flight - 51-C - had eroded to a depth of one-third of their radius. Instead of regarding this little-understood problem as an unacceptable risk, it was interpreted as a "a safety factor of three." Should the interpretation have been more pessimistic?
Today, a handful of people, Bleecker included, are wondering whether they can forge a deeper connection between science fiction and real-life science, one in which the emotional and imaginative power of sci-fi stories leads engineers and scientists to a fuller understanding of the contours of the future they are working toward.
Twenty-five years after leaving the virtual reality lab, Bleecker articulated his ideas in “Design Fiction: A short essay on design, science, fact and fiction,” which is really a 97-page pamphlet exploring the relationship between science fiction, particularly the 2002 movie “Minority Report,” and technological progress.
David Foster Wallace understood the paradox of attempting to write fiction that spoke to posterity and a contemporary audience simultaneously, with equal force. In an essay written while he was at work on “Infinite Jest,” Wallace referred to the “oracular foresight” of writers such as Don DeLillo, whose best novels — “White Noise,” “Libra,” “Underworld” — address their contemporary audience like a shouting desert prophet while laying out for posterity the coldly amused analysis of some long-dead professor emeritus. Wallace felt that the “mimetic deployment of pop culture icons” by writers who lacked DeLillo’s observational powers “compromises fiction’s seriousness by dating it out of the Platonic Always where it ought to reside.” Yet “Infinite Jest” rarely seems as though it resides within this Platonic Always, which Wallace rejected in any event. (As with many of Wallace’s more manifesto-ish proclamations, he was not planting a flag so much as secretly burning one.) We are now at least half a decade beyond the years Wallace intended his novel’s subsidized time schema — Year of the Whopper, Year of the Depend Adult Undergarment — to represent. Read today, the book’s intellectually slapstick vision of corporatism run amok embeds it within the early to mid-1990s as firmly and emblematically as “The Simpsons” and grunge music. It is very much a novel of its time.
How is it, then, that “Infinite Jest” still feels so transcendentally, electrically alive?
Jo Morgan’s new collection requires and rewards repeated attention. Rereading poetry goes with the territory: a poem you do not want to reread is unlikely to be up to much. But this book is especially challenging. Each time you read – like rubbing a brass or watching mist lift or solving a clue – it becomes clearer, more striking, new things come to light. It is a work to be caught in snatches, in flashes, by stealth, as life itself sometimes is.
If the id had an id, and it wrote poetry, the results might sound like “Widening Income Inequality” (Farrar, Straus & Giroux), Frederick Seidel’s sixteenth collection. The title borrows a current meme, while also suggesting Yeats’s apocalyptic poem “The Second Coming” (“Turning and turning in the widening gyre / The falcon cannot hear the falconer”). Seidel’s satanic refinement is expressed in poems at once suave and vengeful, their garish pleasures linked to the many splendid goods—Ducati motorcycles, bespoke suits, Italian shoes—that they describe. To encounter a poem by Seidel is therefore to be co-opted into his Ricardo Montalban aesthetic of creepy luxury. American poets like to think of their art as open, democratic, all-embracing; few aside from Seidel have imagined the lyric poem to be an exclusive haunt of self-flattering, hedonistic élites. Seidel is securely on the winner’s side of the widening wealth gap; the implication, if we’re reading him, is that so are we. He is the Phi Beta Kappa poet of doomsday, happily escorting the world’s fortunate to a well-appointed abyss, then cannonballing in alongside us.
It is disturbing, at first, to read an autobiographical book in which the author knows he is dying and you know that he will be dead by the end of it. But Kalanithi writes very well, in a plain and matter-of-fact way, without a trace of self-pity, and you are immediately gripped and carried along. The fact that I use the present tense in writing about him shows that the book has taken on a life of its own, as Kalanithi clearly hoped it would. It’s a remarkable book, for many reasons, especially for his description of his transition from all-powerful doctor to anxious patient, and of how he was “so authoritative in a surgeon’s coat but so meek in a patient’s gown”.
The days of the all-powerful critic are over. But that figure — high priest or petty dictator, destroying and consecrating reputations with the stroke of a pen — was always a bit of a myth, an allegorical monster conjured up by timid artists and their insecure admirers. Criticism has always been a fundamentally democratic undertaking. It is an endless conversation, rather than a series of pronouncements. It is the debate that begins when you walk out of the theater or the museum, either with your friends or in the private chat room of your own head. It’s not me telling you what to think; it’s you and me talking. That was true before the Internet, but the rise of social media has had the thrilling, confusing effect of making the conversation literal.
Recently, a young man walked into the bar where I was working, sat down, and told me that I was pretty. It just flew out of his mouth by accident; he’d obviously had a few. His vibe wasn’t slimy or aggressive. He just seemed excited to discover that a woman he found attractive would be opening his next beer. Convention suggests that the most normal and appropriate response from me would be a display of gratitude, but I wasn’t thankful. I just felt instantly beleaguered in a very familiar way.