This tells most of all in the inane and imperious axiom that says ‘Beauty is in the eye of the beholder.’ It’s a well-meaning attempt at democratisation, allowing us all the power to declare beauty even where others might dissent. But this unthinking homily never interrogates the mysterious criteria by which we deem artworks, objects, even ideas, beautiful.
Then there’s the problem of indeterminacy. Sometimes, the word ‘beauty’ aspires to the solidity of a proper noun, grand and true. Other times, it seems a more nebulous term for an elusive kind of experience. We can be careless about the beautiful, shrugging it off as a matter of mere appearance. It is not grave like the stuff of our political lives, or profound like our moral considerations. Certainly, we know to admire the beautiful in its different forms – a painting, a song, a building, sometimes even an act or a gesture – and we might go so far as to believe that our engagement with beautiful things constitutes a deep and meaningful experience, as though it were a momentary pause in the hectic thoroughfare of our lives. But we rarely permit matters of beauty the same seriousness that we customarily grant big ideas such as ‘democracy’ and ‘justice’.
What has become truly necessary is stating the obvious: No work of art, no matter how incisive, beautiful, uncomfortable or representative, needs to exist. Yet the internet — the same force that has increased awareness of social-justice movements — has hyperbolized all entreaties to our fragmented attention spans. It’s now as easy to see all the incredible and twisted ways the world causes suffering as it is to waste a couple hours scrolling through Twitter. The concerned citizen’s natural response is to prioritize. It’s why so many outlets seem to invoke moral outrage as a growth strategy — and why being told what you need to read or watch starts to be appealing.
The prospect of “necessary” art allows members of the audience to free themselves from having to make choices while offering the critic a nifty shorthand to convey the significance of her task, which may itself be one day condemned as dispensable. The effect is something like an absurd and endless syllabus, constantly updating to remind you of ways you might flunk as a moral being. It’s a slightly subtler version of the 2016 marketing tagline for the first late-night satirical news show with a female host, “Full Frontal With Samantha Bee”: “Watch or you’re sexist.”
My neighborhood is like a peninsula, but surrounded by graveyards. The two streets near my place end in dead ends in three places which lead to graveyards, and one which leads to the streets. So when I say I live in a graveyard, I guess technically I'm in a plot of land surrounded by graveyards, but it's not like I'm on a path in the middle of a graveyard. Have you ever rode the L or the J past Broadway Junction and seen that huge graveyard? That’s what I’m next to.
You’re an author looking to make a splash in the literary world. You want to write something so different, so far out of the box, that readers everywhere will sit up and pay attention to your unique voice.
Then it comes to you: write a story from a second-person point of view! You’ve heard countless times before that this is something to avoid. “But rules are made to be broken,” you declare, as you boot up your word processor and begin drafting a story where ‘you’ is the primary pronoun. You soon discover, though, that the second person can be harder than it looks.
The physics of atoms and their ever-smaller constituents and cousins is, as Adam Becker reminds us more than once in his new book, “What Is Real?,” “the most successful theory in all of science.” Its predictions are stunningly accurate, and its power to grasp the unseen ultramicroscopic world has brought us modern marvels. But there is a problem: Quantum theory is, in a profound way, weird. It defies our common-sense intuition about what things are and what they can do.
“Figuring out what quantum physics is saying about the world has been hard,” Becker says, and this understatement motivates his book, a thorough, illuminating exploration of the most consequential controversy raging in modern science.
Academics have found increasing evidence that happiness through adulthood is U-shaped – life satisfaction falls in our 20s and 30s, then hits a trough in our late 40s before increasing until our 80s.
Forget the saying that life begins at 40 – it’s 50 we should be looking toward.
This marvelous book is a history of one of the hardest things to explain: why something did not happen. Histories of non-events are inherently difficult to write because of the methodological commitment of historians to stick close to documentary sources, and things that don’t happen rarely leave an obvious documentary trail. In this case, the non-event that Samuel Moyn describes in his new book, Not Enough: Human Rights in an Unequal World, is the institutionalization of a political ethic of material egalitarianism.