A mild-mannered engineer stands onstage at San Francisco’s Civic Auditorium. A massive video screen looms behind him, displaying a close-up of his face in the lower right half of the screen, with a close-up of his computer display superimposed over his face to the left. Introducing his team, he sounds a bit nervous, saying, “If every one of us does our job well, it’ll all go very interesting, I think.”
He starts by typing text, and then copying and pasting the word “word” multiple times, first a few lines, then paragraphs. He cuts and pastes blocks of text. He makes a shopping list his wife has requested — bananas, soup, paper towels — creating numbered lists, categories and subcategories, using his cursor to move around the document. Narrating as he works, he sounds not unlike Rod Serling. When he makes the occasional self-deprecating joke, we hear genial laughter from the audience.
Today, this presentation would be completely unremarkable. But it’s not 2018 — it’s December 9, 1968. The engineer is Douglas C. Engelbart, founder of the Augmented Human Intellect Research Center at the Stanford Research Institute, and nobody in his audience — or in the history of the world — has ever seen anything like it before.
Doug Engelbart was the first to actually build a computer that might seem familiar to us, today. He came to Silicon Valley after a stint in the Navy as a radar technician during World War II. Engelbart was, in his own estimation, a “naïve drifter,” but something about the Valley inspired him to think big. Engelbart’s idea was that computers of the future should be optimized for human needs—communication and collaboration. Computers, he reasoned, should have keyboards and screens instead of punch cards and printouts. They should augment rather than replace the human intellect. And so he pulled a team together and built a working prototype: the oN‑Line System. Unlike earlier efforts, the NLS wasn’t a military supercalculator. It was a general‑purpose tool designed to help knowledge workers perform better and faster, and that was a controversial idea. Letting nonengineers interact directly with a computer was seen as harebrained, utopian—subversive, even. And then people saw the demo.
Many of the boundary lines in our lives are highly literal, and, for the most part, this is how we’ve been trained to think of boundaries: as demarcations shored up by laws, physical, legal, or otherwise, that indicate exactly where one thing ends and another begins. Here is the border of your property; here is the border of your body; here is the border of a city, a state, a nation – and to cross any of these boundaries without permission is to transgress. But one of the most significant boundary lines in our lives is not this way, and one piece of ubiquitous technology is making this line increasingly permeable and uncertain, at a cost that we may only be starting to comprehend.
Here’s a thought experiment: where do you end? Not your body, but you, the nebulous identity you think of as your “self”. Does it end at the limits of your physical form? Or does it include your voice, which can now be heard as far as outer space; your personal and behavioral data, which is spread out across the impossibly broad plane known as digital space; and your active online personas, which probably encompass dozens of different social media networks, text message conversations, and email exchanges?
This is a question with no clear answer, and, as the smartphone grows ever more essential to our daily lives, that border’s only getting blurrier.
This means that in the Home app on iOS, you can long-press on the Day & Dusk to manually choose from an array of color options. Further, the colors can now better integrate with things such as scenes and scheduling in the Home app.
Programming languages are collections of ideas. Learning a programming language means learning to apply the ideas found in the language. But sometimes it is hard to isolate the ideas.
Language details might obscure ideas in the language. For example, some typeclasses in Haskell obey some algebraic laws. Because I was exposed to those typeclasses first, I expected this to be true of every typeclass. I misunderstood the idea of a typeclass as a way to group types by laws.
Catastrophe! How can Apple survive without 5G iPhones until 2020?
Here's how: The same way the company survived being way behind on 3G and LTE technologies, both of which it embraced long after its competitors did.
I've tasted all of them, and I enjoy eating eight of the thirteen things you must eat in Singapore.
What I miss about eating in Singapore, though: a good Mexican restaurant.
~
Thanks for reading.