On Tools and the Aesthetics of Work

In the summer of 2022, an engineer named Keegan McNamara, who was at the time working for a fundraising technology startup, found his way to the Arms and Armor exhibit at the Met. He was struck by the unapologetic mixture of extreme beauty and focused function captured in the antique firearms on display. As reported in a recent profile of McNamara published in The Verge, this encounter with the past sparked a realization about the present:

“That combination of craftsmanship and utility, objects that are both thoroughly practical and needlessly outrageously beautiful, doesn’t really exist anymore. ‘And it especially doesn’t exist for computers.'”

Aesthetically, contemporary digitals devices have become industrial and impersonal: grey and black rectangles carved into generically-modern clean lines . Functionally, they offer the hapless user a cluttered explosion of potential activity, windows piling on top of windows, command bars thick with applications. Standing in the Arms and Armor exhibit McNamara began to wonder if there was a way to rethink the PC; to save it from a predictable maximalism.

The result was The Mythic I, a custom computer that McNamara handcrafted over the year or so that followed that momentous afternoon at the Met. The machine is housed in a swooping hardwood frame carved using manual tools. An eight-inch screen is mounted above a 1980’s IBM-style keyboard with big clacking keys that McNamara carefully lubricated to achieve exactly the right sound on each strike: “if you have dry rubbing of plastic, it doesn’t sound thock-y. It just sounds cheap.” Below the keyboard is an Italian leather hand rest. To turn it on you insert and turn a key and then flip a toggle switch.

Equally notable is what happens once the machine is activated. McNamara designed the Mythic for three specific purposes: writing a novel, writing occasional computer code, and writing his daily journal. Accordingly, it runs a highly-modular version of Linux called NixOS that he’s customized to only offer emacs, a text-based editor popular among hacker types, that’s launched from a basic green command line. You can’t go online, or create a PowerPoint presentation, or edit a video. It’s a writing a machine, and like the antique arms that inspired it, the Mythic implements this functionality with a focused, beautiful utilitarianism.

Read more

We Don’t Need a New Twitter

In July, Meta announced Threads, a new social media service that was obviously designed to steal market share from Twitter (which I still refuse to call X). You can’t blame Meta for trying. In the year or so that’s passed since Elon Musk vastly overpaid for the iconic short-text posting service, Twitter has been struggling, its cultural capital degrading rapidly alongside its infrastructure.

Meta’s plan with Threads is to capture the excitement of Twitter without all the controversy. Adam Mosseri, the executive in charge of Threads, recently said they were looking to provide “a less angry place for conversations.” His boss, Chris Cox, was more direct: “We’ve been hearing from creators and public figures who are interested in having a platform that is sanely run.”

Can Meta succeed with this plan to create a nicer Twitter? In my most recent article for The New Yorker, published earlier this month, I looked closer at this question and concluded the answer was probably “no.” At the core of Twitter’s ability to amplify the discussions that are most engaging to the internet hive mind at any one moment is its reliance on its users to help implement this curation.

As I explain in my piece:

Read more

Edsger Dijkstra’s One-Day Workweek

Within my particular subfield of theoretical computer science there’s perhaps no individual more celebrated than Edsger Dijkstra. His career spanned half-a-century, beginning with a young Dijkstra formulating and solving the now classic shortest paths problem while working as a computer programmer at the Mathematical Center in Amsterdam, and ending with him as a renowned full professor holding a prestigious chair in the computer science department of the University of Texas at Austin.

During this period, Dijkstra introduced some of the biggest ideas in distributed and concurrent computing, from semaphores and deadlock, to nondeterminacy and fairness. In 2003, the year after his death, the annual award given by the top conference in my field was renamed The Dijkstra Prize in his honor.

This is all to say that I was intrigued when an alert reader recently pointed my attention to a fascinating observation about Dijkstra’s career. In 1973, fresh off winning a Turing Award, the highest prize in all of computer science, Dijkstra accepted a research fellow position that the Burroughs Corporation created specifically for him. As his colleagues later recalled:

Read more

When Work Didn’t Follow You Home

In a recent article written for Slate, journalist Dan Kois recounts the shock his younger coworkers expressed when they discovered that he had, earlier in his career, earned a master’s degree while working a full-time job. “It was easy,” he explained:

“I worked at a literary agency during the day, I got off work at 5 p.m., and I studied at night. The key was that this was just after the turn of the millennium. ‘But what would you do when you had work emails?’ these coworkers asked. ‘I didn’t get work emails,’ I said. ‘I barely had the internet in my apartment.'”

In his article, Kois goes on to interview other members of Generation X about their lives in the early 2000s, before the arrival of smartphones or even widely available internet. They shared tales of coming home and just watching whatever show happened to be on TV (maybe “Seventh Heaven,” or “Law and Order”). They also talked about going to the movies on a random weekday evening because they had nothing else to do, or just heading to a bar where they hoped to run into friends, and often would.

Read more

On the Slow Productivity of John Wick

I found myself recently, as one does, watching the mini-documentary featurettes included on the DVD for the popular 2014 Keanu Reeves movie, John Wick — an enjoyably self-aware neon noir revenge-o-matic, filmed cinematically on anamorphic lenses.

At the core of John Wick‘s success are the action sequences. The movie’s director, Chad Stahelski, is a former stuntman who played Reeve’s double in The Matrix trilogy and subsequently made a name for himself as a second unit director specializing in filming fights. When Reeves asked Stahelski to helm Wick, he had exactly this experience in mind. Stahelski rose to the challenge, making the ambitious choice to feature a visually-arresting blend of judo, jiu-jitsu, and tactical 3-gun shooting. In contrast to the hand-held, chaotic, quick-cutting style that defines the Bourne and Taken franchises, Stahelski decided to capture his sequences in long takes that emphasized the balletic precision of the fighting.

The problem with this plan, of course, is that it required Keanu Reeves to become sufficiently good at judo, jiu-jitsu, and tactical 3-gun shooting so as not to look clumsy for Stahelski’s stable camera. Reeves was game. According to the featurette I watched, to prepare for production, he trained eight hours a day, four months in a row. The effort paid off. The action set pieces in the movie were show-stopping, and after initially struggling to find a distributor, the film, made on a modest budget, went on to earn $86 million, kicking off a franchise that has since brought in hundreds of millions more.

Read more

The End of Screens?

Believe it or not, one of the most important technology announcements of the past few months had nothing to do with artificial intelligence. While critics … Read more

On Kids and Smartphones

Not long ago, my kids’ school asked me to give a talk to middle school students and their parents about smartphones. I’ve written extensively on the intersection of technology and society in both my books and New Yorker articles, but the specific issue of young people and phones is one I’ve only tackled on a small number of occasions (e.g., here and here). This invited lecture therefore provided me a great opportunity to bring myself up to speed on the research relevant to this topic.

I was fascinated by what I discovered.

In my talk, I ended up not only summarizing the current state-of-the-art thinking about kids and phones, but also diving into the history of this literature, including how it got started, evolved, adjusted to criticism, and, over the last handful of years, ultimately coalesced around a rough consensus.

Assuming that other people might find this story interesting, I recorded a version of this talk for Episode 246 of my podcast, Deep Questions. Earlier today, I also released it as a standalone video. If you’re concerned, or even just interested, in what researchers currently believe to be true about the dangers involved in giving a phone to a kid before they’re ready, I humbly suggest watching my presentation.

In the meantime, I thought it might be useful to summarize a few of the more interesting observations that I uncovered:

Read more

Danielle Steel and the Tragic Appeal of Overwork

Based on a tip from a reader, I recently tumbled down an esoteric rabbit hole aimed at the writing habits of the novelist Danielle Steel. Even if you don’t read Steel, you’ve almost certainly heard of her work. One of the best-selling authors of all time, Steel has written more than 190 books that have cumulatively sold over 800 million copies. She publishes multiple titles per year, often juggling up to five projects simultaneously. Unlike James Patterson, however, who also pushes out multiple books per year, Steel writes every word of every manuscript by herself.

How does she pull this off? She works all the time. According to a 2019 Glamour profile, Steel starts writing at 8:30 am and will continue all day and into the night. It’s not unusual for her to spend 20 to 22 hours at her desk. She eats one piece of toast for breakfast and nibbles on bittersweet chocolate bars for lunch. A sign in her office reads: “There are no miracles. There is only discipline.”

Read more