Study Hacks Blog

Neil Gaiman’s Radical Vision for the Future of the Internet

Earlier this week, Neil Gaiman was interviewed on Icelandic television. Around the twenty-five minute mark of the program, the topic turned to the author’s thoughts about the internet. “I love blogging. I blog less now in the era of microblogging,” Gaiman explained, referring to his famously long-running online journal hosted at neilgaiman.com. “I miss the days of just sort of feeling like you could create a community by talking in a sane and cheerful way to the world.”

As he continues, it becomes clear that Gaiman’s affection for this more personal and independent version of online communication is more than nostalgia. As he goes on to predict:

“But it’s interesting because people are leaving (social media). You know, Twitter is over, yeah Twitter is done, Twitter’s… you stick a fork in, it’s definitely overdone. The new Twitters, like Threads and Blue sky… nothing is going to do what that thing once did. Facebook works but it doesn’t really work. So I think probably the era of blogging may return and maybe people will come and find you and find me again.”

In these quips, Gaiman is reinforcing a vision of the internet that I have been predicting and promoting in my recent writing for The New Yorker (e.g., this and this and this). Between 2012 to 2022, we came to believe that the natural structure for online interaction was for billions of people to all use the same small number of privately-owned social platforms. We’re increasingly realizing now that it was this centralization idea itself that was unnatural. The underlying architecture of the internet already provides a universal platform on which anyone can talk to anyone else about any topic. We didn’t additionally need all of these conversations to be consolidated into the same interfaces and curated by the same algorithms.

Read more

Should This Meeting Have Been an Email?

In the context of knowledge work there are two primary ways to communicate. The first is synchronous, which requires all parties to be interacting at the same time. This mode includes face-to-face meetings, phone calls, and video conferences.

The second way is asynchronous, which allows senders to deliver their messages, and receivers to read them, when each is ready. This mode includes memos, voicemails, and, most notably in recent years, email.

Which communication style is better? This simple question requires a complicated answer.

Read more

The Quiet Workflow Revolution

Starting a few years ago, ads for a web-based software start-up called Monday.com began to show up everywhere online. A subsequent S.E.C. filing revealed that the company spent close to a hundred and thirty million dollars on advertising in 2020 alone, which worked out to over eighty percent of their annual revenue. By the end of this blitz they had generated more than seven hundred million views of their YouTube-based spots — an audience larger than the preceding four Super Bowls combined.

As I report in my most recent article for The New Yorker, Monday.com had good reason to make this aggressive investment:

“Monday.com claims to help knowledge workers collaborate better: ‘Boost your team’s alignment, efficiency, and productivity by customizing any workflow to fit your needs.’ This objective might sound dry in our current moment of flashy social apps and eerie artificial intelligence, but helping organizations manage their workflows has proved to be surprisingly lucrative. Trello, one of the early success stories from this category, was launched in 2011 as a side project by an independent software developer. In 2017, it was purchased by Atlassian for four hundred and twenty-five million in cash and stock. Another workflow-management service, named Wrike, subsequently sold for $2.25 billion. For its part, Monday.com went on to leverage the user growth generated by its advertising push to support a successful I.P.O. that valued the company at over seven billion dollars.”

This sudden shift in the business productivity market away from tools that help you better execute your work (like word processors and email clients), and toward tools that help you better organize your work, is important.

Read more

On Disruption and Distraction

Disruption and disorder have always stalked the human condition. This reality sometimes plays out on the grand scale, as in the brutality of terror and war, and sometimes more intimately, as in the sudden arrival of ill health or a personal betrayal.

Though such upheavals are timeless, our options for response have continued to evolve. The last decade or so has added a new, culture-warping tool to this collection of coping mechanisms: smartphones. Or, to be more specific, the algorithmically-optimized content delivered through these devices.

The techno-psycho dynamics at play here are straightforward. The algorithms that drive content curation platforms such as TikTok, Twitter, or Instagram are designed to increase engagement. This is an inherently interactive process: the services decide what to show you by combining what they already learned about you in the past with observations on what seems to be drawing more of your attention in the moment. In a period of disruption, this will, more likely than not, lead you deep into digital grooves that promise to offer some relief from your emotional pain.

This relief can be delivered by drowning out your pain with even stronger emotions. These platforms are adept, for example, at stoking a satisfying fire of anger and outrage; a repeated electronic poking of a psychological bruise. For those who were unlucky enough to wander onto Twitter in the immediate aftermath of the horrific terrorist attack on southern Israel last week witnessed this effect in its full unnerving power.

These platforms are also able to move hard in the other direction and serve up the grim surrender of apocalyptic narratives. This was made apparent during the coronavirus pandemic when many were lured by their phones into a sense of survivalist despair that left physic scars that persisted in constraining their lives well after the virus’s inevitable transition toward endemicity.

This relief delivered by our phones is not always about amplifying feelings. It can also be delivered in the form of numbness: drips of endless, meaningless, shiny, shallow distraction that take the edge off your distress. TikTok specializes in this style of deliverance: swipe, swipe, swipe, until you temporarily dislocate from the moment.

As we right now find ourselves mired in an extended period of unusually heavy disorder, it seems an appropriate time to step back and ask how well smartphones have been serving us in this manner. Has the escape they offered led us to a lasting calm or a sustainable response to our travails? Few believe they have.

In search of a better alternative, I reached out to my friend Brad Stulberg, who earlier this fall published a bestselling book, Master of Change, about how to navigate unavoidable upheaval. (You can also watch my recent podcast interview with Brad here.)

Read more

On Tire Pressure and Productivity

The other day the low tire pressure indicator came on in my car. I didn’t see an obvious flat, so the likely explanation was some combination of colder temperatures and natural pressure loss over time, meaning that there was no immediate danger. Nonetheless, the bright orange warning light on my dashboard injected a steady dose of subliminal anxiety when driving.

So, as I was leaving my house this afternoon, somewhat late to catch a matinee showing of The Creator (which I had convinced myself was necessary “research” for someone who writes professionally about artificial intelligence), I decided to take five minutes to pull out the pump and add some air. Soon after I pulled out of my driveway the pressure warning clicked off.

I’m telling this story because of what happened next: I felt a short-lived but intense feeling of satisfaction.

Read more

On Tools and the Aesthetics of Work

In the summer of 2022, an engineer named Keegan McNamara, who was at the time working for a fundraising technology startup, found his way to the Arms and Armor exhibit at the Met. He was struck by the unapologetic mixture of extreme beauty and focused function captured in the antique firearms on display. As reported in a recent profile of McNamara published in The Verge, this encounter with the past sparked a realization about the present:

“That combination of craftsmanship and utility, objects that are both thoroughly practical and needlessly outrageously beautiful, doesn’t really exist anymore. ‘And it especially doesn’t exist for computers.'”

Aesthetically, contemporary digitals devices have become industrial and impersonal: grey and black rectangles carved into generically-modern clean lines . Functionally, they offer the hapless user a cluttered explosion of potential activity, windows piling on top of windows, command bars thick with applications. Standing in the Arms and Armor exhibit McNamara began to wonder if there was a way to rethink the PC; to save it from a predictable maximalism.

The result was The Mythic I, a custom computer that McNamara handcrafted over the year or so that followed that momentous afternoon at the Met. The machine is housed in a swooping hardwood frame carved using manual tools. An eight-inch screen is mounted above a 1980’s IBM-style keyboard with big clacking keys that McNamara carefully lubricated to achieve exactly the right sound on each strike: “if you have dry rubbing of plastic, it doesn’t sound thock-y. It just sounds cheap.” Below the keyboard is an Italian leather hand rest. To turn it on you insert and turn a key and then flip a toggle switch.

Equally notable is what happens once the machine is activated. McNamara designed the Mythic for three specific purposes: writing a novel, writing occasional computer code, and writing his daily journal. Accordingly, it runs a highly-modular version of Linux called NixOS that he’s customized to only offer emacs, a text-based editor popular among hacker types, that’s launched from a basic green command line. You can’t go online, or create a PowerPoint presentation, or edit a video. It’s a writing a machine, and like the antique arms that inspired it, the Mythic implements this functionality with a focused, beautiful utilitarianism.

Read more

We Don’t Need a New Twitter

In July, Meta announced Threads, a new social media service that was obviously designed to steal market share from Twitter (which I still refuse to call X). You can’t blame Meta for trying. In the year or so that’s passed since Elon Musk vastly overpaid for the iconic short-text posting service, Twitter has been struggling, its cultural capital degrading rapidly alongside its infrastructure.

Meta’s plan with Threads is to capture the excitement of Twitter without all the controversy. Adam Mosseri, the executive in charge of Threads, recently said they were looking to provide “a less angry place for conversations.” His boss, Chris Cox, was more direct: “We’ve been hearing from creators and public figures who are interested in having a platform that is sanely run.”

Can Meta succeed with this plan to create a nicer Twitter? In my most recent article for The New Yorker, published earlier this month, I looked closer at this question and concluded the answer was probably “no.” At the core of Twitter’s ability to amplify the discussions that are most engaging to the internet hive mind at any one moment is its reliance on its users to help implement this curation.

As I explain in my piece:

Read more

Edsger Dijkstra’s One-Day Workweek

Within my particular subfield of theoretical computer science there’s perhaps no individual more celebrated than Edsger Dijkstra. His career spanned half-a-century, beginning with a young Dijkstra formulating and solving the now classic shortest paths problem while working as a computer programmer at the Mathematical Center in Amsterdam, and ending with him as a renowned full professor holding a prestigious chair in the computer science department of the University of Texas at Austin.

During this period, Dijkstra introduced some of the biggest ideas in distributed and concurrent computing, from semaphores and deadlock, to nondeterminacy and fairness. In 2003, the year after his death, the annual award given by the top conference in my field was renamed The Dijkstra Prize in his honor.

This is all to say that I was intrigued when an alert reader recently pointed my attention to a fascinating observation about Dijkstra’s career. In 1973, fresh off winning a Turing Award, the highest prize in all of computer science, Dijkstra accepted a research fellow position that the Burroughs Corporation created specifically for him. As his colleagues later recalled:

Read more