December 18th, 2016 · 87 comments
The Curmudgeonly Optimist
People are sometimes confused about my personal relationship with digital communication technologies.
On the one hand, I’m a computer scientist who studies and improves these tools. As you might therefore expect, I’m incredibly optimistic about the role of computing and networks in our future.
On the other hand, as a writer I’m often pointing out my dissatisfaction with certain developments of the Internet Era. I’m critical, for example, of our culture’s increasingly Orwellian allegiance to social media and am indifferent to my smartphone.
Recently, I’ve been trying to clarify the underlying philosophy that informs how I think about the role of these technologies in our personal lives (their role in the world of work is a distinct issue that I ‘ve already written quite a bit about). My thinking in this direction is still early, but I decided it might be a useful exercise to share some tentative thoughts, many of which seem to be orbiting a concept that I’ve taken to calling digital minimalism.
Read more »
December 7th, 2016 · 50 comments
The Tally Problem
When I was writing Deep Work I was a heavy user of deep work tallies: a record kept each week of the total hours spent in a state of unbroken concentration (see above).
This strategy provides concrete data about how much deep work you actually accomplish, and the embarrassment of a small tally motivates a more intense commitment to finding time to focus.
I’ve written about this idea on this blog (e.g., here and here) and featured it in the conclusion of Deep Work, and for good reason: it works well — especially as compared to no tracking at all.
Over the past year or so since publishing my book, however, I’ve found myself drifting from this particular productivity tool.
I increasingly found it insufficient to support the long periods of deep work (think: 4 – 7 consecutive hours, multiple times a week) that I need to really support my increasingly complicated pursuits as a professional theoretician with heady aspirations.
The problem was timing.
By the time the average week started, I had already agreed to enough meetings, interviews, appointments and calls in advance that no such long unbroken periods remained. This was true even after I drastically reduced these incoming requests with sender filters and my attention charter.
As I found myself repeatedly frustrated with the fragmented nature of my weeks I knew something had to change…
Read more »
November 30th, 2016 · 29 comments
The Other MIT in My Life
One of the most persistent and popular strategies in the online productivity community is the notion of tackling your most important thing (MIT) first thing in the morning.
The motivation is self-evident. Our days are increasingly filled with distraction and unexpected disruptions. If you make a point of doing one important thing before exposing yourself to that onslaught you can ensure that you make continual progress on things that matter .
I’m not sure about where the idea originated. My research suggests it was adapted over a decade ago from Julie Morgenstern’s book Never Check Email in the Morning by Lifehacker editor Gina Trapani. I first heard about MITs through Leo Babauta (a major inspiration) in the early days of Study Hacks, but continue, to this day, to hear people talk about their commitment to the strategy.
Here’s the thing: I don’t want to dismiss this advice, as I know it has helped many people transition from chaos to less chaos. But I do want to urge those who are serious about their effectiveness to look beyond this suggestion.
It’s amateur ball. The pros play a game with more serious rules…
Read more »
November 23rd, 2016 · 31 comments
My Curmudgeonly Musings Go National
On Sunday, I published an op-ed in the New York Times arguing that social media can cause more harm than good for your career.
The core of my argument is that the professional benefits of social media are being overemphasized (I don’t buy this idea that suddenly Twitter and Facebook are the main channels through which talent is recognized and opportunities spread), while the professional costs of social media are being underemphasized (see: Deep Work).
As indicated in the above screenshot, this generated some discussion.
Of the different reactions that made it onto my radar, the one I found most interesting was the question of how to define “social media” in the context of these types of cost/benefits analyses. (See, for example, the thoughtful self-analysis in this Hacker News thread.)
I think it’s worth me taking the time to clarify my thinking on this issue.
Read more »
November 16th, 2016 · 24 comments
Earlier this month, a group of researchers from Albert-Laszlo Barabasi’s circle of network scientists published an important paper in the journal Science. Its nondescript title, “Quantifying the evolution of individual scientific impact,” obfuscates its exciting content: a massive big-data study that dissects the publication careers of over 2800 physicists to determine the combination of factors that best predicts their probability of publishing high impact papers.
As you might expect, this endeavor caught my attention.
A high-level summary of the researchers’ results highlights two major findings:
Read more »
November 11th, 2016 · 22 comments
Earlier this summer, the beloved writer Neil Gaiman was a guest on Late Night with Seth Meyers to promote The View from the Cheap Seats.
At one point in the interview, Meyers asked Gaiman about boredom. Here was Gaiman’s response:
“I think it’s about where ideas come from, they come from day dreaming, from drifting, that moment when you’re just sitting there…”
“The trouble with these days is that it’s really hard to get bored. I have 2.4 million people on Twitter who will entertain me at any moment…it’s really hard to get bored.”
What’s the solution? Gaiman adds:
Read more »
November 7th, 2016 · 27 comments
The Lesser of Two Evils
In the early fall of 1848, a little-known congressman from the frontier of Illinois set off to Massachusetts to address fellow members of his political party, the Whigs.
His name was Abraham Lincoln.
To put Lincoln’s trip in context, it’s important to remember that the issue dominating the 1848 presidential election was the expansion of slavery into the new territory won in the Mexican War. The Democratic candidate, Lewis Cass, was in favor of extending slavery to these new territories. The stance of the Whig candidate, Zachary Taylor, was less clear, though it was generally assumed he would oppose the expansion.
This assumption was not enough, however, for the strongly anti-expansion Massachusetts Whigs. Taylor was a slaveholder and his refusal to definitively reject expansion made him, in their eyes, a sub-optimal presidential candidate — so they refused to support his nomination, and, during the summer of 1848, became riled up by Charles Sumner, a particularly well-spoken and energized young man who was pushing his fellow party members to vote instead for a dark horse third party candidate, Martin Van Buren, who was emphatically against slavery.
Here’s Sumner talking to the Whigs in Worcester in June, 1848:
“I hear the old saw that ‘we must take the least of two evils’…for myself, if two evils are presented to me, I will take neither.”
This should sound familiar.
Read more »
October 19th, 2016 · 21 comments
The Bionic Office
A couple weeks ago, I wrote about Joel Spolsky’s claim that Facebook’s massive open office is scaring away talent. The comments on the post added many interesting follow ups; e.g., a pointer to a recent podcast episode where a Facebook developer claims the office is rarely more than a third full as people have learned to stay home if they want to produce anything deep.
A critique of open offices, however, inspires a natural follow-up question: what works better?
For one possible answer we can turn once again to Spolsky.
Back in 2003, when Spolsky was still running Fog Creek, they moved offices. Spolsky blogged about his efforts to work with architect Roy Leone to design “the ultimate software development environment.”
He called it the bionic office. Here a picture of a standard programmer’s space from the outside:
Read more »