September 11th, 2017 · 23 comments
Contemplating the Importance of Contemplation
Franklin Foer has a new book coming out this week. It’s titled, World Without Mind: The Existential Threat of Big Tech.
I haven’t read it yet, but this morning, on returning from a family camping trip, I read Foer’s essay in today’s Washington Post and a recent interview with The Verge (as, of course, there’s no better time to contemplate the existential threat of technology than right after a weekend in the woods).
According to the interview in The Verge, Foer writes in the book: “the tech companies are destroying the possibility of contemplation.”
This premise is one I obviously support, having written an entire book on why we should fight to retain our diminishing ability for sustained attention.
But whereas my main issue with digital distraction was limited to issues of personal satisfaction and productivity, Foer, in elaborating his contemplation quote, goes much broader in his concern:
Read more »
September 1st, 2017 · 20 comments
Not Open to Openness
Apple’s new Cupertino headquarters cost $5 billion (see above). One of its prominent features is a massive open office space in which many Apple engineers sit on benches at long shared work tables.
As Apple aficionado John Gruber revealed in a recent episode of his podcast, not everyone is happy with this decision.
“I heard that when floor plans were announced, that there was some meeting with Johny Srouji’s team,” said Gruber, before explaining that Srouji is an important senior vice president in charge of Apple’s custom silicon chips.
Srouji, to put it politely, was not pleased with the idea of moving his team to a cacophonous, distracting, cavernous open office.
As Gruber tells it:
Read more »
August 27th, 2017 · 42 comments
When Writing is More than Writing
As a professor who also happens to opine publicly about productivity, I’m often invited to stop by dissertation bootcamps — a semi-annual ritual at many universities where doctoral students gather to hear advice and work long hours on their theses in an atmosphere of communal diligence.
Something that strikes me about these events is the extensive use of the term “writing” to capture the variety of different mental efforts that go into producing a doctoral dissertation; e.g., “make sure you write every day” or “don’t get too distracted from your writing by other obligations.”
The actual act of writing words on paper, of course, is necessary to finish a thesis, but it’s far from the only part of this process. The term “writing,” in this context, is being used as a stand in for the many different cognitive efforts required to create something worthy of inclusion in the intellectual firmament of your discipline.
Read more »
August 20th, 2017 · 55 comments
The iGen Problem
Many people recently sent me the same article from the current issue of The Atlantic. It’s titled, “Have Smartphones Destroyed a Generation?”, and it’s written by Jean Twenge, a psychology professor at San Diego State University.
The article describes Twenge’s research on iGen, her name for kids born between 1995 and 2012 — the first generation to grow up with smartphones. Here’s a short summary of her alarming conclusions:
“It’s not an exaggeration to describe iGen as being on the brink of the worst mental-health crisis in decades. Much of this deterioration can be traced to their phones.”
I won’t bother describing all of Twenge’s findings here. If you’re interested, read the original article, or her new book on the topic, which comes out this week.
The point I want to make instead is that in my position as someone who researches and writes on related topics, I’ve started to hear this same note of serious alarm from multiple different reputable sources — including the head of a well-known university’s mental health program, and a reporter currently bird-dogging the topic for a major national publication.
In other words, I don’t think this growing concern about the mental health impact of smartphones on young people is simply nostalgia-tinged, inter-generational ribbing.
Something really scary is probably going on.
My prediction is that we’re going to see a change in the next 2 – 5 years surrounding how parents think about the role of smartphones in their kids’ lives. There will be a shift from shrugging our shoulders and saying “what can we do?”, to squaring our shoulders and asking with more authority, “what are we going to do?”
(Photo by Pabak Sarkar)
August 14th, 2017 · 58 comments
The Reading Writer
As a writer I’m required to read lots of books, especially when ramping up a new project, as I am now. The picture above, for example, shows the books I’ve purchased only in the past two days.
I’ve already finished one of them.
My approach to the books I process in my professional life is quite different than my approach to the books I savor in my personal life. The former requires the ruthlessly efficient extraction of key ideas and citations, while the latter unfolds as a slower, more romantic endeavor.
I thought it might be interesting to briefly reveal the method I’ve honed over the years for my professional reading. It’s simple, and the basics should sound familiar to any serious nonfiction reader, but it has served me well.
Here’s the strategy:
Read more »
August 9th, 2017 · 26 comments
The Disconnected Life
Aziz Ansari recently deleted the web browser from his phone and laptop. He also stopped using email, Twitter and Instagram.
As he explained in an interview with GQ, when he gets into a cab, he now leaves his phone in his pocket and simply sits there and thinks; when he gets home, instead of “looking at websites for an hour and half, checking to see if there’s a new thing,” he reads a book.
Here’s how he explains his motivation:
“Whenever you check for a new post on Instagram or whenever you go on The New York Times to see if there’s a new thing, it’s not even about the content. It’s just about seeing a new thing. You get addicted to that feeling. You’re not going to be able to control yourself. So the only way to fight that is to take yourself out of the equation and remove all these things.”
He was worried when he first deleted his web browsers that he would suffer from not being able to look things up. He soon stopped caring.
“Most of the shit you look up, it’s not stuff you need to know,” he explains.
The journalist interviewing Ansari for GQ reacts to this answer with incredulity. “What about important news and politics?”, he asks.
“Guess what?”, Ansari replies. “Everything is fine! I’m not out of the loop on anything. Like, if something real is going down, I’ll find out about it.”
Read more »
July 24th, 2017 · 4 comments
Top Performer is an eight-week online career mastery course that I developed with my friend and longtime collaborator Scott Young. It helps you develop a deep understanding of how your career works, and then apply the principles of deliberate practice to efficiently master the skills you identify as mattering most. Over the past four years we’ve had over two thousand professionals go though this course, representing a wide variety of different fields, backgrounds, and career stages.
We open the course infrequently for new registrations (usually twice a year). It’s that time again: the course is open for registration this week (the registration closes Friday at midnight Pacific time).
If you’d like to learn more about the course, how it works or whether it’s right for you,* see the registration page here.
If you have any questions about the course, Scott’s team will be happy to answer them here: firstname.lastname@example.org
* To emphasize the obvious: the course is definitely not for everyone. It’s expensive and targeting those at a stage in their career where they’re able and willing to invest more seriously in advancement. I might send one or two additional notes about the course this week, but will then return to my regularly scheduled programming.
July 21st, 2017 · 19 comments
An Insightful Life
Claude Shannon is one of my intellectual heroes.
His MIT master’s thesis, submitted in 1936, laid the foundation for digital circuit design. (My MIT master’s thesis, submitted 70 years later, has so far proven somewhat less influential.)
His insight was simple. The wires, relays and switches that made up the types of complex circuits he encountered at AT&T could be understand as the terms and operators of logic statements expressed in the boolean algebra he encountered as a math major at the University of Michigan.
Though simple, this insight had huge impact. It meant that circuits could be designed and optimized in the abstract and precise language of mathematics, and then transformed back to soldered wires and finicky magnetic coils only at the last step — enabling staggering leaps in circuit complexity.
But he wasn’t done. A decade later, inspired in part by his wartime research efforts, Shannon developed information theory: a mathematical framework that formalizes both techniques and fundamental limits for reliably transmitting information over noisy channels.
(For a popular treatment of this theory, see this or this; for a technical introduction, I recommend this guide).
Put another way, Shannon’s master’s thesis laid the foundation for digital computers, while his information theory paper laid the foundation for digital communication.
Not a bad legacy.
Read more »