NEW BOOK!
Explore a better way to work – one that promises more calm, clarity, and creativity.

The Empty Sky Paradox: Why Are Stars (in Your Professional Field) So Rare?

dagstuhl-640px

The Empty Sky Paradox

In many fields, people are eager to produce top results. A non-trivial fraction of the Internet is dedicated to tips and hacks for accomplishing this exact goal.

So why are so few people stars?

This past week provided me a good opportunity to reflect on this question. I attended a Dagstuhl seminar on wireless algorithms, which means I spent a week in a castle (pictured above), tucked away in rural west Germany, working with top minds in my particular niche of theoretical computer science.

Here’s what I noticed:

In theory, the people who tend to consistently produce important work seem to be those who consistently take the time to decode the latest, greatest results in their subject area.*

Only when you’re at the cutting edge are you well-positioned to spot and conquer the most promising adjacent intellectual territory (for more detail on this idea, see Part 3 of SO GOOD).

This sounds like simple advice — stay up to date on the latest work! — but most practicing researchers probably don’t follow it. Why? Because this turns out to be incredibly hard work.

(These results are tricky, and presented in short conference papers where key mathematical steps are elided, requiring days [and sometimes much more] to decode.)

This brings me back to the general question of why most fields have so few stars. The answer, I conjecture, is that most fields are similar to theoretical computer science in that the path to becoming a standout includes a prohibitively difficult step. It’s this step that limits stars, as most people simply lack the comfort with discomfort required to tackle really hard things.

At some point, in other words, there’s no way getting around the necessity to clear your calendar, shut down your phone, and spend several hard days trying to make sense of the damn proof.

(Photo by Nic McPhee)

###

* This is a skill that I’ve been systematically developing for the last three to five years. I’m better than I was, but not yet as good as I want to be. I can attest from personal experience that these proof decoding efforts: (a) are extremely difficult — deep work purified to its most stringent form; (b) are crucial for producing useful results; and (c) get easier (though, quite slowly) with practice.

19 thoughts on “The Empty Sky Paradox: Why Are Stars (in Your Professional Field) So Rare?”

  1. In economics, new results are published as working papers which are often more detailed than the eventual journal articles. Of course, there are presentations before that and so it also helps to go to seminars and conferences. I think also there is scope for novelty from people bringing ideas from other areas to address questions (which is what I often try to do) not all advances are based on decoding the latest results in detail and building on the incrementally.

    I think based on observing people that stars in my field need a combination of traits, which don’t combine a lot: Intuition/creativity, logical ability, attention to detail, conscientiousness, high energy level, good interpersonal skills, ability to focus on what is important. Most people only have a couple of those. And you need time free from heavy teaching or administrative loads. I just complete two years with a heavy admin load and it makes such a difference time wise.

    Reply
  2. I doubt it’s “stay up to date on the latest work.” I suspect it’s keep up with the latest fads and promote yourself within them. Not so much hard work as simply giving up on caution, good sense and integrity.

    It all depends on what you want to be, a Henry Goddard, who flashed across the skies of fame with The Kallikak Family (1912) or a G. K. Chesterton, author of Eugenics and Other Evils (1922).

    Here’s what Wikipedia says about the star-quality Goddard:

    “He was the leading advocate for the use of intelligence testing in societal institutions including hospitals, schools, the legal system and the military. He played a major role in the emerging field of clinical psychology, in 1911 helped to write the first U.S. law requiring that blind, deaf and mentally retarded children be provided special education within public school systems, and in 1914 became the first American psychologist to testify in court that subnormal intelligence should limit the criminal responsibility of defendants.”

    But all that turned sour. Here’s what Wikipedia concludes the rest of his life was like:

    “By the 1920s, Goddard had come to candidly admit that he had made numerous errors in his early research, and regarded The Kallikak Family as obsolete. It was also noted that Goddard was more concerned about making eugenics popular rather than conducting actual scientific studies. He devoted the later part of his career to seeking improvements in education, reforming environmental influences in childhood, and working toward better child-rearing practices. But others continued to use his early work to support various arguments with which Goddard did not agree, and he was constantly perplexed by the fact that later generations found his studies to be dangerous to society. Henry Garrett of Columbia University was one of the few scientists to continue to use The Kallikak Family as a reference.”

    Henry Garrett, if you’ve not heard, rose to become President of the American Psychological Association in 1946, but finished out his life defending school segregation and opposing ‘race mixing’ at the University of Virginia.

    In contrast, G. K. Chesterton went so much against the flow of his era, that he almost had to title his book, Eugenics and Other Evils to make clear he was against what in 1912 the NY Times called a “wonderful new science.” He spent much of his life as a writer being the ‘Catholic reactionary’ foil to the media stars: H. G. Wells and George Bernard Shaw.

    The contrast is even more jarring if you look at the long-term impact. I published an textbook edition of GKC’s eugenics that’s still used today in college courses. It’s that ‘right on’ in its predictions that it is still worth studying. He was even right in his prediction that it was Germany where eugenics would bear its bitterest fruit. In 1922, he was warning of 1942. And in the 1932 Illustrated London News he warned that Germany would find itself a dictator and plunge Europe into war over a border dispute with Poland, which is precisely what happened seven years later.

    I suspect much the same is now happening with global warming. The flag they were waving so triumphantly a decade ago has begun to look a bit battered, much like eugenics, riding high in the 1910s, was looking a big dog-eared by the late 1920s.

    In a few decades, historians will note that, for some of these warmest champions, the work that brought them fame is only used as a textbook by some quirky Columbia professor much like the Henry Garrett above.

    I’m not sure which is worse, to be a Henry Goddard and lament the harm done by the ideas of your youthful vigor or to be Henry Garrett and continue to champion them long after they’d become recognized as despicable.

    I even found a relevant poem called “The Great Man” from a long ago book that says:

    But fame is deceitful, uncertain its ray,
    And to-morrow oft glooms on the star of today;

    I given you a bit of poem, so it must be time to end.

    –Michael W. Perry, Henry Goddard and a host of others, including Chesterton and Wells get to give their POV about eugenics and related issues in my Pivot of Civilization in Historical Perspective (2001).

    Reply
  3. Cal,
    Your use of the word “star” reminds me of a well-researched book by Robert Kelley called “How to Be a Star at Work.” The book is based on years of research from Bell Labs aimed to determine what predicts “stars'” achievement. Here, “stars” are defined as engineers considered excellent by customers, peers and management. Initially, the project aimed to test all engineers across every conceivable dimension from SAT scores to college ranking to see if there was a discernible pattern amongst the stars. It turns out that there wasn’t. So, Robert Kelley and his team observed and interviewed all the stars at Bell Labs and discerned a “star” framework that is the basis of the book. Initiative is the first and most important “star” quality for instance.

    All the best,

    Blake

    Reply
  4. Hi Cal,
    I have been reading your blog for quite a long time now. I must say you have some absolutely useful advice on this blog.

    Thank you!

    Reply
  5. This sounds like it might hold for people who are stars to other people within a particular field. To be a star in general, I think it would be necessary to “translate” those cutting edge academic papers into something captivating to the public.

    Cutting edge of research + (ability to translate it + willingness to do so) = stardom?

    Reply
  6. Blake,

    If you’ve been reading Study Hacks for a while, you would surely know that Cal hardly encourages the virtue of “Initiative”. It surely is important, but is as far away as possible from being the “most important”, when it comes to making an impact.

    Reply
  7. Cal, I disagree. You don’t become a star by learning the works of others and making marginal improvements on their ideas. You become a star by looking within yourself for your own original ideas.

    Economics nobel prize winner James Tobin who said something to that effect.

    Reply
    • Original ideas don’t “exist”. What I mean by that is that every new idea is composed of multiple old ideas. For example, think about the invention of the airplane. Was it a brand-new idea? I don’t think so. This idea was just a combination of the flight of birds with the mechanical apparatus of machines.

      The bottom line is, no idea is an island.

      Reply
  8. Thanks for the post Cal.

    In the long run, what is most effective – studying the “latest and greatest”/”cutting edge” material or being rooted in the “classics” of a particular field?

    Reply
    • I have been a long-time fan of Cal.

      Here is my humble opinion…

      There are only 2 factors that matter when it comes to achievement: deep work and distraction.

      You want to maximize deep work, and minimize distraction. Sounds simple, but this is a life-long struggle to strike the balance between these two forces.

      Getting back to your original question. What you have to ask you self is “Does reading the “classics”/”cutting-edge” research fall into the category of deep work or distraction?”

      I know this sounds stupidly simple, but it is really is the formula to achievement. It applies to all fields. Think of marathon running. Winning the Olympic marathon is fairly simple, just run 100 miles a week for 15 years and you will win the race. The same applies to academia or any other domain.

      I am sorry, but we all just have to do the work. 🙂

      Reply
  9. I’m enjoying these comments. There’s an idea that is showing up several different places here that I wanted to quickly address: incrementalism. There seems to be this sense in some comments that taking the time to understand other people’s work means that you will only produce marginal, incremental improvements that are hard but do not really advance human knowledge much. This doesn’t match my experience. Big breakthroughs — the type that catch people off guard — more often than not come from a commitment to really understanding cutting edge tools. If you read Richard Feynman’s writing on research, for example, you’ll see that his “genius” came from constant reading and understanding of other’s work. This meant he knew what problems were open, why they were hard, and what a solution would look like. It also meant that he kept learning new tools he could try to apply to these open problems. Einstein was similarly a voracious reader. We know him for relativity, but he published hundreds of influential papers that started from a premise of really understanding the cutting edge, then having insight on how to advance it. Coming back down to earth some more, the theory students at MIT that I observed have the most meteroic rises, were those that put in the time to understand the latest greatest. Deep understanding yields deep insight.

    Reply
    • Cal,

      Cal – thanks for the additional thoughts.

      I’m paraphrasing Francis Bacon (Of Studies). Reading makes us full, conference (or discussion) makes us ready, writing makes us exact. Learning, understanding and ultimately output comes from a great volume of input.

      Reply
    • I agree that studying others’ work is a great way to learn the tools required to answer difficult research questions. It may very well be the only way.

      I don’t know about Computer Science, but in Economics, formulating an original research question (in plain english; not math) is the first and most important steps. Without a good question, the results you find are not interesting. Once the question is established, you then go find out what tools you need to answer it. Simply put, questions come before tools. Maybe in Computer Science (and physical sciences), tools come before questions, and what is why you benefit more from intensive literature reviews before starting your own work?

      Reply
  10. Have read your blog for a long time, felt moved to comment today. I think the spark that bridges the gap between “understanding others’ work/knowing how to solve the problems that have been solved” and “big breakthroughs”–what makes a person contribute more than incrementalism–is imagination (or possibly something more narrowly defined like “creatively visualizing solutions”). Not my field at all, but I understand that Tesla could “see” his machines years before he could bring them into being.

    But what allows people to be creative/visualize/connect the dots in this big way? Doing the work, doing the work, doing the work.

    Reply

Leave a Comment