When it comes to ability, conventional wisdom says age matters. At one end of the spectrum is the belief that mental sharpness peaks early, a viewpoint enforced by the college-aged entrepreneurs currently raising billions in Silicon Valley. Meanwhile, there exists a simultaneous, parallel narrative, one that equates experience with intelligence (consider that we've never elected a U.S. president younger than 43, and aren't legally allowed to elect anyone under 35).
From a cognitive standpoint, which position holds more sway? Do our intellectual abilities peak in our mid-20s, or do they continue to ripen with age?
According to a recent study published in Psychological Science, the answer is yes to both.
When neuroscientists at MIT and Massachusetts General Hospital gathered data from nearly 50,000 individuals aged 10 to 71 on a wide variety of online cognitive tasks, they found substantial variances in the relationship between age and intellectual abilities. Performance on processing speed, for example, peaked and began to decline early, around high school graduation; verbal and visual working memory tasks plateaued in the mid-30s; emotion-perception gradually crested in late middle-age. Finally, vocabulary continued to steadily climb into the 60s and beyond.
"These findings motivate a nuanced theory of maturation and age-related decline, in which multiple, dissociable factors differentially affect different domains of cognition," the authors write. "On the practical side, not only is there no age at which humans are performing at peak on all cognitive tasks, there may not be an age at which humans perform at peak on most cognitive tasks."
This view of human ability – one that recognizes the existence of multiple, co-existing intelligences, each with its own individual life-cycle -- helps explain why many personal renaissances arrive later in life: Harland Sanders started KFC his 60s, Ray Kroc opened McDonald’s in his early 50s, Steve Jobs spearheaded the iPod, iPhone, iPad and iMac after the age of 45.
Of course, certain abilities do, in fact, favor the young. “A lot of what it comes down to — are you cognitively able to do it?” James C. Kaufman, a professor of educational psychology at the University of Connecticut, recently told The New York Times. “Most software developers don’t suddenly start at 60." Attempting to develop a new skill at an age where your ability is already waning, the deck is literally stacked against you. “It is generally very difficult to get a late start in a field that requires lots of fluid intelligence from the get-go,” Dean Keith Simonton, a professor of psychology at the University of California, Davis told the outlet.
This, of course, gets at the root of Silicon Valley's youth obsession; some of the bias is founded in fact. But the blanket statements often bandied about – "people under 35 are the people who make change happen…people over 45 basically die in terms of new ideas," venture capitalist Vinod Khosla famously pronounced – are oversimplifications.
Engineering, programming and coding are skills ideally acquired before middle-age, but entrepreneurship itself is not a profession reserved for the young. A recent report from the Kauffman Index of Entrepreneurial Activity found that the highest rate of entrepreneurial activity in the U.S. happens in the 45-54 age bracket, with a considerable amount occurring in the 55-64 demographic. Meanwhile, when a team of researchers looked into the backgrounds of 502 successful engineering and technology American companies founded between 1995 and 2005, they discovered that the median age of their founders was a nearly middle-aged 39.
In fields like law, psychoanalysis and entrepreneurship, lead-time is important; peaks often crest late. It’s important, then, to step back and recognize the shifting abilities that course through us: There is no magic formula, or magic age, when it comes to starting a business.