I ’ve lived in Silicon Valley for most of my life, but as I enter my fifties, I worry that I may be too old for it. My wife and I are both California natives, and we returned here to raise our children. I love the Valley’s cerebral, unapologetic geekiness, the tremendous cultural and national diversity, the absurdly good food and weather, and the heart-stoppingly beautiful terrain.
It’s not like I’m slowing down: I’ve published two books in the last three years (the most recent one has been pretty well-reviewed), I do better work now than I ever have, I have a wider range of skills, and (I like to think) I’m wiser. But as Bloomberg Business Week recently noted, while the median age of U.S. workers is 42, here in Silicon Valley it’s around 31. Age does not give you an advantage here.
In a region that talks constantly about “the war for talent” and throws perks and stock options at workers with video-game speed, it’s not like older workers are too expensive. The Valley skews young because it assumes that the young are natural innovators, and “gray hair and experience are really overrated,” as Hubspot CEO Brian Halligan told the New York Times.
But the Valley is just an extreme example of a much bigger trend. Over the last couple decades, in many industries, we’ve come to see age and experience not as sources of wisdom, but as liabilities. Indeed, by the time Facebook co-founder Mark Zuckerberg declared in 2007 that “Young people are just smarter,” and a few years later venture capitalist Vinod Khosla said, “People over 45 basically die in terms of new ideas,” they were repeating an idea that evolved so slowly, and along many fronts, that it has come to seem natural, inevitable, and self-evident.
Ideas like these are powerful because they shape our thinking without our recognizing their effect on us. But they’re only powerful if we don’t ask where they came from and why we believe them.
For most of human history, age and experience were assumed to bestow wisdom, and wisdom was assumed to be a good thing. Youthful genius has been recognized since ancient times, but it wasn’t wisdom’s competitor. A figure like Isaac Newton, whose breakthroughs in optics, calculus, and physics all came in his twenties, or brilliant young poets like Thomas Chatterton and Rupert Brooke, were seen as possessed of an inborn, uncontrollable, even divine genius. (Youthful accomplishment was also tinged with tragedy: Chatterton died at seventeen, Brooke at twenty-seven.)
Besides, youthful genius hardly figured outside science and art. A mathematician like Evariste Galois might do paradigm-shattering work by twenty (when he was killed in a duel), but building fortunes and businesses required years of patience, prudence, and occasional boldness. In larger enterprises, you rose, not leapt, to the top: you paid your dues, did the work, and waited your turn. In politics, a brilliant start was a sign of a promising career: Thomas Jefferson and William Gladstone were both recognized as stellar minds, but their political careers still unfolded over decades. The professions required both up-to-date knowledge and experience: a great physician or lawyer had to know the latest developments in their fields, but also possess the maturity that came only from years of practice. Across all these realms, greatness required maturity and wisdom, which could not be learned, only acquired.
Then two things happened that upended this age-old system.
First, the value of youthful genius increased.
From the 1980s, in fields like finance and high tech, it was no longer necessary to pay your dues and wait your turn. On Wall Street and Silicon Valley, success was now determined by your ability to leverage your technical skills or market expertise before they became obsolete. Young brash entrepreneurs like Steve Jobs and Bill Gates trounced older, slower executives at IBM and Hewlett-Packard. Their companies, in turn, hired “digital natives” whose intimate, brain-altering familiarity with computers and the Internet made them self-evidently preferable to older workers who had only seen computers in, say, elementary or middle school. Then came along a new generation of foolish and hungry CEOs, empowered by their technical knowledge and liberated by their inexperience, too naive to realize that they couldn’t change the world, and not weighed down by lives, kids, or other distractions.
The result is that the world’s most influential and profitable companies look less like medicine or law, and more like the science-fiction movie Logan’s Run: they’re great until you’re 30, and then you disappear.
Second, age and experience became liabilities.
You saw this most vividly in Silicon Valley, but high-tech industries weren’t the only ones in which age was becoming a burden. Across manufacturing, industry, retail, and services, business process reengineering (one of the great consulting fads of the 1990s) argued that business as usual was, by definition, inefficient. New technology would let you automate functions and replace experienced but expensive middle-aged workers with inexperienced, cheaper young labor. In a fast-changing, rapidly-globalizing world, business as usual was the kiss of death, conventional wisdom a drag on profitability, and those who defended either — or who couldn’t adapt to the new world, or were too expensive — had to be thrown overboard in favor of fresh thinking and cheap overseas labor. It wasn’t just workforces that skewed young: in advertising, Tom Goodwin notes in a recent essay, the ideal worker became a tech-savvy 24 year-old who could design ads for… well, other 24 year-olds.
Asking why we believe that experience and wisdom are liabilities, and that youth offers unique, decisive advantages, is more important now than ever. People are living longer. In economies driven by brains rather than brawn, workers can have longer careers. In countries where retirement systems are stressed, they’ll need to work to avoid poverty. Yet the number and ability of older workers has increased exactly at the same time that we’ve devalued them.
Companies should also recognize that they lose by worshipping youth and discounting experience. Dan Lyons explains what happens when venture capitalists “let young founders go it alone” and run companies rather than pair youthful founders with industry veterans:
The consequences have been predictably disastrous. Young male founders hire young male employees, and spend huge money building kooky office frat houses…. This huge, dynamic industry, which is generating so much wealth, has walled itself off from most of the workforce, telling millions of people that they cannot participate. This situation obviously shortchanges a lot of workers, but it also hurts tech companies by depriving them of talent.
There’s also a case to be made that discarding or excluding older workers deprives an industry of valuable talent and experience, and actually makes it narrower, less innovative, and even less entrepreneurial. Tom Goodwin argues that the disappearance of an older generation of executives (“Living in New York and working in advertising I rarely see people over the age of fifty,” he says) has meant that the industry as a whole is taken less seriously by its clients, overestimates the novelty of every new technology, and lacks the perspective to differentiate noisy events from deep and truly meaningful changes. Aziz Shamim argues that rather than creating products to eliminate disease, end poverty, or educate the poor, today youthful “tech culture is focused on solving one problem: What is my mother no longer doing for me?” A decade ago, business professor Vivek Wadhwa studied tech company founders, and found that the most successful were in their late thirties or older. More recently, work by the Kauffman Foundation concluded that successful entrepreneurs are actually getting older: they’re increasingly likely to be in their fifties or even sixties.
Further, history teaches us that creative industries are exactly the ones in which people are able to make contributions throughout their lives. As David Galenson notes, creatives come in two kinds: young geniuses who make conceptual breakthroughs, and old masters whose work matures over decades. This holds true in art, music, movie, and even economics. The world would be poorer if it only recognized Van Gogh and Picasso, and ignored Rembrandt and Vermeer.
Finally, we should recognize that creative lives can be long, and that even youthful prodigies have surprising second acts. Clint Eastwood’s legacy will probably rest less on his roles as Rawhide’s Rowdy Yates or “Dirty Harry” Callahan, and more on his work as a director, which has been his main focus since he turned seventy in 2000. Had Steve Jobs not had a second act in his fifties, the world might never have seen the iPod, iPad, and iPhone. No one ever argued that Lord of the Rings would have been awesome if only it had appeared when J.R.R. Tolkien was in his twenties, rather than his sixties.
For all these reasons, it’s high time to rethink our preference for youth and disdain for wisdom. The world needs, and should be able to make room for, both. Let’s hope it happens in time to help me get my next job.
Originally published at medium.com