I’ve been writing lately about the modern technology career. These jobs require a great deal of ongoing education, yet the ladder seems too short. By mid career, the way forward is unclear and we run into age discrimination.
Age discrimination is more than just a bias. There are system forces at play. My Windows development experience in Visual C++ 6 is not as relevant as it used to be. At the same time, my experience and lifestyle costs have increased since my twenties. With experience comes an expectation of more pay, but new technologies emerge and the technologies of previous decades become less relevant. Is it really any surprise that finding a job gets harder after ten or fifteen years of experience?
It's easy to be envious of those new graduates of MIT, Berkeley, and Carnegie Mellon. They get to go work for some hip company like Yahoo, Groupon, or Google with its office ball pit. But, perhaps we have the story wrong—or, at least, we may be looking at the wrong side of the story.
Let's Change the Discussion
What if we looked at the software developer's first programming job not as the goal but rather like residency for a doctor—the first step on the ladder? After the residency, you get to specialize.
My friend Tessa Welsh once told me that even though she began with a programming job right out of college, her goal was project management (PM). She had to take a programming position because she was fresh out of school and could not get a PM job. Four years later, a PM position opened up at her company, she applied, and she’s been a project manager for ten years now.
For some, programming is a calling, but for many more, it might be a one part of a much more complex career. If the half-life of a programmer is ten to fifteen years, then in ten to fifteen years half of all new programmers will have moved on to something else.