A long time ago, in a galaxy far, far away (Stackoverflow.com, November 2008), a university junior posted this question:
Most jobs speak of C#, Visual C++, .NET, Java, etc etc Where as I am mainly using Java, C++, Perl, Python and programming to the standard Unix standards, would I be better off ditching Linux and spending my last year of University brushing up on Windows based technologies, languages and API’s, would this increase my chance of getting into the industry?
A fellow named Andy Lester responded with this:
I suspect that your “most jobs” observation is from looking in the wrong places.
Whether or not “most jobs” are using MS technologies, would you WANT to work with MS technologies? If you went and boned up on your .NET and Visual C++ and had to use Windows all day, would that be the kind of job you wanted? If not, then it doesn’t matter if that’s what “most jobs” call for, because those aren’t the jobs for you.
There are not hundreds of jobs out there available for you that are a good match for you, and for which you are a good match. Don’t worry about the broad playing field of the job market, but instead focus on the jobs that DO interest you.
I think this is stupendously bad advice. Of course you should bone up on Microsoft technologies. The chances of you making it through a 40-year career in technology without having to work with MS stuff is slim to none. Of course, the real answer is…focus on what you’re learning in school first.
Now recently, I was busy doing something else and I had long forgotten all about this exchange, and Andy has replied back:
Ben’s right, you’re likely to have to use Microsoft technologies, if that’s how you want your career to take you. What I think we’re seeing here is the difference in viewpoints between someone like Ben who seems to think primarily in terms of maximum salary and maximum employability, and someone who thinks about the importance of loving what it is that you do for a job.
He then spends the rest of his post regurgitating his original point: focus on the jobs that interest you.
Firstly, to Andy: I don’t think your point is a bad one, but I still think it was a bad answer to a 21-year-old kid trying to figure out how to make himself marketable. Of course you should be brilliant at the things that interest you – but it’s also wise to be familiar with tools and techniques that don’t, because they may very well allow you to sustain a situation in which you are allowed to do things that interest you.
My college situation was very similar to the OP’s: I went to a good computer science school, but all the tools totally revolved around Java, C, and Unix. I never learned any of the Win32 API, and only after my Junior year did I really get a chance to learn anything about Microsoft developer technologies. That was a huge deal. It helped broaden my horizons a bit, even if my first job out of school turned out to be a realtime embedded systems job. But you know what? After a few years I decided that I wasn’t terribly interested in what I was doing at that job any more, so because I was familiar with tools that I didn’t typically use, I had the flexibility to quit that job and find another one pretty quickly.
Another reason why it would have been smart for the OP to bone up a bit on .NET development or the Windows APIs is simply that he may not really know what he is interested in. If all you’ve ever done is Java on Linux, how could you? It could also be that what he’s really interested in is distributed algorithms (or any technology-agnostic application domain), in which case the particular platform or toolchain involved is immaterial. If that were the case, then it’s a no-brainer: get familiar with new stuff before you graduate, and you’ll be more attractive to potential employers.
So, simply put: my point wasn’t that you shouldn’t pursue the things that interest you, but that it’s wise (particularly for a new grad) to round out your skill set to increase the odds that you’ll get to do the things that interest you.
Just ask a Lisp programmer.