Having attended interviews for programming jobs and sitting in on other people’s interviews, I had this thought–Maybe we’re expecting too much of fresh graduates given what is taught in university? Now, I understand what a difficult and expensive process it is for companies to find and hire good candidates. We expect fresh graduates to know how programming in industry works without asking where they are supposed to have picked up those skills. We assume that they have either been taught that sort of thing at university or that they have picked it up by themselves by being passionately involved in programming on their own or freelancing for companies. The problem is, how many universities teach the things that companies want? And how many university graduates are involved in programming aside from university coursework or projects? It would be interesting to see some surveys and studies around the gaps between what employers want and what university curriculums teach.
From my experience (having gone through a 4-year undergraduate degree in Computer Engineering), I have to say that much of the “Learning how to learn” approach of teaching was good but not quite there in terms of preparing us to for jobs that required some rather concrete knowledge of programming for industry. Some of the concepts I wish our courses or projects had are:
1. Automated build processes and tools (like ant, maven, etc)
2. Source control (checkins, commits, updates, merges…)
3. Web programming (web servers, application servers, latest frameworks)
4. Introduction to the latest concepts in industry such as dependency injection, SOA, ESBs, etc etc.
5. Bug trackers
7. Scripting languages (shell, groovy, etc)
8. Examples of industry software (Databases, business apps, web apps, data warehousing, etc etc)
Of course you can probably have whole courses on each of these and the objective of universities is not to do that but introductions to these concepts via examples or project courses would have been very beneficial. It might be that many professors have not worked in industry since the early days or have worked on projects which didn’t involve any of these. In such a situation, I’m not sure who is best placed to initiate enhancements of the curriculum but I would imagine those involved in the Professional Associations and Accreditation Boards need to do those. If this happens, more and more candidates will face interviews having to defend why they did not learn all this stuff themselves. Even if somebody was passionate about programming and learned all of this, this does not represent the majority of students and as such we are leaving behind a big group of people who would have otherwise been the next generation of engineers and progammers. I feel for the money spent on getting a university degree these days, they could do a lot better. And if they are not equipped, then there is probably a market place for companies in industries to enter that WOULD create the people with the skills they need. Imagine a software company that takes in a class of programmers each year, charging a fraction of the tuition paid to universities, and teach 4-5 courses covering the basic knowledge required to work in industry. I would say a year’s worth of training (free or chargeable) is a win-win situation for both fresh graduates and companies. Now that would make universities wake up and not be so complacent about the business they are in. And it would make the process of hiring experienced people a lot more easier knowing that they’ve received industry specific training which could set the standards required for educational and vocational institutions claiming they are readying students for industry.