Missing a piece of the Java jigsaw – Are students learning the right languages at university?
Interesting insights have arisen off the back of the 2018 HackerRank Student Developer Report which analyses the most youthful proportion of the developer community.
At the forefront of what has been observed, is the fact that self-teaching is still a method deeply entrenched within any developer’s DNA. Whilst many computer science students are expanding their knowledge base at university, more than 50% of them say that a portion of their understanding comes from teaching it to themselves; whilst nearly a third of them are entirely self-taught.
Such information may imply that the technology is evolving at a greater pace than the computer science courses designed to explain them. YouTube is seen as a prominent resource and students utilise it more than tapping into the minds of professionals.
Such a conundrum further pushes students to go on and teach themselves what they need to know to find employment. Students are however schooled in alternative languages, but these aren’t as sought after, such as Python, C++ and C# and certainly further down an employer’s wish list.
Continuing on with what is not taught on the curriculum: frameworks. In general, it is felt that this is something you learn on the job. Nods.js, AngularJS, and React are the top three which show a massive gap between what the students know and what the employers need them to know.
In conclusion, the way that students learn is multi-faceted. The classroom is clearly delivering skills and upskilling people; often putting them into the right mindset, but curriculums seem slow to react. Therefore graduates are also doing it themselves and obtaining on the job experience. It is an ever-evolving pursuit of upskilling, which is something that will continue for the entirety of their careers. Ironically a skills which is perhaps the most important one taught.