Roman Suzi
1 min readApr 3, 2021

--

(joke)First of all, university may be too late to learn programming (it should have happened earlier) (/joke). University years are probably better suited for networking and getting used to understand scientific ways rather than learning software development. It just happens the formal side is still important for HR, but on the other hand it's much easier for HR to check educational background than assess how good at programming you really are. Everyone is fast learner and team player according to resumes.

I'd added number 1 hint from recruiter perspective: make your contribution to Open Source and show your portfolio.

One comment about microservices and skills become irrelevant in 2 years. At the deeper level there really are not so many changes. From a couple of past decades I can mention only: the real need for concurrent programming, machine learning and AI breakthrough, and probably functional programming on the rise (I do not yet include low-code hype here). Everything else is just programming languages, frameworks, platforms coming and going. Most of the theoretical foundations of mainstream tech today presumably studied by CS majors are from XXth century (it takes some good things decades to reach mainstream).

For self-learners it still makes sense to have some crush courses on what computer science is about. Then maybe the minority complex will disappear for good.

--

--

No responses yet