Two weeks before the start of a new semester seems like a good point to reflect on some academic choices I have made, and trying to unpack and understand my own decisions.
The background - I entered into university with a double degree in Computer Science and Business, without being too clear on the rationale for this choice. Having been undecided on what course to pursue until rather late, in retrospect I feel like I took the double degree seems like hedging the bet and keeping my options as open as possible. My rationale, at that point, was to avoid being classed as a stereotypical code monkey, and possibly help me stand out from the legions of Computer Science graduates entering the workforce. Being in India, where Computer Science seemed like the undergraduate degree of choice for nearly a majority of my peers, made me acutely aware of the need to stand out from the competition.
Quickly into my CS education, I discovered the vast amount of knowledge - both technical and cultural - associated with the field. I quickly realised that programming, even with data structures and algorithms, was but the proverbial tip of the iceberg. There are various technology stacks - mobile (Android, iOS, Windows Phone), web (LAMP, MEAN, Backbone, Ruby on Rails …), different programming paradigms (procedural, functional, OOP), different database systems (MySQL, MongoDB …), and network protocols. I was introduced to the basics of operating systems, to security issues, and the principles of software design. Other than this technical knowledge, there was the associated culture to dive into. I was introduced to HackerNews (the procrastination site that makes you feel productive), xkcd, to the debates on editors, tabs vs spaces, indentation, open source culture, and the many more topics that every programmer is expected to have an informed opinion on. And I loved it all.
I had not exactly been a tech Luddite before university, but a vast majority of this knowledge was completely new to me. Learning anything new was exciting and only piqued my curiosity further. As such, it strongly frustrated me that it felt that I did not have enough time, energy, or motivation to devote to learning more about many of these topics. I believed (and still do) that too much of my understanding is basic or superficial, especially compared to that of my peers, many of whom have been programming for years. In this scenario, my business degree became a convenient excuse and scapegoat for my laziness in exploring more of computing. My default attitude on the topic of my business degree was that of regret, believing that it was holding me back from fully exploring computing, and that I was only not dropping it due to the academic complications that dropping it would cause.
It did not help that many of my peers in Computing (and me myself, sometimes), looked down on Business school with an air of intellectual arrogance. Here we were, making things which were going to change the world and make it a better place, while the pretentious folks over at Business focused on polish and fluff. It is not even an exclusively university phenomenon, as tech culture throughout does look down on management and bureaucracy, especially of the non-programming kind. In the words of the guy from HBO’s Silicon Valley, the attitude seems to be that ‘Jobs was a poseur, he didn’t even code’.
However, I have come to realize that the dichotomy between fields is not very sharp - there are important technical skills to be learnt from Business, and industry-relevant soft skills in Computing. Skills like the basic reading of a balance sheet or running regression and correlation analysis on Excel are immensely applicable and useful. Further, the dismissing of ‘softer’ aspects of study like Management and Organization or Marketing as common sense dressed up in fancy frameworks, I believe, misses the point. After all, programming can be described as logic dressed up in syntax, but that does not stop there from being a learning curve. In both cases, the establishment of a common language (frameworks and theories in the first case, programming language in the latter) allows for a way to organize simple ideas together to achieve results. On the other hand, ‘non-technical’ skills and processes like documentation, design patterns and principles, and architecture design are highly important in order to work productively on any software project of some magnitude. As such, I find it highly blinkered when friends of mine acknowledge the importance of design and documentation in code, and in the same breath disparage business for not being ‘real work’. Indeed, the opposite, of business students dismissing computing students as dweeby nerds seems to be much less prevalent, thanks in part to pop culture embracing and assimilating nerd culture and in part, I guess, to Singapore being highly technophiliac.
A brilliant programmer who refuses to understand or appreciate business considerations and attitudes is as handicapped as a management suit who forces unrealistic expectations and pressures on engineers. A big reason that I am sticking through with my Business degree is in the hope that it provides me the impetus to rise above these stereotypes, and helps me better know the world that I am supposed to change and disrupt with technology.tags: university - education - business - computing