Up to now, I’ve experienced two internships in industry (at Runtime Verification and Galois) and several academic (student) research positions (at Reed College, Carnegie Mellon Univesity, and the University of Maryland where I am currently at PhD student). I also commonly read the formal and informal writing of many academic and industry computer scientists.
Firstly, since computer science is a relatively new field, it is not well specialized into different categories. Computer science, as it is commonly used, includes a huge variety of entirely separate subjects:
- Software engineering
- Cryptography (basically math)
- Machine learning
- Type Theory (math but different)
You could argue that they all share a basis of making computers do things with algorithms, but this is analogous to arguing that philosophy and chemistry might as well be combined into one field because they both share a basis describing the principles of observable phenomenon.
My point is that computer science is an extremely broad field, and due to this it is difficult to develop useful generalizations about the entire field.
That being said, here are some generalizations on the situations of academic and industrial computer science. They are just my observations, from my own experiences and from the arguments I’ve generally heard from others, but is not deeply researched into the details of specific subfields.
In fields like chemistry and physics, in which I have no special knowledge, there are clear distinctions between the academic trajectory and the industrial trajectory for a rising student/worker. In chemistry you either prepare to be a chemical engineer, or you prepare for chemistry research. In physics you either prepare to be some sort of engineer or you become a theoretical physicist. I think there are similar dynamics in other scientific fields, such as biology, psychology, etc.
I don’t see this dynamic strongly in computer science; there is much less distinction between the research-type work being done in academia and industry. Academic computer scientist are still usually hardcore programmers, and could easily pick up a software engineering job at any time. In fact, often academic research involves programming projects. Large software companies such as Google, Facebook, Amazon, Apple, etc. employ many researchers from advanced positions in academia, and this is not as a sort of degree requirement like going into medicine because these researchers continue doing similar research in groups like Google AI research groups of Amazon’s automated reasoning research groups.
However, there are still serious differences between the environments of academia and industry, even in computer science.
In industry, goals are important and specific. An aphorism I heard at Galois captures this:
Industry contracts have hard requirements, but negotiable deadlines. Government contracts have hard deadlines, but negotiable requirements.
I’d say that this dynamic is even more pronounced between industry and academia, where academia works like the government but even more layers removed from requirements evaluation (such as by the military, which would be most of the government contracts taken by Galois). The obvious reason is that the projects taken on in industry almost always have a short-to-medium term product in mind. The product is usually client-facing, but could also be for internal use. In any case, the application for the project is almost always in mind when the project is started, even if the ultimate revenue source is further downstream. This sets up project developments for incremental evaluation against specific goals by people external to the project and similar projects. The largest industrial research groups may have long-term plans to apply research that are less specific, but on average are still much more specific than the average academic grant application. Industrial computer scientists are primary evaluated on the usefulness of applications of their research/work, and their ability to meet predefined goals. Industrial researchers get hired to work on specific research that the company already planned on undertaking.
In academia, goals are nebulous and based more on community interest than practicality. Funding is dictated by (usually government-funded) grants, which are awarded based on the community prestige of the applicants, diversity initiatives, and writing quality of the grant application. Grant applications are best when they promise as much as possible within believability, addressing every aspect of the grant description. If the grant is received, then there is very little evaluation of how closely the funded research matches the grant application, other than how many publications resulted that at least fall within the description. Academic researchers are primarily evaluated on the quantity of their publications weighed by the prestige of the journals they were published in. Academic researchers are primarily intrinsically motivated, by their interests and their own views on practicality, and very rarely by applications of their work. The significance of an academic’s work is almost entirely dependent upon how interesting other researchers in their microfield deem it. And these are very small communities — there aren’t very academics in total, and most long-term academics are extremely specialized — I think this is because long-term academics eventually get tenure (otherwise they would probably leave academia) which guarantees them a job and so they never need to worry about acquiring research knowledge in other subfields because they don’t have to worry about a forced transition, and depth of expertise is much more rewarded than breadth because research is always focused on the very edge of current knowledge which tends to require specialization in that direction.
So here’s a question: should the government focus more on funding computer science research through academia (like it does currently) or through industry (which it does to some extent but not as primarily as academia)?
What works great in industry, as I’ve described, is when there are specific goals. Academic researchers transition to industry exactly when companies find reasons for that research to be done for use in some application they can profit from. Sometimes, this will overlap with something that’s already being done in academia. A primary example of this is machine learning, which originated in academia but has since found huge support in industry, as it has become very profitable to develop for large companies like Google. The benefit to society of industrial research is that the company creates value, which it is partially compensated for in the form of profit, but the rest of which is accessible to society. Sure, much research is proprietary, but its turned out in the realm of modern computer science that the research developments can be more useful to themselves open-sourced for many reasons. This often results in companies publishing alongside academics in journals. What the company keeps proprietary is computer capital (computing power) and human capital (skilled programmers and researchers).
But, as I described, there’s a lot of research in academia than has very general goals and nebulous evaluation. This kind of primarily-academic research does not have a product in mind. Of what value is it then? In the ideal case, this can be the sort of basic science research that benefits society in broad ways that are hard to capture that value of by a company. As an example I’m familiar with, the development of type theory was initiated a mathematical exercise to address Russel’s paradox and encode mathematical propositions in machine-checkable form. A half-century along, these abstract foundations have found their way into many industrial-strength programming languages, including Scala, Typescript, Rust, and (if you’re generous) Haskell and OCaml. An extremely general argument for basic science research is that it pays the activation cost for unpredictable projects in the future that can’t be reliably pursued with profit in mind. A small start-up that takes advantage of innovative new ideas would be much less feasible if it first had to do all the basic research required to get to the point of actually making something useful to profit off from. I admit that it’s very difficult to support this argument, in a similar way to how its very difficult to support funding pure mathematics research. Sure, some of it will eventually be used to make value, but it really seems like a huge amount of it is just academic masturbation with no conceivable practical use ever.
In short, research that has a specific value-adding goal in mind is definitely best done in industry, and basic science research is probably best done in academia. But, there are a lot of serious problems with academia, and I don’t see any good practical solutions. It seems to be an inherent part of the academic style of research, where goals and justifications are so nebulous, that primary motivations for research become prestige within their micro-community, publishing pressure, and securing tenure so they don’t have to worry about being evaluated any more and just focus on what interests them. I think this style certainly does do basic science research, but it’s so obviously inefficient and, from the inside, feels really toxic to me. It almost feels corrupt, especially since everything is funded almost entirely by the public. Academics are so far removed from their source of sustenance and also so far removed from those that would ever want to use their work to make something that actually will get evaluated outside the community — its extremely insular. In this way, academia exists as a sort of parallel world divorced from the economic and social realities of the non-academic. Some people love this, and I can empathize with the view. That’s what interested me in theoretical computer science and going into a PhD program — the focus on researching interesting ideas above all else and not worrying about the often frustratingly manual process of turning a good idea into a good product. But, since becoming a PhD student, I quickly came to realize that sort of dynamic doesn’t result in the kind of environment that I want to work in as a career. I haven’t planned to drop out yet, but I easily would if I found a reasonable job opportunity where I can do interesting work in industry.