What language should be taught as the first language on a university computer science degree course?
A friend of mine asked me for my opinions on what programming language should be taught in the introductory class of computer science at university.
I remember this debate when I was at university 20 years ago, this debate has no doubt been raging for centuries. (This may not be literally true, or rather, it is just not yet literally true at the time of writing...) This is what I reckon. What do you reckon?
Should teach important aspects of coding. They don't even have to all be good. For example, maybe the student concludes static typing isn't good (e.g. if the student is a fool.) I am not the biggest fan of threads as a model of concurrency. But the more stuff you're exposed to, the better. For example, static typing, generics, threads, actor model, recursion, iteration, object-oriented programming, functional programming, manual memory allocation, GC, remote procedure calls, serialization, ability to use VCS, and so on. Perhaps one language cannot teach all of these things, then there need to be multiple languages :)
Should be free, so everyone can install it on their own computer. Telling students to "buy a Visual C++ license" is discriminatory against those who have no money.
Should be relatively easy to install. Have you installed LaTeX recently? What about Ghostscript? OK these are not programming languages (that's a blatant lie, both are programming languages), but requiring people to mess around with incredibly complex installs is no good when they don't know much yet. I've had to alter source code to get programming languages to run (back in the 90s), obviously requiring the student to first be able to program before secondly being able to learn to program is not a strategy deserving of an award.
Can program interesting things with it. I mean just programming some algorithms, it's a bit difficult to understand where that fits in to programming, if you don't already have the experience. Perhaps JS is not bad from this perspective.
One thing one of our professors said was they chose a language that no students knew. Because some people were complete beginners, whereas others were nerds since a young age. To allow cohesion between the students, and make teaching one class possible (with such varied experiences). I think that wasn't a bad one. I felt confident/arrogant with C, then was hit on the first week with functional programming which I'd never seen before, brought me back to reality, I realized: right, there is a lot for me to learn here.
Not too much "boilerplate". For example like Java class and main method etc. Or C with lots of #includes. The impression should be conveyed that the programmer is in control of the computer not the other way around, huge amounts of shite the beginner programmer doesn't understand is not going to convey that at all. At some point the programmer will realize they are just a powerless slave to the machine but this point in time should be delayed for as long as possible. (Plus, beginners will copy/paste the boilerplate wrong and nothing will work and they won't know why.)
No "magic" or "action at a distance" (or even "convention over configuration"). Because when stuff doesn't work, there will appear to be no way to debug it. For example in JUnit call a method testXx and it runs, call it TestXxx and it won't. It's difficult for experienced people to get to the bottom of such things without using stack overflow or asking mates or going into the source code, so it's going to be impossible for someone new to the field.
I reckon these things are not important:
Used in the real world e.g. industry. Because that changes so fast, and anyway each job uses a different language (PHP, Java, JS, C#, ...) so you can't be an expert in them all. Anyway a computer science course should be about teaching the principles not just getting a job (even if universities sometimes seem not to understand that..)
If the programming language has been created recently or not. That would be like saying, the book 1984 was written in the past, it teaches us nothing about today, ignore it.
OK so what languages satisfy the above constraints? The official databases & life list of approved first languages. I think if I was teaching an introduction to programming, I would be happy to teach:
Forth - Stack based language, nobody's going to have seen anything like that before. Few gotchas. Easy to start with. But doesn't teach many useful concepts: the main concept it teaches, of stack-based languages, is not a useful concept :)
Ada - Deliberately designed to have few gotchas (for use in safety critical systems), very strong type system, basically a "normal" language (good), but nobody will know it (good).
XSLT - OK dunno about this one, but not many gotchas, could develop "real" programs in the form of reports from data. But too weird, doesn't teach that many concepts. And might lead to a high student suicide rate.
Erlang - completely weird concept of threads, few people will know it. (And I mean weird in a good way.)
Prolog - Logic programming is something most people won't have seen. Not many gotchas.
Go - At least you can program real stuff with it. Not many gotchas. Not too mainstream.
Assembler - Might be a radical choice but it does teach some fundamentals like memory layout and pointers, if you don't understand that stuff you're never really going to deeply get Java etc.
I used to help out at uni teaching non-computer-science students programming, I mean there was a student there who really didn't get variables or "for" loops. I tried really to explain it to her. Maybe she got some way to understanding it (or maybe not). I don't reckon she could handle Integers vs int or a bunch of #includes just to get something written on the screen. Or == between Integers working with low numbers only.
Maybe really Ada. Can interface with C stuff with gcc gnat so might be able to develop useful programs. Teaches many concepts in a good way.
What do you reckon?