cross-posted from: https://lemmy.ml/post/116237
Gerald Jay Sussman on Flexible Systems, The Power of Generic Operations
Dr. Sussman’s answer to “why did MIT siwtch from Scheme to Python in its cirriculum”, in a nutshell is basically that MIT was orginally interested in teaching how the computer works from the high-level programming API down to the circuit level.
But sometime in the the world changed and he noticed his students were spending most of their time reading manual pages. Computers had become so much more complex that there was no practical benefit to learning computers at a lower level, everything was done at a higher level of abstraction. Even computer chips were so huge and complex that not a single person knew how they worked, the best you could hope for was individuals specializing in parts of the circuitry. Computers engineering became more of a “science” in that you learned its nature through experimentation, rather than at analyzing it down to its fundamental components.
So they decided to teach programming at a higher level, and eventually after 10 years they decided to use Python so that they could introduce people to the high-level scientific concepts they would need to become expert computer scientists.
There is a practical benefit for learning how computers work at a lower level, because the computers aren’t going to just build themselves. Yet.
I agree there is a benefit, but it depends pretty heavily on what your career goals are.
I think learning the lower-level details of how computers work is more of a specialization nowadays – people who specialize in digital circuit design would want to learn about this, and could get a career in designing FPGAs and ASICs. A layer of abstraction above that, you have people who specialize in operating systems, and you could get a job in kernel development. Above that, you have servers and databases, above that, you have front-end app development, and above that, you have data science and artificial intelligence.
Each of these layers have lots of domain specific knowledge. But the purpose of an undergraduate course is to prepare students for choosing a specialization, not to teach them about all possible domain specific knowledge they would need for every possible career path, there isn’t enough time for that.
Learning in university how computers worked was invaluable and helped me make my career choices, especially in regqrds to embedded systems engineering, which you seem to have forgotten about.
Not everyone in every college program takes every single course that is offered. There’s plenty of material and surely interest for at least one embedded course which would cover the basics.
Heck, all you need is one tight real-time requirement and suddenly how the computer works is important.
For many modern systems, the solution to that is to have a whole separate CPU (generally just a basic coprocessor).
That’s not computer science. That’s electrical engineering (oe in some schools computer engineering).
Computer engineering covers the intersection of EE and CS, but bootloaders and device drivers are still within the realm of software engineering.
Software engineering isn’t typically a degree offered, and the realm of firmware is dependent on school/region of the world you live in… But computer science is categorically not software engineering. People often incorrectly conflate the two, but they are distinct.
Does that mean they considered Lisp to be low level?
“Does that mean they considered Lisp to be low level?”
I started writing a reply to you and realized you bring up a good point which I can’t really answer.
I am pretty sure Sussman maintains that Scheme (and Lisp) are “high level” languages, but it is a somewhat pure implementation of Lambda calculus with a few practical features added into it. And to me, understanding lambda calculus is kind of a bare minimum for understanding the fundamental mathematical theory of computing. So putting it that way, getting rid of Scheme from the computer science curriculum seems to me kind of like getting rid of integral calculus from the engineering curriculum.
Maybe to give Sussman and the MIT staff the benefit of the doubt here, it doesn’t make sense for people going into AI to learn about lambda calculus when statistics is far more important for that field. Maybe they should separate AI and computer science entirely and go back to teaching computer science people about computer science, and teach AI people about statistics? Maybe AI should be considered a field of statistics and not computer science?
Does that mean they considered Lisp to be low level?