Alan Kay: how I would teach Computer Science 101

Original author: Quora
  • Transfer
“One of the reasons to actually go to university is to go beyond simple vocational training and instead cling to deeper ideas.”


image

Let's think a little bit about this issue. A few years ago, Computer Science departments invited me to lecture at a number of universities. Almost by accident, I asked my first audience, consisting of undergraduates, graduate students, and professors, about their definition of Computer Science. All could give only an engineering definition. I did this in every new place, and everywhere there were similar results.

Another question was: “Who is Douglas Engelbart?” Several people said: “Wasn’t it somehow connected with the computer mouse?” (And it really disappointed me, because my scientific community made a lot of efforts to answer this question was possible after two or three clicks of the mouse and make sure that Engelbart really was somehow connected with a computer mouse).

Part of the problem was the lack of curiosity, partly the narrowness of personal goals that were not related to learning, partly the lack of an idea of ​​what this science is, and so on.

I have been working part-time at the Department of Computer Engineering at the University of California for several years (in fact, I am a professor, but I do not need to attend meetings of the department). Periodically, I conduct classes, sometimes with freshmen. Over the years, the already low level of curiosity for Computer Science has significantly decreased (but the level of popularity has also increased, since computer technology is seen as a way to a well-paid job if you can program and get a certificate in the top 10 schools). Accordingly, not a single student has ever complained that C ++ is the first language at the University of California!

It seems to me that we are faced with a situation in which both the meanings of “Computer” and “Science” were destroyed by weak massive concepts in order to create a new term - a kind of label on jeans - that sounds good but is empty enough. A related term, which was similarly destroyed, is “software engineering”, which, again, did not use the most ingenious ideas of “programming” and “engineering”, but simply combined them (this was intentionally done in the sixties, when it was invented term).

One reason to actually go to university is to go beyond simple training and instead cling to deeper ideas. It seems to me quite reasonable for an introduction to the specialty to try - if possible, using examples - to make students deal with real problems and begin to understand what is really interesting, important and central in this area.

First graders rejoice when they are shown how a ruler on top of another ruler becomes an arithmometer, with which they can beat the guys from grade 5 by adding a fractional part. And then they will be happy to take part in the development of improved arithmometers. They touched a real computer - a physical and mental tool that helps us think. They learned a really effective way to represent numbers - more effective than they teach in schools!

They were able to combine their robust idea of ​​“adding” as “accumulation” with something similar to new powerful properties. They programmed it so that it was able to solve a variety of problems.

They also expanded it. Etc. This is not a digital computer. And this is not a computer with a memorable program. But this is the essence of the computer. In the same way as the anti-cheater mechanism is the essence of computer and computing in general.

image

Antikythera mechanism

How far can we go and how much can be done before everything gets out of hand and we get lost in abstractions? I have always been partial to the characterization of Alan Perlis , the first Turing Prize laureate who might have invented the term “Computer Science” —which said in the 1960s: “Computer Science is the science of processes.” All processes.

For the sake of Quora, let's not try to push it further or turn it into a religious dogma. Let's just use the idea happilyAl Perlis to better think about our area. And especially about how to teach this. Now we need to look at the modern meaning of “science”, and Perlis was pretty sure that it should not be diluted with old meanings (for example, such as “collecting knowledge”) and uses (for example, “library science” or even “social sciences” "). Under “science”, he tried to understand the phenomenon by creating models / maps that try to show, “track” and predict phenomena.

image

I gave several interviews on how the best cards and models can often be used for t-shirts, in the way Maxwell and others fit. The analogy is that there is a “science of bridges,” although most bridges are made by man. But as soon as the bridge is built, it displays the phenomena, scientists can study them, from the bridges it is possible to make models of many types and form comprehensive and useful “theories of bridges”. The fun is that you can then design and build new bridges (I have already mentioned that there is hardly anything more fun than the joint work of scientists and engineers to solve large and important problems!)

image


Herbert Simon - winner of the Turing Prize and the Nobel Prize - called it all “the sciences of the artificial” (and wrote an excellent book of the same name).

image


Let me give you an example. In the 1950s, companies and universities built computers with a memorable program and started programming them - and there was a special moment when Fortran appeared in 1956 - which was not the first high-level language, but perhaps the first one made so well that it used in many different areas, including many that were previously made only in machine language.

All this gave rise to "phenomena."

image

John McCarthy

Lisp’s history is more complicated, but John McCarthy became interested in trying to find a “mathematical theory of computing” and was determined to make everything work perfectly. The eval function, which interprets Lisp, easily fits on a T-shirt! Compared to the "programming system" - this is negligible. More importantly, this “theory of computing” had a more powerful concept than Fortran! It was the best bridge idea!

Lisp’s miniaturization allows the whole idea of ​​programming to be grasped with a couple of taps at a deeper level and to be thought out at a level that seems simply impossible when you look at huge artifacts (this is one of the reasons why scientists like math to be compact and powerful). The mathematics used here is new mathematics because it allows such concepts as “before” and “after”, and this leads to a “logic of variable”, which allows you to save both a functional dependence and a logical train of thought, while also allowing for the position and the passage of time. (This is still not understood today in the cruel world of situational programming).

Lisp, acting as a powerful programming language and metalanguage that is able to present its own theory, is an example of real computer science. If you learn it and other similar things, you can think more deeply and be more responsible for your own destiny than if you just learned to program in Fortran or its modern equivalents (... so you can get closer to programmers!).

You will learn a lot more about the special types of design that are needed in computing (for example, it is usually not appreciated when computing often requires going beyond the computing environment: one of the special characteristics of stored software computing is that it’s not just material for the program, but stuff for a brand new computer).

Another reason for choosing Perlis's definition is that, on the whole, the calculation is much more connected with the creation of many types of systems than with algorithms, “structure data”, or even programming as such. For example, a computer is a system, computing is a system, a local area network and the Internet are systems, and most programs should be better systems than they are (the old programming style lasted from the 50s until the moment that it seems that programming should be like that - there is nothing further from the truth).

The Internet is a good example - unlike most programs these days, the Internet does not need to be stopped to repair or improve something - it looks more like a biological system - in our intention, than what most people consider a computer system. And it is much more scalable and reliable than almost all software systems that exist today. This is really worth considering before teaching beginners in programming not so powerful concepts!

So, what we need to do in the first year of Computer Science is to take into account what specifically students can do at the very beginning, and then try to stay within their “cognitive load” to help them get to what is really important . It is very important to “stay real” and find ways that are intellectually honest and suitable for those who are just starting to learn. (Please don't teach bad ideas just because they seem a little simpler - a lot of bad ideas are actually simpler!).

Students should start by creating something that has the many important characteristics that I discussed here. It should be a system of several dynamically interacting parts and so on. A good way to decide which programming language to use is to simply do something that has thousands of interacting parts! If not, then you should find one. The worst thing that can be done is to put students on the path of fluency too weak, which would greatly limit large-scale ideas. It just kills them - and we want to grow them, not kill.

About GoTo School


image


Also popular now: