Blue. Not! Yellow! - or - Do new programming languages ​​give an increase in development speed

Original author: Robert C. Martin (Uncle Bob)
  • Transfer
What language was used to write the very first programs for the very first computers with a stored program?

Binary machine language, of course.

Why?

Obviously because there was no character assembler. The first programs needed to be written in binary code.

How much easier is it to write programs in assembler than in binary machine language?

A lot easier.

Can I have a number? How many times easier?

Well, damn it, assembler does all the hardest “routine” work for you. Those. It calculates all physical addresses. He makes up all the physical machine instructions. It ensures the impossibility of issuing physically unrealizable commands, for example, addressing outside the address space. And then it creates an easily downloadable binary output.

The economies of work are huge .

How much Can I have a number?

OK If I were to write a simple program, such as printing the squares of the first 25 integers, in assembler on an old machine like PDP-8, then it would take me about two hours. If I were to write the same program in binary machine language, it would take twice as much time.

I speak twice because I would first write the program in symbolic syntax on paper and then assemble the machine language manually on paper. After which I would have to enter this binary code into the computer manually as well. And all this additional work required me about the same time as writing a program in the first case. Perhaps more.

Good enough. Thus, the use of character assembler reduces the amount of work by half?

In fact, I think a lot more. Squaring integers is a fairly simple program. The larger the program, the more difficult it is to manually assemble and load it. I believe, in fact, the gain in laboriousness depends on the size of the program. For large programs, the time savings are significant .

Please explain.

Well, suppose you need to change one line in a program in character assembler. It will take me 20 minutes on an old PDP-8 with punched tape. But with manual assembly, I must then recount all addresses and reassemble all machine instructions manually. Depending on the size of the original program, hours will go away. Subsequent manual entry will require no less time.

I could save time by segmenting the program into modules loaded onto fixed addresses that have free spacing between them. You can save a little more time by writing a small program that helps download a large program. However, such a “routine” load will, nevertheless, be very, very high.

Good. But, nevertheless, can a figure be? On average, how much easier is assembly using compared to writing a program in binary?

Okay. I suppose we can say about 10 times.

In other words, character assembler allows one programmer to do the work of ten programmers working in binary code?

Yes, this is probably close to the truth.

If symbolic assembler reduced the complexity by about 10 times, how much did Fortran do it?

Very decent. If we are talking about the 50s, then Fortran was then simple. In other words, it was somewhat larger than character assembler for character layout - not sure if you understand what I mean.

Does this mean that he reduced the complexity by another ten times?

What of course you are not! The “routine” load of the symbolic assembler was not so high. I would say that Fortran reduced the complexity relatively little. Perhaps about 30%.

In other words, 10 Fortran programmers can replace 13 assembler programmers?

If you want to consider the process from this perspective, then yes, it seems.

We continue - how does a language like C help save time compared to Fortran?

Well, C spends a little less time on “routine” work than Fortran. In old Fortran, things like line numbers and the order of common operators needed to be remembered. There was also an incredible amount of jump operators throughout the text. The C language is much more comfortable for programming than Fortran 1. I would say that it reduced the complexity by about 20%.

Good. That is, 10 C programmers can replace 12 Fortran programmers?

Well, this, of course, is only an assumption, but I would say a reasonable assumption.

Good. Now: how much has C ++ reduced the complexity in relation to C?

Listen, let's stop. We are not recalling a much greater impact now.

Really? What exactly?

Development environment. This means that in the 50s we used punch cards and paper tapes. Compiling a simple program took at least half an hour. And then, if you could access the car. But in the late 80s, when C ++ became popular, programmers stored their programs on disks, and compilation of a simple program lasted only two to three minutes.

Is it a reduction in labor intensity? Or just a decrease in latency?

A. So here it is. The question is clear. Yes, then the car had to wait a long time.

Request: when you give your estimates of labor input, please exclude the waiting time. I'm interested in saving time associated only with the language itself.

I see, I understand. So you asked about C ++. Actually, honestly, I don’t think that C ++ somehow significantly reduced the complexity. Of course, there was something , but, I believe, no more than 5%. This means that the routine load in C was simply small, and therefore the comparative time savings when working in C ++ could not be significant.

Using 5% means that 100 C ++ programmers can replace 105 C programmers. Is that really so?

In general, yes. But only for small and medium-sized programs. For large programs, C ++ provides some additional benefits.

What kind?

This is rather difficult to explain. But the point is that the object-oriented characteristics of C ++, in particular, polymorphism, made it possible to divide large programs into independently developed and deployed modules. And this - for very large programs - significantly reduces the routine load.

Can I have a number?

Well, you seem to be going to twist my hands further ... Considering the number of really large programs that were created in the 80s and 90s, I will say that, in general, C ++ has reduced the complexity, possibly by 7%.

This did not sound particularly confident.

Yes. But let's use this value. 7%

Good. So, 100 C ++ programmers can replace 107 C programmers?

It seems like I said so. Let's use this value.

How much time does Java save compared to C ++?

Hard to say. Saves some time. Java is a simpler language. It has automatic dynamic memory deallocation control (“garbage collection”). It does not have header files. It works in a virtual machine. He has many virtues. And a few flaws.

What about the numbers?

I have a feeling that we are slipping ... But since you are pressing me like that, you would say that, ceteris paribus (which never happens) you can, while working with Java, get a 5% reduction in labor costs compared to C ++.

So, 100 Java programmers can replace 105 C ++ programmers?

Yes! However, no. This is not true. The spread is too large. If we randomly select 100 Java programmers and compare them with 105 C ++ programmers also selected, then I would not dare to predict the result. To get a real win, you need a lot more programmers.

How much more?

At least two orders of magnitude.

In other words, 10,000 randomly selected Java programmers can replace 10 500 similarly selected C ++ programmers?

Perhaps so.

Very well. How much does a language like Ruby reduce the complexity compared to Java?

Well, dear! (sighs). What are you speaking about? Look, Ruby is really a beautiful language. It is both simple and complex, elegant and bizarre. It is much slower than Java, but computers are now so cheap that ...

Sorry, but I am not asking about this.

You're right. I know. So, the main direction in which the complexity of Ruby is less compared to a language such as Java is Types . In Java, you need to create a formal type structure and maintain its consistency. Ruby can be played with types fairly quickly and freely.

It sounds like an increase in labor productivity.

In general, no. It turns out that the ability to play quickly and freely with a type structure leads to the appearance of a class of runtime errors that are not available when programming in Java. Therefore, Ruby programmers have a higher load on testing and debugging programs.

In other words, are these effects balanced?

It depends on who you ask.

I ask you.

Okay. I would say that the effects do not balance each other. The complexity of working with Ruby is lower than with Java.

How much 20%?

People used to think so. Indeed, in the 90s, many thought that Smalltalk programmers worked many times more productively than C ++.

You are confusing me. Why remember those languages?

Yes, because C ++ is pretty close to Java, and Smalltalk to Ruby.

Clear. Thus, Ruby reduces the complexity by several times compared to Java?

No, most likely not so. If you look back at the 90s, then the problem with the waiting time was still quite pronounced. The compilation time for a typical C ++ program was several minutes. The compilation time for the Smalltalk program was practically zero .

Zero?

Almost yes. The problem is that when using languages ​​such as Java and C ++, you need to perform many actions to coordinate all types. When using Smaltalk and Ruby there is no such problem. Therefore, in the 90s, they required time from minutes to milliseconds.

Clear. But since all this is only a waiting time, we can not consider it.

Not certainly in that way. You see, if the compilation time is almost zero , then this gives rise to other programming style and discipline. You can work with a very short cycle - seconds instead of minutes. This gives extremely fast feedback. With long compilation times, fast feedback is not possible.

Does fast feedback reduce labor costs?

Yes, to a certain extent. When your cycles are extremely short, the “routine” load in each cycle is very small. Your busyness caused by the need for tracking is declining. Lengthening cycles increases the “routine” load, and nonlinearly.

Nonlinear?

Yes, the “routine” load is growing disproportionately to the cycle time. It can grow as, for example, O (N ^ 2). I dont know. But I am quite sure that the dependence is nonlinear.

Wonderful! So Ruby is the leader!

Not. And that’s the point. Thanks to the improvement of our hardware in the last twenty years, the compilation time for Java has become almost zero . The Java programmer’s cycle time is no longer (or should be no more) than that of a Ruby programmer.

Please clarify.

I say that programmers using the short-cycle discipline will see only a small difference in labor input (or, generally, will not see it) when working with Java and Ruby. The difference will be so small that it will be difficult to measure.

An immeasurable difference?

I believe that in order to get a statistically reliable result on this difference, experiments will have to be done with thousands of programmers.

But you said earlier that Ruby reduces the complexity compared to Java.

I think it is, but only if the cycle time is long. If the editing / compilation / testing cycle is kept very short, the effect will be negligible.

Zero?

Of course not, more likely about 5%. But the scatter will be gigantic.

So, 10,500 short-cycle Java programmers do the same job as 10,000 Ruby short-cycle programmers?

If you add one more order for the sample size, then I would venture to agree.

Are there languages ​​superior to Ruby?

You can get another 5% using a language like Clojure, since it is, on the one hand, quite simple, and, on the other hand, functional.

Do you give only 5% to a functional language?

No, I say that short-cycle discipline virtually erases performance differences in modern languages.

If you work with short loops, it hardly matters which modern language you use.

That is: Swift? Dart? Go?

Irrelevant.

Scala? F #?

Irrelevant.

In other words, we have reached the top. No future language will be better than what we have now.

Not certainly in that way. I only say that we are on the path of declining efficiency. Not a single future language will win 10 times, as it was with the assembler with respect to binary code. No future language will reduce labor intensity by 50% or 20% or even 10% compared to existing languages. Short cycle discipline has reduced differences to practical immeasurability.

Then why do all new languages ​​appear?

This is the search for the Holy Grail .

And, so this is just a question of the level of your favorite color.



Translator's Note: Title of fasting and its theme is a reference to a fragment of the film "Monty Python and the Holy Grail", in which the Knights of the Round Table meet at five three questions to cross the Bridge of Death

Also popular now: