If to invent a programming language of the 21st century

Original author: Oleksandr Kaleniuk
  • Transfer
The author of the material discusses the problems of modern programming languages ​​and the ways in which we can correct the shortcomings.


Only in the last 18 years, people have come up with many languages, among which probably the most popular are Swift, Kotlin and Go. At the same time, a distinctive feature of the programming language of the 21st century is the absence of any distinctive features. The most pleasant thing about working with such languages ​​is that you can spend the weekend learning one of them and at the end declare that you have managed to master a popular novelty, in fact without learning anything new. They really have nothing new. All modern languages ​​are created on the basis of some correct and proven formula, the name of which is most likely Objective-C, Java or C.

“Lack of novelty” can be considered a valuable feature, but this situation raises one question. Are the languages ​​of the new, 21st century really in front of us, or is it all just a reflection of the bad programming habits of the 20th century?

If I had invented a language, I would not try to correct the past, but would try to create something that would work well in the conditions of modernity, but also be able to develop and withstand the test of time. If this requires radical constructive solutions, then so be it.

Down with the syntax!


The syntax of modern languages ​​reflects an attempt to squeeze the freedom of chalk and blackboard into the chains of ASCII. Some elements of the record, such as arithmetic signs or parentheses, are perceived more or less naturally. But a number of other designations are justified except by the saving of effort when pressing the teletype buttons.

Entering text from the keyboard is no longer difficult. We are not obliged to put ourselves in a position where it is necessary to guess the meaning of the syntax. Pieces like (($: @ (<# [), (= # [), $: @ (> # [)) ({~? @ #)) ^: (1 <#) is a very short and succinct recording format (this is, by the way, a real piece of code in a real language ), but it doesn’t improve readability in any way. And, more importantly, it is difficult to "google" or find it on stackoverflow.

The same can be said about the mysterious names of functions, the symbols of return codes and attributes with an obscure value. They served well in the past, saving a lot of space on punch cards, but today it’s time for a well-deserved rest.

Something

FILE * test_file = fopen("/tmp/test.txt", "w+");

must be transformed into

create file /tmp/test.txt for input and output as test_file

We do not need all these brackets, quotes, asterisks and semicolons (unless, of course, they really do not get the idea clearer). Syntax highlighting is fully capable of completely replacing syntax.

Some things are available in abundance in the 21st century: for example, parsing speed, computer memory, online search. Other resources are still in value: development time, the memory of the programmer, the effort spent on learning the features of the language. Changes in the rules of writing code should shift the focus in favor of cheaper resources and saving more expensive ones.

Down with the built-in types!


You are probably familiar with the paradoxes of JavaScript . For example, such: This result is typical not only for JavaScript. And this is not a paradox at all, but a sample of absolutely correct following by all the respected IEEE 754. This implementation of floating-point numbers is found in almost all architectures. And it is not so bad, considering that we are trying to cram an infinite number of real numbers into 32, 64 or 256 bits.

> 10.8 / 100
0.10800000000000001




The fact that mathematicians consider impossible, engineers embody through the rejection of common sense for the sake of practical implementation. The floating point numbers in the IEEE interpretation are not numbers at all. Mathematics requires associativity from the operation of their addition. The types float and double do not always retain this property. Mathematics requires that the set of real numbers include integers, but this requirement is not satisfied even for a float and uint32_t of the same size. Mathematics requires real numbers to have a zero element. Well, in this respect, the IEEE standard exceeds all expectations, because floating-point numbers have two zero elements instead of one.

Not only floating point numbers have similar features. Embedded integers are implemented no better. Do you know what will happen if you try to add two such 16-bit numbers?

0xFFFF + 0x0001

No one will give an exact answer. Intuition suggests that overflow will give 0x0000. However, such an outcome is not documented in any international standard. In processing this operation, everyone is guided by the approach C and the x86 processor family. Alternatively, 0xFFFF may occur, or an interrupt will be triggered, or some special bit indicating overflow will be stored in a special place.

Such moments are not considered anywhere at all, and the rules for processing such operations differ from language to language. If the oddities of the floating point are at least fixed by the standard, then the last question raised is in principle unpredictable.

Instead, for numerical calculations, I would suggest entering data types of a defined quantity with a fixed comma and with standardized behavior in case of loss of accuracy or if the output goes beyond the upper or lower limit. Something like this: It is not necessary to add all trailing zeros: their presence should be implied by the definition of the data type. But it is important to be able to choose the maximum and minimum bounds independently, and not to depend on the processor architecture.

1.000 / 3.000 = 0.333
0001 + 9999 = overflowed 9999
0.001 / 2 = underflowed 0




Wouldn't such calculations work slower? Yes, they will. But ask yourself: how often do you have to program high-performance computing? I believe that if you are not an expert in this area, it is very rare. And if you are engaged in similar tasks, then use specialized equipment and compilers for this purpose. As far as I can tell, a typical 21st century programmer rarely solves differential equations.

However, nothing prevents you from using fast, complex, and unpredictable built-in types from the past as an alternative, and not as a default option.

Down with the practice of metalanguages!


There are wonderful languages ​​that are not invented to perform tasks, but to create languages ​​that can perform them. Racket, Rebol and Forth are just a few examples. I like all of them, playing with them is pure pleasure. But, as you probably guessed, the pleasure derived from working with the language is not the main criterion that makes the language universal and popular.

The ability to create new languages ​​within a language to perform a task is a very powerful tool that pays off in full during independent research work. Unfortunately, if the code should be understood not only by the author, then besides the main one, you will have to train other people in the new internal language. And here the problems begin.

People want to complete the task, and not learn a language that will help to do the work exactly once, and after that it will not be useful anywhere. For outsiders, the idea of ​​mastering your language is an investment that hardly pays off. But the study of something standardized is an investment for life. Therefore, rather, they will rewrite your code and then learn it. So countless dialects for a single application sphere are born. People argue about aesthetics, ideology, architecture and other unimportant things. And millions of lines of code are written to sink into oblivion in a few months.

Lisp guys went through this in the 80s. They understood that the more applied language elements are standardized, the better. So Common Lisp was born.

And he was huge. Standard INCITS 226–1994 has 1153 pages. This record 17 years later broke only C ++ with the ISO / IEC 14882: 2011 standard (1338 pages). C ++ has to carry a very heavy baggage of heritage, although it was not always so big. Common Lisp was created mostly from scratch.

A programming language should not be so huge. It is not necessary. He just needs a good standard library, filled with all sorts of useful stuff, so people don’t have to reinvent bicycles.

Of course, maintaining a balance between size and usability is not easy. C ++ experience in practice has shown how difficult it is. I believe that in order to achieve the necessary balance, the language of the 21st century should conditionally be sharpened for a specific application area. Since most of the problems now arise precisely in the field of business applications, the language should probably focus on business problems, and not on game development or web design.

So...


The language of the 21st century should be business-oriented, use clear language expressions and not depend on the built-in types. It's great that such a language already exists! What do you think, what are we talking about?

Yes, this is COBOL.

It is one of the first high-level languages, today for the most part forgotten. I have to admit that I deliberately described the ancient COBOL features as ultramodern and incredibly promising. And I did it to show one thing. The code is not written language features. You do it.

It is naive to think that the language is responsible for the quality of the code, and that adding some gadgets (or removing them) can automatically improve everything. At one time, programmers did not like Fortran and COBOL; therefore, they invented C ++ and Java, in order to finally arrive at a situation when, 20 or 30 years later, they also disliked everything.

According to my feelings, the root of the problem lies somewhere in the field of sociology and psychology, but not programming. Do we really dislike languages? And are we satisfied with the environment in which we work? Windows is vulnerable, Visual Studio is too slow, you can't get out of Vim. In fact, it is these things that cause discontent, not the creative process in itself.

But you always have to find the culprit. Being software engineers, partly responsible for how lousy programs are, we won't blame ourselves, right? Therefore, we are looking for flaws in the tools. Let's invent new COBOLs until one day the sun shines brighter, the birds don't sing louder, and Windows starts booting in 2 seconds.

But, most likely, this day will never come.

Therefore, if I wanted to invent a 21st century programming language, instead I would try to find a new approach to responsibility. Or a new way to better master the available tools. I would try to pay more attention to the essential details and ruthlessly get rid of any unnecessary complexity. Instead of languages ​​entering and going out of fashion, there are always some fundamental things that deserve constant rethinking.

image

Also popular now: