The evolution of altruism and P2P

I recently listened to a radio broadcast about the evolution of altruism. The question was discussed of how the “altruism gene” withstands natural selection. This discussion led me to think about what features should be present in modern network applications so that they “survive” in natural selection.

At first glance it seems that one who sacrifices himself for another does not leave offspring and, therefore, his altruistic genes should not be preserved.

Even the assumption that group selection works (sacrificing oneself for the sake of the group improves the survival of the group) does not save, because group selection requires very special conditions, and there is also competition within the group.

How does the altruism gene survive?


A positive factor for the survival of the altruism gene is self-sacrifice in favor of the carrier of the same or closely related gene.
As a result, it turns out that although individuals die, the altruism gene survives, and even better than the selfish one, since more individuals work in a coordinated manner.

The pinnacle of the development of altruism between unicellular organisms is a multicellular organism, in which all cells have a single genetic code, only germ cells multiply, and the rest ensure their survival. Intercellular altruism comes to the fact that routine death of cells (white blood cells, gastrointestinal cells) occurs regularly.
Further development along this path, for example, bees are multicellular organisms with the same genetic code, ready to sacrifice themselves for the sake of the family.


The meaning of the genetic code can be defined as a recorded mode of action for its carrier.
If in a certain situation the body sacrifices itself for the sake of a relative due to genetic programming, and he, in turn, does the same, this can be thought of as a network of nodes with the same software installed, implementing the interaction protocol, a special case of which is self-sacrifice.

It turns out that if the code is flashed with recognition of its own and special interaction with them, then all of its own will form a higher order system, which dramatically increases the efficiency of survival and promotes reproduction.

On the other hand (as mentioned in the same program), if a multicellular organism is created from cells with different genetic codes, the result is “not very”.

There are examples when unicellular, belonging to the same species, but with a different genetic code, in difficult times are combined into one fruiting body. At the same time, those who form the leg of this body do not multiply.
There is a negative selection factor for altruists. It is not profitable to disappear into the leg, despite the fact that the leg is very important for the survival of the fruiting body as a whole.

Through natural selection, “tricks” appear that fall only in the hat. As a result, as I understand it, such a way of survival as the fruiting body, after the extinction of those who make up the leg, has to be "invented" each time anew. Therefore, multicellular organisms with a complex structure cannot appear from cells with different genetic codes.


Society


A similar approach can be applied to a society of people.

First of all, the clan structure of society immediately becomes clear.
Also, it becomes clear the impossibility of building a complex and efficiently built society.
Clans at the top, clans at the bottom, and the bottom ones are always trying to split into separate people and manage their population in different ways, so that there are no competitors.
Action in the interests of all mankind (falling into the "leg") leads to extinction. Acting in the interests of only his clan, at a minimum, does not lead to the development of the system as a whole.
The "fruit body" is falling apart.

From people “in their pure form” it is impossible to build a sufficiently large and complex system, the effectiveness of which would be sharply enhanced through cooperation.

If we consider not only genetic, but also other types of information in humans, the situation will be slightly better.

Common acquired habits, general upbringing, an idea that captures the mind, act just like common DNA.

However, a person is not reliable in this sense.
Firstly, it is necessary in the process of upbringing and education to “establish” ideas for a long time in consciousness, and each time individually and without guarantee of the result.
Secondly, he can forget.
Thirdly, the underlying genetic altruism programs operate on it only in the family circle (which is very limited) and competition programs with everyone else.
Fourth, the resources of his consciousness are very limited.

It turns out that some structure of society and the division of labor are present, however, their complexity and effectiveness cannot develop sufficiently.

Throughout its history, mankind has tried to develop a code that everyone would execute. Laws were created, and their implementation was monitored, training and propaganda were conducted. The code was written on external media (books) in relation to a person and became more stable and common for everyone. When this turned out, the effectiveness of joint work of people increased. But the code was still handled by people, with all the side effects described above.

Solution


Over the past decades, another place has appeared where such a code can be written - a personal computer. And not only recorded, but also performed independently, as in the processing of DNA.

Compared to using human consciousness as a carrier, this approach has an overwhelming advantage.
Firstly, quick installation of the application.
Secondly, one hundred percent match of the code for all users is easily achievable.
Thirdly, everyone who has the same code installed, regardless of their number, is considered to be their own, which means that there is no limit on the number of users.
Fourth, the significantly greater possible complexity of the code and the amount of information.
Fifth, orders of magnitude higher performance.

A serious increase in efficiency is possible due to the fact that the actions of other nodes are predictably useful for the common cause. In this case, "altruism" is the provision of its resources, acting in accordance with previously concluded agreements - protocols implemented in the code and aimed at mutual support.

The hypothesis suggests itself: to increase efficiency in any field, it is necessary to develop network software for it on the principle of universal p2p cooperation.

It should be clarified that the actual executable code of the program should not be the same, but the supported interaction protocols.

Software system survival


Let's try to imagine what features a software system must have in order to survive.

Survival and reproduction of the system depends on its adaptability to the environment. This environment is people, computer users.
They make decisions about the appearance or removal of a copy on their computer based on an assessment of the effectiveness of the system’s functionality, ease of use, and the ability to control the operation of the program.

If the program is organized according to the p2p principle with unified protocols, then in comparison with “ordinary” competing programs, the increase in efficiency will be very significant, so using the new software will be really profitable.

Bitcoin initially had one program that was installed on all nodes, working both as a client and as a miner. Only then, when the network grew, did the "cell differentiation" take place. Clients and miners in terms of meaning have become different types of nodes, although they still carry the complete “genetic code”.

On the other hand, attempts to create massive p2p systems where a person will have to constantly make decisions himself, put ratings, allocate resources, evaluate results, tune, etc., in my opinion, are doomed to failure.
If the course of action is not sewn up in one protocol common to all, but is entrusted to a person, then there will be no corresponding increase in efficiency (the person’s shortcomings are described above). So, there will be no interest of users in this system.

The recently appeared Ripple payment system forces the user to form his own circle of trust and monitor its relevance. This approach is an attempt to transfer risk management to end users. This is equivalent to a different “genetic code” that defines the behavior of the program, with all the ensuing consequences.
The correct implementation would be a single program code containing a system-wide and automatic risk management algorithm that is transparent to users. However, most users will never try to figure it out, but simply run the program.


From the point of view of ease of use, competition will be won by those systems that after installation simply work and "don’t ask for porridge." They can load the processor, network, take up space on the hard drive, but if the system does not require the user to "extra" from his point of view of movements (like the settings mentioned above, rating, etc.) then this is a decisive plus for convenience.

Management and control

There is also an analogy with biological systems.

Adaptation to the needs of users should occur with the help of "mutations" - changes in functionality that are made in one place and survive or not in natural selection. There should be many such mutation sites, which means that the system must have open source code.

Changes are a response to the needs of users in a particular functionality or to increase its effectiveness. The part of the code that implements certain functionality is called an extension (following the example of extensions to browsers).
Natural selection among extensions should be done by means of an automatic network-wide (so that a cooperative effect is manifested) check for compliance with user requirements, something like this. In other words, the user does not select the implementation, but sets the requirements for the results of its work.

Requirements for a file-sharing network from a user's perspective may sound like this: download a gigabyte file in five minutes. By this criterion, torrents won in natural selection.

Even if the system has settings, they must have default values, as well as several operating modes, something like: simple, advanced and expert.

The user’s control over the system consists in the fact that if he wanted to (mainly when he didn’t like something), he could figure out how to work with his chosen degree of immersion in the subject and change the parameters that he understood.
An extreme case is the ability to remove or prohibit the use of a particular extension.

User requirements


We examined what the system gives the user. Now we define what the user gives to the system.
In short, the resources of your computer and the input of information.

The natural desire of the user is simply to consume the functionality of the system and turn off the computer.
To avoid this, the system must remove restrictions or offer additional functionality in exchange for some actions.

Torrent trackers, especially at the beginning of their development, put forward strict requirements for the ratio of the number of downloaded / handed out and limited downloads if the user did not remain on the distribution enough.

The Bitcoin system implements a mechanism in the form of mining capabilities, i.e. obtaining coins for activities to maintain its existence.


If we consider a sufficiently general case, then we can formulate the following mechanism.

The system has (or can be created) information resources that affect the real life of the user. It can be money, a rating or a diploma, elements of a portfolio, etc.
The system offers these resources according to a general transparent algorithm in exchange for some activity for the benefit of the system.

Among the necessary tasks may be:
- implementation of transactions (as in bitcoin);
- confirmation of the facts of the real world;
- writing a new system code;
- necessary activities in the real world, confirmed by other users;
- storage of system data;

Such an approach requires the identification of the user and keeping a history of it, and in some cases, deanonymization.

So what should be an effective system?


1. Distributed.
2. Without privileged nodes.
3. With a single code in all nodes.
4. Resistant to destruction of nodes.
5. The newly installed copy should be operational immediately with the default settings.
6. No competition between nodes, only cooperation and work for the common good.
7. Open source.
8. System settings should be formally defined requirements, based on which the executable code is automatically selected.
9. The system should stimulate users to systemically useful activities by allocating resources for this.

Practical implementation


An example of such systems can be: bittorrent file exchange protocol, Bitcoin cryptocurrency, Skype Internet telephony.

Downloading "from torrents" is possible at a very high speed due to the fact that all other nodes are ready to give.
The cooperation of the Bitcoin network nodes leads to hitherto unprecedented reliability and security against hacking. Just like that, no one can create new bitcoins. It is impossible to seize accounts. Very cheap transactions. Easy to store. These features are significantly superior to the "usual" implementation of the same functionality.

Skype has significantly reduced communication costs.

Disadvantages of Existing Systems

The same skype is not completely decentralized, i.e. has a certain set of nodes on which the software is installed, not the same as on most (for example, leading user lists). In addition, it has closed source code.

One of the main complaints about Bitcoin is the lack of real resources. To this, his supporters respond that all other currencies are secured no more. This is true, but it is not a response to a request for similar functionality. It is not implemented either in bitcoin or in ordinary currencies.

Regarding Ripple, it is written above.

What systems are needed?


Of course, those that implement the functionality necessary for the user. Here the field is not plowed.
You can simply point your finger at any information process taking place in society and see that its p2p implementation will be much more effective from the perspective of the majority (but not from the perspective of those who steer this process now).

Here are a few examples offhand.

System of storage of socially significant information

The main requirement - the information should be "not cut down with an ax." The Internet generally copes with this function, and where not, Google’s cache comes to the rescue. However, complete and automatic decentralization is still a long way off. Information is stored either directly by people (which, as we have figured out above, is inefficient), or by search engine robots on their servers (automatically, but not distributed enough).
Bitcoin is another example of truly distributed information storage.
An interesting implementation of this idea based on bitcoin is “ Proof of existence ”.

Real cryptocurrency collateral

Only real values ​​can serve as collateral for a currency. How to attach something material to a purely informational object?
Just like in ordinary life. There are cadastres of real estate, a database of traffic police and the like, up to consignment notes.
You need to store all this in a common distributed database and integrate with cryptocurrency.
When selling, the transfer of ownership must be carried out simultaneously with the opposite direction of the flow of money. The entire workflow, conclusion of contracts, etc., are the same as now, but inside the p2p network with special software. The entire history of both monetary transactions and the movement of real values ​​is available to all network users similarly to the transaction history of the same bitcoin. Based on this information, a system-wide algorithm should determine the need and method for issuing or destroying money.

Documents

Why constantly carry papers with you, if at any time you can verify the identity of a person by receiving information from a distributed database?

Production and Standards

If the path of material assets is traced in the system from the quarry to the store, then by analyzing the resources used, it is possible to draw a conclusion about the technologies used and to stimulate or suppress their use in monetary terms. Standards are set by the requirements of users, and real technological processes adapt to them.

Conclusion


The construction of distributed systems with a single code will allow humanity to make a quantum leap in the effectiveness of its activities. In fact, this is a solution to the prisoner's dilemma throughout society.

The process is just beginning, but now the results are visible. Let's see what will happen next.

Also popular now: