About unmanned vehicles or why I do not want to live in a "smart home"
Recently, the news river has been reporting to me over and over again with optimism about the next success of an unmanned vehicle. At first, he simply learned to ride the road, then he learned to share it with his brethren, then he began to distinguish between moose and pedestrians running across the road and not even make a difference between them and equally avoid a collision. The latest reports from this front cheerfully report that in some US states, UAVs are already being released onto public roads. Automakers are not far behind - they have ready-made concepts and even working prototypes. They are already thinking about mass production.
And here would slow down, but think twice ...
Unmanned vehicles are still taking their first steps in this world. However, their older brothers - unmanned aircraft - are already plowing through our skies. And they’re not just furrowing, but actively intervening in what is happening on earth - they shoot terrorists, for example. Smart homes have long ceased to be science fiction and even innovation. However, until now, one simple, essentially legal issue has not been resolved, which consists in one of the clauses of the license agreement.
Behind all these drones, smart homes, Azimo and other achievements of robotics is the program. Big and smart, or small and stupid, but it is always there. It is this program that analyzes the incoming data and makes a decision on their basis. And this program has a license agreement. Which says - the developer is not responsible ..., does not guarantee ..., does not accept claims ...
The closest example is enterprise management systems (1C, Axapta, and who else is there). These systems are completely analogous to the programs serving robots - they collect information about the state of the enterprise, analyze it and, on the basis of their calculations, offer some kind of solution. Notice - I said it is "offer", but do not accept. If on a director’s monitor such a system writes “refuse to release Model A”, then the release of this model will not automatically stop - the director has every opportunity to double-check the system’s calculations manually, consult with knowledgeable people and generally ignore the system’s offer based on his rich experience and “sixth sense” ". Enterprise management systems do not actually manage anything. Moreover, if the system makes a mistake in its calculations - for example, like Excel, fails to correctly add the two numbers - the consequences for the company can be most catastrophic, up to complete collapse and bankruptcy. And the manufacturer of the system will refuse to accept claims - he does not guarantee anything and does not bear any responsibility. Accountants, whom “advanced administrators” love to laugh at, are well aware of this - that’s why they check the super-mega-cool system on the accounts that only they have to bear responsibility (for criminal error).
The whole difference between such systems and the systems that control the robots is that these last decisions are not offered, but made. An unmanned fighter can incorrectly add two numbers, but he doesn’t offer the pilot to change course and shoot at children running at the school stadium - he will do it himself because there is no pilot. An unmanned car may add up two numbers incorrectly, but it will not offer the driver to drive the square full of people - he will play these little balls, because there is no driver. A smart house can add two numbers incorrectly and decide that the volume of your bath is ten times bigger than it is and that you need to pour water to the brim - it will not give you advice, but simply open the taps to the full, and the neighbors from below to his program is not laid.
They may object to me that modern cars are already full of filling, and that it can refuse at any moment. Right. It is also true that the examination will show that the failure of the filling of the machine is to blame for the accident, and the manufacturer will be responsible. But if the crashed software is to blame for the accident, then the car manufacturer is out of work - the car is working. And the software manufacturer is not a priori at work - because he does not guarantee anything, does not bear responsibility and simply does not accept claims. Who will answer?
Actually, the question of responsibility should stand in the way of the production of all these drones, robots and smart homes with a concrete wall. Scientists are constantly proud to report on ever new solutions to problems, on the most advanced improvements, on milestones achieved and new achievements. However, when your smart home turns out to be an idiot - who will answer? Who to judge when your latest drone drives into a bus stop with people?
The habit of not looking to press the “I agree” button may serve us poorly in the future ...
And here would slow down, but think twice ...
Unmanned vehicles are still taking their first steps in this world. However, their older brothers - unmanned aircraft - are already plowing through our skies. And they’re not just furrowing, but actively intervening in what is happening on earth - they shoot terrorists, for example. Smart homes have long ceased to be science fiction and even innovation. However, until now, one simple, essentially legal issue has not been resolved, which consists in one of the clauses of the license agreement.
Behind all these drones, smart homes, Azimo and other achievements of robotics is the program. Big and smart, or small and stupid, but it is always there. It is this program that analyzes the incoming data and makes a decision on their basis. And this program has a license agreement. Which says - the developer is not responsible ..., does not guarantee ..., does not accept claims ...
The closest example is enterprise management systems (1C, Axapta, and who else is there). These systems are completely analogous to the programs serving robots - they collect information about the state of the enterprise, analyze it and, on the basis of their calculations, offer some kind of solution. Notice - I said it is "offer", but do not accept. If on a director’s monitor such a system writes “refuse to release Model A”, then the release of this model will not automatically stop - the director has every opportunity to double-check the system’s calculations manually, consult with knowledgeable people and generally ignore the system’s offer based on his rich experience and “sixth sense” ". Enterprise management systems do not actually manage anything. Moreover, if the system makes a mistake in its calculations - for example, like Excel, fails to correctly add the two numbers - the consequences for the company can be most catastrophic, up to complete collapse and bankruptcy. And the manufacturer of the system will refuse to accept claims - he does not guarantee anything and does not bear any responsibility. Accountants, whom “advanced administrators” love to laugh at, are well aware of this - that’s why they check the super-mega-cool system on the accounts that only they have to bear responsibility (for criminal error).
The whole difference between such systems and the systems that control the robots is that these last decisions are not offered, but made. An unmanned fighter can incorrectly add two numbers, but he doesn’t offer the pilot to change course and shoot at children running at the school stadium - he will do it himself because there is no pilot. An unmanned car may add up two numbers incorrectly, but it will not offer the driver to drive the square full of people - he will play these little balls, because there is no driver. A smart house can add two numbers incorrectly and decide that the volume of your bath is ten times bigger than it is and that you need to pour water to the brim - it will not give you advice, but simply open the taps to the full, and the neighbors from below to his program is not laid.
They may object to me that modern cars are already full of filling, and that it can refuse at any moment. Right. It is also true that the examination will show that the failure of the filling of the machine is to blame for the accident, and the manufacturer will be responsible. But if the crashed software is to blame for the accident, then the car manufacturer is out of work - the car is working. And the software manufacturer is not a priori at work - because he does not guarantee anything, does not bear responsibility and simply does not accept claims. Who will answer?
Actually, the question of responsibility should stand in the way of the production of all these drones, robots and smart homes with a concrete wall. Scientists are constantly proud to report on ever new solutions to problems, on the most advanced improvements, on milestones achieved and new achievements. However, when your smart home turns out to be an idiot - who will answer? Who to judge when your latest drone drives into a bus stop with people?
The habit of not looking to press the “I agree” button may serve us poorly in the future ...