Expert opinion: Quantum computers, quantum engineering and quantum

    In April of this year, under the project " Collective Phenomena in Quantum Matter, " led by leading scientist K.B. Efetov , a leading international scientist A.M. came to us Zagoskin, a reader in quantum physics at the physics department of Loughborough University, UK , one of the founders of D-Wave Systems Inc. (1999, Canada) , which released the world's first adiabatic quantum simulator.

    Our university could not ignore such a significant event. A.M. Zagoskin gave a lecture, and we captured this event, organized a video shoot and edited a video that can be viewed here. This material will undoubtedly be of interest to the target audience! We also could not miss such an opportunity and asked the professor to participate in our already traditional heading on the GT “Expert Opinion” .

    Alexander Markovich gladly agreed to write a note in a popular science format about quantum computers and engineering specifically for our corporate blog on GT . We are sure that it is on GT that a large part of the target audience is concentrated, who will be interested in this material!

    Also tomorrow we will publish this material on our portal, where, perhaps, a real scientific discussion will unfold between our experts, young researchers and the author. Some of the most respected readers will probably want to join the discussion onour portal.

    Quantum computers, quantum engineering and quantum


    In science fiction novels, the main thing was the radio. Under him, the happiness of mankind was expected. There is a radio, but no happiness.
    Ilya Ilf, “Notebooks” .

    If a word “revolution” or “quantum” appears in a newspaper article about science, it is usually not worth reading further. Nevertheless, the results obtained over the past twenty years may in time justify the high-profile title of “the second quantum revolution”.

    The “first quantum revolution" of the middle of the last century led to the creation of nuclear weapons and nuclear energy, semiconductor electronics, masers, lasers and superconducting devices. Without these technologies, the current level of civilization would not have been possible. Nevertheless, they rely on the simplest quantum effects, in the sense that they can be described by a small number of variables even when the macroscopic number of particles is involved in them. So, the quantum theory of a solid body operates mainly with one- and two-particle Green functions. Superconductivity is a good example: despite its essentially quantum nature and the fact that the macroscopic number of electrons enters the superconducting condensate, an ordinary superconductor is well described by a single “wave function” of coordinate and time.

    The “second quantum revolution” is based on more fragile effects — such as quantum entanglement at macroscopic distances and a quantum superposition of macroscopically different states. Of course, in quantum theory there are no prohibitions on their existence, which is illustrated by the famous Schrodinger cat. However, until relatively recently, firstly, their implementation remained almost impossible, and secondly, therefore, it was considered irrelevant. The Copenhagen interpretation, with its fundamental (albeit unknown where passing) boundary between the micro and macro worlds, did a good job.

    The situation began to change after Feynman emphasized that it is fundamentally impossible to effectively model quantum systems using classical computers - simply because the dimension of the corresponding Hilbert space increases exponentially with the size of the system. This does not contradict the achievements of quantum field theory, solid state theory, etc. - success was achieved precisely in those tasks that could be reduced to the simultaneous consideration of a relatively small number of degrees of freedom. (For example, when the behavior of a macroscopic quantum system is described in terms of an almost ideal gas of quasiparticles, and correlations above the second-third order are not significant.)

    The operation of a digital quantum computer essentially uses precisely such states of a quantum system that classical calculation methods cannot cope with. This aroused interest in their experimental implementation, despite considerable skepticism on the part of even people like Tony Leggett, who received the 2003 Nobel Prize for his fundamental work in the theory of macroscopic quantum tunneling and superposition. Indeed, the “cat” states of the form | 0000 ... 0> + | 1111 ... 1> are very fragile compared to the factorized П J (| 0> + | 1>) J. The study of the mechanisms and speed of their destruction (“the problem of the quantum-classical transition”) continues, unexpectedly turning from a rather abstract semi-philosophical division of the foundations of quantum mechanics into almost an engineering discipline. But the main thing is that such states turned out to be much more stable than expected, and that so far no experimental indications of their fundamental prohibition have appeared.

    As a result, from the end of the last century, such progress has been made in the manufacture and control of essentially artificial quantum structures — from individual superconducting qubits to current Daveev processors — that the theory has lagged behind. Current theoretical and computational methods do not allow predicting, analyzing, and simulating the behavior of such structures.

    By “essential quantumness”, for the absence of a better term, I mean the existence in the system at any given time of a sufficiently large number of degrees of freedom in a state of quantum superposition. The quantum state vector of such a system “lives” in a very large Hilbert space. In the general case, such a vector cannot be simulated or measured. Already a hundred qubits is practically the limit. No wonder the coherent quantum behavior in Daveev processors was directly demonstrated only for a group of about a dozen qubits, for which it was possible to measure the quantum state and build a quantitative model with which the measurement results were compared. The behavior of five hundred - or a thousand-kbit processor has to be characterized by indirect results:

    Thus, even if there is no fundamental prohibition on the existence of arbitrarily large “cats,” and in principle it is possible to create a universal quantum computer that is able to effectively simulate the behavior of large essentially quantum systems, the road to it is still blocked. Quantum systems that can still be modeled using existing methods are too small and - as a universal quantum computer - inoperative. And we cannot construct a system of a sufficiently large size, because it is impossible to predict or characterize its behavior. Moreover, this system will be quite complex, and the task of its design, manufacture, characterization, debugging and operation is already an engineering task. It is necessary to create quantum engineering.

    One of the definitions of engineering is the creation of reliable structures from unreliable elements. In our case, completely new levels of insecurity appear due to the notorious fragility of “feline” states, which cannot be compensated for by duplication of systems or verification of individual structural elements. Standard engineering methods are not enough. But a general engineering approach with its focus on results, using estimates, phenomenological assessments, heuristics and intuition, the habit of having to satisfy incompatible requirements, can be fruitful where calculation from the first principles is impossible.

    Engineering can be divided into engineering of elementary units, structural engineering and systems engineering. In relation to our field, the first of them concerns individual quantum bits, their small arrays and corresponding control circuits. Here, in general, everything is clear - the theory is well tested in experiment for a variety of implementations of these devices. The latter should deal with the integration of quantum and non-quantum devices in large systems - and this is not yet relevant, because there is no middle link. Structural engineering, by definition, predicts the properties of a structure based on the properties of its structural elements. This is exactly what we cannot do even for existing structures, and it is here that we need to focus our efforts.

    Of course, neither “quantum engineering intuition” nor quantum engineering can develop other than on the basis of the regular application of quantum theory to the development and testing of new essentially quantum devices. A universal quantum computer is not the only and, apparently, not the most interesting and useful of such devices. (Although it - or rather, the frightening prospect of creating and using it to decrypt the RSA code - played a key role in attracting attention and money to this area of ​​research.) More realistic, for example, quantum optimizers like Dave's ones - in essence, analog devices, their kind of quantum slide rule, tuned to a fairly accurate solution to a limited, but important, class of problems. Quantum metamaterials are interesting - artificial media with a sufficient degree of quantum coherence, with predicted fun properties and possible use as sensors or for image processing. In a word, humanity will find what nuts to prick with these royal seals. The main thing is to try to make them. And success in the creation and application of essentially quantum devices will be the very social practice that is the criterion of truth and which will give more for our understanding of quantum theory than any number of university lectures and, especially, popular books and television programs.

    Now the importance of this topic is gradually gaining recognition. After the initial hype and bloated expectations and several years of natural cooling and skepticism, an upsurge followed. Its signs are the recently allocated £ 250 million in quantum technology in the UK, the just-promised € 1 billion in the European Union, investments by investors such as Google and NASA in DiWave Systems, a press conference by the Canadian Prime Minister at a research institute at which he (briefly and incorrectly) explained to reporters how a quantum computer works, etc. etc. The interest of financiers, businessmen, politicians and the military is understandable, although - as always - the benefits and troubles of a fundamentally new technology will not lie where they are expected.

    As for the dangers, the main threat of quantum “second wave technologies” for, for example, the global financial sector is not at all that someone will begin to crack codes in large numbers in order to transfer money from someone else’s accounts or steal economic secrets. The main threat - and even a universal quantum computer is not needed for its implementation - is that it will become possible to manage the global economy in real time, which simply eliminates the entire financial sector as completely unnecessary.

    upd:


    As for the benefits, for example, no Grover algorithm will be able to clear up the rubbish of informational rubbish in which the world is drowning (from Facebook ones to improving the quality of scientific research and higher education of transparent reporting, global ratings and other things “sharing best practice ”), if not to reduce the generation of this garbage. Returning to the epigraph - no technology alone solves the problems of mankind and does not bring him happiness. The main benefit, as always, will be to expand and deepen our understanding of the laws of nature.

    Also popular now: