Quantum Hashing Lecture in Yandex
Farid Mansurovich Ablaev - Head of the Department of Theoretical Cybernetics of Kazan Federal University. Arriving at Yandex's Moscow office, Farid Mansurovich spoke about algorithms that could potentially be run on quantum computers. Such devices are still very few, and they are not really mastered even by the most advanced companies. But when they begin to get cheaper, the specialists will already have the time to start using them.
One area where quantum systems could undergo major changes is digital signature mechanisms. The report reveals a hashing algorithm that is radically superior to analogs for classical computers. Under the cut - detailed decoding and slides.
Most of the slides will be in English, but I think my colleagues will excuse me. It’s not always possible to translate everything, and there’s not always a need.
This report is based on the works presented here. With our first work - I am a mathematician myself - we were invited to the journal Laser Physics Letters, which, unlike mathematical high-ranking journals with a corresponding index slightly less than 1, has an index of 3.2. Now, when all Hirsch Indexes and others are taken into account, it turns out to be useful to publish in such journals - physical.
One of the latest results was published at the Algebra in Computational Complexity seminar, there is such a German center. You can see these results there. Archive is a clear thing.
I'll start with motivation. Daniel joked that there will soon be a quantum computer, and we will all use it. In Kazan, there is a group that collects computers, and the director, Viktor V. Dyachkov. He is from Tambov, very energetic. How will he meet me, he says: “Farid, when will we be selling a quantum computer?”
Canadians are already selling. You know about the D-Wave computer, it is periodically tested, Canadians announced that it was first 500-qubit, and now it has 1024 qubits. Periodically during testing, it turns out that all this can be done on a classic computer, then again they have a good quantum computer. But this is still in this state ...
We can say that, probably, individual qubits - and 500, and 1024, and beyond - can be sculpted. But the fact is that a quantum computer will become good and productive, very powerful only if these qubits are in a linked entangled state, an entangled state. In essence, this is a situation where a system can be programmed in such a way that the qubits can feel each other. It is almost impossible to do.
If it would be possible to create such a quantum register - I would not even talk about a quantum computer - where the concatenated states are of the order of 500, this will guarantee such a performance at which it is possible to simultaneously work with states of 2,500 . This is the number of particles in the universe, and this is good. Then this system will beat all existing computers.
The Shore algorithm that is mentioned here is the first 1994 result. Surely many people know that if you ask Google or Yandex what the Quantum algorithm zoo is, then it immediately rolls out a page with more than 50 quantum algorithms, which are an order of magnitude better than existing classical algorithms. Potentially, there is already a set of algorithms that can be run on quantum machines. And the first is the Shore algorithm, the factorization algorithm. All the pathos of this business just began when the Shore algorithm appeared. Factorization means that classic RSA algorithms will be beaten by a quantum computer.
In response to this, such a trend immediately appeared in the cryptography community - post quantum cryptography. There is such a book, indicated here. In it, the cryptographic community proposed a whole set of tools that would fight a potential quantum computer.
Cryptographers also talk about such a moment - that, for example, hashing-based digital signature preparation schemes, such as Lamport's signature systems, will not be beaten by a quantum computer either.
I will recall what Lamport's digital signature is. This is a fairly simple thing, was proposed in 1979. Leslie Lamport is known as the creator of LaTeX, and he is the most cited person in the field of theoretical and practical computer science. He is cited outrageously due to the fact that he is one of the creators of LaTeX.
Lamport's digital signature is a system that allows you to sign bits 0 and 1 well. A word w is associated with 0, and another word v is associated with one. These are two private keys for 0 and 1. Alice prepares the value of the functions f (w) and f (v). It is assumed that you need to take a one-way function and forward these pairs to Bob. If Alice wants to sign 1, then she sends the original word v and 1 to Bob, and Bob, quickly calculating f (v) from v, checks to see if this is true.
One-sidedness is very crudely meaningful. This means that by the value of the function the argument is difficult to find, and by the argument the value of the function is easy to calculate. Based on it, you can do the same with long messages ...
The next step that is useful to us: we recall what an epsilon is a universal hash family.
Take a set of N hash functions. By a hash function we mean a function that compresses words of length K into words of length M. K> M, here conflicts are possible - situations where two different words are displayed in one word. This happens when K> M. We
collect a set of such hash functions. F will be called epsilon-universal if the following holds: in a situation where f is chosen equally likely from this set, the probability that two different words will give the same image will be small - not more than epsilon. Such a family is called an epsilon universal hash family. And in the case where epsilon is 1 / N, it is a universal family.
Using the family of epsilon-universal hash families, you can arrange digital signature systems. I will not dwell on this in detail. I’ll talk about the mathematical part, tell you where it is implemented and what kind of communities work in Russia.
Central idea: we generalize the concept of hashing and hash families of functions to the quantum case. But the point is this: we will map the words into quantum states, into a qubit ensemble. At the same time, we want these two requirements to be fulfilled, so that collisions are rare. That is, meaningfully, the slightest change in the words of the argument should lead to a significant change in the hash of the image. So it is required in the classics. And of course, the function must be one-way.
As for one-sidedness, I jumped to the classics. The concept of a one-way function, roughly speaking, is connected with the problem P ≠ NP. If this condition is fulfilled, then a one-way function exists, otherwise we work with potentially one-way functions. There is such a problem, we all know about it.
We propose a construction that, in a certain sense, allows us to construct optimal hash functions. We propose a construction that, using classical error-correcting codes, allows you to build entire families of different quantum hash functions. This is a contribution to post-quantum cryptography.
If the Shore algorithm fights with the classics, then the quantum part here allows, in a sense, to strengthen the struggle with the future quantum computer.
Very briefly about what is qubit here. A qubit is a unit vector in a two-dimensional Hilbert complex space with such a property that the coordinates of the vector a 0 and a 1 are complex numbers whose sum of squares of amplitudes is equal to one. In fact, a complex number is two-dimensional. And, formally speaking, this is a four-dimensional point. But this ratio reduces the dimension to unity, and as a result, a qubit can be represented as a vector in three-dimensional space.
Moreover, this means that he has two degrees of freedom: the angle ψ and the angle Θ. Everything happens in polar coordinates. This component is called the real amplitude, where is the sine. And this is called the phase - e iφ .
a 0 and a 1- the likelihood that we will be either zero or one, if we measure the qubit.
In a sense, a good analogy of a qubit is the situation when we took a coin, flipped it, and while it flies, it is simultaneously at 0 and 1. It fell, we measured it, the classic probabilistic bit ceased to live and appeared either as 0 or like 1. You can drop two coins, but then they will be independent. And a system of quantum bits, if they are concatenated, is something that has no analogue in the classics.
How can I encode a word with a single qubit? We interpret the binary word as a number. This is only about the material parts, there is no phase part. It is clear that if ψ (w) is such a number, and if we consider it as a number modulo 2 k, then the word is encoded into a certain unit vector, determined by the angle depending on the word we read. Everything is fine here. Here's a way to encode a word in one qubit.
So, physically, such an encoding is a one-way function. There is the result of Alexander Semenovich Holevo - he works at the Steklov Institute. In the late 60s, he showed that if we have an ensemble of qubits, then the maximum of classical information that can be extracted from S qubits is S bits. I speak very meaningfully, on the fingers. Clear language is behind this.
So, if we have one qubit and thus drove a word there, then from words of length K we can extract only 1 bit of classical information. This is precisely physical one-sidedness. Here two words are encoded in different states, such information.
What is bad here? Two different words may be encoded into close states, may seem indistinguishable. One qubit is not enough.
Everything that I said can be rewritten in the language of the phase. Now we will not talk about real amplitudes, but take the first part, a vector with an amplitude of 0, the second with an amplitude of 1, but here we add a phase. Using the phase you can work, but here it will not be useful to us.
How can this be implemented? How can one qubit be configured this way? In fact, all transformations are turns, they are unitary. We start from the zero state and, reading at each step the next bit ...
R1, R2, and so on, you need to specify different ones. The angles are different. Slowly pick up the corners and as a result we get what we need.
Already having a good one-way quantum function, how could we translate Lamport's signature and do it quantum?
As before, we select the word w, associate it with 0, and associate the word v with 1. These are secret keys for 0 and 1. And - we procure public keys. These are already quantum states corresponding to the word w: I showed how it can be typed on one qubit. The word v is typed in the same way.
We forward 0 and the corresponding value of the quantum state 1 to Bob. It is impossible to reverse the process due to the result of Halev. Then, if we want, we send Bob a couple of words v and 1. Bob, having the corresponding state and word v, can check whether it is true that the quantum state ψ (v) was obtained using this word. We offer a reverse test here.
Since the transformation is unitary, that is, one-to-one ... There is a word w, a quantum state has been prepared, we start from scratch, apply a unitary transformation, and accumulate the necessary transformation. It turned out the corresponding condition. Now that we have got the word w, Bob, knowing the algorithm, can organize such an inverse transformation. It is easy to do. And he applies this transformation to the resulting quantum state. As a result, if two words are the same, then the result should be state 0.
We know which state to compare with. If the words are equal, then the reverse test and the corresponding probability that they are equal are determined by the following formula: the scalar product of the vector 0 and the state of the inverse transformation that we received.
If two words are the same, it is always 1 - here we will not be mistaken. And if the states are not equal, then, dealing with only one qubit, we can still be close to unity. This is if the two states, ψ (v) and ψ (w), were close.
We need to come up with a situation where two different words would translate the state into almost orthogonal to it. It will be optimal for us.
The next step is when we do not have one qubit and we need a register of S qubits.
From one qubit we can move on to such a system. Therefore, we work in a Hilbert space of size 2s and the corresponding ensemble is described in the same way as in the case of one qubit. a ithere is amplitude, complex numbers. The norm of this vector is still 1. The norms of these numbers squared are likely to get one of the basic states i, if we measure it.
That is, we can say that now we will consider functions that are arranged as follows: words of length K are mapped into an ensemble of S qubits. The record will be like this.
I will not say how it was organized: this can be done by analogy with how it was done for one qubit.
The result is now like this. What will we demand? We will call the function ψ, which maps words of length k into an ensemble of S qubits, δ-Resistant (| Σ k|, s) -quantum function - if the condition is fulfilled that two different words of length k generate a state that is almost orthogonal, delta-orthogonal.
This situation will provide the following condition. If we then, for verification, start applying the so-called reverse test to them, will it be true that the two states are the same? The likelihood that we say yes will turn out to be no more than a delta square for different words. If the words are the same, then we will say “yes” with a probability of 1. This is exactly what we need.
The theorem is proved quite simply. It turns out that if we want to build such-Resistant delta (| Σ the k |, s) are quantum functions, then s can not be less than log (k) - log (log ( 1 + √ ( 2 / 1 - delta )) ) - 1.
There is a lower mark. Is it possible to construct such good delta-stable quantum functions that are close to the theoretical lower bound? Surprisingly, it turns out - it is possible.
Looking ahead, I am announcing a result that says that if we have an error-correcting linear code that is arranged so that words of length k are encoded by words of length n and we work in the corresponding field F q , then we can for any pre-selected δ and for such Δ, which is associated with the characteristics of this code d from the hamming distance n, construct the code length. Moreover, the number of qubits that we need for this is no more than log (n).
Jumping ahead again, we realized the result, looked at how it would look on the very beautiful Reed-Solomon code. And it turned out that the following characteristics were obtained for him. If the length of the code by a constant is greater than the length of the original message, then for such an error threshold as k-1 / q + Θ, we can construct the corresponding quantum hash code with characteristics log (q log (q)) and some kind of additional weight. And this thing is close enough to the lower estimate that is present here.
This was surprising to us. When I told this at a seminar at the Physics and Technology Institute, Alexander Semenovich Holevo, having heard the lower mark, said: plausible, yes, - but how to construct? When we discussed with him the option that this, for example, can be done using the Reed-Solomon code, he agreed that something could be done in this direction.
This is the description of the report that is being presented here. And now I’ll talk a little more in detail about this ...
The first example was this: we have a word, hash it in one quantum bit. Now we need, having an ensemble of s qubits, to determine how to encode and hash all this there. What kind of mechanism should this be? To answer this question, we propose the concept of a quantum hash generator.
The first examples. As always, it turns out that there was something like this in the world, but it was spoken a little in another language. There was such a thing as quantum fingerprinting or quantum fingerprints. This was done in 2002 by Harry Boorman et al for a special communication model of computation with an arbiter. And it turned out that this is a special binary case of quantum hashing.
One of the challenges in coding theory is to switch from binary codes to a more than binary alphabet. Reed-Solomon codes are fundamentally not binary, here the game is of high quality. When we move to a field of another characteristic, more than 2, then it really moves forward.
I will show the first examples of what has been done. And then I'll show you how to go from error-correcting codes and invest in this design.
There is an epsilon universal hash family. Using a certain family of functions, I will begin to generate one quantum function. There is a set of classical linear functions, there is a field F q . From it we take the coefficients and have a set of functions H = {h 1 , ..., h T }. Now, using the h function, we generate one-qubit quantum functions in this way. For simplicity, I will not work with phases, but with material amplitudes. In a sense, this is intuitive, although physicists for their implementation asked to rewrite it all in the language of the phase, because the phase thing is easier to implement.
Now we are preparing a set of such quantum functions: | ψ h j (w)⟩ = cos 2πh j (w) / q| 0⟩ + sin 2πh j (w) / q | 1⟩.
Then we collect these functions into one in this way: | ψ H (w)⟩ = 1 / √T | j⟩ | ψ h j (w)⟩.
Significantly, this record means that we simultaneously and equally equally probabilistically start calculating all these T-functions h. And we build calculations in this way - quantum physics allows us to do this. Before us is a mathematical record of what physics allows us to do.
Register j needs to be treated as a binary notation for j. And if so, then it is enough for us to have a log (T) bit, the qubit state corresponds to them. I will not speak in detail how this is done. And log (T) qubit guide the calculation of the last qubit. In the corresponding subspace, each of the subspaces controls the rotation of one last qubit - in which all information is wrapped. The last qubit rotates with all these T possibilities - they thus provide log (T) of quantum bits.
I will pass to the concrete example that was proposed by Boorman, Cleve, Watros, and de Wolf. They took an arbitrary error-correcting binary code with a hamming distance d, words of length k in which are encoded by words of length n, and examined just what I showed on the previous slide.
At the same time, a system of log (n) qubits is running. Each register is such that it rotates the last qubit in accordance with the value of E i (w). Since this code is binary, the rotation is only 90 degrees. Either the last qubit remains at zero, or rotates 90 degrees. And this ensemble is wrapped in a system of log (n + 1) qubits, which controls the behavior of the latter.
But they didn’t speak this language, they had a different task, their construction was rewritten in this way. And it turned out that this feature is good. It is delta-stable for words of length k if we have s qubits and if s is like this. Everything is good here.
The next step we took with Sasha Vasiliev. In 2009, using this design, he defended his thesis. We also did not know then that it was a hash, but displayed quantum fingerprints in the case of an arbitrary finite field. The design was invented. As always, it turned out that this design, being in a completely different language, was used by Avi Ugderson and Razborov in 1993 for other tasks. In fact, interpreting their result in our language, we say that we can choose some small set of order log (q) from such functions: T = ⎡ ( 2 / δ 2 ) ln (2q) ⎤.
Thus, the set of such linear functions, not very large in comparison with log (k), makes it possible to form a very good general quantum hash function. It is delta-stable, and it encodes words in the set q, field elements are interpreted as words of length log (k), and here it suffices to have s qubit.
Summarizing these constructions, we get the following: the concept of a quantum hash generator appears. We will say that the family of functions G is a quantum hash generator of a function of this kind. If we can build similar ones for each function and somehow map them into an ensemble of l qubits, and also, putting them together, we get a quantum hash function that is delta-stable with such characteristics that words of length k are encoded in d + l quantum-bit, then we will call such a family a quantum hash generator.
The definition is that the design of Boorman and his co-authors and our 2008 design fit into this concept.
If we have a quantum hash generator, these functions can be built in different ways. Most likely, the hash generator ambiguously defines a quantum hash function. You can talk about these topics further.
Now - the central theorem, which ties everything together. First condition: we take an epsilon-universal hash-family that has such characteristics, N functions, and using them we encode words of length k in the field F q . The second condition: as H we take some ready-made quantum hash generator - and there are such ones. In particular, it’s more convenient for us to take the quantum hash generator that Alexander Vasilyev and I came up with.
Arranging a superposition of these two families, we first apply the functions of the family F, and then wrap them in functions from the set H. Then the new family is a quantum hash generator with these characteristics. It is delta-stable, it encodes words of length k into an ensemble of s qubits. Δ ≦ ε + δ is the corresponding characteristic of the epsilon-universal hash family and the delta-characteristic of the delta stability of the finished quantum hash generator, which we take. Well, the number of qubits depends on the number of functions in the epsilon-universal family and on the number of functions in the finished quantum hash generator.
And here we need the logarithms of the number of corresponding functions. This in a sense ensures that quantum hash functions are exponentially cheaper to build than to have such things in the first place.
A whole group of new examples appeared, which used the constructions existing in computer science. Frevold, a mathematician from Riga, in 1979 proposed a feature called “classic fingerprinting” based on the fact that we want to compare two different words and compare them in different modules. The Chinese remainder theorem plays here. I will not talk much about this, but if two words are equal, then they are equal in any module p i to a small module. And if two words are not equal, then for most p i the corresponding meanings are also not equal.
The first work was published in the collection “Problems of Cybernetics” at the turn of the 70-80s and, as I recall, was fairly well narrated at a seminar by Oleg Borisovich Lupanov at Moscow State University.
It turns out that this family is, of course, a universal hash family with such characteristics, a 1 / c-universal family, where c is the corresponding way of choosing the number of all functions. And since this is an epsilon-universal hash family, then combining it with a family that we together with Alexander Vasilyev came up with and which, in turn, is based on the results of Wigderson’s analysis, we can find out that with the help of Frevold’s family of classical fingerprinting a quantum hash is generated function with the following characteristics: delta-stable, words of length k are mapped to s qubits. The delta depends in this way: the small delta is the characteristic of our family, 1 / s is the Freyvold characteristic, so many qubits are enough and the lower bound is like that. The lower and upper bounds are fairly close. It turned out
Classic hash families based on the Freywold construct are not used in practice, because there are many prime numbers, you need to keep them all. Some other characteristics are used, for example, a very famous family - a universal linear hash family.
All textbooks on algorithms have such an example. It was proposed by the creator of universal hash families, Caltor and Wegman in the 1979-1980s, that the totality of such linear functions is a universal classical hash family. The proof is simple and beautiful.
Of course, we take it, our family, which I and Sasha Vasiliev came up with. Here it is. The result is such a design. It turns out a quantum hash generator that generates a quantum hash function with these characteristics. This is a characteristic of a linear universal family, and delta is our characteristic. The upper bound is this; the number of qubits is the one that is needed here.
And it turned out that this family is exponentially more expensive than the existing lower bound. The lower bound is log (k), and here the upper bound is k. Such well-known and used things, it turns out, is better not to apply in the quantum world.
Here is the result that I announced at the beginning. If there is an error-correcting linear code nkd that encodes words of length k into words of length n, working in the field F qthen there is a quantum hash generator. So, we can build the corresponding quantum hash function, which has such properties that stability is determined by Δ = (1 - d / n) + δ. This is the characteristic of the error correction code, and δ is the characteristic of the family that we use. s, in turn, is: s ≦ log (n) + log (log (q)) + 2 * log (1 / δ) + 4. The
proof is based on the 1994 result, which suggests that if we If we have a error-correcting code with the corresponding characteristics, then we can also build a universal hash family. And vice versa: if we have a universal hash family with such characteristics, then we can build the corresponding code that corrects errors.
Universal hash families were defined and appeared at the turn of the 70-80s. Error correcting codes are, in turn, a classic: all name codes appeared at the turn of the 1950s and 1960s. Surprisingly, such a very simple relationship was discovered only in 1994. Who knows, he knows, and if not, it is useful to read. It will be interesting.
Using this theorem, we take the Reed-Solomon code - it is very beautiful, it is described in all textbooks and courses - and we build a hash generator, the corresponding hash function.
This is what I announced in the first part of the story. This design was optimal.
So, we can now extend all this to other designs. A lot of them. In particular, they ask me why not Reed-Malyar. Of course, it is also possible. Bodchi-Udhuri - of course, everything also falls, beautifully. But the Reed-Solomon code is too classic. Though at school tell.
This can be described in more detail. You can read, it makes no sense to speak in detail.
This is what we are working on and where it is all moving. I specifically wrote such a cheat sheet. To begin with, I leave this text, of course, and if I have questions, I will say it.
What are the real physical problems appearing here? Physicists are now working on the so-called quantum memory. They understand it in such a way that the carrier of a qubit is a photon. It flies into the substance, and the substance absorbs it. After a while, acting on this substance, you can emit a photon in exactly the same state. The state of a photon is the state of a qubit.
And if such things line up, it is a signal amplifier, a repeater. Now we are talking about building quantum repeaters. For 100 km, photons go through the optical fiber, then they are absorbed and begin to disappear. If every 100 km a quantum repeater is installed that repeats the signal, the problem of transmitting information will be solved. So, the first task is to build a repeater.
To quantum computers on which it will be possible to implement Shore, is far away. The first task is to build a quantum repeater. What can be done with a quantum repeater? The substance is absorbed, before radiation it is possible to act on it, transform the state and extract it in a new state. Now there is a struggle that if about a dozen qubits, an ensemble can be remembered and extracted, then such an amount will be normal for constructing a hash function.
See, if s is of the order of 10, then for which fields can this be done? Or if s is about 100? Technology now seems to be approaching the fact that with a dozen and a hundred it will already be possible to work. If s turns out to be of the order of 100 using such a purely theoretical result as the Reed-Solomon code, then what functions can be stably hashed - and with which fields?
I can immediately say with whom we are cooperating. In Moscow, this is Moscow State University and the International Laser Center of Moscow State University. At the VMK, Academician Valiev organized the Department of Quantum Informatics, after his death a department was formed ... I don’t remember what it is called. Voevodin is engaged in supercomputing, supercomputers. Inside, a community that works in the field of quantum informatics is preserved.
At the Steklov Institute, Semenov, Alexander Semenovich Holevo, of course. And the Physics and Technology Institute of the Russian Academy of Sciences is becoming central. He now oversees these issues well, we are actively interacting. But it turns out that all the people I listed are mostly people who came here from physics. We can offer algorithms, but good physics is needed before the algorithms. It's about technology now.
And there is a corresponding laboratory, a department at the Academy of Cryptography of the Russian Federation. A good group in Chernogolovka, in Novosibirsk, in ITMO. There is a series of conferences called “Quantum Cryptography” in the world. This year the conference was held in Paris, next year it will be held in Tokyo, etc. It took place in Zurich, in Singapore. The communities there are groups that deal with the issue of quantum key transmission. Such a quantum protocol is known as BB-84, Bene-Brossard. They invented it in 1984, and now there is a whole conference around it: more than a hundred people gather and physicists compete in the implementation of such things.
There are not so many algorithms already here, a physical licking of all this takes place. And now everyone is waiting for when finally what is called quantum memory will be ready, when it will be possible to repeat this business experimentally in many places, to start building protocols.
In Kazan, there is a group that deals specifically with issues of quantum memory. They are now most interested in working with phases, they can arrange fairly good experiments in this language.
The laboratories are almost ready, but it turned out that the dollar became twice as expensive, and now the good equipment that the guys wanted to order before, now does not work out. But this is the next question.
Perhaps that's all. Thanks.
- How do you assess the likelihood of intercepting encrypted information in the creation of an algorithm for decrypting a key?
- I interact with Sergey Kulik from the International Laser Center. I am not a specialist in the field of physics, but the classic algorithm is BB-84 - the guys said it is completely falling apart. In the field of physics there are whole groups of works on the basis of which they have done almost everything anew. The BB-84 is good for telling students the principles. That is, as always, there is a theoretical educational cryptography, and when it comes to practical things - everything is somewhat different there.
This year I was lucky enough to come to a conference on quantum cryptography. And there was such a day when Peter Shore was planted. Bene, Brossard - they drink coffee there and tell how wonderful everything is. They say: it flows, breaks, everyone does things differently. The Chinese performed. They say that they already have a network between Beijing and Hong Kong. Italians say they are working on a channel with a satellite. It is different, not BB-84, but it is precisely the quantum generation of the key. However, other things are already working, everything is done differently.
- When you said who was working on this, as far as I understood, mostly physicists were mentioned. Do mathematics reach you? I understand that this is a fairly complicated physics, but the mathematics here are quite complicated.
- Yes, I also prepared this for myself. Interesting groups - we do not have them in Russia yet. Richard Josse - a man from theoretical physics came, but became a specialist in computer science. He is probably a little more physicist. But he perfectly understands what is happening in computer science - not only in the quantum field, but also in general in the theory of complexity, for example. He is the head of the center in Cambridge. There are such people.
In Amsterdam, Harry Boorman heads the corresponding center in the field of quantum informatics, but he is not a physicist, but more a mathematician.
Scott Aaronson at MIT. He graduated from Berkeley, a young man, one of the most recognized people in this field.
Umesh Weserane at Berkeley - Scott Aaronson is his apprentice. Andres Ambainis in Riga - just a graduate of Berkeley. It is approximately like that, among mathematicians. There are, of course, a lot of algorithms in this area. When Peter Shore came up with this algorithm, the next - those who came up with a quick search in a quantum database - was Grover. This is a serious, but so theoretical result! Imagine: you need to drive a large database into a superposition and keep it in a quantum register. And then a search in this disordered database can be done faster by the square root of exhaustive search. Mathematically - beautiful. Now a lot of things have been wrapped around him, other tasks have appeared, Latvian mathematicians work a lot there.
Regarding mathematics, I would say that almost all major universities have these centers. But, unfortunately, not in Russia. In Russia, it is Moscow State University, the department of quantum informatics - it is now called a little differently at the VMK. Our group consists of those involved in algorithms and benchmarking. I can say right away that Russia is still far behind.
What have you done in this regard in Singapore? Famous country. They gathered Europeans: someone from Cambridge leads physicists, someone from mathematics is in the position of principal investigator in France, Hartman Klauck from Frankfurt works there. This community communicates well with each other, holds conferences. Zurich has a good group in the field of quantum informatics - mathematicians work with physicists and travel to each other. Russia is still a little aloof. With this group it is necessary to begin to interact more closely.
- I would like to add about Scott Aaronson. He came to Russia, read a report at the Computer Science in Russia conference, and received an award for best work. In addition, he has a very interesting blog scottaaronson.com/blog. There, just about quantum computing, a lot of things are written in the popular exposition; he is not only a scientist, but also a great popularizer of science. And he also has an interesting book, so far only written in English. It is called Quantum Computing since Democritus, "Quantum Computing Since Democritus." I read it, I enjoyed it a lot.
- Yes, and the other day I received information from the newsletter that they are discussing the emergence of a brain module or some other entity that is trying to quickly solve an NPPO-complete problem. There is a dispute, Aaronson is involved. Look, this is curious. By the way, it was he who created complexity zoo - where are all the complexity classes. He was a graduate student in Berkeley, then they bought it from him. Good job done.
Thanks.
One area where quantum systems could undergo major changes is digital signature mechanisms. The report reveals a hashing algorithm that is radically superior to analogs for classical computers. Under the cut - detailed decoding and slides.
Most of the slides will be in English, but I think my colleagues will excuse me. It’s not always possible to translate everything, and there’s not always a need.
This report is based on the works presented here. With our first work - I am a mathematician myself - we were invited to the journal Laser Physics Letters, which, unlike mathematical high-ranking journals with a corresponding index slightly less than 1, has an index of 3.2. Now, when all Hirsch Indexes and others are taken into account, it turns out to be useful to publish in such journals - physical.
One of the latest results was published at the Algebra in Computational Complexity seminar, there is such a German center. You can see these results there. Archive is a clear thing.
I'll start with motivation. Daniel joked that there will soon be a quantum computer, and we will all use it. In Kazan, there is a group that collects computers, and the director, Viktor V. Dyachkov. He is from Tambov, very energetic. How will he meet me, he says: “Farid, when will we be selling a quantum computer?”
Canadians are already selling. You know about the D-Wave computer, it is periodically tested, Canadians announced that it was first 500-qubit, and now it has 1024 qubits. Periodically during testing, it turns out that all this can be done on a classic computer, then again they have a good quantum computer. But this is still in this state ...
We can say that, probably, individual qubits - and 500, and 1024, and beyond - can be sculpted. But the fact is that a quantum computer will become good and productive, very powerful only if these qubits are in a linked entangled state, an entangled state. In essence, this is a situation where a system can be programmed in such a way that the qubits can feel each other. It is almost impossible to do.
If it would be possible to create such a quantum register - I would not even talk about a quantum computer - where the concatenated states are of the order of 500, this will guarantee such a performance at which it is possible to simultaneously work with states of 2,500 . This is the number of particles in the universe, and this is good. Then this system will beat all existing computers.
The Shore algorithm that is mentioned here is the first 1994 result. Surely many people know that if you ask Google or Yandex what the Quantum algorithm zoo is, then it immediately rolls out a page with more than 50 quantum algorithms, which are an order of magnitude better than existing classical algorithms. Potentially, there is already a set of algorithms that can be run on quantum machines. And the first is the Shore algorithm, the factorization algorithm. All the pathos of this business just began when the Shore algorithm appeared. Factorization means that classic RSA algorithms will be beaten by a quantum computer.
In response to this, such a trend immediately appeared in the cryptography community - post quantum cryptography. There is such a book, indicated here. In it, the cryptographic community proposed a whole set of tools that would fight a potential quantum computer.
Cryptographers also talk about such a moment - that, for example, hashing-based digital signature preparation schemes, such as Lamport's signature systems, will not be beaten by a quantum computer either.
I will recall what Lamport's digital signature is. This is a fairly simple thing, was proposed in 1979. Leslie Lamport is known as the creator of LaTeX, and he is the most cited person in the field of theoretical and practical computer science. He is cited outrageously due to the fact that he is one of the creators of LaTeX.
Lamport's digital signature is a system that allows you to sign bits 0 and 1 well. A word w is associated with 0, and another word v is associated with one. These are two private keys for 0 and 1. Alice prepares the value of the functions f (w) and f (v). It is assumed that you need to take a one-way function and forward these pairs to Bob. If Alice wants to sign 1, then she sends the original word v and 1 to Bob, and Bob, quickly calculating f (v) from v, checks to see if this is true.
One-sidedness is very crudely meaningful. This means that by the value of the function the argument is difficult to find, and by the argument the value of the function is easy to calculate. Based on it, you can do the same with long messages ...
The next step that is useful to us: we recall what an epsilon is a universal hash family.
Take a set of N hash functions. By a hash function we mean a function that compresses words of length K into words of length M. K> M, here conflicts are possible - situations where two different words are displayed in one word. This happens when K> M. We
collect a set of such hash functions. F will be called epsilon-universal if the following holds: in a situation where f is chosen equally likely from this set, the probability that two different words will give the same image will be small - not more than epsilon. Such a family is called an epsilon universal hash family. And in the case where epsilon is 1 / N, it is a universal family.
Using the family of epsilon-universal hash families, you can arrange digital signature systems. I will not dwell on this in detail. I’ll talk about the mathematical part, tell you where it is implemented and what kind of communities work in Russia.
Central idea: we generalize the concept of hashing and hash families of functions to the quantum case. But the point is this: we will map the words into quantum states, into a qubit ensemble. At the same time, we want these two requirements to be fulfilled, so that collisions are rare. That is, meaningfully, the slightest change in the words of the argument should lead to a significant change in the hash of the image. So it is required in the classics. And of course, the function must be one-way.
As for one-sidedness, I jumped to the classics. The concept of a one-way function, roughly speaking, is connected with the problem P ≠ NP. If this condition is fulfilled, then a one-way function exists, otherwise we work with potentially one-way functions. There is such a problem, we all know about it.
We propose a construction that, in a certain sense, allows us to construct optimal hash functions. We propose a construction that, using classical error-correcting codes, allows you to build entire families of different quantum hash functions. This is a contribution to post-quantum cryptography.
If the Shore algorithm fights with the classics, then the quantum part here allows, in a sense, to strengthen the struggle with the future quantum computer.
Very briefly about what is qubit here. A qubit is a unit vector in a two-dimensional Hilbert complex space with such a property that the coordinates of the vector a 0 and a 1 are complex numbers whose sum of squares of amplitudes is equal to one. In fact, a complex number is two-dimensional. And, formally speaking, this is a four-dimensional point. But this ratio reduces the dimension to unity, and as a result, a qubit can be represented as a vector in three-dimensional space.
Moreover, this means that he has two degrees of freedom: the angle ψ and the angle Θ. Everything happens in polar coordinates. This component is called the real amplitude, where is the sine. And this is called the phase - e iφ .
a 0 and a 1- the likelihood that we will be either zero or one, if we measure the qubit.
In a sense, a good analogy of a qubit is the situation when we took a coin, flipped it, and while it flies, it is simultaneously at 0 and 1. It fell, we measured it, the classic probabilistic bit ceased to live and appeared either as 0 or like 1. You can drop two coins, but then they will be independent. And a system of quantum bits, if they are concatenated, is something that has no analogue in the classics.
How can I encode a word with a single qubit? We interpret the binary word as a number. This is only about the material parts, there is no phase part. It is clear that if ψ (w) is such a number, and if we consider it as a number modulo 2 k, then the word is encoded into a certain unit vector, determined by the angle depending on the word we read. Everything is fine here. Here's a way to encode a word in one qubit.
So, physically, such an encoding is a one-way function. There is the result of Alexander Semenovich Holevo - he works at the Steklov Institute. In the late 60s, he showed that if we have an ensemble of qubits, then the maximum of classical information that can be extracted from S qubits is S bits. I speak very meaningfully, on the fingers. Clear language is behind this.
So, if we have one qubit and thus drove a word there, then from words of length K we can extract only 1 bit of classical information. This is precisely physical one-sidedness. Here two words are encoded in different states, such information.
What is bad here? Two different words may be encoded into close states, may seem indistinguishable. One qubit is not enough.
Everything that I said can be rewritten in the language of the phase. Now we will not talk about real amplitudes, but take the first part, a vector with an amplitude of 0, the second with an amplitude of 1, but here we add a phase. Using the phase you can work, but here it will not be useful to us.
How can this be implemented? How can one qubit be configured this way? In fact, all transformations are turns, they are unitary. We start from the zero state and, reading at each step the next bit ...
R1, R2, and so on, you need to specify different ones. The angles are different. Slowly pick up the corners and as a result we get what we need.
Already having a good one-way quantum function, how could we translate Lamport's signature and do it quantum?
As before, we select the word w, associate it with 0, and associate the word v with 1. These are secret keys for 0 and 1. And - we procure public keys. These are already quantum states corresponding to the word w: I showed how it can be typed on one qubit. The word v is typed in the same way.
We forward 0 and the corresponding value of the quantum state 1 to Bob. It is impossible to reverse the process due to the result of Halev. Then, if we want, we send Bob a couple of words v and 1. Bob, having the corresponding state and word v, can check whether it is true that the quantum state ψ (v) was obtained using this word. We offer a reverse test here.
Since the transformation is unitary, that is, one-to-one ... There is a word w, a quantum state has been prepared, we start from scratch, apply a unitary transformation, and accumulate the necessary transformation. It turned out the corresponding condition. Now that we have got the word w, Bob, knowing the algorithm, can organize such an inverse transformation. It is easy to do. And he applies this transformation to the resulting quantum state. As a result, if two words are the same, then the result should be state 0.
We know which state to compare with. If the words are equal, then the reverse test and the corresponding probability that they are equal are determined by the following formula: the scalar product of the vector 0 and the state of the inverse transformation that we received.
If two words are the same, it is always 1 - here we will not be mistaken. And if the states are not equal, then, dealing with only one qubit, we can still be close to unity. This is if the two states, ψ (v) and ψ (w), were close.
We need to come up with a situation where two different words would translate the state into almost orthogonal to it. It will be optimal for us.
The next step is when we do not have one qubit and we need a register of S qubits.
From one qubit we can move on to such a system. Therefore, we work in a Hilbert space of size 2s and the corresponding ensemble is described in the same way as in the case of one qubit. a ithere is amplitude, complex numbers. The norm of this vector is still 1. The norms of these numbers squared are likely to get one of the basic states i, if we measure it.
That is, we can say that now we will consider functions that are arranged as follows: words of length K are mapped into an ensemble of S qubits. The record will be like this.
I will not say how it was organized: this can be done by analogy with how it was done for one qubit.
The result is now like this. What will we demand? We will call the function ψ, which maps words of length k into an ensemble of S qubits, δ-Resistant (| Σ k|, s) -quantum function - if the condition is fulfilled that two different words of length k generate a state that is almost orthogonal, delta-orthogonal.
This situation will provide the following condition. If we then, for verification, start applying the so-called reverse test to them, will it be true that the two states are the same? The likelihood that we say yes will turn out to be no more than a delta square for different words. If the words are the same, then we will say “yes” with a probability of 1. This is exactly what we need.
The theorem is proved quite simply. It turns out that if we want to build such-Resistant delta (| Σ the k |, s) are quantum functions, then s can not be less than log (k) - log (log ( 1 + √ ( 2 / 1 - delta )) ) - 1.
There is a lower mark. Is it possible to construct such good delta-stable quantum functions that are close to the theoretical lower bound? Surprisingly, it turns out - it is possible.
Looking ahead, I am announcing a result that says that if we have an error-correcting linear code that is arranged so that words of length k are encoded by words of length n and we work in the corresponding field F q , then we can for any pre-selected δ and for such Δ, which is associated with the characteristics of this code d from the hamming distance n, construct the code length. Moreover, the number of qubits that we need for this is no more than log (n).
Jumping ahead again, we realized the result, looked at how it would look on the very beautiful Reed-Solomon code. And it turned out that the following characteristics were obtained for him. If the length of the code by a constant is greater than the length of the original message, then for such an error threshold as k-1 / q + Θ, we can construct the corresponding quantum hash code with characteristics log (q log (q)) and some kind of additional weight. And this thing is close enough to the lower estimate that is present here.
This was surprising to us. When I told this at a seminar at the Physics and Technology Institute, Alexander Semenovich Holevo, having heard the lower mark, said: plausible, yes, - but how to construct? When we discussed with him the option that this, for example, can be done using the Reed-Solomon code, he agreed that something could be done in this direction.
This is the description of the report that is being presented here. And now I’ll talk a little more in detail about this ...
The first example was this: we have a word, hash it in one quantum bit. Now we need, having an ensemble of s qubits, to determine how to encode and hash all this there. What kind of mechanism should this be? To answer this question, we propose the concept of a quantum hash generator.
The first examples. As always, it turns out that there was something like this in the world, but it was spoken a little in another language. There was such a thing as quantum fingerprinting or quantum fingerprints. This was done in 2002 by Harry Boorman et al for a special communication model of computation with an arbiter. And it turned out that this is a special binary case of quantum hashing.
One of the challenges in coding theory is to switch from binary codes to a more than binary alphabet. Reed-Solomon codes are fundamentally not binary, here the game is of high quality. When we move to a field of another characteristic, more than 2, then it really moves forward.
I will show the first examples of what has been done. And then I'll show you how to go from error-correcting codes and invest in this design.
There is an epsilon universal hash family. Using a certain family of functions, I will begin to generate one quantum function. There is a set of classical linear functions, there is a field F q . From it we take the coefficients and have a set of functions H = {h 1 , ..., h T }. Now, using the h function, we generate one-qubit quantum functions in this way. For simplicity, I will not work with phases, but with material amplitudes. In a sense, this is intuitive, although physicists for their implementation asked to rewrite it all in the language of the phase, because the phase thing is easier to implement.
Now we are preparing a set of such quantum functions: | ψ h j (w)⟩ = cos 2πh j (w) / q| 0⟩ + sin 2πh j (w) / q | 1⟩.
Then we collect these functions into one in this way: | ψ H (w)⟩ = 1 / √T | j⟩ | ψ h j (w)⟩.
Significantly, this record means that we simultaneously and equally equally probabilistically start calculating all these T-functions h. And we build calculations in this way - quantum physics allows us to do this. Before us is a mathematical record of what physics allows us to do.
Register j needs to be treated as a binary notation for j. And if so, then it is enough for us to have a log (T) bit, the qubit state corresponds to them. I will not speak in detail how this is done. And log (T) qubit guide the calculation of the last qubit. In the corresponding subspace, each of the subspaces controls the rotation of one last qubit - in which all information is wrapped. The last qubit rotates with all these T possibilities - they thus provide log (T) of quantum bits.
I will pass to the concrete example that was proposed by Boorman, Cleve, Watros, and de Wolf. They took an arbitrary error-correcting binary code with a hamming distance d, words of length k in which are encoded by words of length n, and examined just what I showed on the previous slide.
At the same time, a system of log (n) qubits is running. Each register is such that it rotates the last qubit in accordance with the value of E i (w). Since this code is binary, the rotation is only 90 degrees. Either the last qubit remains at zero, or rotates 90 degrees. And this ensemble is wrapped in a system of log (n + 1) qubits, which controls the behavior of the latter.
But they didn’t speak this language, they had a different task, their construction was rewritten in this way. And it turned out that this feature is good. It is delta-stable for words of length k if we have s qubits and if s is like this. Everything is good here.
The next step we took with Sasha Vasiliev. In 2009, using this design, he defended his thesis. We also did not know then that it was a hash, but displayed quantum fingerprints in the case of an arbitrary finite field. The design was invented. As always, it turned out that this design, being in a completely different language, was used by Avi Ugderson and Razborov in 1993 for other tasks. In fact, interpreting their result in our language, we say that we can choose some small set of order log (q) from such functions: T = ⎡ ( 2 / δ 2 ) ln (2q) ⎤.
Thus, the set of such linear functions, not very large in comparison with log (k), makes it possible to form a very good general quantum hash function. It is delta-stable, and it encodes words in the set q, field elements are interpreted as words of length log (k), and here it suffices to have s qubit.
Summarizing these constructions, we get the following: the concept of a quantum hash generator appears. We will say that the family of functions G is a quantum hash generator of a function of this kind. If we can build similar ones for each function and somehow map them into an ensemble of l qubits, and also, putting them together, we get a quantum hash function that is delta-stable with such characteristics that words of length k are encoded in d + l quantum-bit, then we will call such a family a quantum hash generator.
The definition is that the design of Boorman and his co-authors and our 2008 design fit into this concept.
If we have a quantum hash generator, these functions can be built in different ways. Most likely, the hash generator ambiguously defines a quantum hash function. You can talk about these topics further.
Now - the central theorem, which ties everything together. First condition: we take an epsilon-universal hash-family that has such characteristics, N functions, and using them we encode words of length k in the field F q . The second condition: as H we take some ready-made quantum hash generator - and there are such ones. In particular, it’s more convenient for us to take the quantum hash generator that Alexander Vasilyev and I came up with.
Arranging a superposition of these two families, we first apply the functions of the family F, and then wrap them in functions from the set H. Then the new family is a quantum hash generator with these characteristics. It is delta-stable, it encodes words of length k into an ensemble of s qubits. Δ ≦ ε + δ is the corresponding characteristic of the epsilon-universal hash family and the delta-characteristic of the delta stability of the finished quantum hash generator, which we take. Well, the number of qubits depends on the number of functions in the epsilon-universal family and on the number of functions in the finished quantum hash generator.
And here we need the logarithms of the number of corresponding functions. This in a sense ensures that quantum hash functions are exponentially cheaper to build than to have such things in the first place.
A whole group of new examples appeared, which used the constructions existing in computer science. Frevold, a mathematician from Riga, in 1979 proposed a feature called “classic fingerprinting” based on the fact that we want to compare two different words and compare them in different modules. The Chinese remainder theorem plays here. I will not talk much about this, but if two words are equal, then they are equal in any module p i to a small module. And if two words are not equal, then for most p i the corresponding meanings are also not equal.
The first work was published in the collection “Problems of Cybernetics” at the turn of the 70-80s and, as I recall, was fairly well narrated at a seminar by Oleg Borisovich Lupanov at Moscow State University.
It turns out that this family is, of course, a universal hash family with such characteristics, a 1 / c-universal family, where c is the corresponding way of choosing the number of all functions. And since this is an epsilon-universal hash family, then combining it with a family that we together with Alexander Vasilyev came up with and which, in turn, is based on the results of Wigderson’s analysis, we can find out that with the help of Frevold’s family of classical fingerprinting a quantum hash is generated function with the following characteristics: delta-stable, words of length k are mapped to s qubits. The delta depends in this way: the small delta is the characteristic of our family, 1 / s is the Freyvold characteristic, so many qubits are enough and the lower bound is like that. The lower and upper bounds are fairly close. It turned out
Classic hash families based on the Freywold construct are not used in practice, because there are many prime numbers, you need to keep them all. Some other characteristics are used, for example, a very famous family - a universal linear hash family.
All textbooks on algorithms have such an example. It was proposed by the creator of universal hash families, Caltor and Wegman in the 1979-1980s, that the totality of such linear functions is a universal classical hash family. The proof is simple and beautiful.
Of course, we take it, our family, which I and Sasha Vasiliev came up with. Here it is. The result is such a design. It turns out a quantum hash generator that generates a quantum hash function with these characteristics. This is a characteristic of a linear universal family, and delta is our characteristic. The upper bound is this; the number of qubits is the one that is needed here.
And it turned out that this family is exponentially more expensive than the existing lower bound. The lower bound is log (k), and here the upper bound is k. Such well-known and used things, it turns out, is better not to apply in the quantum world.
Here is the result that I announced at the beginning. If there is an error-correcting linear code nkd that encodes words of length k into words of length n, working in the field F qthen there is a quantum hash generator. So, we can build the corresponding quantum hash function, which has such properties that stability is determined by Δ = (1 - d / n) + δ. This is the characteristic of the error correction code, and δ is the characteristic of the family that we use. s, in turn, is: s ≦ log (n) + log (log (q)) + 2 * log (1 / δ) + 4. The
proof is based on the 1994 result, which suggests that if we If we have a error-correcting code with the corresponding characteristics, then we can also build a universal hash family. And vice versa: if we have a universal hash family with such characteristics, then we can build the corresponding code that corrects errors.
Universal hash families were defined and appeared at the turn of the 70-80s. Error correcting codes are, in turn, a classic: all name codes appeared at the turn of the 1950s and 1960s. Surprisingly, such a very simple relationship was discovered only in 1994. Who knows, he knows, and if not, it is useful to read. It will be interesting.
Using this theorem, we take the Reed-Solomon code - it is very beautiful, it is described in all textbooks and courses - and we build a hash generator, the corresponding hash function.
This is what I announced in the first part of the story. This design was optimal.
So, we can now extend all this to other designs. A lot of them. In particular, they ask me why not Reed-Malyar. Of course, it is also possible. Bodchi-Udhuri - of course, everything also falls, beautifully. But the Reed-Solomon code is too classic. Though at school tell.
This can be described in more detail. You can read, it makes no sense to speak in detail.
This is what we are working on and where it is all moving. I specifically wrote such a cheat sheet. To begin with, I leave this text, of course, and if I have questions, I will say it.
What are the real physical problems appearing here? Physicists are now working on the so-called quantum memory. They understand it in such a way that the carrier of a qubit is a photon. It flies into the substance, and the substance absorbs it. After a while, acting on this substance, you can emit a photon in exactly the same state. The state of a photon is the state of a qubit.
And if such things line up, it is a signal amplifier, a repeater. Now we are talking about building quantum repeaters. For 100 km, photons go through the optical fiber, then they are absorbed and begin to disappear. If every 100 km a quantum repeater is installed that repeats the signal, the problem of transmitting information will be solved. So, the first task is to build a repeater.
To quantum computers on which it will be possible to implement Shore, is far away. The first task is to build a quantum repeater. What can be done with a quantum repeater? The substance is absorbed, before radiation it is possible to act on it, transform the state and extract it in a new state. Now there is a struggle that if about a dozen qubits, an ensemble can be remembered and extracted, then such an amount will be normal for constructing a hash function.
See, if s is of the order of 10, then for which fields can this be done? Or if s is about 100? Technology now seems to be approaching the fact that with a dozen and a hundred it will already be possible to work. If s turns out to be of the order of 100 using such a purely theoretical result as the Reed-Solomon code, then what functions can be stably hashed - and with which fields?
I can immediately say with whom we are cooperating. In Moscow, this is Moscow State University and the International Laser Center of Moscow State University. At the VMK, Academician Valiev organized the Department of Quantum Informatics, after his death a department was formed ... I don’t remember what it is called. Voevodin is engaged in supercomputing, supercomputers. Inside, a community that works in the field of quantum informatics is preserved.
At the Steklov Institute, Semenov, Alexander Semenovich Holevo, of course. And the Physics and Technology Institute of the Russian Academy of Sciences is becoming central. He now oversees these issues well, we are actively interacting. But it turns out that all the people I listed are mostly people who came here from physics. We can offer algorithms, but good physics is needed before the algorithms. It's about technology now.
And there is a corresponding laboratory, a department at the Academy of Cryptography of the Russian Federation. A good group in Chernogolovka, in Novosibirsk, in ITMO. There is a series of conferences called “Quantum Cryptography” in the world. This year the conference was held in Paris, next year it will be held in Tokyo, etc. It took place in Zurich, in Singapore. The communities there are groups that deal with the issue of quantum key transmission. Such a quantum protocol is known as BB-84, Bene-Brossard. They invented it in 1984, and now there is a whole conference around it: more than a hundred people gather and physicists compete in the implementation of such things.
There are not so many algorithms already here, a physical licking of all this takes place. And now everyone is waiting for when finally what is called quantum memory will be ready, when it will be possible to repeat this business experimentally in many places, to start building protocols.
In Kazan, there is a group that deals specifically with issues of quantum memory. They are now most interested in working with phases, they can arrange fairly good experiments in this language.
The laboratories are almost ready, but it turned out that the dollar became twice as expensive, and now the good equipment that the guys wanted to order before, now does not work out. But this is the next question.
Perhaps that's all. Thanks.
- How do you assess the likelihood of intercepting encrypted information in the creation of an algorithm for decrypting a key?
- I interact with Sergey Kulik from the International Laser Center. I am not a specialist in the field of physics, but the classic algorithm is BB-84 - the guys said it is completely falling apart. In the field of physics there are whole groups of works on the basis of which they have done almost everything anew. The BB-84 is good for telling students the principles. That is, as always, there is a theoretical educational cryptography, and when it comes to practical things - everything is somewhat different there.
This year I was lucky enough to come to a conference on quantum cryptography. And there was such a day when Peter Shore was planted. Bene, Brossard - they drink coffee there and tell how wonderful everything is. They say: it flows, breaks, everyone does things differently. The Chinese performed. They say that they already have a network between Beijing and Hong Kong. Italians say they are working on a channel with a satellite. It is different, not BB-84, but it is precisely the quantum generation of the key. However, other things are already working, everything is done differently.
- When you said who was working on this, as far as I understood, mostly physicists were mentioned. Do mathematics reach you? I understand that this is a fairly complicated physics, but the mathematics here are quite complicated.
- Yes, I also prepared this for myself. Interesting groups - we do not have them in Russia yet. Richard Josse - a man from theoretical physics came, but became a specialist in computer science. He is probably a little more physicist. But he perfectly understands what is happening in computer science - not only in the quantum field, but also in general in the theory of complexity, for example. He is the head of the center in Cambridge. There are such people.
In Amsterdam, Harry Boorman heads the corresponding center in the field of quantum informatics, but he is not a physicist, but more a mathematician.
Scott Aaronson at MIT. He graduated from Berkeley, a young man, one of the most recognized people in this field.
Umesh Weserane at Berkeley - Scott Aaronson is his apprentice. Andres Ambainis in Riga - just a graduate of Berkeley. It is approximately like that, among mathematicians. There are, of course, a lot of algorithms in this area. When Peter Shore came up with this algorithm, the next - those who came up with a quick search in a quantum database - was Grover. This is a serious, but so theoretical result! Imagine: you need to drive a large database into a superposition and keep it in a quantum register. And then a search in this disordered database can be done faster by the square root of exhaustive search. Mathematically - beautiful. Now a lot of things have been wrapped around him, other tasks have appeared, Latvian mathematicians work a lot there.
Regarding mathematics, I would say that almost all major universities have these centers. But, unfortunately, not in Russia. In Russia, it is Moscow State University, the department of quantum informatics - it is now called a little differently at the VMK. Our group consists of those involved in algorithms and benchmarking. I can say right away that Russia is still far behind.
What have you done in this regard in Singapore? Famous country. They gathered Europeans: someone from Cambridge leads physicists, someone from mathematics is in the position of principal investigator in France, Hartman Klauck from Frankfurt works there. This community communicates well with each other, holds conferences. Zurich has a good group in the field of quantum informatics - mathematicians work with physicists and travel to each other. Russia is still a little aloof. With this group it is necessary to begin to interact more closely.
- I would like to add about Scott Aaronson. He came to Russia, read a report at the Computer Science in Russia conference, and received an award for best work. In addition, he has a very interesting blog scottaaronson.com/blog. There, just about quantum computing, a lot of things are written in the popular exposition; he is not only a scientist, but also a great popularizer of science. And he also has an interesting book, so far only written in English. It is called Quantum Computing since Democritus, "Quantum Computing Since Democritus." I read it, I enjoyed it a lot.
- Yes, and the other day I received information from the newsletter that they are discussing the emergence of a brain module or some other entity that is trying to quickly solve an NPPO-complete problem. There is a dispute, Aaronson is involved. Look, this is curious. By the way, it was he who created complexity zoo - where are all the complexity classes. He was a graduate student in Berkeley, then they bought it from him. Good job done.
Thanks.