“A human being should be able to change a diaper, plan an invasion, butcher a hog, conn a ship, design a building, write a sonnet, balance accounts, build a wall, set a bone, comfort the dying, take orders, give orders, cooperate, act alone, solve equations, analyze a new problem, pitch manure, program a computer [no doubt he meant quantum], cook a tasty meal, fight efficiently, die gallantly. Specialization is for insects.” Time Enough for Love: The Lives of Lazarus Long (1973)
Ah, little lad, you’re staring at my fingers. Would you like me to tell you the little story of right-hand/left-hand? The story of bit and qbit?
B-I-T! It was with this left hand that old brother Claude struck the blow that laid all these PCs and iPhones around. Q-B-I-T! You see these fingers, dear hearts? These fingers has veins that run straight to the tabloids. The right hand, friends, the hand of supremacy. Now watch, and I’ll show you what happening now. Those fingers, dear hearts, is a-warring and a-tugging, one agin t’other. Now watch ’em! Old brother left hand, left hand he’s a fighting, and it looks like bit is a goner. But wait a minute! Hot dog, bit is a winning! Yessirree! It’s bit that’s won, and right hand qbit is down for the count!
- Essentially classical algorithms: Numerical methods to integrate the Schroedinger equation forward in time, as a particular PDE (tailored to account for the unitary character of evolution but sometime not even that): split-step methods, Krylov sub-spaces, Magnus schemes etc. Old fashioned but reliable tools like an old handy hammer is reliable when you need to strike a nail.
- Tensor based classical algorithms. Initiated by the rise of many-body quantum physics in the late 1990 and rooted in the low-rank tensor approximation ideas, they marked a substantial advance in the fight with the Course of Dimensionality. Turned to be very successful when used to simulate many-body systems of chain-like topology characterized by a short-range entanglement; were a decisive factor in the development of many-body localization and transport in quantum disordered systems. But their time is over because most interesting quantum life is going on outside the bulb of weakly & shortly entangled states.
- Digital quantum simulators. Qubitization and Trotterization are not new ideas (Lloyd? 1996?) but they were catalyzed by the QC hype; these two plus gates & measurements brought several digital Q-simulation algorithms. Interesting and inspiring though a bit formal and ‘fundamental’ (at the moment). Research activity and expectations on these new algorithms was going up steadily during last 6-7 years, being heated by the fast progress in the digital QC-technology and strong competition between the main players on the market (Rigetti, IBM, Google, and Microsoft). But if we see no new quantitative results and interesting model-specific simulations in the next few years – the algorithms will slide down from the Peak of Inflated Expectations (error correction and mitigation techniques are different story). A very natural field to to play with QC emulators.
- Machine Learning & ANN algorithms. Train good your ANN and it will simulate your quantum model better than any algorithm from the two first generations! Good combination of hard-core quantum physics, AI/ML, provoking and partially controversial (previous post), these algorithms will certainly climb up the curve during next five-six years.
The concept of High-Concept crystallized in Hollywood in 1980s: a movie whose idea is delivered by its title and whose plot can be explained in two sentences and understood by an eighth-grader. It is highly visual and has an appeal to all types of viewers, from retired army colonels to nurses and young university lecturers. You can entry this movie at any point of its narrative and enjoy it and have a good feeling at the end. All this however does not mean that high-concept movies are stupid and made for dummies – they are original, made in a thoughtful way and perfectly orchestrated. These movies are planed to be box-office bombs (“We have no obligation to make history. We have no obligation to make art. We have no obligation to make a statement. Our obligation is to make money“, Don Simpson). Examples: “Top Gun” and “Jurassic Park”.
Low-concept movies are on the other side of spectrum: complex plots (sometimes completely opposite, so trivial that you cannot even pitch them, like “two guys are seating in a restaurant and talking”) and character-driven narratives. These movies are targeted to a specific audience, they are more demanding and often full of connotations to literature, art, and other movies. Prospects for commercial success are slim. Examples: “My Nights Are More Beautiful Than Your Days” and “Faces”.
The High/Low concept has been extended to literature. It can also be extended to science: A high-concept paper is a scientific paper presenting exciting results which are explained by paper’s title and whose idea can be grasped by a commoner. Such paper gets good media coverage and journalists are happy to write about it. Example: “Washing Away Postdecisional Dissonance” (2010) and “Speech synthesis from neural decoding of spoken sentences” (2019). Low-concept papers: traditional scientific papers which can be comprehended by experts only. Examples: open a random issue of “Journal of Topology” and choose a random paper.
“Quantum supremacy using a programmable superconducting processor” (October, 2019) is a trickster, a low-concept wolf in a high-concept’s skin. The title tells about the message: Supremacy of quantum computers over classical ones is demonstrated. Media coverage: 375 News stories, 34 Blog posts, 6,026 Tweets and 9 Wikipedia mentions. It was in newspapers and magazines, it was discussed in podcasts. By now, many laypeople have heard of it.
But what is the content? How precisely the QS was demonstrated? What is the idea of the proof-of-concept experiment reported? Even physicists or IT experts are hardly able to get answers to these questions by reading the paper — because it is too cryptic, too demanding, too technical. It is a Klein bottle, a promise of a joy of learning something deep & exciting which turns to be a depressive demonstration of a growing gap between society and technology and shallowness of media hypes.
When discussing recent RMP Machine learning and the physical sciences, some colleagues voiced a critical opinion which can be summarized in the following statement: “This does not provide us with understanding“. To unfold, there is no way to categorize and create a “theory”: ML devices are not scientists and they are not able to do something which up to now is considered as a part of theory-building activity.
It is indeed a legitimate opinion but it pulls everything to the bottom of the philosophical depth (and rings many bells, a-la “Can machines think?” -> “Can machine categorize and build theories?”). It is better to bounce from the bottom, return to the surface, and stay afloat there, pragmatic and doer-like oriented. The situation (in a broader context) is described by Susskinds: “…With this has come a third technology-related bias that we call the ‘AI fallacy’. This is the mistaken supposition that the only way to develop systems that perform tasks at the level of experts or higher is to replicate the thinking processes of human specialists. This anthropocentric view of ‘intelligent’ systems is limiting. It emboldens both professionals and commentators, for example, to leap from the observation that computers cannot ‘think’ to the unwarranted conclusion that systems cannot undertake tasks at a higher standard than human beings. As we show in this book, however, systems of today are increasingly out-performing human exerts, not by copying high-performing people but by exploring the distinctive capabilities of new technologies, such as massive data-storage capacity and brute-force processing“.
With neuronal networks we are able to grasp many-body quantum states which are more complex then matrix-product states (by using equal computational resources). But different from the case of MPS and PEPS states, we do not have theories – in the canonical sense – for these NN states (otherwise we would not need the ANNs). But here the ignorance is a bliss: We can profit from NNs by reaching much more states than before — for the same price, in terms of computational or technological cost. But we need to make an effort and accept the fact that a neuronal network, which encodes some many-body quantum state, is a theory of this state.
A tardigrade on a silicon nitride membrane which is in a state of superposition of two eigenstates (groundstate plus first excited state). This is a sketch; the size of a real tardigrade has to be much smaller than the size of the membrane
Gröblacher is also interested in experiments involving living creatures. He is currently working on putting a sheet of nitride into a superposition of states. By using a laser, it is theoretically possible to get a barely visible membrane of silicon nitride measuring around one millimetre across into a superposition of vibrations ….. Gröblacher reckons they are within a couple of years of achieving this superposition of vibrations.
“A superposition state of these membranes would allow us to demonstrate that objects that are visible to the naked eye still behave quantum, and we can really test decoherence – the transition between classical and quantum mechanics,” he says.
He then hopes to extend the experiment by placing tiny living organisms called tardigrades onto the membrane of silicon nitride, putting them into superposition too.
This is an idea with a high outreach potential. A first quantum vivisection, a Scroedinger tardigrade…. These micro-animals (official title) are famous for being tough: They can enter a ‘hibernation state’ of near complete dehidration and metabolism rate decreased by factor 1/1000, and, being in this state, survive an exposure to outer space (almost perfect vacuum), high-intensity radiation of all kinds (including gamma rays), and pressure up to 1,200 atm. Therefore, they can withstand cryogenic environment required to achieve ground state cooling of the membrane. In short, if there an animal able to survive superposition this must be a tardigrate.
What is interesting is the survival statistics similar to that obtained in an experiment in 2007: …dehydrated tardigrades were taken into low Earth orbit on the FOTON-M3 mission carrying the BIOPAN astrobiology payload. For 10 days, groups of tardigrades, some of them previously dehydrated, some of them not, were exposed to the hard vacuum of outer space, or vacuum and solar UV radiation. Back on Earth, over 68% of the subjects protected from solar UV radiation were reanimated within 30 minutes following rehydration, although subsequent mortality was high; many of these produced viable embryos. In contrast, hydrated samples exposed to the combined effect of vacuum and full solar UV radiation had significantly reduced survival, with only three subjects of Milnesium tardigradum surviving. Would there be any difference between just the groundstate and superposition of the groundstate and first excited state of the membrane? Or it would be negligible on the background of the mere exposure to the cryogenic environment?
The problem is that the typical size of silicon nitride membranes Gröblacher (and other teams) currently dealing with is <= 0.5mm. This is also the typical size of tardigrates. I do not know masses but expect them to be comparable. It is not possible to maintain the extra-high quality factor of such membranes after placing on them a tardigrade – unless the membranes are on-purpose designed and curved, with ‘nests’ for tardys (a-la seats of Space Jockeys).
This is the first textbook on QC I came across that covers quantum annealing, linear system solvers, sampling of random bit strings from a distribution, and, consequently, Quantum Supremacy. I could only welcome this attempt, this is really a big step forward.
As a reference book for an IT-oriented MSc course, I would use this one. It is the closest to what I was thinking about as an optimal teaching strategy. Perhaps the first chapter (electrons and other ‘physics’) and chapter on Bell inequalities (what is the need for them in the QC context?) have to be dropped. Also the last chapter, with a rather weak and amorphous discussion on the future of QC.
Would you like me to tell you the little story of Right Hand-Left Hand – the story of bit and qbit?
If we continue with a metaphor of Quantum Computing as the art
“to choreograph things [i.e., gates] such that for each wrong answer, some of the paths leading there have positive amplitudes and others have negative amplitudes, so they cancel each other out, while the paths leading to the right answer reinforce”
a bit further, we would arrive at the idea of melody/music produced by a quantum circuit.
I am pretty sure that many people have already arrived at this idea and Google could confirm my feeling (too lazy to check). I also believe that in most cases these people ended up with the idea of harmonics of different frequencies, corresponding to different computational basis states (specified with sequences of bits). It is very intuitive but also boring.
We can think of something more advanced, more digital, f. e., of granular synthesis. Each basis state is represented by a grain of a particular shape. The shape can reproduce the corresponding binary sequence (as in the sketch) but it is not necessary, it might be better to out shape grains by following some other criteria (aesthetic?). Then the grain is repeated periodically so we have a specific beat – unce-unce-unce (agree that already this is better than monotonous sound of a single harmonic). Another basis state is encoded with another grain and and another beat with a different period – tyyyunz—tyyyunz—tyyyunz. Amplitudes of the basis vector/states should be changed after every gate (we can deal with real amplitudes only… need to think about this more) but in a continuous manner. This again can be done with the granular synthesis toolbox.
The wish is that all this results in a techno-like piece for a randomly generated Q-circuit. If, in particular, the circuit is not random and the results of computations are a single basis state then the final coda is a single beat. Potentially, this can be realized with Ableton Live and The Mangler and one well-motivated PhD with background in digital sound processing.
We are not up to the advancement of Quantum Computing (QC) as a scientific Math/Phys field or discipline. Rather, we want to explore possibilities to use it as a mean to solve challenging classical problems – to handle large-scale optimization tasks, to sort Big Data, and to do quantum modeling (in Feynman’s sense, the next stage in the evolution of computational quantum physics). This direction can be called applied QC. To put it short, it is about advancement of classical IT fields and computational quantum physics by implementing QC.
There is analogy in the computational physics which, as a research field, has two ‘lawns’: on one researchers develop new computational methods and algorithms while on the another one their colleagues implements these developments to model & investigate physical phenomena. F.e., one community develops symplectic methods to propagate quantum systems (on classical computers!), while another community uses them to study ionization of atoms by strong EM pulses.
Note added: Another example is complex analysis. It is a branch of mathematics but it provides the best tool to analyze AC circuits in electric engineering.
Finally, in Norwegian IT industry and commercial R&D enterprises they are least interested in the advancement of the QC as a scientific field. What they expect from QC are new computational tools to solve their practical problems.